• @mannycalavera@feddit.uk
    link
    fedilink
    English
    291 year ago

    Is there an example of AI generated images that aren’t hyper realistic or have perfect bokeh? I’m taking about an out of focus shot or the subject looks like a regular slob like you and I?

    • @BetaDoggo_@lemmy.world
      link
      fedilink
      English
      231 year ago

      It’s mostly bias in the training data. Most people aren’t posting mediocre images of themselves online so models rarely see that. Most are also finetuned to specifically avoid outputting that kind of stuff because people don’t want it.

      Out of focus is easy for most base models but getting an average looking person is harder.

      • @hoshikarakitaridia@sh.itjust.works
        link
        fedilink
        English
        71 year ago

        I would usually try to add things to the prompt you’d expect to find in a more casual scenario, like “smartphone” with half weight or something, or “video”, or maybe like “Facebook”. Just meta information you think attaches to more casual photos. Maybe even add “photo”.

    • @Mahlzeit@feddit.de
      link
      fedilink
      English
      41 year ago

      The models are deliberately engineered to create “good” images, just like cameras get autofocus, anti-shake and stuff. There are many tools that will auto-prettify people, not so many for the reverse.

      There are enough imperfect images around for the model to know what that looks like.

    • @Unforeseen@sh.itjust.works
      link
      fedilink
      English
      4
      edit-2
      1 year ago

      I assumed this was because it’s making an average. Human attraction is highly sensitive to symmetry so this creates that symmerty by the way it works.