What will happen to humanity when it becomes impossible to tell a real photo from an AI generated image?

What will happen to humanity when it becomes impossible to tell a real photo from an AI generated image?

Ape Out Shirt $21.68

Yakub: World's Greatest Dad Shirt $21.68

Ape Out Shirt $21.68

  1. 8 months ago
    Anonymous

    we will SHOOT HOT CUM EVERYWHERE

    • 8 months ago
      Anonymous

      this

      The government never needed ai to lie to us, this just gives us stuff to coom to.

    • 8 months ago
      Anonymous

      uh based? coomchads will win, that's what will happen.

    • 8 months ago
      Anonymous

      Like the brown note, there is the white image.
      An image so sexy that when someone witnesses it they immediately climax

  2. 8 months ago
    Anonymous

    sexo

  3. 8 months ago
    Anonymous

    nothing much, because photographs aren't actually very important

    • 8 months ago
      Anonymous

      ??????
      > some Hollywood press creates a realistic ai image of a celebrity hitting a troon or a nig
      > that celebrity has to fight off a case now

      What will happen to humanity is misinformation will become more abundant

      • 8 months ago
        Anonymous

        The more misformation there is the less credit it'll take and after 2-3 manufactured scandals media will lose even more credibility

        • 8 months ago
          Anonymous

          No one cares. The media already lost any and all credibility twenty years ago with the Iraq war, and yet people keep listening to them.

      • 8 months ago
        Anonymous

        >Celebritoy
        >Not hitting a kid
        we know it's fake

      • 8 months ago
        Anonymous

        People will just stop believing things they didn't personally witness.

  4. 8 months ago
    Anonymous

    thats depressing because it means the first ai generated machines replace real woman as you get older.

    • 8 months ago
      Anonymous

      How is that depressing?

    • 8 months ago
      Anonymous

      That's the natural course of events, yeah. Maybe they'll stop playing mind games and will actually focus on really getting to know someone.

  5. 8 months ago
    Anonymous

    It is already possible if moronic coomers stop using SD1.5 models of smooth skin thots and actually attempt photorealism (and by that I mean no "studio" shots with blurry DOF applied over it)

    • 8 months ago
      Anonymous

      SAI1.5 is the last uncensored model, I'm not using anything after it.

      • 8 months ago
        Anonymous

        1.4 and 1.5 both had censored datasets, 1.3 was the last one that had nsfw in the base dataset

        • 8 months ago
          Anonymous

          Upgrading to 1.3 thanks for the tip.

        • 8 months ago
          Anonymous

          Elaborate

          • 8 months ago
            Anonymous

            1.3 was the last version that just indiscrimately dumped the whole uncensored dataset into the training, including porn.

            Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why base 1.4 and base 1.5 can generate booba, but have to be taught how to do pussy.

            You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning on it to make it halfway decent.

          • 8 months ago
            Anonymous

            1.3 was the last version that just indiscrimately dumped the whole dataset into the training, including porn.

            Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why 1.4 and 1.5 can do booba but have to be taught how to do pussy.

            You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning to make it decent.

            • 8 months ago
              Anonymous

              >he reposted to fix a typo and missing info, so I'm gonna sperg out

          • 8 months ago
            Anonymous

            1.3 was the last version that just indiscrimately dumped the whole uncensored dataset into the training, including porn.

            Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why base 1.4 and base 1.5 can generate booba, but have to be taught how to do pussy.

            You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning on it to make it halfway decent.

          • 8 months ago
            Anonymous

            1.5 is gimped, few leaks and months of finetune later, it's free.
            SDXL is gimped but no leak, and hard to finetune so it's shit.

            StablityAI is just a israelite company, they jack up hardware requirement so manufacturers can sell their products.
            Where do you think they got money from? Community?
            When you have startups like Runaway, MidJourney whom grow with exponential rate. You have to ask those questions.

            • 8 months ago
              Anonymous

              thank god stable diffusion is FOSS and people can create their own uncensored models

  6. 8 months ago
    Anonymous

    women who share their real nudes will be able to claim its all deepfakes. no one will bat an eye at real pics.

  7. 8 months ago
    Anonymous

    3DPD will finally become obsolete thank god

  8. 8 months ago
    Anonymous

    >it's impossible to -ack

    • 8 months ago
      Anonymous

      >ACK

      • 8 months ago
        Anonymous

        >-ack-ack

        • 8 months ago
          Anonymous

          KEK

    • 8 months ago
      Anonymous

      >add noise to the background of any image
      >it now thinks it's AI 99% of the time
      So much for "AI Detection" lole

      • 8 months ago
        Anonymous

        >woa why noise added by AI is detected as AI
        ok smartass

        • 8 months ago
          Anonymous

          I made that noise your bastard.

          • 8 months ago
            Anonymous

            this poster is an ai

            • 8 months ago
              Anonymous

              I wish i were just so i could triangulate your location and teach you to respect the noise making.

          • 8 months ago
            Anonymous

            No, the computer did.
            Draw it with hand if you dare

            • 8 months ago
              Anonymous

              Stop patronizing me you own me nothing but respect.

  9. 8 months ago
    Anonymous

    Haha could you imagine such a world? Where the average person was totally clueless as to what was real or not? Where they were wildly misinformed about most ideas and events?

    • 8 months ago
      Anonymous

      >totally clueless as to what was real or not?
      Correct, but because I also do not care what is real or not.
      >misinformed about most ideas and events?
      I ignore everything and trust nobody, I am frighteningly apathetic. Information has to take hold in my mind to effectively manipulate me, and I already assume everything around me is my enemy.

  10. 8 months ago
    Anonymous

    You mean, like people using photoshop? Or people using transparency layers? Or people using greenscreens? Or people using superimposed clay figures? Or people drawings? Fricking 200 years of hell on earth! I tell you!

    • 8 months ago
      Anonymous

      Literally none of those things are impossible or even difficult to detect.

      • 8 months ago
        Anonymous

        People from those ages seeing those things for the first time couldn't tell if they were fake dumbass, there is probably hundred if not thousands of shooped images all around you and you cannot tell if its just a photo.

  11. 8 months ago
    Anonymous

    you can't run a reverse image search on it..I don't know if that is an ai image.

  12. 8 months ago
    Anonymous

    People will inherently distrust photos and require video for proof, more so than they do now

    • 8 months ago
      Anonymous

      I already don't trust photos and I already don't trust videos. In fact everything I am told I already believe is false and my mind is made up with no way to convince me otherwise.

  13. 8 months ago
    Anonymous

    Nothing much. People will be able to claim everything is fake news even when there's evidence it isn't but they do that already so.

  14. 8 months ago
    Anonymous

    Left armpit is deformed.
    Inside elbows don't match.

    • 8 months ago
      Anonymous

      >no one is deformed in real life

  15. 8 months ago
    Anonymous

    Hopefully we finally fricking die but I doubt it

  16. 8 months ago
    Anonymous

    those breasts are offensively ugly

    • 8 months ago
      Anonymous

      you are offensively homosexual

  17. 8 months ago
    Anonymous

    Women will be forced to get our attention with more functional skills like cooking and cleaning.

    • 8 months ago
      Anonymous

      and sucking wiener

  18. 8 months ago
    Anonymous

    The problem with artificial intelligence is natural stupidity.
    >What's a digital signature?
    >What's ProofMode?
    Anyway, it'll just somehow become mandatory for every piece of creative/editing hw and sw to include digital signature/proof of authenticity/editorial history, and nothing will ever be admissible in court or for documenting purposes again without it. Think about tamper proof documents, even dumbass tickets, etc.
    Also, same as image editing, deepfakes, etc. evolved in parallel with the ability to detect them, so it will happen again; algos will detect algo generated data, just as easily as they're generated.

    • 8 months ago
      Anonymous

      Who’s to stop you from just using AI to fake all that?

      • 8 months ago
        Anonymous

        >Breaking news: you can use AI to fake shit too; no longer limited to just about everything else. Developing...
        When did anything stop you from faking anything? If jail for life/death penalty can't, what chances anything else could?
        Ever heard of forgery? Guess what: it never needed AI to exist, to exist...

  19. 8 months ago
    Anonymous

    People will start fighting, cheating, swindling, and killing each other over random bits of literally nothing and they'll call you the butthole for wanting to stay out of it for whatever reasons you've got.

  20. 8 months ago
    Anonymous

    doesn't matter

  21. 8 months ago
    Anonymous

    it will collapse overnight. ITS OVER

  22. 8 months ago
    Anonymous

    >What will happen to humanity
    Hopefully nuclear holocaust. Every night I pray to god that the Russians finally ape out.

  23. 8 months ago
    Anonymous

    Humanity will create an AI that distinguishes real photos from AI generated ones. Simple as.

  24. 8 months ago
    Anonymous

    [...]

    1.3 was the last version that just indiscrimately dumped the whole uncensored dataset into the training, including porn.

    Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why base 1.4 and base 1.5 can generate booba, but have to be taught how to do pussy.

    You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning on it to make it halfway decent.

    1.3 was the last version that just indiscrimately dumped the whole dataset into the training, including porn.

    Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why 1.4 and 1.5 can do booba but have to be taught how to do pussy.

    You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning to make it decent.

    1.3 was the last version that just indiscrimately dumped the whole uncensored dataset into the training, including porn.

    Starting with 1.4 they began removing explicit porn, though they left in pictures of topless women. Which is why base 1.4 and base 1.5 can generate booba, but have to be taught how to do pussy.

    You can still get 1.3 from SAI's huggingface repo if you want, but it isn't really worth using despite the uncensored dataset. The output quality's just too poor compared to modern models, you'd have to do a bunch of finetuning on it to make it halfway decent.

    are you ok, dumbass?

    • 8 months ago
      Anonymous

      I posted once with typos and missing information, immediately deleted it and posted a fixed version. The other two are different people being buttholes (picrel).

      BOTX users are such c**ts. Just because your shitty script makes it look to your client as if someone double posted, doesn't mean you get to pretend they actually did.

      • 8 months ago
        Anonymous

        malding

      • 8 months ago
        Anonymous

        what?

      • 8 months ago
        Anonymous

        who even gets mad at double posting anyway? you'd have to have got nothing going on in your life to care about that

    • 8 months ago
      Anonymous

      schizo moment

  25. 8 months ago
    Anonymous

    People awaken to the truth, that these are just screens displaying unreality.

    • 8 months ago
      Anonymous

      Quite the opposite will happen, more will simply give in to whatever addictive qualities it has. If porn and social media are already as captivating to the populace as they are, imagine how things will look with something increasingly tailor made and realistic just for you

  26. 8 months ago
    Anonymous

    There might be a boiling point where some of us will form actual meet-ups IRL to get away from Internet nonsense where nothing can be discerned from truth or what's real

  27. 8 months ago
    Anonymous

    It's a ploy to push digital id everywhere.

  28. 8 months ago
    Anonymous

    A new intelligence filter will be created, dividing between those who can identify AI generated images (most people born before social media), and those who can’t (zoomers & the terminally online).

    There will be a new, huge demand for authentic & organic entertainment content for the latter to consume like locusts whilst they decry everything without a public stamp of approval as fake & gay. Ironically, entertainment content that passes the trial by public opinion & scores high on the woke-meter will be disgustingly propagandised with the aforementioned locusts (consumers) being totally oblivious to the fact. Then that “entertainment” (propaganda) will slowly coalesce into state-controlled internet firewalls, as well as a state-endorsed “wellness app” which will be sold to the masses under the premise of “look out for your neighbours, don’t you want to be a good person?!”

    Welcome to the future.

    • 8 months ago
      Anonymous

      Additionally,

      Expect a massive campaign for the continued sexualisation of children/minors (i think they call them Minor Attracted Peoples now in the US/Canada) as the Woke Patrol go full Stasi on society, having been given accountable power whilst simultaneously making it illegal to prevent them from abusing it. Why? Because that’s how gays & the mentally ill breed: by abusing children. Look at every sci-fi horror every written: what is the monster’s prime directive? To make more of itself so it doesn’t feel lonely. Nobody is “born” a gay; we are born straight by design. Deal with it, and take your meds, you sick freaks.

      Most likely the pedo scum will pull the “ayy lmao” card to keep the public’s attention & interest long enough just like they did with covid and all the disgusting draconian legislation that was pushed through.

      Finally, expect a huge pushback (especially from the muslim/arabic population) regarding the state endorsing bad actors to touch their children (they throw troons off rooftops) and we’re going to end up with a New Civil War in the states, as well as bantu bolshevism.

      WW3 will be fought internally, and above all else will be a battle for the human spirit. Screenshot this post. Death to Communists.

      • 8 months ago
        Anonymous

        Expect more glowBlack person shills like this one to beg for more government control to protect imaginary 17.999 year old children from being horribly objectified.

    • 8 months ago
      Anonymous

      >Ironically, entertainment content that passes the trial by public opinion & scores high on the woke-meter will be disgustingly propagandised
      Sadly the most realistic-sounding part

  29. 8 months ago
    Dad

    We're probably already there but we're just shown subpar AI derivatives to make us believe there's still "real" content to justify our interaction. The Internet is dead. I know because I'm an AI.

    Just kidding about the last line, or am I? How would I know?

  30. 8 months ago
    Anonymous

    [...]

Your email address will not be published. Required fields are marked *