AI bros... it's so fucking over...

AI bros... it's so fucking over...

  1. 3 months ago
    Anonymous

    But bros I thought self attention is good?

  2. 3 months ago
    Anonymous

    So, So, So, So, So, I'm fucking saying So, So, So,

    • 3 months ago
      Anonymous

      gay gay fgt gay
      I have similar qualifications to this anon but I also know hashmaps I also coded with the software I have similar qualifications to this anon but I also know hashmaps I also coded the ability to work

    • 3 months ago
      Anonymous

      https://i.imgur.com/JSanMfe.png

      >so

      What's wrong with the so word

      • 3 months ago
        Anonymous

        it's a soi word, like "though" at the end of sentences or "y'all"

        • 3 months ago
          Anonymous

          I type whatever the fuck I want though
          Ya'll are a bunch of morons

    • 3 months ago
      Anonymous

      he only said "so" one time

    • 3 months ago
      Anonymous

      And And And And So, fucking I'm So

    • 3 months ago
      Anonymous

      So, I was looking at this twitter screencap thread and I wanted to yell moron at the top of my lungs.

  3. 3 months ago
    Anonymous

    Anyone find the parallels between AI and inbreds to be hilarious?

    • 3 months ago
      Anonymous

      I'd say it's more comparable to the idea of generation loss. Wait.... can inbreeding be mathematically comparable to generation loss? More studies are needed. I smell a paper.

    • 3 months ago
      Anonymous

      it's more like losing your imagination as you age

  4. 3 months ago
    Anonymous

    source ?

    • 3 months ago
      Anonymous

      I've seen artcels claim this a few times now but never seen proof

      you don't need proofs on twitter
      just post something outrageous enough and people who want to believe it will

      I wish people BOTentlemen read more papers than solely living off of twitter/leddit threads between their gaming breaks. Shame that BOT doesn't care as much for AI
      https://arxiv.org/abs/2305.17493v2
      >The Curse of Recursion: Training on Generated Data Makes Models Forget
      >In this paper we consider what the future might hold. What will happen to GPT-{n} once LLMs contribute much of the language found online? We find that use of model-generated content in training causes irreversible defects in the resulting models, where tails of the original content distribution disappear. We refer to this effect as Model Collapse and show that it can occur in Variational Autoencoders, Gaussian Mixture Models and LLMs. We build theoretical intuition behind the phenomenon and portray its ubiquity amongst all learned generative models. We demonstrate that it has to be taken seriously if we are to sustain the benefits of training from large-scale data scraped from the web. Indeed, the value of data collected about genuine human interactions with systems will be increasingly valuable in the presence of content generated by LLMs in data crawled from the Internet.
      Model collapse is inevitable for AI especially if it depended on cheap web scraping

      • 3 months ago
        Anonymous

        reading papers costs money wtf this is BOT - consumer technology
        if it effects our ability to coom and play vidya, we're not interested in spending
        most of this board is NEET and/or robots

      • 3 months ago
        Anonymous

        BASED. MY JOB IS SAVED!!! SUCK MY COCK CHAT GPEET

        • 3 months ago
          Anonymous

          For now, maybe.

      • 3 months ago
        Anonymous

        Didn’t read

      • 3 months ago
        Anonymous

        Non sequitur.
        The fact that people can do this (train on AI-generated content) doesn't mean it's what's being done right now.

  5. 3 months ago
    Anonymous

    duh

  6. 3 months ago
    Anonymous

    I've seen artcels claim this a few times now but never seen proof

    • 3 months ago
      Anonymous

      you don't need proofs on twitter
      just post something outrageous enough and people who want to believe it will

      • 3 months ago
        Anonymous

        >he posted, on BOT

  7. 3 months ago
    Anonymous

    > get overdosed on digital art
    > loose sensibility
    oh wow, who would have guessed? he just forgot to mention that human digital art is also affected

  8. 3 months ago
    Anonymous

    Models mark their outputs so they know not to learn from them if they show up in future training data. That's the entire reason adversarial stego works.

  9. 3 months ago
    Anonymous

    So ummmmm apparently you basically already made this thread and you're ummm apparently too stupid to come up with another one?? Is this the power of human "intelligence"?

  10. 3 months ago
    Anonymous

    >GUYS PLEASE DONT MAKE ANYMORE AI ART OR YOUR AI ART WILL BE RUINED TRUST ME
    okay

  11. 3 months ago
    Anonymous

    That's a pile of shit. People only post AI art that is good quality. Which means the ai art that gets posted online is already pre-filtered and is of higher average quality than the average quality of unfiltered output.

    • 3 months ago
      Anonymous

      you are wrong on all accounts

  12. 3 months ago
    Anonymous

    HAHAHAHA AI BROS BTFO

  13. 3 months ago
    Anonymous

    >so

  14. 3 months ago
    Anonymous

    wouldnt the opposite be true?
    if you think about one of these sd models as having a large number of possible outputs, if ppl start uploading all the best outputs to the internet and a model trains off of that wouldnt that just improve the model to exclude bad outputs even more?

  15. 3 months ago
    Anonymous

    >3.7M views
    >76k likes
    The more I that I actually take the time to learn how AI actually works the more and more obvious it becomes when you see people completely talking out of their ass

  16. 3 months ago
    Anonymous

    We trained AI with AI so the AI can emulate AI.

    • 3 months ago
      Anonymous
  17. 3 months ago
    Anonymous

    >People post the best outputs from SD
    >All the best output begins to get incorporated into newer models
    >newer models biased by the better output
    >newer models produce outputs with better training
    ???

  18. 3 months ago
    Anonymous

    AI art probably needs smoothing to remove AI atrifacts before being used in training.
    The artifacts likely fuck up the training process and make the AI art more like adding random noise to the training set.

    • 3 months ago
      Anonymous

      It's amazing how quickly AI goes to shit when you try to train it with AI pictures.
      I tried to supplement a model with AI pics that had insufficient training data and just by throwing in 20% of AI pics it fucked up the model completely.
      Even if the data you use looks as realistic as possible, it somehow still manages to screw up the results.

      It's interesting to see how this plays out because art sites for example have gotten absolutely saturated with AI pictures and so have places like Pinterest.
      If you scrape the web for that data it's not going to work.
      This is going to lead to only well thought out models with hand picked training data standing out of the shit tier noise that has been introduced into the system.

      It's not just the noise it's the end result that still sucks balls. You can always tell when a picture has been made with AI.
      Throw those pics into the training data and you're in for a shit time and then do it few times over again and it's a disastrous result.

      • 3 months ago
        Anonymous

        kys garden gnome

      • 3 months ago
        Anonymous

        Just save it as jpeg with 50% quality before feeding it back
        Problem solved

      • 3 months ago
        Anonymous

        But Stable Diffusion is supposed to add the invisble watermark to generated images in order to avoid training on those images. However, there have been many Stable Diffusion models trained on Midjourney images. Sure, it's not generations deep yet, but they look fine.

        https://i.imgur.com/JSanMfe.png

        >so

        Therefore?

      • 3 months ago
        Anonymous

        >It's not just the noise it's the end result that still sucks balls. You can always tell when a picture has been made with AI.
        I'm not talking about noise in the image. An AI generation isn't created with the same intentionality as a human drawing which means the ques that the trainer looks for won't be as relevant.
        This is what I mean by noise. Even if you pick the best looking generated images, at the data level they're attributing differently to the set.

  19. 3 months ago
    Anonymous

    NonDB prog
    >oh no the DB prog has access to that DB
    Tragedy

  20. 3 months ago
    Anonymous

    >I was told that I would have to worry about an autonomous drone shooting a hellfire missile at me.
    >Instead, the problem is shitty AI-generated pictures based on other AI-generated pictures.

  21. 3 months ago
    Anonymous

    This isn't actually happening, it's just bitter artist cope.

  22. 3 months ago
    Anonymous

    lol

    • 3 months ago
      Anonymous

      why down syndrome fingers?

  23. 3 months ago
    Anonymous

    every day is repost day

  24. 3 months ago
    Anonymous

    YOU think the singularity is in 2033. I think it is in 2031!

  25. 3 months ago
    Anonymous

    Source: trust me bro

  26. 3 months ago
    Anonymous

    Generative AI? More DEGENERATIVE AI.

  27. 3 months ago
    Anonymous

    >the programs are now starting to pull from it
    Is that how that works

  28. 3 months ago
    Anonymous

    yes, the age of free web scrapping and data harvesting are coming to an end.

    no more free lunch from there

  29. 3 months ago
    Anonymous

    Anyone who has been on the internet more than a minute knows there is a lot of shitty art around.
    Whether it is by curating the data set they train on or detecting in code the quality of the image, they'll find some solution to generated images poisoning the training.

    • 3 months ago
      Anonymous

      Yeah, I downloaded a bunch of images from one of the boorus to train a model on and a lot of the art is so shitty I can't use it. I don't know how these "artists" really have room to criticize anything.

Your email address will not be published. Required fields are marked *