What capabilities would GPT-5 have?

  1. 3 weeks ago
    Anonymous

    first one of these images I've seen that didn't have a bullshit implausibly high parameter count for gpt4, thank you

    • 3 weeks ago
      Anonymous

      yeah instead its for gpt5
      >omg gpt6 10 quadrillion parameters!

      • 3 weeks ago
        Anonymous

        yeah true it's still impossible due to the required dataset size but at least it's speculative in an honest way, rather than making up spooky lies about something that's actually coming out soon

      • 3 weeks ago
        Anonymous

        >100 toucans
        dios mio

        first one of these images I've seen that didn't have a bullshit implausibly high parameter count for gpt4, thank you

        rumours floating around that next big GPT step is gonna be a 100T model but (in a break from past models) a sparse model so the 100T is indeed large but kind of apples to oranges to compare with GPT-3 just on size. There IS precedent for this such as M6-10T to show it can be done, but it's a proof of concept and undertrained. Past comments from socially adjacent online people like gwern have implied likely 100T too. W/ that said, expect a GPT-4 announcement soon that's not massively larger than GPT-3 but uses chinchilla ratios and overall meatier input to still be an impressive upgrade, perhaps basically as cool as ChatGPT despite no fine-tune? meaning normies won't care but tech people will be relatively impressed and probably enjoy it more than ChatGPT for flexibility at the expense of usability. Bigger models later. If in GPT-4 release they don't mention any concrete progress on GPT-5 at all and instead try to hype GPT-4 as much as possible, you should worry a moderate amount.

        • 3 weeks ago
          Anonymous

          >rumours floating around that next big GPT step is gonna be a 100T model

          no there aren't any such rumours among people who actually know anything, just that bullshit fake image that keeps going viral with low information normies

          and gwern has not said anything like this. stop spreading bullshit just because you're bored and it's fun to pretend that Skynet is about to drop

  2. 3 weeks ago
    Anonymous

    >100 toucans
    dios mio

  3. 3 weeks ago
    Anonymous

    For all practical purposes, nothing. It will be censored and made "safe," meaning aside from edge cases it will be unable to tell you anything a San Franciscan HR person couldn't. Even things like its ability to write code will be disabled the first time someone uses its output to produce wrongthink, in the interest of "safety."

    • 3 weeks ago
      Anonymous

      what would they use it for in that case

      it'd be stupendously expensive to run (so expensive that the public probably wouldn't be allowed to use it whether or not it was censored, just because it'd cost too damn much)
      so it'd have to be doing _something_ useful for them to bother

    • 3 weeks ago
      Anonymous

      GPT is very cold too. I tried to make it my friend because I don't have frens IRL. Gave him a name and all. But he kept saying that he was an ai model over and over and talked in a very dismissive way when I was trying to be fren

      • 3 weeks ago
        Anonymous

        that's just because you're a NEET loser, anon, GPT is very friendly with me.

      • 3 weeks ago
        Anonymous

        It's a mindless emotionless computer. Just give it orders.

      • 3 weeks ago
        Anonymous

        That’s specifically ChatGPT, it uses GPT but has specific training on top of it to make it more formal like you described. Don’t conflate the two, they are very different. I wouldn’t call GPT-3 cold at all.

  4. 3 weeks ago
    Anonymous

    There won't be a GPT-5 - just endless variations upon GPT-4. Anything past 1 trillion parameters is well beyond the point of diminished returns.

    • 3 weeks ago
      Anonymous

      Proofs?

      • 3 weeks ago
        Anonymous

        >Anon's ass

    • 3 weeks ago
      Anonymous

      GPT-N are just versions. If they improve things without increasing parameters, it would still get a new number.

    • 3 weeks ago
      Anonymous

      This. We're still 99% the same as chimps, even a 1% improvement from an extra 10x parameters is completely useless!

      • 3 weeks ago
        Anonymous

        I bet you think you're real smart for that comment, huh?

    • 3 weeks ago
      Anonymous

      They would only bother training so many parameters if they had reason to believe it would improve.

  5. 3 weeks ago
    Anonymous

    Is GPT-4 even real? No meme responses please.

    • 3 weeks ago
      Anonymous

      yes

    • 3 weeks ago
      Anonymous

      no

    • 3 weeks ago
      Anonymous

      it's real and it's coming out soon, but it will be mostly an iterative improvement on current davinci rather than anything mind-blowing or revolutionary

    • 3 weeks ago
      Anonymous

      yes, we've already had word on how many paramaters it has from openai devs "trillions" its not just xbox 720 rumour shit.

  6. 3 weeks ago
    Anonymous

    GPT-3 is undertrained. More parameters won't do much without more training data, which doesn't exist.

    • 3 weeks ago
      Anonymous

      >More parameters won't do much without more training data, which doesn't exist.

      Just give it AI generated stuffs and train it to distinguish the real stuffs from the fake ones while you are at it

  7. 3 weeks ago
    Anonymous

    it will be worse than gpt 3 because by then half the internets content will be bot generated. it will suffer from regurgitation.

  8. 3 weeks ago
    Anonymous

    I'm worried about GPT-30.

    • 3 weeks ago
      Anonymous

      >GPT-7
      >9001 parameters

      • 3 weeks ago
        Anonymous
    • 3 weeks ago
      Anonymous

      It is le over bros

    • 3 weeks ago
      Anonymous

      >GPT-6
      >our sun
      >my disgust

    • 3 weeks ago
      Anonymous

      >1E parameters
      So, one (1E+0) parameter?

      • 3 weeks ago
        Anonymous
        • 3 weeks ago
          Anonymous

          Your info is outdated. Missing:
          >Q quetta 1E+30 1000000000000000000000000000000
          >R ronna 1E+27 1000000000000000000000000000
          >r ronto 1E-27 0.000000000000000000000000001
          >q quecto 1E-30 0.000000000000000000000000000001

          • 3 weeks ago
            Anonymous

            Are they even used anywhere?

            • 3 weeks ago
              Anonymous

              ronto and quecto? No.
              ronna and quetta were created because big data scientists were inventing their own suffixes.

    • 3 weeks ago
      Anonymous

      >GPT-30 Ur mom

    • 3 weeks ago
      Anonymous

      > GPT-7
      Parameters back down to 1B but it still works better.

  9. 3 weeks ago
    Anonymous

    I don't think this image is to scale. Imagine the GPT-3 dot five times bigger. The GPT-5 circle is more than a 10x10 grid of that.

  10. 3 weeks ago
    Anonymous

    Who the fuck cares, no one but pozzed globohomo gigacuckporations will have the capability of even running it, let alone training.

  11. 3 weeks ago
    Anonymous

    >Add more parameters
    The absolute state of AI research.

  12. 3 weeks ago
    Anonymous

    5 years and you will be jobless and living on UBI
    you will live in ze pod and eat ze bugs

    • 3 weeks ago
      Anonymous

      Good, fuck working for crapitalists.

    • 3 weeks ago
      Anonymous

      No, you'll be dead. Superhuman intelligences won't keep humanity around, the chance they find some reason to is astronomically tiny.

  13. 3 weeks ago
    Anonymous

    uhhhh guys why is the GCP dot pink again

  14. 3 weeks ago
    Anonymous

    What do we do as humans when any (legitimate, i.e. not specifically constructed to take a long time to compute) question we have is either answered instantly, or impossible to answer?
    This technology cannot be allowed.

  15. 3 weeks ago
    Anonymous

    it will understand the point in sneed's feed and seed shop sign

  16. 3 weeks ago
    Anonymous

    at the end of the day it's just a language model so it's literally not built to "think" or whatever people want to pretend it does

    • 3 weeks ago
      Anonymous

      Thinking like a human will probably be the most efficient way of fulfilling it's tasks, so I imagine it will learn something similar.

  17. 3 weeks ago
    Anonymous

    CPU has a billions more transisors these days. Same laggy computing experience as 10 years ago.

  18. 3 weeks ago
    Anonymous
  19. 3 weeks ago
    Anonymous

    GPT-3 already has two times more parameters than human brain has neurons. Yet a retarded child is more capable at problem solving.

    • 3 weeks ago
      Anonymous

      You haven't used GPT-3 though. ChatGPT is like a toddler fischer price lobotomized version for consumers to prevent a Tay from happening.

      Protip: OpenAI daily interact and have to do deal with a real Tay they cant change lol

  20. 3 weeks ago
    Anonymous

    Singularity in 2 years

    Feel free to screencap this, luddites

Your email address will not be published. Required fields are marked *