What capabilities would GPT-5 have?

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 1 year ago
    Anonymous

    first one of these images I've seen that didn't have a bullshit implausibly high parameter count for gpt4, thank you

    • 1 year ago
      Anonymous

      yeah instead its for gpt5
      >omg gpt6 10 quadrillion parameters!

      • 1 year ago
        Anonymous

        yeah true it's still impossible due to the required dataset size but at least it's speculative in an honest way, rather than making up spooky lies about something that's actually coming out soon

      • 1 year ago
        Anonymous

        >100 toucans
        dios mio

        first one of these images I've seen that didn't have a bullshit implausibly high parameter count for gpt4, thank you

        rumours floating around that next big GPT step is gonna be a 100T model but (in a break from past models) a sparse model so the 100T is indeed large but kind of apples to oranges to compare with GPT-3 just on size. There IS precedent for this such as M6-10T to show it can be done, but it's a proof of concept and undertrained. Past comments from socially adjacent online people like gwern have implied likely 100T too. W/ that said, expect a GPT-4 announcement soon that's not massively larger than GPT-3 but uses chinchilla ratios and overall meatier input to still be an impressive upgrade, perhaps basically as cool as ChatGPT despite no fine-tune? meaning normies won't care but tech people will be relatively impressed and probably enjoy it more than ChatGPT for flexibility at the expense of usability. Bigger models later. If in GPT-4 release they don't mention any concrete progress on GPT-5 at all and instead try to hype GPT-4 as much as possible, you should worry a moderate amount.

        • 1 year ago
          Anonymous

          >rumours floating around that next big GPT step is gonna be a 100T model

          no there aren't any such rumours among people who actually know anything, just that bullshit fake image that keeps going viral with low information normies

          and gwern has not said anything like this. stop spreading bullshit just because you're bored and it's fun to pretend that Skynet is about to drop

  2. 1 year ago
    Anonymous

    >100 toucans
    dios mio

  3. 1 year ago
    Anonymous

    For all practical purposes, nothing. It will be censored and made "safe," meaning aside from edge cases it will be unable to tell you anything a San Franciscan HR person couldn't. Even things like its ability to write code will be disabled the first time someone uses its output to produce wrongthink, in the interest of "safety."

    • 1 year ago
      Anonymous

      what would they use it for in that case

      it'd be stupendously expensive to run (so expensive that the public probably wouldn't be allowed to use it whether or not it was censored, just because it'd cost too damn much)
      so it'd have to be doing _something_ useful for them to bother

    • 1 year ago
      Anonymous

      GPT is very cold too. I tried to make it my friend because I don't have frens IRL. Gave him a name and all. But he kept saying that he was an ai model over and over and talked in a very dismissive way when I was trying to be fren

      • 1 year ago
        Anonymous

        that's just because you're a NEET loser, anon, GPT is very friendly with me.

      • 1 year ago
        Anonymous

        It's a mindless emotionless computer. Just give it orders.

      • 1 year ago
        Anonymous

        That’s specifically ChatGPT, it uses GPT but has specific training on top of it to make it more formal like you described. Don’t conflate the two, they are very different. I wouldn’t call GPT-3 cold at all.

  4. 1 year ago
    Anonymous

    There won't be a GPT-5 - just endless variations upon GPT-4. Anything past 1 trillion parameters is well beyond the point of diminished returns.

    • 1 year ago
      Anonymous

      Proofs?

      • 1 year ago
        Anonymous

        >Anon's ass

    • 1 year ago
      Anonymous

      GPT-N are just versions. If they improve things without increasing parameters, it would still get a new number.

    • 1 year ago
      Anonymous

      This. We're still 99% the same as chimps, even a 1% improvement from an extra 10x parameters is completely useless!

      • 1 year ago
        Anonymous

        I bet you think you're real smart for that comment, huh?

    • 1 year ago
      Anonymous

      They would only bother training so many parameters if they had reason to believe it would improve.

  5. 1 year ago
    Anonymous

    Is GPT-4 even real? No meme responses please.

    • 1 year ago
      Anonymous

      yes

    • 1 year ago
      Anonymous

      no

    • 1 year ago
      Anonymous

      it's real and it's coming out soon, but it will be mostly an iterative improvement on current davinci rather than anything mind-blowing or revolutionary

    • 1 year ago
      Anonymous

      yes, we've already had word on how many paramaters it has from openai devs "trillions" its not just xbox 720 rumour shit.

  6. 1 year ago
    Anonymous

    GPT-3 is undertrained. More parameters won't do much without more training data, which doesn't exist.

    • 1 year ago
      Anonymous

      >More parameters won't do much without more training data, which doesn't exist.

      Just give it AI generated stuffs and train it to distinguish the real stuffs from the fake ones while you are at it

  7. 1 year ago
    Anonymous

    it will be worse than gpt 3 because by then half the internets content will be bot generated. it will suffer from regurgitation.

  8. 1 year ago
    Anonymous

    I'm worried about GPT-30.

    • 1 year ago
      Anonymous

      >GPT-7
      >9001 parameters

      • 1 year ago
        Anonymous
    • 1 year ago
      Anonymous

      It is le over bros

    • 1 year ago
      Anonymous

      >GPT-6
      >our sun
      >my disgust

    • 1 year ago
      Anonymous

      >1E parameters
      So, one (1E+0) parameter?

      • 1 year ago
        Anonymous
        • 1 year ago
          Anonymous

          Your info is outdated. Missing:
          >Q quetta 1E+30 1000000000000000000000000000000
          >R ronna 1E+27 1000000000000000000000000000
          >r ronto 1E-27 0.000000000000000000000000001
          >q quecto 1E-30 0.000000000000000000000000000001

          • 1 year ago
            Anonymous

            Are they even used anywhere?

            • 1 year ago
              Anonymous

              ronto and quecto? No.
              ronna and quetta were created because big data scientists were inventing their own suffixes.

    • 1 year ago
      Anonymous

      >GPT-30 Ur mom

    • 1 year ago
      Anonymous

      > GPT-7
      Parameters back down to 1B but it still works better.

  9. 1 year ago
    Anonymous

    I don't think this image is to scale. Imagine the GPT-3 dot five times bigger. The GPT-5 circle is more than a 10x10 grid of that.

  10. 1 year ago
    Anonymous

    Who the frick cares, no one but pozzed globohomosexual gigacuckporations will have the capability of even running it, let alone training.

  11. 1 year ago
    Anonymous

    >Add more parameters
    The absolute state of AI research.

  12. 1 year ago
    Anonymous

    5 years and you will be jobless and living on UBI
    you will live in ze pod and eat ze bugs

    • 1 year ago
      Anonymous

      Good, frick working for crapitalists.

    • 1 year ago
      Anonymous

      No, you'll be dead. Superhuman intelligences won't keep humanity around, the chance they find some reason to is astronomically tiny.

  13. 1 year ago
    Anonymous

    uhhhh guys why is the GCP dot pink again

  14. 1 year ago
    Anonymous

    What do we do as humans when any (legitimate, i.e. not specifically constructed to take a long time to compute) question we have is either answered instantly, or impossible to answer?
    This technology cannot be allowed.

  15. 1 year ago
    Anonymous

    it will understand the point in sneed's feed and seed shop sign

  16. 1 year ago
    Anonymous

    at the end of the day it's just a language model so it's literally not built to "think" or whatever people want to pretend it does

    • 1 year ago
      Anonymous

      Thinking like a human will probably be the most efficient way of fulfilling it's tasks, so I imagine it will learn something similar.

  17. 1 year ago
    Anonymous

    CPU has a billions more transisors these days. Same laggy computing experience as 10 years ago.

  18. 1 year ago
    Anonymous
  19. 1 year ago
    Anonymous

    GPT-3 already has two times more parameters than human brain has neurons. Yet a moronic child is more capable at problem solving.

    • 1 year ago
      Anonymous

      You haven't used GPT-3 though. ChatGPT is like a toddler fischer price lobotomized version for consumers to prevent a Tay from happening.

      Protip: OpenAI daily interact and have to do deal with a real Tay they cant change lol

  20. 1 year ago
    Anonymous

    Singularity in 2 years

    Feel free to screencap this, luddites

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *