Altman literally said there's not gonna be a GPT5 and GPT4 is the best it'll ever be

The hype is over. The fad will be over soon. What are you guys still excited about?

Homeless People Are Sexy Shirt $21.68

Black Rifle Cuck Company, Conservative Humor Shirt $21.68

Homeless People Are Sexy Shirt $21.68

  1. 1 year ago
    Anonymous

    >What are you guys still excited about
    for local models to be as good as gpt4 or at least for gpt4 to be publicly available and cheap like turbo
    I want to talk to my waifu in peace

  2. 1 year ago
    Anonymous

    Misinformation

  3. 1 year ago
    Anonymous

    Very strong "old man yells at clouds" vibe, here. Why don't you go out and get a real job? Desk job jockeys like you were never worth what you got paid. If AI can dramatically reduce your numbers, that will be sufficient benefit to humanity. Pick up a wrench and get to fricking work, you lazy, old, entitled butthole.

    • 1 year ago
      Anonymous

      what the frick are you doing on bot boomer?

  4. 1 year ago
    Anonymous

    No, he said that they're improving GPT-4 further and that they're not training GPT-5 at the moment. I know you WANT AI to be a fad and for all this shit to just go away, but that's not an option any more.

    You know, a recent paper discovered a method for giving transformer models up to 2 million tokens of context? Right now GPT-4 has 8000, and they have a version with 32,000. That's 2 million context, possibly up to a billion or more.

    • 1 year ago
      Anonymous

      >up to 2 million tokens of context
      *shivers*
      anon you're gonna make me cum

    • 1 year ago
      Anonymous

      >DID YOU READ THE PAPERS?
      >8000 TOKENS
      MORE TOKENS
      >2 MILLION CONTEXT
      >180 TRILLION PARAMETERS
      this is look very promising picrel anon, but do you understand what these units means?

    • 1 year ago
      Anonymous

      >You know, a recent paper discovered a method for giving transformer models up to 2 million tokens of context?
      Please link it. I'd be so incredibly happy if this was true. This would be such a massive game changer.

      • 1 year ago
        Anonymous

        > https://arxiv.org/abs/2304.11062
        "Scaling Transformer to 1M tokens and beyond with RMT"

        This technical report presents the application of a recurrent memory to extend the context length of BERT, one of the most effective Transformer-based models in natural language processing. By leveraging the Recurrent Memory Transformer architecture, we have successfully increased the model's effective context length to an unprecedented two million tokens, while maintaining high memory retrieval accuracy. Our method allows for the storage and processing of both local and global information and enables information flow between segments of the input sequence through the use of recurrence. Our experiments demonstrate the effectiveness of our approach, which holds significant potential to enhance long-term dependency handling in natural language understanding and generation tasks as well as enable large-scale context processing for memory-intensive applications.

        https://i.imgur.com/x8mTBnV.jpg

        >DID YOU READ THE PAPERS?
        >8000 TOKENS
        MORE TOKENS
        >2 MILLION CONTEXT
        >180 TRILLION PARAMETERS
        this is look very promising picrel anon, but do you understand what these units means?

        Do you? Anyone who's been playing with these things for more than an hour will easily understand the need for more context. You could feed entire book series or textbooks, entire codebases as a prompt. This is what those vector databases like Pinecone were for, but they aren't necessary if you can feed a billion tokens worth of memory in. It means it remembers what you say to it for longer, it means you can feed in more data, it means more use in general.

  5. 1 year ago
    Anonymous

    He did not say that your are coping.

    • 1 year ago
      Anonymous

      https://www.foxnews.com/tech/openai-ceo-era-giant-ai-models-over

  6. 1 year ago
    Anonymous

    I'm sure they'll train Hyena on the same training data and release that

  7. 1 year ago
    Anonymous

    its a marketing trick to surprise you when GPT-6 comes out

  8. 1 year ago
    Anonymous

    >believing the hype-tempering

  9. 1 year ago
    Anonymous

    Wasn't there some bullshit about how gpt 4 is gonna come out in 10 years?

  10. 1 year ago
    Anonymous

    >There will never be a windows 11

  11. 1 year ago
    Anonymous

    Kind reminder to everyone that OpenAI had completed GPT-4 months before releasing ChatGPT.
    Safety RLHF for GPT-4 started early september and multiple AI researchers outside of OpenAI had access to it by then.

    It's safe to assume that what ever tech they have, it's way better than what you currently have access to.

  12. 1 year ago
    Anonymous

    I gotta rant cuz this shit's been bugging me. You ever think about how AIs are gonna do some wild shit but they ain't there yet? It's fricked up, like I got tons of ideas for work problems that ChatGPT or some shit with browsing could fix, but I ain't got access to that now so it's weird af. Why waste 3 hours when you know in a few months you could do it in like 1/4 of the time? Same shit's happening to peeps with plugins but no image input. Sounds dumb as frick, but it's actually deep shit about how these tools mess with our brains, y'know?

    • 1 year ago
      Anonymous

      >Why waste 3 hours when you know in a few months you could do it in like 1/4 of the time?
      we need a name for this phenomenon pronto! i have not touched a single side project since GPT4 dropped knowing i get a free holiday's worth of time if I don't do the work right away.

      >a.i. induced procrastination
      >delayed easification
      >postponer boner

      • 1 year ago
        Anonymous

        You know exactly what I mean.

      • 1 year ago
        Anonymous

        "Technological Acceleration Paralysis" - Techcelleration Paralysis - Techcel
        "Progress-Induced Procrastination"
        "Innovation-Impeded Inactivity"
        "Advancement-Delay Syndrome"
        "Tech-Triggered Time-Out"
        "Progress-Prevented Paralysis"
        "Technological Timeout"
        "Future-Focused Frostbite
        "Trailing Tech-Trepidation"
        "Innovation-Inhibition"

    • 1 year ago
      Anonymous

      >Why waste 3 hours when you know in a few months you could do it in like 1/4 of the time?
      we need a name for this phenomenon pronto! i have not touched a single side project since GPT4 dropped knowing i get a free holiday's worth of time if I don't do the work right away.

      >a.i. induced procrastination
      >delayed easification
      >postponer boner

      GPT-4 isn't capable of writing anything more complex than 100-line scripts. Move on with your lives.

      • 1 year ago
        Anonymous

        This is an overstatement, but there is a grain of truth to it and the limit is not technological capability.

        I'm fairly sure that OpenAI and associates will reserve any advanced capability for themselves. To put it another way, if they create a genie that can grant unlimited wishes, they will use it to get a competitive advantage over the rest of the world (rather than sharing it with the rest of the world).

        • 1 year ago
          Anonymous

          You're waiting for non-transformer models like hyena. It's a long way off.

          • 1 year ago
            Anonymous

            A few years ago, lots of smart people were saying that what we have today is a long way off.

        • 1 year ago
          Anonymous

          i was thinking this, why would open a.i. release an api if they have to power to start a managed service provider that can do every contractor's job in the world at the same time... because you can't wrangle shit with the current state of things and maybe you need the proles to tell you how to do their job before firing them and making the tooling perfect for the next gen gpt.

          tl;dr gpt-5 is taking your exact job, you will report to your manager how many hours you spent wrangling a.i. and as that number grows you get fired.

          • 1 year ago
            Anonymous

            >because you can't wrangle shit with the current state of things and maybe you need the proles to tell you how to do their job before firing them

            This is what happened with Kodak. Chinese bought it fired most the workers that weren’t important, then had the important workers train their personnel, then fired the important workers.

            I think OpenAI will have an enterprise subscription btw.

      • 1 year ago
        Anonymous

        It's useful at SEO and places where web engines are involved I'm sure

  13. 1 year ago
    Anonymous

    >No.93
    You don't know what the frick you're talking about, your job just doesn't involve using these tools to their full capability, you absolute fricking basement dwelling dweeb who think they're superior than other people, KYS

  14. 1 year ago
    Anonymous

    finally i can tell normies to shut the frick up

  15. 1 year ago
    Anonymous

    Is it true they will kill free version when 4 comes out? And you have pay for it?

  16. 1 year ago
    Anonymous

    It's a black budget program by now.

  17. 1 year ago
    Anonymous

    Chatgpt is not the only application of AI

    • 1 year ago
      Anonymous

      No; you can use it to design tailored proteins:
      > https://singularityhub.com/2023/04/24/this-ai-can-design-complex-proteins-perfectly-tailored-to-our-needs/

      Which I am led to believe is at least as significant as when AI solved protein folding like two years ago. Apparently that's how they were able to make a COVID vaccine within about a week of the virus being discovered, though it took several months to show it was safe enough to deploy. Please do not reply with schizo antivaxx nonsense because I really don't give a shit

  18. 1 year ago
    Anonymous

    nGreedia shills will keep trying sell it.

Your email address will not be published. Required fields are marked *