GPT-4.5 will be soon available leaked

Search on duck duck go "OpenAI blog gpt-4.5 turbo" and you will see it. Apparently also works on bing

A Conspiracy Theorist Is Talking Shirt $21.68

Yakub: World's Greatest Dad Shirt $21.68

A Conspiracy Theorist Is Talking Shirt $21.68

  1. 2 months ago
    Anonymous

    Frick off nobody cares

  2. 2 months ago
    Anonymous

    will i have to pay more is the real question

    • 2 months ago
      Anonymous

      why would you pay at all

  3. 2 months ago
    Anonymous

    >OpenAI announces GPT-4.5 Turbo, a new model that surpasses GPT-4 Turbo in speed, accuracy and scalability. Learn how GPT-4.5 Turbo can generate natural language or code with a 256k context window and a knowledge cutoff of June 2024.

    Which intern fricked up?
    Also
    >June 2024.

    • 2 months ago
      Anonymous

      Probably supposed to be January 2024.

    • 2 months ago
      Anonymous

      4.0 sucks though so turbo is just fast suck?

      • 2 months ago
        Anonymous

        Yes, sucks really fast.

    • 2 months ago
      Anonymous

      >Which intern fricked up?
      Maybe it was written by GPT itself

  4. 2 months ago
    192.168.1.1

    How did this happen? How can search engines index the page when the blog url isn't live yet? Did someone oopsie the robots.txt?

  5. 2 months ago
    Anonymous

    What happened to gpt-5

    I thought things were accelerating, they seem to be going much slower than previously stated

    • 2 months ago
      Anonymous

      No, you fricking moron. Internal development has sped up; OpenAI has only gotten skittish about actually releasing things because their AI is getting too powerful, but they've been skittish for a long time. Sam Altman regretted releasing GPT-3.5 'too early' and Illya was reluctant to even release GPT-2. The release of SOTA models from other companies has sped up with GPT-4 going from 1st place to 3rd or 4th since the beginning of this year.

    • 2 months ago
      Anonymous

      People hyped AI advancement through media like a stock chart that goes up quickly and steep. It’s going to happen in waves, until AI can think up greater things than itself that it can also make or tell us to make. Even then that can taje time and effort to do. Which will slow down the speed of advancement.

  6. 2 months ago
    Anonymous

    >OpenAI blog gpt-4.5 turbo
    o shit it's real. on bing too

  7. 2 months ago
    Anonymous

    does anyone know how to get the cache ID for a given web page? for example, you can get the cached version of https://openai.com/blog/new-models-and-developer-products-announced-at-devday by going to https://cc.bingj.com/cache.aspx?d=5013756218379730&w=6iSDS4Uc_6nn_mbXwGgcadpZkIsAN28f
    on google it's as easy as typing cache:$URL, but doing that for https://openai.com/blog/gpt-4-5-turbo/ gets you a 404

    • 2 months ago
      Anonymous

      >https://web.archive.org/web/20240312193449/https://openai.com/blog/gpt-4-5-turbo/
      There's a capture on archive.org but it's empty. hmm

    • 2 months ago
      Anonymous

      any progress? how did you get the cache ID for that new-models-and-developer-products-announced-at-devday page?

      • 2 months ago
        Anonymous

        >any progress?
        none, I didn't keep looking for info

        >how did you get the cache ID for that new-models-and-developer-products-announced-at-devday page?
        bing shows the cached page in the results. I removed the search statement from the URL and the language info and got that. if you remove either parameters, d or w, it will show an error.
        I guess d is the ID and w is some sort of signature. I have no idea how to get the ID for some random page, though.

        • 2 months ago
          Anonymous

          I looked around for a while but gave up

          On most websites the main bing.com/search HTTP response contains the cache d and w parameters in the URL, like
          <div class="b_attribution" u="4|5092|4925035076274715|W4SHxnK-vTq_pb6EMVCxL8FrmgXrnUiR" tabindex="0">

          But for the gpt-4-5 URL, none shows up in the HTML
          There are no AJAX requests done later either
          I don't think any public docs exist regarding the generation system either, and from comparing a few of them, there aren't any obvious patterns

    • 2 months ago
      Anonymous

      I looked around for a while but gave up

      On most websites the main bing.com/search HTTP response contains the cache d and w parameters in the URL, like
      <div class="b_attribution" u="4|5092|4925035076274715|W4SHxnK-vTq_pb6EMVCxL8FrmgXrnUiR" tabindex="0">

      But for the gpt-4-5 URL, none shows up in the HTML
      There are no AJAX requests done later either
      I don't think any public docs exist regarding the generation system either, and from comparing a few of them, there aren't any obvious patterns

      oh, I just checked and realized d isn't an ID, you can change d to 1 or whatever and bing will still show the page.
      so, the actual ID is the w parameter.

      btw, w is base64-encoded:
      >>> import base64
      >>> base64.b64decode("6iSDS4Uc_6nn_mbXwGgcadpZkIsAN28f"+"==")
      b'xea$x83Kx85x1cxeayxe6m|x06x81xc6x9dxa5x99x08xb0x03vxf1'

      I guess the | separates the ID from some kind of signature

      • 2 months ago
        Anonymous

        oh, nice, I assumed since d was required (not supplying a value gives you a 404), the actual value mattered too

        Are you sure w is base64 though? I don't think base64 is supposed to have underscores or dashes
        You can interpret any string as base64 in the context of b64decode

        But someone got a Bing API response to the blog post URL and it looks like it was cached a while ago
        https://twitter.com/stimfilled/status/1767617991980589209
        So even if we saw the page, the info probably wouldn't be very useful

  8. 2 months ago
    Anonymous

    >GPT-4/Initial release date
    >March 14, 2023
    Maybe they will release it in 2 days?

  9. 2 months ago
    Anonymous

    good, I'm glad competition is finally lighting a fire under their asses
    claude has a 200k context window and the public GPT has been dogshit for a long time
    >we might give you 64k if you're lucky, but probably not

Your email address will not be published. Required fields are marked *