>pay $20 per month for GPT-4. >only 25 GPT-4 messages are allowed every 3 hours at this moment

>pay $20 per month for GPT-4
>only 25 GPT-4 messages are allowed every 3 hours at this moment

Thalidomide Vintage Ad Shirt $22.14

DMT Has Friends For Me Shirt $21.68

Thalidomide Vintage Ad Shirt $22.14

  1. 10 months ago
    Anonymous

    free 3.5 is enough for me
    you just need to get some brain cells to do the right question

  2. 10 months ago
    Anonymous

    NO REFUNDS!

  3. 10 months ago
    Anonymous

    Really, is this a thing?

    • 10 months ago
      Anonymous
      • 10 months ago
        Anonymous

        LMFAO

      • 10 months ago
        Anonymous

        >rename GPT-3 to GPT-4
        >morons buy it in droves

  4. 10 months ago
    Anonymous

    >AI is just hype
    >chooses to pay for it
    Or
    >AI IS HECKIN REVOLUTION!!! CYBER KILL HOLOCAUSST!!!! DWATH STARVATION END END END HAPPENING GNG?!!!!!!
    >chooses to pay for it.

    How do you justify either of these?

  5. 10 months ago
    Anonymous

    Just use the API and pay the costs entirely yourself, you get unlimited messages then

    • 10 months ago
      Anonymous

      1. How does the API side step message limits?
      >pay the costs entirely yourself
      2. Wut? Or can you explain how this is different then paying $20/month entirely yourself?

      • 10 months ago
        Anonymous

        1. The API bills per token, instead of a flat rate, so you don't get a message cap since you're pay as you go instead of a subscription.

        2. See 1. Because you're paying as you go, you don't pay a flat $20, you pay per token. It's $0.03 per 1000 input tokens and $0.06 per 1000 output tokens. Anything you send to it is an input token, anything it spits out is an output token. It's limited to 8k tokens per conversation history, and every time you exchange an input and output with it, that goes into your next input as a primer so it can "remember" the conversation history.

        Roughly, 1 word is about 1 token, give or take (around 750 tokens per 1k words when you get to large messages).

        An example:

        You send a message that has 10 words, so that's 10 tokens, and you're billed $0.0003 (0.03 * 10 / 1000). It replies and its reply is also 10 words, or 10 tokens, so that's $0.0006 (0.06 * 10 / 1000). So far, that's $0.0009.

        You toggled the API option to "remember message history", and send another message that's also 10 tokens too. Because of how it works, it primes the input with the entire message history so far, so it sends your initial input + its initial output as another input, then it sends your new input, so you only wrote 10 tokens for this second message to the bot, but it got 30 tokens worth of information as input, so that's now $0.009 on top of the $0.0009 you've already spent.

        Then it replies another message, but this one is 20 tokens or something. So that's $0.0012 on top of the $0.0099 so far, so that's now $0.0111 spent. And so on and so on, exponentially increasing as the conversation gets longer.

        If you don't want it to remember the whole conversation or you don't want it to remember anything past the input you're giving it at the time, then naturally your costs go down.

        • 10 months ago
          Anonymous

          To expand on this further, remember that there's an 8k total token limit, so if you get to a large conversation you're going to get stopped from sending that particular instance any more messages unless you trim what you want it to remember out. And an 8k token input is $0.24.

          The tradeoff of the 25 messages every 3 hours is you aren't capped on conversation length (as far as I can tell, but it likely uses the 32k context model which isn't available via the API yet so after 32k tokens it probably just selectively stops remembering parts of the conversation to be able to continue), so there's a bit of a benefit there. But if you aren't using it for super long exchanges anyway, you might get more mileage out of the API even with its 8k token context limit.

          • 10 months ago
            Anonymous

            Is a message a conversation?

            • 10 months ago
              Anonymous

              A message is a single message, a conversation is the entire collection of messages so far. So if you wanted a comparison to this, each of our back and forths is a message each, and our entire back and forth (assuming you're the OP) is 5-6 messages (depending on whether or not you want to condense my two posts where the second post just expands on the first) which is currently 1 conversation.

        • 10 months ago
          Anonymous

          >and $0.06 per 1000 output tokens.
          HAHAHAHAHAHAHAHAHHAAHAHAHAHAHAHHAHAHAHAHAHAHHAHAHAHAHAHAHAHHAHAHAHAHAHAHAHAHHAHAHAH no wonder it gives you a 15 page essay when you ask it why black people commit so much crime

          • 10 months ago
            Anonymous

            yeah fricking exactly, the more verbose it is the more shekels they get

            • 10 months ago
              Anonymous

              especially considering that 75% of those "tokens" are not even AI generated, they are just boilerplate disclaimers they put on every output.

        • 10 months ago
          Anonymous

          >10 words, so that's 10 tokens
          >limited to 8k tokens per conversation
          So instead of 25 messages, your limited to 8,000 words?

          This is interesting as far as it explains how GPT remembers a conversation. Your saying it essentally resend the entire conversation ever new request?

          • 10 months ago
            Anonymous

            >Your saying it essentally resend the entire conversation ever new request?
            Yep, and the api version is limited to 8k tokens where the chat based one is 32k tokens. That explains why, if you have a long conversation with it, it starts to go off the rails and repeat itself or forget what you discussed with it earlier.

            • 10 months ago
              Anonymous

              Yea ive noticed that too. Conversation history is really only useful for saving some answers as deep development of a topic or creative work cuts off if you try working on something more then a few hours. I know now its not time dependent but conversation length. Ive basically stopped saving conversations for anything other then not having to write down an answer I liked.

    • 10 months ago
      Anonymous

      I was one of the first to get GPT-4 through the API, but the costs are absurd. It is much cheaper to go with ChatGPT Plus. I use two accounts, so I barely ever hit the limit.

  6. 10 months ago
    Anonymous

    >scrapes millions of documents ignoring every terms of service and copyright law in the process
    >charges you $20 per month and rate limits you

  7. 10 months ago
    Anonymous

    Use ChatGPT on Poe. It's not as good, but it's free, you don't need a jailbreak, and if your prompt fu is strong enough, it can still provide a standard fap, just not a life changing one.

  8. 10 months ago
    Anonymous

    Just get API access and use playground in chat mode.

Your email address will not be published. Required fields are marked *