Now that the dust has settled, should I pay to use GPT-4?

Now that the dust has settled, should I pay to use GPT-4?

It's All Fucked Shirt $22.14

Homeless People Are Sexy Shirt $21.68

It's All Fucked Shirt $22.14

  1. 1 year ago
    Anonymous

    never pay for anything

  2. 1 year ago
    Anonymous

    You're actually considering shelling out money for something as frivolous and potentially dangerous as using GPT-4 to fulfill your secret fetishes? Have you not heard about the countless data breaches that have occurred in recent years, putting sensitive information at risk? And yet here you are, willingly exposing yourself to even more potential harm just to satisfy some twisted desire. Get a grip man, this isn't worth it.

    • 1 year ago
      Anonymous

      I don't believe in sensitive information, so...

  3. 1 year ago
    Anonymous

    Try to get the API instead, use a open source chatGPT verson with your api key, then pay as you use instead of $20 a month

    • 1 year ago
      Anonymous

      depends if you use it a lot... in either case, is right, get an API key and pay as you go. the frontend is for normies who don't know any better.

    • 1 year ago
      Anonymous

      >use a open source chatGPT verson

      What are the good ones these days? i haven't been keeping up

  4. 1 year ago
    Anonymous

    It's expensive to use and thus unusable.

  5. 1 year ago
    Anonymous

    isn't Bing chat gpt 4?

    • 1 year ago
      Anonymous

      Can you use Bing's ai without Edge?

      • 1 year ago
        Anonymous

        mmm believe so
        https://chrome.google.com/webstore/search/user%20agent%20switcher?hl=nl
        change the user agent to edge in chrome

        • 1 year ago
          Anonymous

          okay i tried and it works

  6. 1 year ago
    Anonymous

    Why would you use GPT-4 when local hosted models are now better for non-professional tasks?

    • 1 year ago
      Anonymous

      >non-professional tasks
      wtf are professional tasks?

      • 1 year ago
        Anonymous

        Using it as an automated customer support system, coding assistant, co-pilot integration for office etc. If you're using it as a personal assistant, roleplay or anything else local models are better.

        • 1 year ago
          Anonymous

          right nvm I forget people use this trash for anything beyond its intended purpose (cooming)

          • 1 year ago
            Anonymous

            You seem like an immensely creative person

        • 1 year ago
          Anonymous

          >coding

          So local finetuned models are still shit a coding?

          • 1 year ago
            Anonymous

            Nope

  7. 1 year ago
    Anonymous

    The dust has most assuredly not settled.

  8. 1 year ago
    Anonymous

    Is GPT-4 better at being factually correct?
    GPT-3 keeps giving me falsehoods about stuff like French literature even after asking it to double-check multiple times.

    • 1 year ago
      Anonymous

      try it for yourself
      https://www.bing.com/search

      • 1 year ago
        Anonymous

        Is there any reliable way to stop it from using bing search? Because when the LLM answers, it answers well - but the binged answers are just awful, unrelated to what I want to know and incomplete.

        Bing is a moronic tumor attached to a brilliant technology.

    • 1 year ago
      Anonymous

      >Is GPT-4 better at being factually correct?

      Slightly, but at this point I wouldn't be surprised if closedai is poisoining gpt3.5 results to make gpt4 look better. in my experience gpt4 is marginally better than gpt3.5 and often exactly the same, just much more bloated and slow.

      Trouble with closedai being closed is that we can never know if they've poisoned their older models to make their newer, more costly ones look better... because, well, they're closed.

  9. 1 year ago
    Anonymous

    What dust?

  10. 1 year ago
    Anonymous

    if you dont mind the 25 message limit, yes, pay for it, 20 bucks to make your life easier isnt much, i think its a bit of a rip off but, until local models get good this will do, it depends on how desperate you are for it honestly

Your email address will not be published. Required fields are marked *