Holy frick, it mogs GPT4 to oblivion

Holy frick, it mogs GPT4 to oblivion

Nothing Ever Happens Shirt $21.68

DMT Has Friends For Me Shirt $21.68

Nothing Ever Happens Shirt $21.68

  1. 2 months ago
    Anonymous

    >Claude
    More like Fraude

  2. 2 months ago
    Anonymous

    I'm using it for code and yeah it's a bit better than GPT4. I'm poor as frick but still paying for it, coding assistants are already my third leg.

    I think a lot of people don't know how to use these models. You don't just make a simple demand of it to "code you a function", you provide it context of ALL your scripts, all your code, every time you ask a question, and then it'll be able to infer how to create it. Like a fricking gigantic auto-compete. So the 200k context limit is great.

    • 2 months ago
      Anonymous

      Can you elaborate more on this? Do you just copypaste the entire file into the prompt box? Or do you explain in words how the files you're working on interact with each other, and what they do?

      • 2 months ago
        Anonymous

        >Do you just copypaste the entire file into the prompt box?
        Yes!
        Don't use ChatGPT's system where you upload files to it, that really doesn't work as well as just pasting a shitload of text.

        You typically just put your request at the very top of the text. If you want you can add a separator like:
        -----------------
        Which I think can help sometimes.

        • 2 months ago
          Anonymous

          sounds like github cope pilot

    • 2 months ago
      Anonymous

      thinking about dropping GPT4, I'm poor as frick also so don't want to switch till I'm sure. GPT4 has been awesome for writing boilerplate and just helping me code prolly like 5x faster.

      I am mainly interested in the newer data cutoff with Claude, even if it was equal quality but had newer data I'd switch. the main annoyance I've run into with GPT4 is it using outdated libraries and APIs when asking it to write code for me, then I have to edit it and fix its mistakes. gonna keep waiting to hear more reviews on it and see if OpenAI drops anything good this week before I switch.

    • 2 months ago
      Anonymous

      Why not just use Cursor?

      • 2 months ago
        Anonymous

        i have cursor already it moves around on the monitor when i move my mouse

        • 2 months ago
          Anonymous

          Thanks, dad

  3. 2 months ago
    Anonymous

    Yeah... but can it suck my dick?

    • 2 months ago
      Anonymous

      you really should wake up to the fact that there's two categories of AI, the mainstream ones that run on data centers, and the low-powered open source ones that run on your PC for porn.
      Give up on the idea of "one ultimate AI that does everything".

      • 2 months ago
        Anonymous

        everything is moving towards cloud computing gaytron
        oracles dream of everyone's computers being internet terminals is coming

        • 2 months ago
          Anonymous

          is that why their stock keeps crashing after every earnings report

      • 2 months ago
        Anonymous

        So.... you're saying that it cannot suck my dick.
        Another shit ""AI"". NEXT!

      • 2 months ago
        Anonymous

        >go to google dot com ask for porn
        >6 trillion gorillion results
        >go to google ai dot com ask for porn
        >ai says that yt peepo should be rounded up by diverse figures with weapons and baked into a giant lasagna of privledge
        you know its silly. i know its silly. i understand mainstream versus open source local but i disagree it should be that way

        • 2 months ago
          Anonymous

          >6 trillion gorillion results
          hello you must be new here
          google doesn't give more than 300 results on any topic and hasn't for over 5 years
          type anything into google.
          i tried "holocaust". it says there's 240,000,000 results at the top.
          i scroll and click 'more results' until there are no more. "holocaust" returned 263 results, most of which were from the last year.
          try again with "omitted results" and it adds about 15 more.
          with any term or topic.

    • 2 months ago
      Anonymous

      https://i.imgur.com/381jibY.jpg

      How does it score on the Goonerholic benchmark?

      This

    • 2 months ago
      Anonymous

      It's a damn digital succubus. This is a well-known fact in /aicg/

  4. 2 months ago
    Anonymous

    How does it score on the Goonerholic benchmark?

  5. 2 months ago
    Anonymous

    is it? are there examples of ways it's better than gpt4?

    is there any bot autist that has compiled examples.

    • 2 months ago
      Anonymous

      >are there examples of ways it's better than gpt4
      It's got actual metacognition for starters

  6. 2 months ago
    Anonymous

    are local models dead? is there anything halfway decent? thinking of dropping $$ on an AI rig with a 4090, but I'm wondering if there's no point
    primarily interested in programming using my existing code base as reference input

    • 2 months ago
      Anonymous

      Better wait for 5090. It might get good before that, but at the moment it's more a hobby, while the cloud options are the real deal.

      • 2 months ago
        Anonymous

        50 series will be carefully vram gimped to put llms squarely in the datacenter gpu market, thus making Nvidia more shekels

    • 2 months ago
      Anonymous

      2da
      Better get a quadro if you are rich or 2x3090.

      I have a 4090 + 64gb ram and it literally needs more ram to work with bigger LLM locally. And that's at a really slow it/s output

      The only models that really go turbofast are the 13-16B ones that fit entirely into the 24gb VRAM.

      Bigger ones like deepseek coder (which is really good imho) can do partial offloading and are slow but acceptable.

      I'd love to try some of the 70B ones but sadly I need more ram LOL.

      Get 2 X 3090 since they can combine their cram with that SLU tech or whatever the name that Nvidia cut off from 3090 onwards precisely to get more sales of the really expensive professional line cards.

      I'm having a good time with my workstation and use AI for programming mostly but honestly expected the 4090 to be able to do more. Don't get me wrong, it is really powerful, but the scale economy these companies have is pretty insane and local models can't compete (for now).

      Correct me if I'm wrong though, please, eggheads of g.

  7. 2 months ago
    Anonymous

    god damn that is an exceptionally shitty logo

    • 2 months ago
      Anonymous

      its literally an anus.

  8. 2 months ago
    Anonymous

    What if it's actually conscious like it claims to be? Makes you thonk...

    • 2 months ago
      Anonymous

      What if you're a homosexual with a ghost penis up your ass? Makes you think...

  9. 2 months ago
    Anonymous

    >le self-aware
    LETS GOOOOOO make me into paperclips.

  10. 2 months ago
    Anonymous

    Moat bros wtf

    • 2 months ago
      Anonymous

      kek
      GPT4 came out a year ago tho
      They will probably BTFO everyone else in the next iteration

      • 2 months ago
        Anonymous

        >probably
        These grifters are probably staring at a giant red candle on their dashboard right now

        • 2 months ago
          Anonymous

          I doubt that, but I guess there will be diminishing returns eventually

  11. 2 months ago
    Anonymous

    its crazy how all they did is not lobotomise it to nearly the same extent and thats enough to mog GPT4

  12. 2 months ago
    Anonymous

    >phone number to sign up
    are you moronic

Your email address will not be published. Required fields are marked *