The King of AI Cards

>$300 brand new
>$250 second hand
>Tensor cores
>12GB VRAM
>NVIDIA drivers
Is the NVIDIA GeForce RTX 3060™ the best card for AI? How can anyone else compete

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 11 months ago
    Anonymous

    Can it run Crysis on HD?

    • 11 months ago
      Anonymous

      >video gaymes
      Wrong board

      This is what I use for prooompting for now, waiting for a good deal on a Nvidiot card with at least 16GB of VRAM

      • 11 months ago
        Anonymous

        >12GB VRAM

        why does nvidia torture us with such little ram. We need more, NOW. Like said, video games are for fricking children.

        The only purpose of a gpu at this point is Ai.

        • 11 months ago
          Anonymous

          This. I can run all the games I want with a ten year old graphics card, but AI's are what I need new hardware for. And for AI, VRAM is all that matters.

        • 11 months ago
          Anonymous

          cos it would cannibalize their """high end""" offerings, which are literal gamer cards but with more ram.

        • 11 months ago
          Anonymous

          you do realize 99% people still buy GPUs for games, do you? local AI is popular only with BOT coomers and schizos. 12GB VRAM is still plenty for gaming. you won't get more VRAM in consoomer cards any time soon.

        • 11 months ago
          Anonymous

          G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G G A M I N G

        • 11 months ago
          Anonymous

          Nvidia agrees, but their actual AI cards cost $10,000 each and are being bought up as fast as they can make them. They simply have no incentive to try and go for the consumer market in any way at all, especially since the majority of gamers have no need whatsoever for any GPU more advanced than the RTX 3080.

    • 11 months ago
      Anonymous

      Yes very well actually

  2. 11 months ago
    Anonymous

    The Intel Arc A770 has much better FP16 performance and the extra 4GB VRAM can take it further. Pretty much the only thing stopping it from becoming the value champion for AI right now is software support. Imagine what things will be like once oneAPI gets better support for PyTorch and Tensorflow.

    • 11 months ago
      Anonymous
      • 11 months ago
        Anonymous

        your point?

        VRAM is more important for AI. I can wait a few seconds longer for an iteration, I can't do anything about the memory.

      • 11 months ago
        Anonymous

        > theoretical max

        Now let's see the actual performance doing anything in the real world

    • 11 months ago
      Anonymous

      Can run only 7B.
      Pathetic.

      > the only thing stopping it from becoming the value champion for AI right now is software support
      Yup.
      AMD shows it isn't that easy.

      • 11 months ago
        Anonymous

        Not sure how they compare rn, but some time after Ryzen came out, Intel's software department had more employees than AMD had total employees.

    • 11 months ago
      Anonymous

      Do these arc cards have dedicated AI accelerators like Tensor cores in the RTX cards?

      • 11 months ago
        Anonymous

        Yes. They're called XMX cores (Xe Matrix Extensions).

        • 11 months ago
          Anonymous

          Now do they support cuda?

        • 11 months ago
          Anonymous

          >xe
          go back to xeddit

    • 11 months ago
      Anonymous

      Intel Arc is already killing it in Blender, they just merged initial hardware RT support while AMD still hasn't gotten HIP-RT for the RX6000 series out the door even though they promised it was going to be ready by late last year.

  3. 11 months ago
    Anonymous

    Have a 3060, bought if for unreal and it beat a 3080 in performance on that.

    Now use it for prompting, but yeah I want MOAR vram. Looking at a 4090, prices are ridiculous here in EU

    • 11 months ago
      Anonymous

      >Looking at a 4090, prices are ridiculous here in EU
      They are already at or below exchange rate adjusted msrp + vat.
      I'm waitgayging for one too but I don't expect them to go much lower.

  4. 11 months ago
    Anonymous

    >192-bit bus
    laughinggirls.jpg

  5. 11 months ago
    Anonymous

    Frick off AI homosexual it's because of homosexuals like you AMD limits the VRAM on other cards so AI homosexuals don't just buy them all we'd have a 16GB 4070s and 20GB 2080s if it wasn't for AI homosexuals immediately just buying them all to make porn all the 3060s being sold are all to AI homosexuals wanting to make porn frick every single one of you AI homosexuals

    • 11 months ago
      Anonymous

      >we'd have a 16GB 4070s and 20GB 2080s if it wasn't for AI homosexuals
      Eh, not like Nvidia started skimping on VRAM only recently - see 970 3.5GB. Also, models like 2080 TI and 3080 were popular for AI despite not having like 20GB.
      >so AI homosexuals don't just buy them all
      Hey, I've been telling AI bros to buy Arc A770 16GB. As

      https://i.imgur.com/gzScWjm.jpg

      The Intel Arc A770 has much better FP16 performance and the extra 4GB VRAM can take it further. Pretty much the only thing stopping it from becoming the value champion for AI right now is software support. Imagine what things will be like once oneAPI gets better support for PyTorch and Tensorflow.

      says, for AI in particular it's got a very compelling price : performance (and VRAM).
      You can do some gaming on it, too, and it'll age much better than the various 8GB Ampere VRAMlets.
      (On a strategic note, I'd also much rather support Intel staying in the dGPU market than Nvidia's price policies. Vote with your wallet etc. Pic related.)

  6. 11 months ago
    Anonymous

    >EVGA
    oh no no no no

  7. 11 months ago
    Anonymous

    recently 'upgraded' and went with this because of the 12GB. the next step was a 4070 for $250 more, and still 12GB.

  8. 11 months ago
    Anonymous

    >>$300 brand new
    370 in germany
    And euros are supposed to be stronger than dollars
    19% VAT raping again
    Remember, that's what the leftists want

    • 11 months ago
      Anonymous

      yeah, buying electronics in EU and not having a business to deduct VAT is tough

    • 11 months ago
      Anonymous

      350 here.

    • 11 months ago
      Anonymous

      vat is the most right wing tax, as it falls specially to the lower classes and doesn't consider the incomes or assets at all.

      • 11 months ago
        Anonymous

        >he thinks leftists care about the average person

      • 11 months ago
        Anonymous

        leftist theft is never right wing no matter how much you recoil, how injured you are, how you suddenly shrink back

      • 11 months ago
        Anonymous

        you have to go back you moron

      • 11 months ago
        Anonymous

        > this thing we advocated for turned out to not be what we thought it would even after evil right wingers warned us
        > that means it's evil and right wing
        You have a child's understanding of the world if you think "right wing" means needlessly attacking poor people.

        • 11 months ago
          Anonymous

          it means my news channel wants me to like you and hate who they call "on the left" ooo spooky

  9. 11 months ago
    Anonymous

    I'm not telling

  10. 11 months ago
    Anonymous

    Literally just buy a second hand RTX A4000 with 20GB for €450

  11. 11 months ago
    Anonymous

    *king of AI cards for a hobbyist

  12. 11 months ago
    Anonymous

    >12GB VRAM
    there's your problem
    t. creditmaxxed a used 3090 for 4-bit 30B miku

    • 11 months ago
      Anonymous

      >creditmaxxed
      moron

      • 11 months ago
        Anonymous

        weird way to spell "based"
        mr shekelberg just gives me stuff its awesome

      • 11 months ago
        Anonymous

        so long as it's 0% and you can pay it off before it does get a rate it's always worth it

        • 11 months ago
          Anonymous

          two assumptions that never hold true
          don't spend money you don't have, or get hauled to court by the banks and watch them frisk you for everything you own

          • 11 months ago
            Anonymous

            Spending money I don't yet have had been the best investment of my life
            >t. homeowner.

            • 11 months ago
              Anonymous

              I bought my home

          • 11 months ago
            Anonymous

            >dont spend money you dont have
            good to know you are a financially illiterate boomer

            I bought my home

            you dont have to spend 3x your yearly wage in a month when taking out credit

            money has a time factor
            >https://www.investopedia.com/terms/t/timevalueofmoney.asp

  13. 11 months ago
    Anonymous

    It is. I bought one used last December specifically for AI.

  14. 11 months ago
    Anonymous

    >EVGA

  15. 11 months ago
    Anonymous

    Hehe My Coom Generator

    • 11 months ago
      Anonymous

      starving artists hate this one simple trick

      • 11 months ago
        Anonymous

        Chinese vidya artists don't like it, either. Maybe except for the ones who get to stay at their jobs and use the AI to 5x their productivity or whatever.

        • 11 months ago
          Anonymous

          >5x their productivity
          still need to see that irl for any meaningful work

        • 11 months ago
          Anonymous

          inkcels btfo

    • 11 months ago
      Anonymous

      And the porn industry is in fricking permanent shambles

      • 11 months ago
        Anonymous

        i don't fap or pay attention to 3dpd, is the mindgeek israelites and friends actually worried

  16. 11 months ago
    Anonymous

    >12gb VRAM
    that's fricking nothing, although perhaps sufficient if "ai" for you is just diffusion cooms

    • 11 months ago
      Anonymous

      >just diffusion cooms
      I see you haven't been paying attention to the incredibly fast moving machine learning thread generals here

    • 11 months ago
      Anonymous

      CPUs run useful LLMs brud

      • 11 months ago
        Anonymous

        If you are looking for a GPU in the first place, it would be assumed that you are not satisfied with CPU speeds on textgen.

  17. 11 months ago
    Anonymous

    Can anyone explain VRAM to me?

    >have 10GB card
    >play RDR2
    >literally always maxed out
    >even when I use the exact same settings I had on my 980 that had only 4GB of VRAM)

    The frick is this. I'm worried it's gonna keep filling up and then shift over to RAM which will cause my game to frick up performance wise.

    • 11 months ago
      Anonymous

      If it ran okay on 4GB, it's probs just hoarding assets in case it needs them again later, for better performance. Could also be reserving some VRAM for later use.

    • 11 months ago
      Anonymous

      Because empty VRAM is wasted space. So they just fill it or mark it as full so nothing else can use it when it wants to. There is also a difference because "used" VRAM and VRAM actually in use.

      VRAM capacity is only an issue when it has to constantly trash things in and out to keep up with the gameplay. The VRAM issue is mostly overblown by morons and a couple of new shitybox ports running like shit regardless.

    • 11 months ago
      Anonymous

      see

      Because empty VRAM is wasted space. So they just fill it or mark it as full so nothing else can use it when it wants to. There is also a difference because "used" VRAM and VRAM actually in use.

      VRAM capacity is only an issue when it has to constantly trash things in and out to keep up with the gameplay. The VRAM issue is mostly overblown by morons and a couple of new shitybox ports running like shit regardless.

      Ignore any dogshit VRAM monitoring software, MSI afterburner for example can't differentiate between allocated and usage.

      So that's why you can get two PCs, with the exact same in game settings but one has more VRAM, which have totally different VRAM measurements. Games typically just allocate all (or most) of the VRAM to use it when it needs to.

      You'll know when your VRAM is legit maxed out because your game will stutter like a mess. But even if you see the VRAM shooting up, to near full or outright maxed, this doesn't mean the game is actually using all of it.

      TL;DR - Ignore that shit, look out for stutters. No stutters = No issues.

      • 11 months ago
        Anonymous

        >MSI afterburner for example can't differentiate between allocated and usage.

        Funny, because RTSS (which is what MSI relies on for that overlay), can actually do that with the Memory Usage / Process checkbox.

        • 11 months ago
          Anonymous

          Wait, what?

          • 11 months ago
            Anonymous

            RTSS is just the collection server. You still need a program to select what you want/display it. Such as RivaTuner or Afterburner. But it's been supported for a few years now.

    • 11 months ago
      Anonymous

      see the second post replying to you

  18. 11 months ago
    Anonymous

    it's a good poorgay-tier card. the 3090 is king for mid-tier though

    • 11 months ago
      Anonymous

      >$300 for ONE (1) component
      >poor

      • 11 months ago
        Anonymous

        That's basically the lowest you can go for a practical GPU.

  19. 11 months ago
    Anonymous

    3090 24GB (936 GB/s) == 10-20+ token/s (30B)
    3060 12GB (360 GB/s) == 5-11 token/s max (13B) tuned
    P40 24GB CPU (346 GB/s) == 4-7 t/s 30B | 11-14 t/s 13B (as swap)

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *