When did you realize that Nvidia is no longer a GPU company, but a machine learning company?

When did you realize that Nvidia is no longer a GPU company, but a machine learning company?

  1. 2 months ago
    Anonymous

    around 2016, after talking to some of their employees

  2. 2 months ago
    Anonymous

    When I started using CUDA and stopped giving a fuck about video games.

  3. 2 months ago
    Anonymous

    When they started making more money from datacenter than they did from games, which wasn't that long ago.

  4. 2 months ago
    Anonymous

    when I went to GTC 2016

  5. 2 months ago
    Anonymous

    is that a bad thing? why do gaymers think they’re entitled to the hardware instead of the people actually creating the most important technology in human history?

    • 2 months ago
      Anonymous

      >actually creating the most important technology in human history?
      Like... creating AI porn?

      • 2 months ago
        Anonymous

        unironically yes

      • 2 months ago
        Anonymous

        If you genuinely think people are willing to spend $1600 to create extremely mediocre 512x512 porn with a maybe 2% yield rate even with good prompts 10x faster than on a card that costs $90 you are kidding yourself

        • 2 months ago
          Anonymous

          >mediocre 512x512 porn

        • 2 months ago
          Anonymous

          >extremely mediocre 512x512 porn
          speak for yourself

        • 2 months ago
          Anonymous

          >mediocre
          expand your diction

  6. 2 months ago
    Anonymous

    tomorrow

  7. 2 months ago
    Anonymous

    3xxx series

  8. 2 months ago
    Anonymous

    Yeah, imagine their faces when all gamers switch to Intel/AMD and by looking at their annual income they realize that they were gaming company all this time.
    Machine learning company which can't sold their GPU since all cryptominers left, lmao, what a joke

    • 2 months ago
      Anonymous

      Oh yeah, AMD will be on top this gen. Nvidia collapse imminent.
      Just two more weeks.

      • 2 months ago
        Anonymous

        >Just two more weeks
        No, more like four weeks.

        • 2 months ago
          Anonymous

          >It'll be different this time!
          lmao
          When was the last time AMD had a majority of dGPU sales? It was definitely pre-Tesla.

          • 2 months ago
            Anonymous

            AMD actually is outselling Nvidia in the compute sector now

            • 2 months ago
              Anonymous

              I'm starting to see why you think RDNA3 will make a difference. You inhabit a wonderland of delusional cope.

              • 2 months ago
                Anonymous

                >You inhabit a wonderland of delusional cope.
                Lmao moron, do you realize that compute sector have clients who need an individual solutions and AMD know how to work with them, meanwhile Nvidia is known to be shitty with individual contracts and working with them is pain in the ass (for example, Apple and Nvidia cooperation).

              • 2 months ago
                Anonymous

                Still waiting on the proofs for your pants-on-head retarded post.
                This should give you a sense not just of being wrong, but how wrong you are.

              • 2 months ago
                Anonymous

                not him but that's already 3 years ago. AMD released some new hardware and got some pretty sweet deals after that, and recent deals matter more for future direction than old generations still being used. I can't find any numbers on this tho, could very well be that nv is still selling way more cards as we speak

          • 2 months ago
            Anonymous

            >When was the last time AMD had a majority of dGPU sales
            Who cares? If RDNA 3 beats Lovelace, but sheep keep buying NVIDIA anyway, then that's only positive for me. More cards for me at a better price.

            • 2 months ago
              Anonymous

              >Everyone will switch to AMD/Intel! Nvidia's gaming sales will dry up!
              >Well, uh, actually, maybe neither will happen...
              We're on the same page.

              • 2 months ago
                Anonymous

                Right, I'm playing into your hypothetical scenario where all consumers are sheep that will keep buying NVIDIA forever even if AMD performs better for a better price. Ignore AMD Ryzen, it never happened.

            • 2 months ago
              Anonymous

              >If RDNA 3 beats Lovelace
              perf per dollar and perf per watt almost certainly, but for the absolute halo product performance probably not. and even if they did, great many people will still get nvidia just because that's what they're used to.

    • 2 months ago
      Anonymous

      fucking retard, 3090 is only ~ 0.6% of gaming market share according to steam stats, gamers are piss poor children or manchildren in moms basement who cant afford something as cheap as $2000 GPU while its common in AI/ML space because it means your work productivity increases and time it takes massively decreases, videogames were just the starting point of GPUs but after 30+ years we finally realized GPUs can be used for better and more important things

      • 2 months ago
        Anonymous

        >cant afford something as cheap as $2000 GPU
        I've recently had a revelation that most people could afford this just fine, they just don't want to buy it. Due to the history of moore's law, a lot of tech enthusiasts have become drunk on abstract value propositions and marginal upgrade paths. When in reality if you spend $3000 a year buying the best new shit every year that's still on the cheap end of hobbies (obviously you haven't bought any games yet but again, cheap end) and is the equivalent of replacing the tires on your car twice, which anyone would do without blinking. I blame linus tech tips and underages

      • 2 months ago
        Anonymous

        Only brain damaged people or people that ate not using their money would spend $2000 on a GPU.
        No fucking shit people don't get them for gaming machines

        • 2 months ago
          Anonymous

          stop being poor

    • 2 months ago
      Anonymous

      Retard.
      Nvdia doesn't care about gamers anymore, 1 or two more gens and i can actually see leather jacket not even talking about fps performance anymore but exclusively how fast an AI can work on these.
      They are probably leaving the gamer market to AMD

  9. 2 months ago
    Anonymous

    20 series launch was the first one with tensor cores wasn't it? so then

    • 2 months ago
      Anonymous

      Tensor cores have nothing to do with it. We're talking about CUDA cores and VRAM, the only things that can be used for neural networks. Tensor cores are shit for gaymers so they can half their framerate and be happy, real developers don't have any access to them.

      • 2 months ago
        Anonymous

        You can run neural nets on tensor cores under a narrow range of circumstances, and if you do find yourself in those circumstances you should use the tensor cores because they're about 15x faster

    • 2 months ago
      Anonymous

      Volta was the first with tensor cores.

      Tensor cores have nothing to do with it. We're talking about CUDA cores and VRAM, the only things that can be used for neural networks. Tensor cores are shit for gaymers so they can half their framerate and be happy, real developers don't have any access to them.

      >Tensor cores are shit for gaymers
      >real developers don't have any access to them
      Retard.

  10. 2 months ago
    Anonymous

    But they just got btfo'd by Tesla's new dojo

  11. 2 months ago
    Anonymous

    When Turing became a compute architecture. All we get now is compute scraps where the theoretical performance is not even close to being realized. Maxwell was so damn efficient when it came out because it was gaming focused.

  12. 2 months ago
    Anonymous

    When the $1200 3080TI was released

  13. 2 months ago
    Anonymous

    its computing company

    GPU still has computing in it's name, later they'll probably adapt new "General Processor Unit" while "Central Processor Unit" will remain only for logical operations and security
    So, nothing about machine learning cope retard, not even tensor cores have anything to do with AI, they're optimized matrix operations cores

    • 2 months ago
      Anonymous

      >not even tensor cores have anything to do with AI, they're optimized matrix operations cores
      They're optimized for low-precision matrix operations which is exactly what AI needs.

  14. 2 months ago
    Anonymous

    When did you realize that GPUs have more, practical, applications than gaymen?

  15. 2 months ago
    Anonymous

    Test

  16. 2 months ago
    Anonymous

    fuck A.I

  17. 2 months ago
    Anonymous

    A.I.? More like ayy I fucked your mom last night! lmaoooo

  18. 2 months ago
    Anonymous

    2016, Nvidia announce P100.
    Nvidia remove G(graphics/geforce) from Datacenter products plus announce NVlink.

  19. 2 months ago
    Anonymous

    I don't get why cuda baby ducks are seething at amd's rise. Once nvidia is thoroughly defeated and amd reigns supreme in the gpu sector they'll open source cuda like they did gsync and cuda cucks would still be able to reuse their expertise. No guarantees that cuda continues to dominate LOL, imagine wasting your time on a proprietary programming language what fucking retards.

Your email address will not be published.