Do I really need a powerful computer if I want to get into machine learning?

Do I really need a powerful computer if I want to get into machine learning?

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 2 years ago
    Anonymous

    You don't need a powerful computer, you just need an Nvidia computer.

    • 2 years ago
      Anonymous

      Man, why did proprietary CUDA get to win over OpenCL? Imagine what the world would be like if it were the other way around.

      • 2 years ago
        Anonymous

        Nvidia had a good timing on this. They had workstation cards before, but multicore cpus is what helped them out. Amd had their Athlons 64x2 and intel released core2 chips, and that meant multicore/threaded development was becoming much more approachable, instead of having to hog lab setups or supercomupters at the time. They bought Ageia few years after that, which gave them physx, and then they worked on putting it together with the gpu on a single card.
        Then it was as simple as pitching an idea to companies and labs, and it made sense for them. Instead of buying additional servers and core licenses, why not add these pcie cards to your server and split the workload into small simple chunks which gpus excell at. And given how ati was losing ground after 6000 series to nvidia, and amd got blown away by core2 duos and quads, we’re lucky they’ve made it to 2016.

    • 2 years ago
      Anonymous

      im not BOTermin

      • 2 years ago
        Anonymous

        Only BOTermins use AyyMD. If you do Machine Learning, you have to get Nvidia. If you do anything scientific, you get Nvidia.

        • 2 years ago
          Anonymous

          False.

          • 2 years ago
            Anonymous

            I mean, feel free to train your models on 16GB cards without ANY of the CUDA goodness and libraries. You will come to Nvidia, you will (Unless mama Lisa Su buys a CUDA license)

        • 2 years ago
          Anonymous

          ROCm is good now

    • 2 years ago
      Anonymous

      bump

  2. 2 years ago
    Anonymous

    Yes, it will everything you will throw at it.

  3. 2 years ago
    Anonymous

    just run the model on the CPU, yea itl take hours but at least you aren't limited

  4. 2 years ago
    Anonymous

    just ask your IT department to get access to a computing cluster

    • 2 years ago
      Anonymous

      My IT department's computing cluster is 5 unpowered chinkpads in a warehouse.

  5. 2 years ago
    Anonymous

    [...]

    Because Nvidia invested hard in it, and AyyMD at the time was too busy being bankrupt to invest in theirs.
    Fun fact: Nvidia offered AMD a CUDA license for $0.50 per GPU. AMD declined. Thanks Raja. Very helpful.

    • 2 years ago
      Anonymous

      >Fun fact: Nvidia offered AMD a CUDA license for $0.50 per GPU. AMD declined. Thanks Raja. Very helpful.
      Get tha frick outta here! Seriously?

      • 2 years ago
        Anonymous

        Yes, seriously.

  6. 2 years ago
    Anonymous

    just use colab

    • 2 years ago
      Anonymous

      Google will time you out of colab after a few hours, so you're very limited in the complexity of models you train. But for learning basic concepts it's fine. You can rent a GPU on gcp but that quickly becomes expensive.

  7. 2 years ago
    Anonymous

    Look, to do any serious work, no personal desktop is going to be enough. To do serious work you need multiple GPUs that are designed for Deep Learning like NVIDIA V100 and A100, which you can only rent via AWS or shit like vast.ai.
    BUT, a decent GPU with at least 6GB VRAM is pretty nice to have if you want to learn while being independent from Kaggle and Colab and their shitty jupyter notebooks.

    • 2 years ago
      Anonymous

      >v100 and a100
      <not dgx

  8. 2 years ago
    Anonymous

    You can just use the CPU to start follow along with tutorials and courses.

    I did my first "real" working tensorflow project on an athlon 5350.

    Just use whatever you typed your post on until you actually have competence and start running into performance issues with your code.

  9. 2 years ago
    Anonymous

    not really, you can do all your models in google colab.

  10. 2 years ago
    Anonymous

    google TPU has a free tier I think

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *