Machine Learning

I want to build my own machine learning rig and train my own data. I want to train it on technical documentation so I can make AI my b***h and write code for me.

Where the frick do I even start?

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

Tip Your Landlord Shirt $21.68

POSIWID: The Purpose Of A System Is What It Does Shirt $21.68

  1. 10 months ago
    Anonymous

    start by putting your finger in your butt

    • 10 months ago
      Anonymous

      now what??

      • 10 months ago
        Anonymous
  2. 10 months ago
    Anonymous

    >used 3090, get 2 if you really wanna go balls deep. 4090 is also an option if you have $$$ to burn
    >7950x
    >64GB RAM
    >2TB nvme SSD + 16+TB HDD
    >overkill PSU
    you'll never look back. you're welcome

    • 10 months ago
      Anonymous

      based. would you recommend this as a standalone tower or can i use my main rig. obviously stand alone is better but a lil $ conscious here

      • 10 months ago
        Anonymous

        from experience it really depends on how well you can handle the heat and noise. Normally I'd put this in a separate room with better airflow and just ssh into it from wherever but if you can handle the heat (especially if you're in a cold climate area where you may want the extra heat), making it your main rig is an option.

        • 10 months ago
          Anonymous

          hmm good point. i guess it will be hard to tell unless i try it out. im also in an apartment so....

          • 10 months ago
            Anonymous

            power limits are your friend then. if you're really sensitive to heat and/or your electric rate really sucks (i.e. you're a europoor), get a single 4090. if you have even more money to burn and wanna be a local ML gigachad look into the RTX A6000.

            • 10 months ago
              Anonymous

              thanks fren. what do you mean by power limits though? or what i should research? apologies for n00b questions

    • 10 months ago
      Anonymous

      This guy is almost there, but it's not efficient. No need for the monster CPU, you just need something with a bunch of PCIe lanes.

      >Chinese X99 / DDR3 motherboard with 3 x16 slots
      >Xeon E5 2673 v3
      >4x16 or 4x32 GB DDR3 ECC, shit's dirt cheap
      >Biggish NVMe
      >3x 3090, shouldn't be any more expensive than his recommendation
      >You can pick up 3x P40 for about the same price as one 3090, that will also let you run any model with huge context but it will be a lot slower. Also you need to figure out how to cool them and you need to buy some more CPU power cables to power them

  3. 10 months ago
    Anonymous

    It's not worth it. Just rent servers with GPUs.

    • 10 months ago
      Anonymous

      i agree this is probably cheaper but i prefer to roll my own. also i am a schizo and i dont want to send data to a 3rd party. that being said...what service would you recommend in this case?

  4. 10 months ago
    Anonymous

    Buy 8 A100s and read the "Textbooks Are All You Need" paper

    • 10 months ago
      Anonymous

      >not installing a distributed cluster of H100s in your mom's basement
      ngmi

      • 10 months ago
        Anonymous

        >distributed
        enjoy your latency
        Shit gets way too complicated once you leave the comfort of a single machine

  5. 10 months ago
    Anonymous

    What kind of data?

    • 10 months ago
      Anonymous

      i want to train it on an SDK so it will write code for me. tbh i have no idea if its gonna work, but i do have some cash to burn and i think its the best way to learn and get involved in machine learning. lets see where it takes me

Your email address will not be published. Required fields are marked *