Is the hardware bottleneck real for AI. Or is it cope

Is the hardware bottleneck real for AI
Or is it cope

Homeless People Are Sexy Shirt $21.68

Nothing Ever Happens Shirt $21.68

Homeless People Are Sexy Shirt $21.68

  1. 2 months ago
    Anonymous

    Given the low efficiency algorithms in which it's based, I'd say the bottleneck is more real than reality itself

  2. 2 months ago
    Anonymous

    Well considering llamap.cpp was built by a hobbyist in a weekend and it enabled you to run models that used to require multiple 40grand GPUs, with your gaymin' CPU and some RAM in reasonable time I'd say there's quite a bit more to optimize. I ran llama64B on my laptop, it was great.

    • 2 months ago
      Anonymous

      You're moronic

      the "hardware bottleneck" is currently imposed by ivory tower python shitters spewing garbage code across 5000 CPUs and uploading terrible GPU kernels with huge cyclomatic complexities that stall the gpu pipelines constantly.
      things could easily get a couple orders of magnitude more efficient by hiring a decent programmer or 2.
      but we're still physically limited by data transfer rates on every component of the hardware except maybe the cpu. and don't misunderstand, I'm talking bandwidth not latency. the human brain has enormous bandwidth even if it only runs at 80hz

      >python is only used to pull levers
      all the program logic is in Python; all the synchronization, all the data collection and routing, all the "safety" features.
      but the GPU kernels are also written by python shitters and will have terrible performance on their own

      You're also moronic

      Being moronic and making dunning kruger posts like these should be a bankable offense.

      https://i.imgur.com/Cp2P8UR.jpg

      Is the hardware bottleneck real for AI
      Or is it cope

      Probably, and I have a suspicion that there's a data bottleneck as well

      • 2 months ago
        Anonymous

        >bankable offense
        your Freudian slip is my business strategy

      • 2 months ago
        Anonymous

        Calling others stupid doesen't make you seem any smarter. Nobody says dunning kruger ironically, it's always a sperg with a lot to prove

        • 2 months ago
          Anonymous

          >Nobody says dunning kruger ironically, it's always a sperg with a lot to prove
          nta youre right. dunning kruger is an ironic phrase used to describe dunning kruger. the rest of the world that isnt stuck on sucking academic toe cheese uses the phrases "survivorship bias" and "imposter syndrome"

  3. 2 months ago
    Anonymous

    the "hardware bottleneck" is currently imposed by ivory tower python shitters spewing garbage code across 5000 CPUs and uploading terrible GPU kernels with huge cyclomatic complexities that stall the gpu pipelines constantly.
    things could easily get a couple orders of magnitude more efficient by hiring a decent programmer or 2.
    but we're still physically limited by data transfer rates on every component of the hardware except maybe the cpu. and don't misunderstand, I'm talking bandwidth not latency. the human brain has enormous bandwidth even if it only runs at 80hz

    • 2 months ago
      Anonymous

      Python is only used to pull levers, not the actual intense code. This is why many 'python' apps outperform other C/C++ apps because it ends up calling super optimizef libraries instead. Brains aren't comparable to computers at all, no number can comparably quantify their differences.

      • 2 months ago
        Anonymous

        >python is only used to pull levers
        all the program logic is in Python; all the synchronization, all the data collection and routing, all the "safety" features.
        but the GPU kernels are also written by python shitters and will have terrible performance on their own

        • 2 months ago
          Anonymous

          get the AAAA game devs on the case

        • 2 months ago
          Anonymous

          I don't believe you. Do you have source?

          • 2 months ago
            Anonymous

            the voices in my head told me, Black person. frick off back to del.icio.us

            • 2 months ago
              Anonymous

              I'm gonna write a really passive aggressive blog post about your microaggressions to all my xeri followers

  4. 2 months ago
    Anonymous

    I am not really that educated on how AI works, but isn’t there a brand new architecture for chips being made that specialises for AI? One where processor and ram are basically in the same place, or rather millions of tiny processors connected to million tiny rams and then that connected together into big thing, instead of now where there is big processor connected to big ram trough wires. Shouldn’t that improve it in a significant way?

    • 2 months ago
      Anonymous

      You've desrcribed A100 or newer

  5. 2 months ago
    Anonymous

    Both. The hardware limitation is precisely why AI is a cope. Imagine how most normies wont experience hardware more powerful than a heavily subsidized vidya gayming box, and now tell me how the frick every person, their mother, and their dog is going to have AGI sitting in their basements within the next "2 more weeks"? Much like raytracing, unless you have a basement full of used supercomputers, you aren't doing shit with it. Even if you have the compute, it doesn't justify the electricity costs in 95.99% situations. Unless you're pixar about to release the next blockbuster, you're just pissing away resources on something a single person could animate in their bedroom, and achieve 60-80% of the expected outcome. 'b-but muh perfect reflections for every single frame at 60fps@4k" cool nobody cares. likewise, nobody cares if you can shit out as much propaganda as the CIA with a $750,000 per day (ChatGPT expenses) server farm.

  6. 2 months ago
    Anonymous

    Jim Keller thinks it’s software related and hardware dependent.

    Basically he’s making silicon right now that will rape Nvidia.

  7. 2 months ago
    Anonymous

    small in size, big in attitude. i love those dogs.
    > some shit about ai
    nobody cares.

  8. 2 months ago
    Anonymous

    It is one problem but not the biggest one by any stretch of the imagination. Sure efficiency could be much better if there was a greater focus on optimization but the largest issue still is and will be for a long time are the datasets and their quality or lack thereof in most cases.

Your email address will not be published. Required fields are marked *