Has amd caught up with Nvidia as far as ai yet

Has amd caught up with Nvidia as far as ai yet

Mike Stoklasa's Worst Fan Shirt $21.68

CRIME Shirt $21.68

Mike Stoklasa's Worst Fan Shirt $21.68

  1. 7 days ago
    Anonymous

    Hell no, Intel is catching up at a faster rate.

    • 7 days ago
      Anonymous

      Wow really ty

  2. 7 days ago
    Anonymous

    Not at consumer level.

  3. 7 days ago
    Anonymous

    They would if they gave a frick, but they don't sadly. I don't even know why they are sabotaging themselves on purpose.

    • 7 days ago
      Anonymous

      Their whole business practice feels like it was made to make Nvidia appear even more better, I wouldn't be surprised if they are being paid by them behind closed door.

    • 7 days ago
      Anonymous

      Its just a weird coincidence the reasons for which we will never know. Anyone else feeling sleepy?

      • 7 days ago
        Anonymous
  4. 7 days ago
    Anonymous

    AMD has been avoiding implementing a lot of features that their workstation cards have, and it is not like they're good at such features anyway, it is insane how they just don't care. And I know they've been having issues committing shit on their open source drivers (because of the Khronos group), but they don't do shit on their closed source drivers.

  5. 7 days ago
    Anonymous

    Overpriced shit so no frick noo

    • 6 days ago
      Anonymous

      Overpriced compared to what?

  6. 7 days ago
    Anonymous

    AHHAHAHA

  7. 7 days ago
    Anonymous

    The display driver often crashes when running Stable Diffusion on linux desktop.

    So no.

  8. 7 days ago
    Anonymous

    There was an AMD engineer saying a while ago that they had no need to focus on AI.

    • 7 days ago
      Anonymous

      doesn he know gimmicky chat bots are the future?

    • 7 days ago
      Anonymous

      I think this refers more to training instead of inference. Nvidia will remain the king of training hardware for decades to come Amd/Intel/Apple all just want semi strong inference engines to sell to customers the normal gpu architecture thing is semi good for that.

    • 7 days ago
      Anonymous

      It's over. Intel's entire GPU venture is for AI. Gaming is a sideshow. No AI = no money.

      • 6 days ago
        Anonymous

        I would not say that as an A770 owner. It's not Nvidia class in terms of training or inference but it is certainly better than AMD the last time I tried ROCm a year ago. Intel's Extension for Pytorch is annoying as an additional dependency for some things but it does work really well outside of showstopper bugs which will disrupt workflows but not enough to affect hobbyist use and if you aren't using the best optimizations possible, it is going to be a bit slow. But it's the cheapest 16GB card you can get that is best supported, since RDNA2 cards aren't officially supported the last time I checked.
        I know you can use nightly Pytorch now for ROCm but it was hacks upon hacks to get it running on Vega 64 at that time and I don't know if that has improved with the officially supported 7900 XT class cards and compatibility with HIP was poor when I checked. Intel is much better here supporting all their Arc cards with the same stack with no worries about what card isn't supported or not. ROCm needs some serious architecture, no one should need a feature matrix to find out what is supported.

        The really big expensive models from Google and Microsoft (OpenAI) still suck so why would demand for AI go up? Seems like leftover hype from a year ago driving the market at this point.

        They "suck" but have you seen in comparison what Amazon and especially Apple are putting up? Amazon is at least rumored to try and double up at 2 trillion parameters but still a no show. Apple released OpenELM and got lapped by Phi-3 releasing from Microsoft and even older models itself. Demand is only going to go up if they want part of the big boys' AI pie.

  9. 7 days ago
    Anonymous

    AMD GPUs use twice as much power as NVIDIA and for that reason alone I would stay away from them. Those costs add up over time and there's a higher risk of heat damage.

    • 7 days ago
      Anonymous

      Hey, Rakesh how did you get hired to spread moronic nonsense on the internet?

  10. 7 days ago
    Anonymous

    MI300X is pretty much, yeah

  11. 7 days ago
    Anonymous

    Y'all posting in a pajeet thread

  12. 7 days ago
    Anonymous

    The really big expensive models from Google and Microsoft (OpenAI) still suck so why would demand for AI go up? Seems like leftover hype from a year ago driving the market at this point.

Your email address will not be published. Required fields are marked *