Do I need an Nvidia GPU for AI voices? I already got cucked out of stable diffusion, flowframes and chatbots...

Do I need an Nvidia GPU for AI voices?
I already got cucked out of stable diffusion, flowframes and chatbots...

A Conspiracy Theorist Is Talking Shirt $21.68

The Kind of Tired That Sleep Won’t Fix Shirt $21.68

A Conspiracy Theorist Is Talking Shirt $21.68

  1. 11 months ago
    Anonymous

    Actually a good question, I want to know too.
    Probably belongs in one of the many AI generals though

    • 11 months ago
      Anonymous

      https://i.imgur.com/JkGwSaO.png

      Do I need an Nvidia GPU for AI voices?
      I already got cucked out of stable diffusion, flowframes and chatbots...

      Depends what you want to do.
      I'm using an integrated GPU on a laptop but I end up needing to cut up audio into one minute intervals in order for conversions.

      As for generations, I don't know.

  2. 11 months ago
    Anonymous

    >another thread of a local lintroon that can't into X because of incompatibility
    many such cases

    • 11 months ago
      Anonymous

      I don't use loonix...

      • 11 months ago
        Anonymous

        Then you're even more doomed than normal. AMD only cares about AI in the consumer space to the extent that they can use it for hype to snag scientific computing contracts.

        At least on Linux if you have one of the few cards that usually works you can probably install or compile code with rocm support. Windows still has nothing but onnx and opencl.

    • 11 months ago
      Anonymous

      >dumb wintroony is dumb
      picture me surprised.

  3. 11 months ago
    Anonymous

    I think you can use a amd GPU with mrq tortoise. But I think rvc only supports Nvidia (at least cpu works fast with it).

  4. 11 months ago
    Anonymous

    >chatbots
    No anon, real chads use CPU for chatbots.

  5. 11 months ago
    Anonymous

    Just use colab or whatever.
    No point I'm buying a new expensive graphics card just for a hobby you'll probably grow out of

    • 11 months ago
      Anonymous

      >Just use colab or whatever
      y-yeah I love Google IP locking me out of the shit for an entire week after 15 minutes of using it

  6. 11 months ago
    Anonymous

    can you guys post the rentry with the voice samples.

  7. 11 months ago
    Anonymous

    The GPGPU world is only barely starting to take its first baby steps away from CUDA. Just assume that you need an Nvidia GPU for everything unless you are specifically told otherwise.

  8. 11 months ago
    Anonymous

    nu

  9. 11 months ago
    Anonymous

    https://vocaroo.com/1o8eAidN8B8v
    Anyway I'm a newbie to this but I'm having a lot of fun with it. Took a while to get anything decent but I think this one sounds pretty good. Could definitely be better if I knew what I was doing though.

    Carl singing Country Roads.

  10. 11 months ago
    Anonymous

    It's your best bet as long as Pytorch and CUDA reign supreme in Machine Learning

  11. 11 months ago
    Anonymous

    in general, the best answer right is yes. But personally I'm seeing a lot of progress outside of nvidia, lots of work with apple's metal/coreml, intel and AMD but rn the best bet is nvidia, you just can't compare with how well everything just works and they have the market dominance with CUDA for decades. No big surprise to see them join the trillion dollar club today

  12. 11 months ago
    Anonymous

    Stable diffusion runs fine with amd cards

    • 11 months ago
      Anonymous

      On fricking linux. I'm never touching that shit.

Your email address will not be published. Required fields are marked *