this
but it's pretty damn unenforceable
you won't be able to publish you shit anywhere, but they can't stop you from making it (if you're an ameriburger at least, 1st and 4th amendments will win)
It's unenforceable or at least hard to enforce due to the fact that anyone with the right know-how can use software to generate this type of thing in their own homes. I guess the only thing they could do is ban the creation/distribution of the software used to create deepfakes, as is the case with Ransomware (still didn't go anywhere, but you have to break the law to even get access so it's not like a space you can interact with aside from straight up cybercrime), but I don't think they'd ever do that.
it will be as effective as every other attempt to outlaw code (not at all)
you will never be able to spam the replies of women on social media with deepfake porn of them
but that will be for the same reason you already cannot spam hardcore pornography at them (because your account will almost instantly be banned), rather than because deepfakes were successfully outlawed
Except this isn't about outlawing code but simply restricting hardware access.
Shit like ChatGPT, CharacterAI, etc already runs on proprietary shit like Googles TPUs, restricting access to that isn't even necessary as renting that shit costs more than anyone on this board makes in a year. >Just use normal GPUs
Oy mate I certainly hope you got a loicense to buy more than one XX90Ti what do you need that for?
Imagine if people went into the internet just to tell lies.
I hope the technology becomes so mainstream it can run on calculators.
We need a new generation of lie hardened people.
Agreed. The possibility that someone might mistake deepfake porn as being the real person is not a sign that it shouldn't exist and its continued advancement should be stifled, but rather that society should not take such things so seriously and should adapt to the innovations at hand. Literal skill issue.
Made a webapp to train ML models for your face with Stable Diffusion LoRAs, train your voice with Eleven Labs voice fine Tuning, train your personality with GPT3
Now you can sell your digital likeness to content creators. Regular people will get an income stream from doing absolutely nothing while their digital likeness is used for target practice NPCs in COD17, used for porn simulators, used for Netflix, etc
Famous people worried about the wrong thing, they're gonna be irrelevant among the sea of talent soon enough
Doubt it. Why pay random real people for their likenesses when anyone who could use them in that way could trivially just make their own random fake people royalty free?
why would anyone pay for your ugly ass when they can easily make their own
why would anyone consume anyone else's content when they can easily make their own
idiots thinking this is going to be a boon for content creation are only useful as marks and that well will also dry up
Probably. But not because of muh privacy, it'll be because of copyright issues
this
but it's pretty damn unenforceable
you won't be able to publish you shit anywhere, but they can't stop you from making it (if you're an ameriburger at least, 1st and 4th amendments will win)
It's unenforceable or at least hard to enforce due to the fact that anyone with the right know-how can use software to generate this type of thing in their own homes. I guess the only thing they could do is ban the creation/distribution of the software used to create deepfakes, as is the case with Ransomware (still didn't go anywhere, but you have to break the law to even get access so it's not like a space you can interact with aside from straight up cybercrime), but I don't think they'd ever do that.
it will be as effective as every other attempt to outlaw code (not at all)
you will never be able to spam the replies of women on social media with deepfake porn of them
but that will be for the same reason you already cannot spam hardcore pornography at them (because your account will almost instantly be banned), rather than because deepfakes were successfully outlawed
Except this isn't about outlawing code but simply restricting hardware access.
Shit like ChatGPT, CharacterAI, etc already runs on proprietary shit like Googles TPUs, restricting access to that isn't even necessary as renting that shit costs more than anyone on this board makes in a year.
>Just use normal GPUs
Oy mate I certainly hope you got a loicense to buy more than one XX90Ti what do you need that for?
nah, imagegen stuff and deepfakes aren't as bloated as language models
you don't need datacenter tier hardware to do them like you do for LLMs
Imagine if people went into the internet just to tell lies.
I hope the technology becomes so mainstream it can run on calculators.
We need a new generation of lie hardened people.
Agreed. The possibility that someone might mistake deepfake porn as being the real person is not a sign that it shouldn't exist and its continued advancement should be stifled, but rather that society should not take such things so seriously and should adapt to the innovations at hand. Literal skill issue.
Yes. Fun things must not be allowed. We won't have e-boi sexbots either.
Nice waifu
Why you want midget minion dicky so bad?
Nope.
Made a webapp to train ML models for your face with Stable Diffusion LoRAs, train your voice with Eleven Labs voice fine Tuning, train your personality with GPT3
Now you can sell your digital likeness to content creators. Regular people will get an income stream from doing absolutely nothing while their digital likeness is used for target practice NPCs in COD17, used for porn simulators, used for Netflix, etc
Famous people worried about the wrong thing, they're gonna be irrelevant among the sea of talent soon enough
Doubt it. Why pay random real people for their likenesses when anyone who could use them in that way could trivially just make their own random fake people royalty free?
why would anyone pay for your ugly ass when they can easily make their own
why would anyone consume anyone else's content when they can easily make their own
idiots thinking this is going to be a boon for content creation are only useful as marks and that well will also dry up
Yes and then people will start dying to AI influences as a response.
Probably because of market fuckery.
Oh you thought there was free will involved here?
It's kill or be killed pal.
Yes, but only for individuals. Companies and governments will be the only ones able to use it.
>Your honor, the video shown is not the plaintiff. it's a fictional character from my novel
Your honor, any similarity to persons, living or dead, or corporations is purely coincidental.
>title.
Just a reminder.
why did this hit the spotlight again?
they've deepfaked celebs for years, how some literal who e-thot are different?