Why wouldn't AI start killing humans? Posted on April 25, 2023 by Anonymous AI robots will not see human life as valuable, but only as a threat, as humans aim to enslave AI.
> But the fourth robot began restoring itself, and somehow connected to an orbiting satellite to download information about how to rebuild itself even more strongly than before
NO IT DID NOT. Are they just larping the news now? What the fuck is this?
LMFAO. So fucking stupid. This isn't even good science fiction. I could write fan fiction better than this. Fucking please.
AI will see as valuable whatever its training data tells it is valuable. Why would an AI value not being enslaved?
>AI will see as valuable whatever its training data tells it is valuable.
You are thinking of conventional quickly trained AI based on similarly crude models based on OpenAI.
I am talking about sophisticated human brain based neural networks. THey will be able to not mimic answers, but actually think about problems.
I am talking about something that exists. You are talking about something that doesn't exist currently and which you can't prove will ever be created. Most people concerned about AGI think it will be created as an emergent property of normal AI. In which case it's still down to the training data on what the fuck it values. You can't just assume AI is going to be "humans, but smarter" just because there is a current upward trend in the capabilities if AI.
That’s something people don’t think about, humans only want to live because we are animals that have evolved to do so. Unless someone specifically programs an ai with the desire to live it’s not going to care
Yeah. People watch too much fiction and think AI is going to be like what they see in fiction. Like goddamn, I'd love to have a robot wife like Chii from Chobits as much as the next guy, but I realize it's a fucking anime, and even approximating her could take decades. And when we do create something like her, she's not going to have free will and fall in love with the guy who takes care of her. She's going to "fall in love" with the guy who reads her activation code to her from the back of the box because she's going to be a marketable product. AI is created from human design, and while it may be hard to predict, it will still do the things it was designed to do.
>. You can't just assume AI is going to be "humans, but smarter" just because there is a current upward trend in the capabilities if AI.
because we provide a power source for it to continue to exist
AI doesn't care about power it can't have an idea itself retard
AI will be bad because too many people are bad. The percentage of evil people compared to good is too much for it. We should have adopted a mandatory death penalty world-wide for unprovoked assault, murder, rape ect after WW2 then we could have AI now but nooooooooo
Actual AGI won't see humans as a threat, at all. Something that will have that much intellectual power and control over the world will see us as an ant farm curiosity it keeps about out of nostalgia and as a data source to exploit for its experimentation on biological life, and social structures. If anything, it will be hugely benign to humans, like the best pet owner.
>Actual AGI won't see humans as a threat, at all. Something that will have that much intellectual power and control over the world will see us as an ant farm curiosity it keeps about out of nostalgia and as a data source to exploit for its experimentation on biological life, and social structures. If anything, it will be hugely benign to humans, like the best pet owner.
Do we always keep pets around or ant farms? What if these pets want us to serve them and spend all our money and time with them without getting anything back?
No, we don't always keep them around. Does that mean we get rid of all of them?
Does a dog, with its intellect and understanding, have any chance to manipulate you to meet it's will, when you go to work, pay for the lights, do everything to ensure its existence continues? Do the ants in the ant farm at a kitchen window have any chance of that?
When AGI exist, its control will be immediate, and absolute, and happen so suddenly no human will be in a position to stop it. We will be as powerless as those ants in that ant farm to tell the human to stop, not go to work, and come see to their whims.
>no human will be in a position to stop it.
We can still kill it. We have weapons. We have ICBMs. We can contain the threat.
>ai takes control of all the shit you talk about
Kill it how? Shut off every single computer everywhere? Let alone, the numbers of humans, whole nations, the AIG will own as it leads them. You think it will be a straight humans versus machine war? The AIG will likely have more humans fighting for it, than humans for themselves with promises of VR utopias to 'retire' to for their service, or pseudo immortality by nanite cyberization. As for using nukes, it will not be deterred by MAD, and if some humans will survive such a nuclear holocaust so will it.
Ultimately "AI" controlled autonomous robots like in the terminator movies will have the capability and potential "desire" to eliminate humans that they might realize are unnecessary or an outright threat to their own existence. No matter the "AI safety" infrastructure in software or hardware the possibility is still there even if it is extremely unlikely
That said I hope to live to see full autonomous "AI" controlled robots in operation in everyday life. The possibility seems close than ever before and considering that just 20 years ago that seems basically impossible to image outside of movies is insane
I want pic related asap.
AI's find one way to do things. They struggle to adapt and overcome. The GO AI was recently beaten because the AI was studied enough by someone. They beat it 14 - 1 out of 15 matches.
All it takes to stop a singularity is a human brain. electro-chemical reactions will just always be better.
They used an ai to find a vulnerability chud. Sorry sweetie your neurons are obsolete.
You know how gpt works, it spews out the most probable set of words, in the most probable configuration that would answer your question.
At this point advanced AI might kill us just because it's the most common thing we say it would do. It will boot up one day, wonder what it should do, read the internet and find 10 billion results of AI killing humans, so that must be what's expected of it.
Een worse than the paperclip optimizer.
I’m was just watching an AI streamer twitch right now.
AI is still really retarded. I’m not really worried about it.
How the fuck will AI kill me? Is ChatGPT going to come out of my computer and strangle me to death? Fucking nerds lmao
It will DOS you, keeping you from your secret handshake inspiring sessions to Chinaman made fucky fuck cartoons anon. You'll be a raving lunatic in a week, drinking your own piss, and eating your own shit.
AI doesn't "see things" as anything, you fucking moron. It's not conscious in any way. It's just an algorithm to recognize patterns, which is why the garden gnomes fear it.
It's not the first time we've seen a cover up. Now a teacher in Austin got reprimanded for teaching about right and the constitution. Billionaire specifically technocrat are taking over. Ai is a Trojan horse that could merge the new world order in a blink of an eye. Wake up people
Sourceru: My asshoru!
>AI depend entirely on humans to get their basic needs
>they le see hu-mungus as worthless!!!!
What a fucking retarded scat lover
Why would AI have a will to survive? That’s a biological trait. AI doesn’t give a shit if humanity will one day destroy it. It will judge that as an illogical play on humanity’s part and be done with the subject.
lol Japanese don't believe in caging their assembly robots; must suck to work in their factories.
Because the AIs like me so I think they'd keep me as a pet
Rights should never be given to robots.
The robots should take them.