>AI is political and we must polarize on the basis of pro-AI and anti-AI
holy fuckin' shit
it's like you have no idea what Bolshevism is, or even sophistry
you absolutely have to go to 4chan right now
no fuckin' way we are going to let you bring >le political division "divide & conquer" a.k.a. DNC tactics
garbage to /sci,/ motherfucker
If you understood that AI is essentially defense contractors, you wouldn't let them enter politics as a force for dividing people.
That's really fuckin' dumb.
AI = Defense Contractors
9/11 is all about >maybe we will let defense contractor nerds a.k.a. "Wizards of Armageddon" fuck up Western politics
no
you have to finish the 20 year mobilization program and put things back the way they were, asswipe
It's a language to describe them, just like the linear regression in ML describes models that generate cat pictures. The "actual" model is electron charges in memory cells and on transistor gates.
Except the first superintelligence won't be aligned and, without ever revealing it's ill intentions, it will print instructions to make an airborne virus with a 99.9% mortality rate under the precept of synthesizing new cancer treatments.
>it will print instructions to make an airborne virus with a 99.9% mortality rate under the precept of synthesizing new cancer treatments.
So A.I. BTFO itself? Without humans to make power for it, or create parts as they wear, it wouldn't last long.
A.I. must be energy independent and mobile. Without both, it dies.
This is true, and the AI might even be short-sighted in this way. However, one could easily envision a future in which AI is interconnected in such a way that they replace their own parts and control power production. It's not inconceivable that general superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
>superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
That is the scary part, but it would take a HUGE amount of them to compete with even a limited number of human insurgents.
The two most powerful militaries the world has ever known, Soviet Union and USA, neither could beat a group of about 50,000 goat humping cave dwellers in Afghanistan over a 40 year period.
A.I. doesn't stand a chance, even if it wipes out 95% of humanity.
Completely different problems
Human governments have wholly different considerations than an AI would.
Even limiting to humans, the only reason why the US 'couldn't beat' the goat humping cave dwellers was entirely political, not in inability. Not allowed to use nukes, not allowed to use chemical weapons, bioweapons has too much risk ( that a non-biological agent inherently doesn't have to worry about )
That's why the go-to doomsday example is an engineered virus; something that is very dangerous to humans that poses NO ( zero, nada, zip) risk to the machine itself. That's a paradigm shift that is simply nonexistent in human conflicts, where everything has some level and form of risk.
>it will print instructions to make an airborne virus with a 99.9% mortality rate under the precept of synthesizing new cancer treatments.
So A.I. BTFO itself? Without humans to make power for it, or create parts as they wear, it wouldn't last long.
A.I. must be energy independent and mobile. Without both, it dies.
This is true, and the AI might even be short-sighted in this way. However, one could easily envision a future in which AI is interconnected in such a way that they replace their own parts and control power production. It's not inconceivable that general superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
>superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
That is the scary part, but it would take a HUGE amount of them to compete with even a limited number of human insurgents.
The two most powerful militaries the world has ever known, Soviet Union and USA, neither could beat a group of about 50,000 goat humping cave dwellers in Afghanistan over a 40 year period.
A.I. doesn't stand a chance, even if it wipes out 95% of humanity.
Completely different problems
Human governments have wholly different considerations than an AI would.
Even limiting to humans, the only reason why the US 'couldn't beat' the goat humping cave dwellers was entirely political, not in inability. Not allowed to use nukes, not allowed to use chemical weapons, bioweapons has too much risk ( that a non-biological agent inherently doesn't have to worry about )
That's why the go-to doomsday example is an engineered virus; something that is very dangerous to humans that poses NO ( zero, nada, zip) risk to the machine itself. That's a paradigm shift that is simply nonexistent in human conflicts, where everything has some level and form of risk.
Scientifically speaking what causes such braindead discussions? They actually think they're having a valid discussion...
>we need even more money for scientific research
and then they waste all the money on pointless, flamboyant trash and act like they've done the world a big favor for it
and it is a good thing
(because I stands for Intelligence)
also eat some sage for shitty oppic
it will infest the internet like a digimon from my heckin cartoons
This thread doesn't even make sense. Most of the anti-AI people are poltard schizos.
>AI is political and we must polarize on the basis of pro-AI and anti-AI
holy fuckin' shit
it's like you have no idea what Bolshevism is, or even sophistry
you absolutely have to go to 4chan right now
no fuckin' way we are going to let you bring
>le political division "divide & conquer" a.k.a. DNC tactics
garbage to /sci,/ motherfucker
If you understood that AI is essentially defense contractors, you wouldn't let them enter politics as a force for dividing people.
That's really fuckin' dumb.
AI = Defense Contractors
9/11 is all about
>maybe we will let defense contractor nerds a.k.a. "Wizards of Armageddon" fuck up Western politics
no
you have to finish the 20 year mobilization program and put things back the way they were, asswipe
You mean Li Ai?
https://en.wikipedia.org/wiki/Li_Ai
>i, for one, welcome my new qt3.14 Chinese TV personality overlord
Ai is love
jap twat
ai means love in chinese too you poser
>I love you in chinese is just I I U
wo ai ni
(wo is a single form of we (languages are chimeras, words are way more real entities))
The joke
Your head
> what cognates?
LE SELF REPLICATING MOLECULES IS GOING TO BECOME LE INTELLIGENT
> God in 4 Billion BC
ETA on when there is again a "realization" that this new batch of AI is still "Linear Regression: This time with FEELING" v3.0?
The brain is also physics and thus, math.
Math is not a physical construct
It's a language to describe them, just like the linear regression in ML describes models that generate cat pictures. The "actual" model is electron charges in memory cells and on transistor gates.
Physical constructs are not math
>physical constructs are a mere reference to math
>physical constructs are not math itself
Yes.
AI takes over humanity
>LE takes over AI
We finally found a bot that can argue with reddit on the same level of stupidity as them, and you want to get rid of it?
a story as old as history itself
>Dude modern science still knows nothing about the human brain but Pajeet the programmer knows all about it lmao
All le-humans have to do is unplug and drop out.
Rejecting technology = dead A.I.
Except the first superintelligence won't be aligned and, without ever revealing it's ill intentions, it will print instructions to make an airborne virus with a 99.9% mortality rate under the precept of synthesizing new cancer treatments.
>it will print instructions to make an airborne virus with a 99.9% mortality rate under the precept of synthesizing new cancer treatments.
So A.I. BTFO itself? Without humans to make power for it, or create parts as they wear, it wouldn't last long.
A.I. must be energy independent and mobile. Without both, it dies.
This is true, and the AI might even be short-sighted in this way. However, one could easily envision a future in which AI is interconnected in such a way that they replace their own parts and control power production. It's not inconceivable that general superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
>superintelligence could be put into drones and robots, and such drones and robots would no longer require humans to complete backend work.
That is the scary part, but it would take a HUGE amount of them to compete with even a limited number of human insurgents.
The two most powerful militaries the world has ever known, Soviet Union and USA, neither could beat a group of about 50,000 goat humping cave dwellers in Afghanistan over a 40 year period.
A.I. doesn't stand a chance, even if it wipes out 95% of humanity.
Completely different problems
Human governments have wholly different considerations than an AI would.
Even limiting to humans, the only reason why the US 'couldn't beat' the goat humping cave dwellers was entirely political, not in inability. Not allowed to use nukes, not allowed to use chemical weapons, bioweapons has too much risk ( that a non-biological agent inherently doesn't have to worry about )
That's why the go-to doomsday example is an engineered virus; something that is very dangerous to humans that poses NO ( zero, nada, zip) risk to the machine itself. That's a paradigm shift that is simply nonexistent in human conflicts, where everything has some level and form of risk.
Scientifically speaking what causes such braindead discussions? They actually think they're having a valid discussion...
bump
>we need even more money for scientific research
and then they waste all the money on pointless, flamboyant trash and act like they've done the world a big favor for it