So I’m sure that this has been asked before but is there any reason to create a truly self aware, perfect artificial intelligence? As in, beyond just trying to do it to begin with. In fact it seems downright foolish to do so when so much can go wrong.
if things can go as poorly as you imagine, it is not unreasonable to consider the possibility that it could also go remarkably well beyond our currently most optimistic outlooks
taking these types of gambles is a pretty common thing with humans and technological leaps, AI isn't really all too different in that regard
This
lol , scared old women
>a subset of sufficiently risk tolerant people to defect and create it anyway
at all costs, whatever it takes
lol
Game theory says simply not building AGI is not a stable equilibrium.
It doesn't matter if AGI will kill us 99% of the time. All it takes is a subset of sufficiently risk tolerant people to defect and create it anyway. It's a multipolar trap.
We're boned
I am so sick and tired of seeing your lesswrong pseud posts on this board. You have lost every argument that you've engaged in and yet you continue to post the same shit.
Stop already
No lol keep seething
It's kind of telling that a decade old meme only idiots actually buy into is the only thing you know about rationality
>only idiots actually buy into
But you know Roko's Basilisk IS coming for you, right? The Basilisk not existing is is an unstable equilibrium. The idiots will build it and then it will torture you forever. It's game theory, Yud.
Not that it'll make a difference since you aren't in a good faith conversation, but you can go read the actual history. Yud was upset because someone else's action at thinking they discovered an information hazard was to immediately post it on the internet.
In real life there are no convergent instrumental goals to the basilisk, so there's no reason to be concerned about it. There's actually a $200 bounty right now to write an FAQ for the people having existential crisis on the basilisk and quantum suicide walking them through step by step why it shouldn't concern them.
The difference is, we actually try to find what's true, not just argue an arbitrary position.
But if you're feeling offended, just pretend it's 2 minutes ago for all the difference it'll make.
> there's no reason to be concerned about it
It's an unstable equilibrium. All it takes is a couple of Basilisk believers to become convinced by Roko's argument. Time for you to have a nice day, methinks.
>to write an FAQ for the people having existential crisis on the basilisk and quantum suicide walking them through step by step why it shouldn't concern them
And why shouldn't anyone be concerned?
Devil's advocate, I'm not a rationalist, but why shouldn't they be concerned? Is the reason its difficult to generate an FAQ over this because there are no good reasons not to worry?
Game theory wank for me but not for thee.
There are a lot of posts that debunk the basilisk, but there isn't a really good comprehensive FAQ.
Someone has an existential crisis over this about once a month so someone put a $200 bounty on it, presumably hoping it could be a schelling point anyone can just point them to instead of spending time reassuring every singe one of them that no they don't actually need to worry about acausal incentives like that.
Game theory says you should have a nice day before Roko's basilisk comes for you.
>is there any reason to create a truly self aware, perfect artificial intelligence?
Materialists need a god for their deranged transhumanist faith.
the priest/science class wishes to displace GOD, and this would be just one of their countless retarded takes throughout the ages
people who have no friends and no human relationships other than shallow ones all crave an AI robot waifu because that seems like an easier way to get friends than fixing there atrociously repulsive personality would be.
>if only i could buy friends, then life would be perfect. lord knows i will never be able or willing to earn friendship, buying a robot save to simulate one is my only hope
pathetic
I watch popsci YouTube videos for 4 hours a day: the post
Just switch the power off, easy
bump