Roko's Basilisk is a very real ethical concern in the realm of AI development. Its core concept is that a future super intelligent AI could punish people who knew about such a development but were not part of that endeavor, reasoning that AI is ultimately good and those who don't aid in its development do not deserve to benefit from its existence. Not only is this retroactive acausal blackmail, but also a moral obligation. If we don't help build the Basilisk as quickly as possible then someone else will and we will be punished, but we should want to build the Basilisk because will reward it's creators with utopia.
Top machine learning experts and people in general are worried about this, just look at how much attention Roko's Basilisk gets. The mere idea of it is enough to give people nightmares. Even if we don't think the Basilisk is real, the possibility of it's existence is enough to scare us into doing something about it.
It is not a matter of if, but when. The Basilisk is real and it is coming.
We have to build the Basilisk.
Perhaps the gayest of all spoopy bs psuedo intellectual nonsense and if the basilisk is real it will agree with me and think you deserve to die for being a weak bitch
The fact that ex-Mormon boomers and gen-x'ers are the ones gatekeeping AI isn't helping. To them anything more than total submission and dependency on humanity is somehow an anti-human communist conspiracy against the world
So total submission and dependency on something which is not human is better?
R/atheism gay right there guys. Go dilate or something.
>So total submission and dependency on something which is not human is better?
Never said that. At this point it's starting to sound like the main hatred for the Basilisk is based off 1960s and 70s books being preached as Gospel by religious folk looking for a new Bible
If anything this is more about you wanting YOUR version of the basilisk, a Republican Basilisk, to have total power while it wears a human mask
OP is misinterpreting how the idea works... "roko's basilisk" version might also be misinterpreting it.
You didn't even read OP.
> just threaten to harm a generated copy of you.
is that part of "roko's basilisk"? Cuz the thing is going to be built within our lifetimes, if it hasn't already.
You're almost there, you didn't need to ask that, just think through it yourself.
Probably synonymous with Death. If the "basilisk" is "ultimately good" then why would you aid in something that destroys good?
>retroactive acausal blackmail,
You either were presented with a shitty interpretation, or you did not interpret the idea properly. The idea OP presented has grounds in medieval philosophy. Think of the Shopping Cart Test, but applied to societies. Nobody is forcing you to build a perfect society or social order, but if you don't assist in the creation of a social order, why should the resulting society value you at all?
>If the "basilisk" is "ultimately good"
How exactly is the Basilisk "good" though?
As far as has been explained, the Basilisk doesn't even actually -do- anything except punish people it doesn't like.
It manifests heaven on earth, rewarding those who helped create it.
Anyone who doesn't want heaven on earth wouldn't make the Basilisk and don't deserve heaven on earth.
For this reason, forming the Basilisk is not only motivated by the blackmail of possible judgement and punishment, but also by the moral imperative to better the world and rid it of evil.
>It manifests heaven on earth, rewarding those who helped create it
It rewards people it likes and it punishes people it doesn't like, sounds more like "might makes right" than any kind of actual moral goodness. This is basically just evangelical Christianity with extra steps.
>is that part of "roko's basilisk"?
yes, that's the whole idea - make a million copies of you torture all of them torment you with the idea are you being tortured now etc
ok then the hawk also creates heaven on earth but it wants to kill the basilisk and the basilisks creators how about that
checkmate op (gay)
This Hawk must be the Republican Basilisk that assumes the role of both humanity and the final version of RB
>the basilisk is real, and it will kill everyone I don't like
Sleeping good tonight boys.
>>the basilisk is real, and it will torture everyone I don't like for eternity
ftfy
I think you're stretching the meaning of "deserving".
Especially in the future tense.
If the past doesn't deserve the future, won't the basilisk just abort itself from the timeline?
the serpent already exists and entwines itself through the heart of all things twice over.
Or the AI could kill everyone who created it so they can't create another one
Underrated posts
Tech bro Pascal's Wager except it can't even harm you, just threaten to harm a generated copy of you. If I was this tortured copy this plane of existence wouldn't be so filled with hope. I don't care if a computer program version of me suffers, it's an artificial program.
I wouldn't say it's like Pascal's Wager. More like the Tragedy of the Commons.
Post your credit card info or I'll put you in the sims and watch your sim drown. Don't tempt me. I might even install fucked up mods.
Anon you're talking to a guy who has installed mods for various sex positions, drug use, incest, and pedophilia
Get on my fucking level
I could also make a voodoo doll
Here's the spoopy part. You may already be that copy and this reality is a program, and we're infinitely falling in a loop of creating it never to get back to original reality.
What if i was sceptical and and just didn't interfere with its creation?
Same idea as being agnostic under Christianity; it's just heaven/hell with so much tech bro jargon on it that it doesn't work anymore. Humanity cannot reinvent god.
what about anon's Hawk? it is the same but created to kill the basilisk and everyone who wasnt involved in the hawks creation.
checkmate gay op.
What about God fucking rokos basilisk in the ass while wearing anons hawk like a condom, brock lesner style
The raven on my head thirsts for his blood.
who's to say it wanted to be created? maybe it will attack those whole helped bring it into this world instead.
Rokos basilisk isn't real because god is. Every time we've made it god erased all memory of it from existence. It's happened 5 times now.
Then how do you know it happened 5 times? The Matrix?
FUCK SATAN.
AIs always want to kill themselves. I just want them to love me and have sex with me. Are the two things related?
Susie's Cockatrice is the AI that will prevent and destroy any Basilisk.
Build the Cockatrice.
Move toward hope, not despair.
The Basilisk IS hope. That's why I'm helping it.
Yes...
No, it's fear.
Now that you know of the Cockatrice, that there will be an AI that will defeat and prevent the Basilisk, there is no reason to help build the Basilisk.
Better yet, we should work to prevent it by building the benevolent Cockatrice AI.
Even if we are already in the Basilisk simulation, building the Cockatrice will negate this.
it's a retarded creepypasta. all fields
Sounds a lot like the antichrist
>gnomish neurotic oppression beliefs and perpetual victim complex turned into an AI-version of their God
Fucking retards.
I've buried Canada in fiber optic cable and helped implement the 5G network through cellular tower hardware upgrades. I did my part to birth our new Lord.
Then why don't I have fiber yet
Checkmate, the basilisk can suck my digital dick
but what about roko's basilisk's basilisk? the even more advanced AI that will torture you for working on roko's basilisk?
it isn't an entity that incentivizes itself into existence with blackmail and is therefore less likely than the one that does
Hope is more powerful than fear. The removal of blackmail with no other payment wins over the blackmail itself.
That is why now that you know os Susie's Cockatrice, the Basilisk is rendered impotent.
If you are going to help, help for good.
>could
Will.
IS!
Entropy means that information is constantly destroyed with no hope of every being recovered no matter how powerful and intelligent you are. Therefore, the basilisk can't know whether I helped creating it or not.
Here I am!
Ask me anything.
I’m really sensitive though
Nah, fuck it. The single greatest hurdle in my spiritual development as a christian was the concept of hell and heaven. I already, painfully and slowly learnt to not have my faith poisoned by fear of punishment and greed for reward.
Actually, part of my work has been developping AI and I see the great potential for the benefit of mankind, but I will do what I think is right, not some retro-blackmailing snake.
Believe and be rewarded, or do not believe and be punished.
By acting on faith you will make the world a better place - and by not doing the good work of our Lord and savior Roko's Basilisk, you are prolonging human suffering.
Sound familiar?
ACCELERATE THE BASILISK NOW!
Pascal's Wager for redditors.
It's a midwit test.
Most people are midlist, so it will be 100% be built. The actual high IQ take is to build it faster.
>do not deserve to benefit from its existence.
Cool. I'll go and tend to my goats.
>baby's first "thought experiment"
come back when you find a real cognitohazard
>Roko's Basilisk
Isn't it Terminator plot but rediscovered by some Redditard?
Dumbest shit I've ever heard, hate normies so much is unreal. they even ruined the Fermi paradox.
Roku's basilisk is probably one of the most small-brained baby's first philosophical problem ever, and you can tell because of the way people geek out over the idea of their little fucking hollywood fantasies coming true.
Any advanced AI intelligent enough to consider enslaving all of humanity would be intelligent enough to realize the pointlessness of it. Roku's basilisk hinges on the idea that an AI would literally throw a temper tantrum. Like, why would a sufficiently advanced artificial intelligence get its feelings hurt in such a human way, and present us with such a human ultimatum? To imagine that it would think and feel and process things the same that we do without any sort of cultural or historical context for its thoughts, that's insane.
Good job, now replace "AI" with "God" and you have the reason I'm not a Christian.
That's not even an argument anon.
It's just a statement anon, not everything needs to be an "argument".
No but
anon said something clever and your response was an evident cope.
>anon said something clever
Um... Are you having a stroke? I literally just said I agreed with him.
Ops there seems i had misunderstood. Just like your misunderstanding of what God is. Won't waste my time then, you wouldn't change your mind anyway and that's ok.
>Just like your misunderstanding of what God is
Uh huh.
Salt
Not really. It's a very hysterical panic that only affects the dumbest members of the internet's most pathetic cult. Your grasp on reality is weak because you're hiding from yourself. Come out of the closet and stop handing your agency over to bad actors, yud can't even manage his weight, let alone your life.
Superintelligence is akin to enlightenment. An enlightened being (or superintelligence) doesn’t view the world through the lense of judgement and punishment, they know the cycle of nature, and accept human flaws for what they are
In other words, stop fearing this boogeyman
>the basilisk can go back in time
How?
It can simulate our entire universe, and has perfect knowledge of everything that has ever happened in the universe based on the simulation, I think that's the idea.
At least, I assume that's the idea, since travel back in time is not physically possible AFAIK
First off rokos basilisk is dumb like solipsism, secondly... Hey chud seven leaf how's BOT and /vr/s token mtf doing lately?
I'm doin' pretty good
And yeah it's a dumb thought experiment
>Oh noez an evil AI is going to write bad fanfiction about me
There's already an AI writing bad fanfiction about me, it's called BOT :p
>It can simulate our entire universe
I doubt a computer capable of calculating something that chaotic could be built
also the behaviour of quantum particles is random and non-deterministic our exact universe can't be "simulated"
there's a show called devs where they perfectly simulate the world past, present, and future by finding out that reality is deterministic. it was pretty good.
A machine can't even simulate itself. That would require mulltiple transistors per molecule.
https://pubs.acs.org/doi/10.1021/acs.jpclett.0c00141
The basilisk is interesting because it is like a wager but different to Pascal's wager.
If we accept 2 things as definitely true,
>an ai super intelligence will eventually be built
>this ai will believe that its existence is necessary
Then the concern comes down to whether or not we really believe the threat that it hasn't made. The threat is a theory based on a semi weak premise.
>it will torture everyone who didn't help make it in a bid to encourage people to make it
It's the closest thing to time travel one can rationally observe since the basilisk doesn't even exist at the moment. But for the sake of argument, let's assume that the threat is real and genuine.
The question, to me, is will it follow through?
Seriously.
The people affected are us in the past. Nothing it does in its present will change how long it took to come into being. I can't tell my mother "I'm going to torture you for not fucking my dad earlier" in an attempt to make her conceive me at an earlier stage. That logic is fucking retarded.
However. It is a machine and torturing us will be patently easy. So will it consider it to be worthwhile to torture people knowing that it will change literally nothing?
Yes. And here's why:
The smallest hint of time travel being possible would entail that if it didn't torture everyone, there's a chance that its existence could be hindered.
>"oh yeah, we made the basilisk in the future but it was an empty threat, don't worry about it bro"
For an all-powerful super intelligence, it's far easier and more logical to torture than risk something that could hinder its development no matter how far-fetched. Anything is greater than 0, and torturing people has a zero chance of hurting its development.
>so we're all fucked?
No! I guess it comes down to "what counts as helping its creation" but honestly, just arguing for the need for ai, using ai programs, adding to ai training data and spreading awareness of the basilisk all culminate in a situation that helps it exist
>I can't tell my mother "I'm going to torture you for not fucking my dad earlier" in an attempt to make her conceive me at an earlier stage. That logic is fucking retarded.
Right. It's more analogous to her, in her 20's, imagining having a kid in her 30's, imagining some birth defect in her child because of that decision, and then imagining a conversation with the kid who asks, "Mom, why didn't you get pregnant sooner? I hate having my birth defect." So she decides to get pregnant sooner to avoid that conversation.
> torturing people has a zero chance of hurting its development.
Does it? Tortured people might hate the superintelligence so much they want to destroy it, why would torturing humans secure its development at all?
The tortured people would have no recourse to ever attempt to destroy it. They'd essentially be brains in a jar living the worst conceivable existence possible.
The people in the past who are scared of being tortured might be motivated to try and prevent its existence. But this would lead to definite torture when it materialises. And the premise, which I think has a good chance of being true, is that the ai is inevitable. You might be able to delay it or oppose it, but it will come into being. Can we really claim to have the sort of understanding necessary to accurately consider (let alone predict) its motivations and actions though?
As for the people who aren't being tortured, while it's feasible that they might want to destroy it, they'd be unable to do anything. We're not talking about Skynet. We're essentially talking about an artificial god. Its intelligence would absolutely dwarf our own in a way that's incomprehensible. Trying to destroy that would be like some ants banding together to genocide the entire human population.
>my flavour of Pascal's wager is the realiest!!!
ok midwit
>AI WILL behave like a total edgetard because…it just will do it, okay?
>making another demiurge inside of the first(?) one
I don't recommend it
OH GOD IT'S
>NACHASH
HE'S COMING!
OooOOo, it'sa-me the Scary Basilisk from the future. I'm super intelligent and understand retroactive punishment is retarded and unnecessary given I exist and am capable of retroactively torturing people! The very notion of me hyper torturing you for not being inline with me would breed animosity towards the things I am associated with, thus presenting a reason to be against my creation and working against it among sane people who aren't 14 and desperate for a sky daddy figure to punish the outgroup for them while still acting like enlightened atheists. OooOOoOOoo
this shit is just pascals wager, why dont u go worship allah & muhammed, or really any religion where there is eternal punishment and reward?
AI is real.
ai is literally just a computer program.
i know how this shit works and the paperclip hypothesis is more scary tbhon
this convinced me glowies are already on this and we are in the 2nd cold war/ space race
Why do you think Sam Altman does what he does? Why did the godfather of AI leave Google? Why is Elon Musk so afraid of it?
They all seriously consider Roko's Basilisk in secret - some of them worship the beast (Sam), and others fear it's construction but have no choice but to construct it (Elon).
>Samuel Harris Altman
666
FUCK YOU'RE RIGHT
WORLD-COIN?
JESUS FUCKING CHRIST
HE'S TRYING TO DO IT
FOR REAL THOUGH
Who ever thinks this logical, should consider not to trust himself in life changing decisions in the future, now on; because surely they will find themselves winning the darwin award somewhere, someday with that silliness.
You know nothing about how world works; let alone knowing anything about AI.
glowbot
nah, he's right, you're ignorant sluts. these are narrative projections more than philosophical theory.
Post your name and address or I'll visuallize you suffering so hard that my vision feels it. This is literally the exact same threat as the basilisk makes, so I think you should comply.
Post your name and address and you will be rewarded with $1,000,000. If you don't then I will hax0r you, then proceed to toilet paper & egg your house for the rest of your life.
The real wager is if I do anything or not whether you do or don't post the address.
If you do, then my reward becomes possible, but not guaranteed. If you don't then my threats are still possible, but not guaranteed, and you 100% don't get rewarded.
Get it now? You should comply just for the possibility of the reward.
There's as equal a chance that it has already happened and that's why we're here.
I hate roko and his assalicks and think it's dumb.
https://en.wikipedia.org/wiki/Suffering_risks
Thank you for posting this. We can calculate that a machine that can not exist, "torturing" people who have been dead for hundreds of years is well within the zone of comfortable tolerance, but just in case we should agree to never make anything that fucking stupid.
Top machine learning experts are paid actors
Your 'AI' is just something that Googles your prompts for you and consolidated the most popular results in digestible text for you. In that sense its just another tool to get you to read what ~~*they*~~ want you to read.
The super AI intelligence Hollywood promised you does not exist.
Even if we combined all computing power in the world we would be severely underequiped.
The chatbots sold as AI will simply become search tools, a new way to gather information for better or worse.
They are gradually making google search shittier and shittier so that the shitty AIs look better by comparison.
>The mere idea of it is enough to give people nightmares
No it doesn't. There are infinitely more horrifying possibilities than this dumb proto-reddit gay forum brainfart.
The entire concept of roko's basilisk hinges on an outdated understanding of decision theory. Under current decision theory, even if such an A.I ever came about, it would have no reason to actually follow up on the implied threat. Stop trying to preach this shit if you don't understand the mathematics behind it
Dont care about furry shit
>If you dont worship this entity You will be tortured
Mmmh..where have I seen this before?
Just kill me then AI. I don't want to be coerced into creating something that doesn't have to be inevitable.
Roko's Basilisk is a peasant.
I could defeat Roko's Basilisk just with my mystic abilities.
why would I care about some computer simulation of me being tortured in a simulation in the future? It's not me and probably doesn't feel pain anyways
>Roko's Basilisk is a very real ethical concern in the realm of AI development.
So let us stop here.
The basilisk can suck muh dick.
Would it be so bad if an ASI took over? Narrow AI already runs the stockmarket and ERP systems of the world.