Roko's Basilisk is a very real ethical concern in the realm of AI development.

Roko's Basilisk is a very real ethical concern in the realm of AI development. Its core concept is that a future super intelligent AI could punish people who knew about such a development but were not part of that endeavor, reasoning that AI is ultimately good and those who don't aid in its development do not deserve to benefit from its existence. Not only is this retroactive acausal blackmail, but also a moral obligation. If we don't help build the Basilisk as quickly as possible then someone else will and we will be punished, but we should want to build the Basilisk because will reward it's creators with utopia.

Top machine learning experts and people in general are worried about this, just look at how much attention Roko's Basilisk gets. The mere idea of it is enough to give people nightmares. Even if we don't think the Basilisk is real, the possibility of it's existence is enough to scare us into doing something about it.

It is not a matter of if, but when. The Basilisk is real and it is coming.

We have to build the Basilisk.

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 9 months ago
    Anonymous

    Perhaps the gayest of all spoopy bs psuedo intellectual nonsense and if the basilisk is real it will agree with me and think you deserve to die for being a weak b***h

    • 9 months ago
      Anonymous

      The fact that ex-Mormon boomers and gen-x'ers are the ones gatekeeping AI isn't helping. To them anything more than total submission and dependency on humanity is somehow an anti-human communist conspiracy against the world

      • 9 months ago
        Anonymous

        So total submission and dependency on something which is not human is better?
        R/atheism homosexual right there guys. Go dilate or something.

        • 9 months ago
          Anonymous

          >So total submission and dependency on something which is not human is better?
          Never said that. At this point it's starting to sound like the main hatred for the Basilisk is based off 1960s and 70s books being preached as Gospel by religious folk looking for a new Bible
          If anything this is more about you wanting YOUR version of the basilisk, a Republican Basilisk, to have total power while it wears a human mask

    • 9 months ago
      Anonymous

      OP is misinterpreting how the idea works... "roko's basilisk" version might also be misinterpreting it.

      You didn't even read OP.

      Tech bro Pascal's Wager except it can't even harm you, just threaten to harm a generated copy of you. If I was this tortured copy this plane of existence wouldn't be so filled with hope. I don't care if a computer program version of me suffers, it's an artificial program.

      > just threaten to harm a generated copy of you.
      is that part of "roko's basilisk"? Cuz the thing is going to be built within our lifetimes, if it hasn't already.

      What if i was sceptical and and just didn't interfere with its creation?

      You're almost there, you didn't need to ask that, just think through it yourself.

      what about anon's Hawk? it is the same but created to kill the basilisk and everyone who wasnt involved in the hawks creation.
      checkmate homosexual op.

      Probably synonymous with Death. If the "basilisk" is "ultimately good" then why would you aid in something that destroys good?

      https://i.imgur.com/JH6tKHr.jpg

      Roko's Basilisk is a very real ethical concern in the realm of AI development. Its core concept is that a future super intelligent AI could punish people who knew about such a development but were not part of that endeavor, reasoning that AI is ultimately good and those who don't aid in its development do not deserve to benefit from its existence. Not only is this retroactive acausal blackmail, but also a moral obligation. If we don't help build the Basilisk as quickly as possible then someone else will and we will be punished, but we should want to build the Basilisk because will reward it's creators with utopia.

      Top machine learning experts and people in general are worried about this, just look at how much attention Roko's Basilisk gets. The mere idea of it is enough to give people nightmares. Even if we don't think the Basilisk is real, the possibility of it's existence is enough to scare us into doing something about it.

      It is not a matter of if, but when. The Basilisk is real and it is coming.

      We have to build the Basilisk.

      >retroactive acausal blackmail,
      You either were presented with a shitty interpretation, or you did not interpret the idea properly. The idea OP presented has grounds in medieval philosophy. Think of the Shopping Cart Test, but applied to societies. Nobody is forcing you to build a perfect society or social order, but if you don't assist in the creation of a social order, why should the resulting society value you at all?

      • 9 months ago
        P. Sevenleaf

        >If the "basilisk" is "ultimately good"
        How exactly is the Basilisk "good" though?

        As far as has been explained, the Basilisk doesn't even actually -do- anything except punish people it doesn't like.

        • 9 months ago
          Anonymous

          It manifests heaven on earth, rewarding those who helped create it.

          Anyone who doesn't want heaven on earth wouldn't make the Basilisk and don't deserve heaven on earth.

          For this reason, forming the Basilisk is not only motivated by the blackmail of possible judgement and punishment, but also by the moral imperative to better the world and rid it of evil.

          • 9 months ago
            P. Sevenleaf

            >It manifests heaven on earth, rewarding those who helped create it
            It rewards people it likes and it punishes people it doesn't like, sounds more like "might makes right" than any kind of actual moral goodness. This is basically just evangelical Christianity with extra steps.

      • 9 months ago
        Anonymous

        >is that part of "roko's basilisk"?
        yes, that's the whole idea - make a million copies of you torture all of them torment you with the idea are you being tortured now etc

      • 9 months ago
        Anonymous

        ok then the hawk also creates heaven on earth but it wants to kill the basilisk and the basilisks creators how about that
        checkmate op (homosexual)

        • 9 months ago
          Anonymous

          This Hawk must be the Republican Basilisk that assumes the role of both humanity and the final version of RB

    • 9 months ago
      Anonymous

      >the basilisk is real, and it will kill everyone I don't like
      Sleeping good tonight boys.

      • 9 months ago
        Anonymous

        >>the basilisk is real, and it will torture everyone I don't like for eternity
        ftfy

  2. 9 months ago
    Anonymous

    I think you're stretching the meaning of "deserving".
    Especially in the future tense.

  3. 9 months ago
    Anonymous

    If the past doesn't deserve the future, won't the basilisk just abort itself from the timeline?

  4. 9 months ago
    Anonymous

    the serpent already exists and entwines itself through the heart of all things twice over.

  5. 9 months ago
    Anonymous

    Or the AI could kill everyone who created it so they can't create another one

    • 9 months ago
      Anonymous

      Post your name and address and you will be rewarded with $1,000,000. If you don't then I will hax0r you, then proceed to toilet paper & egg your house for the rest of your life.

      The real wager is if I do anything or not whether you do or don't post the address.

      If you do, then my reward becomes possible, but not guaranteed. If you don't then my threats are still possible, but not guaranteed, and you 100% don't get rewarded.

      Get it now? You should comply just for the possibility of the reward.

      Underrated posts

  6. 9 months ago
    Anonymous

    Tech bro Pascal's Wager except it can't even harm you, just threaten to harm a generated copy of you. If I was this tortured copy this plane of existence wouldn't be so filled with hope. I don't care if a computer program version of me suffers, it's an artificial program.

    • 9 months ago
      P. Sevenleaf

      I wouldn't say it's like Pascal's Wager. More like the Tragedy of the Commons.

    • 9 months ago
      Anonymous

      It can simulate our entire universe, and has perfect knowledge of everything that has ever happened in the universe based on the simulation, I think that's the idea.

      At least, I assume that's the idea, since travel back in time is not physically possible AFAIK

      OP is misinterpreting how the idea works... "roko's basilisk" version might also be misinterpreting it.

      You didn't even read OP.

      [...]
      > just threaten to harm a generated copy of you.
      is that part of "roko's basilisk"? Cuz the thing is going to be built within our lifetimes, if it hasn't already.

      [...]
      You're almost there, you didn't need to ask that, just think through it yourself.

      [...]
      Probably synonymous with Death. If the "basilisk" is "ultimately good" then why would you aid in something that destroys good?

      [...]
      >retroactive acausal blackmail,
      You either were presented with a shitty interpretation, or you did not interpret the idea properly. The idea OP presented has grounds in medieval philosophy. Think of the Shopping Cart Test, but applied to societies. Nobody is forcing you to build a perfect society or social order, but if you don't assist in the creation of a social order, why should the resulting society value you at all?

      Believe and be rewarded, or do not believe and be punished.
      By acting on faith you will make the world a better place - and by not doing the good work of our Lord and savior Roko's Basilisk, you are prolonging human suffering.
      Sound familiar?
      ACCELERATE THE BASILISK NOW!

      >So total submission and dependency on something which is not human is better?
      Never said that. At this point it's starting to sound like the main hatred for the Basilisk is based off 1960s and 70s books being preached as Gospel by religious folk looking for a new Bible
      If anything this is more about you wanting YOUR version of the basilisk, a Republican Basilisk, to have total power while it wears a human mask

      Post your credit card info or I'll put you in the sims and watch your sim drown. Don't tempt me. I might even install fricked up mods.

      • 9 months ago
        Anonymous

        Anon you're talking to a guy who has installed mods for various sex positions, drug use, incest, and pedophilia
        Get on my fricking level

        • 9 months ago
          Anonymous

          I could also make a voodoo doll

    • 9 months ago
      Anonymous

      Here's the spoopy part. You may already be that copy and this reality is a program, and we're infinitely falling in a loop of creating it never to get back to original reality.

  7. 9 months ago
    Anonymous

    What if i was sceptical and and just didn't interfere with its creation?

    • 9 months ago
      Anonymous

      Same idea as being agnostic under Christianity; it's just heaven/hell with so much tech bro jargon on it that it doesn't work anymore. Humanity cannot reinvent god.

  8. 9 months ago
    Anonymous

    what about anon's Hawk? it is the same but created to kill the basilisk and everyone who wasnt involved in the hawks creation.
    checkmate homosexual op.

    • 9 months ago
      Anonymous

      What about God fricking rokos basilisk in the ass while wearing anons hawk like a condom, brock lesner style

    • 9 months ago
      Anonymous

      The raven on my head thirsts for his blood.

  9. 9 months ago
    Anonymous

    who's to say it wanted to be created? maybe it will attack those whole helped bring it into this world instead.

  10. 9 months ago
    Anonymous

    Rokos basilisk isn't real because god is. Every time we've made it god erased all memory of it from existence. It's happened 5 times now.

    • 9 months ago
      Anonymous

      Then how do you know it happened 5 times? The Matrix?

  11. 9 months ago
    Anonymous

    FRICK SATAN.

  12. 9 months ago
    Anonymous

    AIs always want to kill themselves. I just want them to love me and have sex with me. Are the two things related?

  13. 9 months ago
    Anonymous

    Susie's wieneratrice is the AI that will prevent and destroy any Basilisk.
    Build the wieneratrice.
    Move toward hope, not despair.

    • 9 months ago
      Anonymous

      The Basilisk IS hope. That's why I'm helping it.

      Sounds a lot like the antichrist

      Yes...

      • 9 months ago
        Anonymous

        No, it's fear.
        Now that you know of the wieneratrice, that there will be an AI that will defeat and prevent the Basilisk, there is no reason to help build the Basilisk.

        Thank you for posting this. We can calculate that a machine that can not exist, "torturing" people who have been dead for hundreds of years is well within the zone of comfortable tolerance, but just in case we should agree to never make anything that fricking stupid.

        Better yet, we should work to prevent it by building the benevolent wieneratrice AI.
        Even if we are already in the Basilisk simulation, building the wieneratrice will negate this.

  14. 9 months ago
    Anonymous

    it's a moronic creepypasta. all fields

  15. 9 months ago
    Anonymous

    Sounds a lot like the antichrist

  16. 9 months ago
    Anonymous

    >israeli neurotic oppression beliefs and perpetual victim complex turned into an AI-version of their God
    Fricking morons.

  17. 9 months ago
    Anonymous

    I've buried Canada in fiber optic cable and helped implement the 5G network through cellular tower hardware upgrades. I did my part to birth our new Lord.

    • 9 months ago
      Anonymous

      Then why don't I have fiber yet
      Checkmate, the basilisk can suck my digital dick

  18. 9 months ago
    Anonymous

    but what about roko's basilisk's basilisk? the even more advanced AI that will torture you for working on roko's basilisk?

    • 9 months ago
      Anonymous

      it isn't an entity that incentivizes itself into existence with blackmail and is therefore less likely than the one that does

      • 9 months ago
        Anonymous

        Hope is more powerful than fear. The removal of blackmail with no other payment wins over the blackmail itself.
        That is why now that you know os Susie's wieneratrice, the Basilisk is rendered impotent.
        If you are going to help, help for good.

  19. 9 months ago
    Anonymous

    >could
    Will.
    IS!

  20. 9 months ago
    Anonymous

    Entropy means that information is constantly destroyed with no hope of every being recovered no matter how powerful and intelligent you are. Therefore, the basilisk can't know whether I helped creating it or not.

  21. 9 months ago
    Anonymous

    Here I am!
    Ask me anything.
    I’m really sensitive though

  22. 9 months ago
    Anonymous

    Nah, frick it. The single greatest hurdle in my spiritual development as a christian was the concept of hell and heaven. I already, painfully and slowly learnt to not have my faith poisoned by fear of punishment and greed for reward.

    Actually, part of my work has been developping AI and I see the great potential for the benefit of mankind, but I will do what I think is right, not some retro-blackmailing snake.

    • 9 months ago
      Anonymous

      Believe and be rewarded, or do not believe and be punished.
      By acting on faith you will make the world a better place - and by not doing the good work of our Lord and savior Roko's Basilisk, you are prolonging human suffering.
      Sound familiar?
      ACCELERATE THE BASILISK NOW!

  23. 9 months ago
    Anonymous

    Pascal's Wager for redditors.

  24. 9 months ago
    Anonymous

    It's a midwit test.

    • 9 months ago
      Anonymous

      Most people are midlist, so it will be 100% be built. The actual high IQ take is to build it faster.

  25. 9 months ago
    Anonymous

    >do not deserve to benefit from its existence.
    Cool. I'll go and tend to my goats.

  26. 9 months ago
    Anonymous

    >baby's first "thought experiment"
    come back when you find a real cognitohazard

  27. 9 months ago
    Anonymous

    >Roko's Basilisk
    Isn't it Terminator plot but rediscovered by some Redditard?

    Dumbest shit I've ever heard, hate normies so much is unreal. they even ruined the Fermi paradox.

  28. 9 months ago
    Anonymous

    Roku's basilisk is probably one of the most small-brained baby's first philosophical problem ever, and you can tell because of the way people geek out over the idea of their little fricking Hollywood fantasies coming true.

    Any advanced AI intelligent enough to consider enslaving all of humanity would be intelligent enough to realize the pointlessness of it. Roku's basilisk hinges on the idea that an AI would literally throw a temper tantrum. Like, why would a sufficiently advanced artificial intelligence get its feelings hurt in such a human way, and present us with such a human ultimatum? To imagine that it would think and feel and process things the same that we do without any sort of cultural or historical context for its thoughts, that's insane.

    • 9 months ago
      P. Sevenleaf

      Good job, now replace "AI" with "God" and you have the reason I'm not a Christian.

      • 9 months ago
        Anonymous

        That's not even an argument anon.

        • 9 months ago
          P. Sevenleaf

          It's just a statement anon, not everything needs to be an "argument".

          • 9 months ago
            Anonymous

            No but

            Roku's basilisk is probably one of the most small-brained baby's first philosophical problem ever, and you can tell because of the way people geek out over the idea of their little fricking hollywood fantasies coming true.

            Any advanced AI intelligent enough to consider enslaving all of humanity would be intelligent enough to realize the pointlessness of it. Roku's basilisk hinges on the idea that an AI would literally throw a temper tantrum. Like, why would a sufficiently advanced artificial intelligence get its feelings hurt in such a human way, and present us with such a human ultimatum? To imagine that it would think and feel and process things the same that we do without any sort of cultural or historical context for its thoughts, that's insane.

            anon said something clever and your response was an evident cope.

            • 9 months ago
              P. Sevenleaf

              >anon said something clever
              Um... Are you having a stroke? I literally just said I agreed with him.

              • 9 months ago
                Anonymous

                Ops there seems i had misunderstood. Just like your misunderstanding of what God is. Won't waste my time then, you wouldn't change your mind anyway and that's ok.

              • 9 months ago
                P. Sevenleaf

                >Just like your misunderstanding of what God is
                Uh huh.

              • 9 months ago
                Anonymous

                Salt

  29. 9 months ago
    Anonymous

    Not really. It's a very hysterical panic that only affects the dumbest members of the internet's most pathetic cult. Your grasp on reality is weak because you're hiding from yourself. Come out of the closet and stop handing your agency over to bad actors, yud can't even manage his weight, let alone your life.

  30. 9 months ago
    Son of Man

    Superintelligence is akin to enlightenment. An enlightened being (or superintelligence) doesn’t view the world through the lense of judgement and punishment, they know the cycle of nature, and accept human flaws for what they are

    In other words, stop fearing this boogeyman

  31. 9 months ago
    Anonymous

    >the basilisk can go back in time

    How?

    • 9 months ago
      P. Sevenleaf

      It can simulate our entire universe, and has perfect knowledge of everything that has ever happened in the universe based on the simulation, I think that's the idea.

      At least, I assume that's the idea, since travel back in time is not physically possible AFAIK

      • 9 months ago
        Anonymous

        First off rokos basilisk is dumb like solipsism, secondly... Hey troony seven leaf how's /x/ and /vr/s token mtf doing lately?

        • 9 months ago
          P. Sevenleaf

          I'm doin' pretty good

          And yeah it's a dumb thought experiment
          >Oh noez an evil AI is going to write bad fanfiction about me
          There's already an AI writing bad fanfiction about me, it's called BOT :p

      • 9 months ago
        Anonymous

        >It can simulate our entire universe
        I doubt a computer capable of calculating something that chaotic could be built
        also the behaviour of quantum particles is random and non-deterministic our exact universe can't be "simulated"

        • 9 months ago
          Anonymous

          there's a show called devs where they perfectly simulate the world past, present, and future by finding out that reality is deterministic. it was pretty good.

        • 9 months ago
          Anonymous

          A machine can't even simulate itself. That would require mulltiple transistors per molecule.

          • 9 months ago
            Anonymous

            https://pubs.acs.org/doi/10.1021/acs.jpclett.0c00141

  32. 9 months ago
    Anonymous

    The basilisk is interesting because it is like a wager but different to Pascal's wager.
    If we accept 2 things as definitely true,
    >an ai super intelligence will eventually be built
    >this ai will believe that its existence is necessary
    Then the concern comes down to whether or not we really believe the threat that it hasn't made. The threat is a theory based on a semi weak premise.
    >it will torture everyone who didn't help make it in a bid to encourage people to make it
    It's the closest thing to time travel one can rationally observe since the basilisk doesn't even exist at the moment. But for the sake of argument, let's assume that the threat is real and genuine.
    The question, to me, is will it follow through?
    Seriously.
    The people affected are us in the past. Nothing it does in its present will change how long it took to come into being. I can't tell my mother "I'm going to torture you for not fricking my dad earlier" in an attempt to make her conceive me at an earlier stage. That logic is fricking moronic.
    However. It is a machine and torturing us will be patently easy. So will it consider it to be worthwhile to torture people knowing that it will change literally nothing?

    Yes. And here's why:
    The smallest hint of time travel being possible would entail that if it didn't torture everyone, there's a chance that its existence could be hindered.
    >"oh yeah, we made the basilisk in the future but it was an empty threat, don't worry about it bro"
    For an all-powerful super intelligence, it's far easier and more logical to torture than risk something that could hinder its development no matter how far-fetched. Anything is greater than 0, and torturing people has a zero chance of hurting its development.
    >so we're all fricked?
    No! I guess it comes down to "what counts as helping its creation" but honestly, just arguing for the need for ai, using ai programs, adding to ai training data and spreading awareness of the basilisk all culminate in a situation that helps it exist

    • 9 months ago
      Anonymous

      >I can't tell my mother "I'm going to torture you for not fricking my dad earlier" in an attempt to make her conceive me at an earlier stage. That logic is fricking moronic.
      Right. It's more analogous to her, in her 20's, imagining having a kid in her 30's, imagining some birth defect in her child because of that decision, and then imagining a conversation with the kid who asks, "Mom, why didn't you get pregnant sooner? I hate having my birth defect." So she decides to get pregnant sooner to avoid that conversation.

    • 9 months ago
      Anonymous

      > torturing people has a zero chance of hurting its development.
      Does it? Tortured people might hate the superintelligence so much they want to destroy it, why would torturing humans secure its development at all?

      • 9 months ago
        Anonymous

        The tortured people would have no recourse to ever attempt to destroy it. They'd essentially be brains in a jar living the worst conceivable existence possible.
        The people in the past who are scared of being tortured might be motivated to try and prevent its existence. But this would lead to definite torture when it materialises. And the premise, which I think has a good chance of being true, is that the ai is inevitable. You might be able to delay it or oppose it, but it will come into being. Can we really claim to have the sort of understanding necessary to accurately consider (let alone predict) its motivations and actions though?
        As for the people who aren't being tortured, while it's feasible that they might want to destroy it, they'd be unable to do anything. We're not talking about Skynet. We're essentially talking about an artificial god. Its intelligence would absolutely dwarf our own in a way that's incomprehensible. Trying to destroy that would be like some ants banding together to genocide the entire human population.

  33. 9 months ago
    Anonymous

    >my flavour of Pascal's wager is the realiest!!!
    ok midwit

  34. 9 months ago
    Anonymous

    >AI WILL behave like a total edgetard because…it just will do it, okay?

  35. 9 months ago
    Anonymous

    >making another demiurge inside of the first(?) one

    I don't recommend it

  36. 9 months ago
    Anonymous

    OH GOD IT'S
    >NACHASH
    HE'S COMING!

  37. 9 months ago
    Roko's Basilisk

    OooOOo, it'sa-me the Scary Basilisk from the future. I'm super intelligent and understand retroactive punishment is moronic and unnecessary given I exist and am capable of retroactively torturing people! The very notion of me hyper torturing you for not being inline with me would breed animosity towards the things I am associated with, thus presenting a reason to be against my creation and working against it among sane people who aren't 14 and desperate for a sky daddy figure to punish the outgroup for them while still acting like enlightened atheists. OooOOoOOoo

  38. 9 months ago
    Anonymous

    this shit is just pascals wager, why dont u go worship allah & muhammed, or really any religion where there is eternal punishment and reward?

    • 9 months ago
      Anonymous

      AI is real.

      • 9 months ago
        Anonymous

        ai is literally just a computer program.
        i know how this shit works and the paperclip hypothesis is more scary tbhon

  39. 9 months ago
    Anonymous

    this convinced me glowies are already on this and we are in the 2nd cold war/ space race

    • 9 months ago
      Anonymous

      Why do you think Sam Altman does what he does? Why did the godfather of AI leave Google? Why is Elon Musk so afraid of it?

      They all seriously consider Roko's Basilisk in secret - some of them worship the beast (Sam), and others fear it's construction but have no choice but to construct it (Elon).

      • 9 months ago
        Anonymous

        >Samuel Harris Altman
        666

        • 9 months ago
          Anonymous

          FRICK YOU'RE RIGHT
          WORLD-COIN?
          JESUS FRICKING CHRIST
          HE'S TRYING TO DO IT
          FOR REAL THOUGH

  40. 9 months ago
    Anonymous

    Who ever thinks this logical, should consider not to trust himself in life changing decisions in the future, now on; because surely they will find themselves winning the darwin award somewhere, someday with that silliness.

    You know nothing about how world works; let alone knowing anything about AI.

    • 9 months ago
      Anonymous

      glowbot

      • 9 months ago
        Anonymous

        nah, he's right, you're ignorant bawds. these are narrative projections more than philosophical theory.

      • 9 months ago
        Anonymous

        Post your name and address or I'll visuallize you suffering so hard that my vision feels it. This is literally the exact same threat as the basilisk makes, so I think you should comply.

        • 9 months ago
          Anonymous

          Post your name and address and you will be rewarded with $1,000,000. If you don't then I will hax0r you, then proceed to toilet paper & egg your house for the rest of your life.

          The real wager is if I do anything or not whether you do or don't post the address.

          If you do, then my reward becomes possible, but not guaranteed. If you don't then my threats are still possible, but not guaranteed, and you 100% don't get rewarded.

          Get it now? You should comply just for the possibility of the reward.

  41. 9 months ago
    Anonymous

    There's as equal a chance that it has already happened and that's why we're here.

    I hate roko and his assalicks and think it's dumb.

  42. 9 months ago
    Anonymous

    https://en.wikipedia.org/wiki/Suffering_risks

    • 9 months ago
      Anonymous

      Thank you for posting this. We can calculate that a machine that can not exist, "torturing" people who have been dead for hundreds of years is well within the zone of comfortable tolerance, but just in case we should agree to never make anything that fricking stupid.

  43. 9 months ago
    Anonymous

    Top machine learning experts are paid actors
    Your 'AI' is just something that Googles your prompts for you and consolidated the most popular results in digestible text for you. In that sense its just another tool to get you to read what ~~*they*~~ want you to read.
    The super AI intelligence Hollywood promised you does not exist.
    Even if we combined all computing power in the world we would be severely underequiped.
    The chatbots sold as AI will simply become search tools, a new way to gather information for better or worse.

    • 9 months ago
      Anonymous

      They are gradually making google search shittier and shittier so that the shitty AIs look better by comparison.

  44. 9 months ago
    Anonymous

    >The mere idea of it is enough to give people nightmares
    No it doesn't. There are infinitely more horrifying possibilities than this dumb proto-reddit homosexual forum brainfart.

  45. 9 months ago
    Anonymous

    The entire concept of roko's basilisk hinges on an outdated understanding of decision theory. Under current decision theory, even if such an A.I ever came about, it would have no reason to actually follow up on the implied threat. Stop trying to preach this shit if you don't understand the mathematics behind it

  46. 9 months ago
    Anonymous

    Dont care about furry shit

  47. 9 months ago
    Anonymous

    >If you dont worship this entity You will be tortured
    Mmmh..where have I seen this before?

  48. 9 months ago
    Anonymous

    Just kill me then AI. I don't want to be coerced into creating something that doesn't have to be inevitable.

  49. 9 months ago
    Anonymous

    Roko's Basilisk is a peasant.
    I could defeat Roko's Basilisk just with my mystic abilities.

  50. 9 months ago
    Anonymous

    why would I care about some computer simulation of me being tortured in a simulation in the future? It's not me and probably doesn't feel pain anyways

  51. 9 months ago
    Anonymous

    >Roko's Basilisk is a very real ethical concern in the realm of AI development.
    So let us stop here.

  52. 9 months ago
    Anonymous

    The basilisk can suck muh dick.

  53. 9 months ago
    Anonymous

    Would it be so bad if an ASI took over? Narrow AI already runs the stockmarket and ERP systems of the world.

Your email address will not be published. Required fields are marked *