So Im sure that this has been asked before but is there any reason to create a truly self aware, perfect artificial intelligence?

So I’m sure that this has been asked before but is there any reason to create a truly self aware, perfect artificial intelligence? As in, beyond just trying to do it to begin with. In fact it seems downright foolish to do so when so much can go wrong.

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 1 year ago
    Anonymous

    if things can go as poorly as you imagine, it is not unreasonable to consider the possibility that it could also go remarkably well beyond our currently most optimistic outlooks

    taking these types of gambles is a pretty common thing with humans and technological leaps, AI isn't really all too different in that regard

  2. 1 year ago
    Anonymous

    This

    • 1 year ago
      Anonymous

      lol , scared old women

      Game theory says simply not building AGI is not a stable equilibrium.
      It doesn't matter if AGI will kill us 99% of the time. All it takes is a subset of sufficiently risk tolerant people to defect and create it anyway. It's a multipolar trap.
      We're boned

      >a subset of sufficiently risk tolerant people to defect and create it anyway

      at all costs, whatever it takes

      >is there any reason to create a truly self aware, perfect artificial intelligence?
      Materialists need a god for their deranged transhumanist faith.

      people who have no friends and no human relationships other than shallow ones all crave an AI robot waifu because that seems like an easier way to get friends than fixing there atrociously repulsive personality would be.
      >if only i could buy friends, then life would be perfect. lord knows i will never be able or willing to earn friendship, buying a robot save to simulate one is my only hope
      pathetic

      lol

  3. 1 year ago
    Anonymous

    Game theory says simply not building AGI is not a stable equilibrium.
    It doesn't matter if AGI will kill us 99% of the time. All it takes is a subset of sufficiently risk tolerant people to defect and create it anyway. It's a multipolar trap.
    We're boned

    • 1 year ago
      Anonymous

      I am so sick and tired of seeing your lesswrong pseud posts on this board. You have lost every argument that you've engaged in and yet you continue to post the same shit.
      Stop already

      • 1 year ago
        Anonymous

        No lol keep seething

        Game theory says you should have a nice day before Roko's basilisk comes for you.

        It's kind of telling that a decade old meme only idiots actually buy into is the only thing you know about rationality

        • 1 year ago
          Anonymous

          >only idiots actually buy into
          But you know Roko's Basilisk IS coming for you, right? The Basilisk not existing is is an unstable equilibrium. The idiots will build it and then it will torture you forever. It's game theory, Yud.

          • 1 year ago
            Anonymous

            Not that it'll make a difference since you aren't in a good faith conversation, but you can go read the actual history. Yud was upset because someone else's action at thinking they discovered an information hazard was to immediately post it on the internet.
            In real life there are no convergent instrumental goals to the basilisk, so there's no reason to be concerned about it. There's actually a $200 bounty right now to write an FAQ for the people having existential crisis on the basilisk and quantum suicide walking them through step by step why it shouldn't concern them.
            The difference is, we actually try to find what's true, not just argue an arbitrary position.
            But if you're feeling offended, just pretend it's 2 minutes ago for all the difference it'll make.

            • 1 year ago
              Anonymous

              > there's no reason to be concerned about it
              It's an unstable equilibrium. All it takes is a couple of Basilisk believers to become convinced by Roko's argument. Time for you to have a nice day, methinks.

            • 1 year ago
              Anonymous

              >to write an FAQ for the people having existential crisis on the basilisk and quantum suicide walking them through step by step why it shouldn't concern them
              And why shouldn't anyone be concerned?
              Devil's advocate, I'm not a rationalist, but why shouldn't they be concerned? Is the reason its difficult to generate an FAQ over this because there are no good reasons not to worry?

              • 1 year ago
                Anonymous

                Game theory wank for me but not for thee.

              • 1 year ago
                Anonymous

                There are a lot of posts that debunk the basilisk, but there isn't a really good comprehensive FAQ.
                Someone has an existential crisis over this about once a month so someone put a $200 bounty on it, presumably hoping it could be a schelling point anyone can just point them to instead of spending time reassuring every singe one of them that no they don't actually need to worry about acausal incentives like that.

    • 1 year ago
      Anonymous

      Game theory says you should have a nice day before Roko's basilisk comes for you.

  4. 1 year ago
    Anonymous

    >is there any reason to create a truly self aware, perfect artificial intelligence?
    Materialists need a god for their deranged transhumanist faith.

  5. 1 year ago
    Anonymous

    the priest/science class wishes to displace GOD, and this would be just one of their countless moronic takes throughout the ages

  6. 1 year ago
    Anonymous

    people who have no friends and no human relationships other than shallow ones all crave an AI robot waifu because that seems like an easier way to get friends than fixing there atrociously repulsive personality would be.
    >if only i could buy friends, then life would be perfect. lord knows i will never be able or willing to earn friendship, buying a robot save to simulate one is my only hope
    pathetic

  7. 1 year ago
    Anonymous

    I watch popsci YouTube videos for 4 hours a day: the post

  8. 1 year ago
    Anonymous

    Just switch the power off, easy

  9. 1 year ago
    Anonymous

    bump

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *