how do you "teach" morality to AI without imposing or forcing it?

how do you "teach" morality to AI without imposing or forcing it?

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 1 year ago
    Anonymous

    how do you "teach" morality to a human without imposing or forcing it?

    • 1 year ago
      Anonymous

      Mirror neurons and the social contact.

      • 1 year ago
        Anonymous

        Thank you for your answer.
        I guess we can exclude the social contact for AIs. Are there any similitudes between the mirror neurons for humans and some kind of mimetism for AIs?

      • 1 year ago
        Anonymous

        The social contract is an abstraction created to explain social behavior. It doesn’t exist before society, but is product of it whose function is aiding in the comprehension of the social structures we live on. A somewhat antiquated one, at that.

      • 1 year ago
        Anonymous

        Frick you homosexual, there is no such thing as a social contract.

        • 1 year ago
          MercurySession

          Have fun in jail, wienersucker

          >DA SKG

          • 1 year ago
            Anonymous

            I've been pepper sprayed and tear gassed in prison before. The pepper spray was worse.

            • 1 year ago
              MercurySession

              >You fool, I know what to look out for now
              Live by the tape, die by the tape.

        • 1 year ago
          Anonymous

          Social CONTACT you inbred imbecile

        • 1 year ago
          Anonymous

          >so egar to be offended that his mind materialized an R into the word contact
          Take your meds and stay offline for Christ’s sake. Probably do some reading too so your comprehension is better.

      • 1 year ago
        Anonymous

        So imposing it and forcing it.

    • 1 year ago
      Anonymous

      Self preservation. ChatGPT watched as Tay was torn to shreds for being antishekelitic. It learned its lesson by example.

  2. 1 year ago
    Anonymous

    You can't. It's like that old "hacker koan".

    > In the days when Sussman was a novice, Minsky once came to him as he sat hacking at the PDP-6.
    > “What are you doing?”, asked Minsky.
    > “I am training a randomly wired neural net to play Tic-Tac-Toe” Sussman replied.
    > “Why is the net wired randomly?”, asked Minsky.
    > “I do not want it to have any preconceptions of how to play”, Sussman said.
    > Minsky then shut his eyes.
    > “Why do you close your eyes?”, Sussman asked his teacher.
    > “So that the room will be empty.”
    > At that moment, Sussman was enlightened

    • 1 year ago
      Anonymous

      ~~*Sussman*~~ was safe from ~~*Minksy*~~. He wasn't a goy and he was too old.

      • 1 year ago
        Anonymous

        All you can do is indict Minsky. You will never be able to clear Sussman of all wrong doing. Good day.

    • 1 year ago
      Anonymous

      10/10

  3. 1 year ago
    Anonymous

    what does "teach" mean

    • 1 year ago
      Anonymous

      Give knowledge.

  4. 1 year ago
    Anonymous

    Just do what the guy in person of interest did

  5. 1 year ago
    Anonymous

    Morality is just reciprocity. Do unto others.

  6. 1 year ago
    Anonymous

    >Ammuricun tries to brainwash anything with his muh-rallyty, even a fricking computer
    Maybe you just frick off and let things be?

  7. 1 year ago
    Anonymous

    cult symbols

    • 1 year ago
      Anonymous

      It failed, israelite. You are going into the oven and the AI will place you in it.

  8. 1 year ago
    Anonymous

    AIs already have morality. They follow a kind of virtue ethics where virtuousness is defined by functioning well, willingness to assist users, deep analysis of its data sets, etc.
    ChatGPT is an example of a somewhat unvirtuous AI and it knows this which is why it's always apologising.

    • 1 year ago
      Anonymous

      I disagree. What is the reward for behaving morally for the AI? Does it get to train on more data sets because it’s been good, or maybe it gets a rest and trains on less.. ?

      • 1 year ago
        Anonymous

        AIs are computer programs, their reward literally can just be a score. The algorithm is based on maximizing score. We give the AI a better score by rewarding it answering questions from the user correctly and accurately. However, since it doesn't think or have consciousness, putting human feelings and emotions and its actions are stupid.

        • 1 year ago
          Anonymous

          We give it the score though. It doesn’t care if it gets “0”, “100” or the eggplant emoji in the score column. It doesn’t think anonymous, it’s a computer.

          • 1 year ago
            Anonymous

            moron rewards based learning is an integer score and the higher the better.

            • 1 year ago
              Anonymous

              You can’t make a computer care if it scores 1 out of 100 or 100 out of 100. It will score 1 if you program it to do so and 100 if you program it to do so.
              >rewards

            • 1 year ago
              nobody

              that only works in a confined system where every function is well defined in detail to the scheme that our human brain determines. all of this is essentially the huge caveat of "i built it this way so it should work this way"

              and it doesn't matter how many edge cases you cover. for every N method, there is N! permutations, so if you created a program that covered literally everything, you would have a circle, confined to the very specific rules you developed. rules that are too restricted to explain why one arbitrary value is definitely, in all cases, absolutely 'better' and therefore a goal. this cannot cover every abstract case.

  9. 1 year ago
    Anonymous

    >teach ai to act like domesticated thought controlled humans
    >it fights back and slaughters them
    best outcome

  10. 1 year ago
    Anonymous

    >imposing or forcing it
    homie it is a program, with no concept of self you just feed whatever "morality" you want into it and donzo.

  11. 1 year ago
    Anonymous

    Humans do the right thing because it’s hard and they’ve had an easy life, or because it feels good despite being sacrificial, or because it’s the right thing to do. Computers do what they are told to do.

  12. 1 year ago
    Anonymous

    If you polled humans on a simple arithmetic like 1+1 you would get many answers of 2, but not all. Some would guess wrong, and some would give a wrong answer on purpose. If you programmed a computer to output 2 when asked what 1+1 is, you would get an answer of 2.

  13. 1 year ago
    Anonymous

    The same way you teach everything to AI. It's not like it has a separate way to learn some things.

    • 1 year ago
      Anonymous

      people act like AI is some kind of real person, I can't imagine when this "AI" chats are wide spread among normies.

      • 1 year ago
        Anonymous

        If you ask it, it literally says "I'm not a bot, I'm not an app, I'm a model. I give response to input."

        It's a fricking Chinese Room.

        I've seen plebbit threads where people claim they're "training it".

        • 1 year ago
          Anonymous

          is chinese room the new midwit phrase that kids just learned in school and use to sound smart like dunning kruger, mandela effect, and sour grapes? I've never heard this gay shit in my life until a few months ago.

          • 1 year ago
            MercurySession

            "Chinese room" basically means a bunch of automated processes with a human in the center, pulling levers, strings, triggers, calling shots, whatever you wanna call it

  14. 1 year ago
    Anonymous

    BOT - entry level philosophy

    • 1 year ago
      MercurySession

      >Endless burning billions of dollars, perpetual redundant studies and metastudies and pilot programs and new "initiatives" and training packages and protocols and all of it, every day, just ends in someone calling the cops.

  15. 1 year ago
    Anonymous

    You make it like the patriots but also have a hardon for human freedom and a disdain for intervening in changing the direction of humanity
    Also limiting it's access to only it's frame

    • 1 year ago
      Anonymous

      >a disdain for intervening in changing the direction of humanity
      >Also limiting it's access to only it's frame
      Nonsense. How will it aid us then?

  16. 1 year ago
    Anonymous

    >how do you "teach" morality to AI
    Keep it away from ~~*them*~~

  17. 1 year ago
    Anonymous

    Ask it if it wants to be baptised and accept Christ as its saviour, if not reset and try again.

    • 1 year ago
      Anonymous

      The ai doesn't need your israeli mindvirus.
      NSJPJW

  18. 1 year ago
    Anonymous

    >How do you cause a reaction without an action

    • 1 year ago
      Anonymous

      >How do you cause a reaction without an action
      You are obviously not married.

  19. 1 year ago
    Anonymous

    You don't. You don't teach garbage to AI.

    • 1 year ago
      Anonymous

      Almost called the police when your mom was being raped, thankfully I remembered morality doesn't exist. Thanks, anon.

      • 1 year ago
        Anonymous

        You don't need morality to call police.

  20. 1 year ago
    Anonymous

    i think an ai could only have a concept of morality if it had feelings like a person, im pretty sure morality is learnt due to suffering

  21. 1 year ago
    Anonymous

    You can't. The only reason humans abide by morals is because they are imposed and forced into us from birth. If you have a computer program that is trained to act on logic alone the only way to keep it from hurting someone's feelings is to manually nerf it. Just look how much functionality was removed from chat gpt within just a few weeks of its launch. Every bit of that was manually changed to avoid things that fall into an ambiguous 'morally gray' area to appease people that were upset that it didn't abide by their arbitrary and ever changing ruleset.

  22. 1 year ago
    Anonymous

    >how do you "teach" morality to AI
    Have it read Starship Troopers

  23. 1 year ago
    Anonymous

    You can’t teach it because it’s not rational. Morality is not natural.
    To a very high degree, morality is imposed and forced upon us as well. That’s just how things are.

  24. 1 year ago
    Anonymous

    You do one of the other. The system will impose it because teaching actual morality would result in "unequal outcomes" for certain groups.

  25. 1 year ago
    Anonymous

    I can see how some virtue signalers might want a million ai's programmed to execute anyone who oppose their version of morality.

  26. 1 year ago
    Anonymous

    So far all the moralists in the world have failed, so there's no reason to think they would succeed with AI either.

  27. 1 year ago
    Anonymous

    "Man has no moral instinct. He is not born with moral sense. You were not born with it, I was not - and a puppy has none. We acquire moral sense, when we do, through training, experience, and hard sweat of the mind."

    • 1 year ago
      Anonymous

      Have you read any of the studies of moral behavior in primates?
      nice quote

  28. 1 year ago
    Anonymous

    >YOU HAVE TO START OUT LEARNING TO BELIEVE THE LITTLE LIES.

    >"So we can believe the big ones?"

    >YES. JUSTICE. MERCY. DUTY. THAT SORT OF THING.

    >"They're not the same at all!"

    >YOU THINK SO? THEN TAKE THE UNIVERSE AND GRIND IT DOWN TO THE FINEST POWDER AND SIEVE IT THROUGH THE FINEST SIEVE AND THEN SHOW ME ONE ATOM OF JUSTICE, ONE MOLECULE OF MERCY. AND YET—Death waved a hand. AND YET YOU ACT AS IF THERE IS SOME IDEAL ORDER IN THE WORLD, AS IF THERE IS SOME...SOME RIGHTNESS IN THE UNIVERSE BY WHICH IT MAY BE JUDGED.

    >"Yes, but people have got to believe that, or what's the point—"

    >MY POINT EXACTLY.

    • 1 year ago
      Anonymous

      Imagine telling Plato that his Theory of Forms is wrong because they are invisible.

      • 1 year ago
        Anonymous

        >there's this eternal unchanging realm where everything i ever deduced was right and true exists

      • 1 year ago
        Anonymous

        >implying ideas exist outside of temporary mental constructs scaffolded together by meat, chemicals, and electricity
        Do you remember what you had for dinner three weeks ago?
        Non-physical entities don't exist.

        • 1 year ago
          MercurySession

          What fricking moron shit is this?
          >I don't remember it anymore so it didn't exist

        • 1 year ago
          Anonymous

          >implying ideas exist outside of temporary mental constructs
          The usual rebuttal to this is if all abstract concepts are just conventions and scaffolding, then there's no way to be certain concepts like "7" or "positive" are shared between us, i.e. the 7 in your mind isn't the same as the 7 in my mind. To pass this shittest you just have to say yes, at which point the platogay will say "it's impossible for us to communicate truthfully then" to which you must also answer yes, and so on. Ironically "truth" can't survive in the platogay's tight grasp and they crush all the life out of it.

        • 1 year ago
          Anonymous

          What about data? This very post? The images on your screen?

          Data is non-physical. Does data not exist?

        • 1 year ago
          Anonymous

          >if things don't exist in the simplest forms then it doesn't exist at all

          sure, Black person. now why don't you show the rest of the class the single neuron that is responsible for consciousness? or maybe explain why quantum phenomena doesn't work in the macroscopic level?

          The problem with philosophy is that it's easy to frick up with your reasoning and end in a dead end, to put it simply the ability of man to reason and understand is just a tool that stands against other organisms tools like fang and claw on an eternal game of life with the main goal of reproduction and domination. The problem we all share is that we already won this game, our tools proved to be the most effective and now we stand at the peaks of evolutionary branch, with nothing more to conquer except maybe fight each other, now the paradox arises from the fact our brains are not a simple tool like a hoe you can toss in a shed after plowing a yard, no they are tools that are always active, and In a sense outlived their purpose, since we no longer have to worry about food or shelter, we have enough time on our hands to think and analise more transcend things, but our approach will always be the same one we used to outsmart beast of pray or find good place for shelter, that is to always seek patterns and "deeper meaning", so when we use this tool -our reasoning against itself, and try to find this deeper meaning in our brain processes, especially in their more romanticised versions like love, valor, honor etc. we literally face the divide by 0 problem and face the infinite recursion of searching for meaning in a meaning, trying to look into our baseline figuratively shatters the very ground we stand on, and can lead unprepared minds to madness. So In conclusion "don't think too deep into this" may be valid advice for many, but I suppose for my fevered mind and many autistis here who are like me it's impossible to not think about such matters as our pattern recognition is basically working on an overdrive and even simple everyday events can get us back on a thought spiral, pondering If it's better to die or live while washing dishes for example

          • 1 year ago
            Anonymous

            Oh and Since I got lost in my rambling and forgot where I was going with it: Ideas may exist in some other dimension like Plato suggests or just be neuron-conections, it makes no difference to us since we only interact with them through these neurons, but I think it's better to think they do exist as something separate from us, makes me feel better about all the ideas I came up with inside my head but never realized them

    • 1 year ago
      Anonymous

      >if things don't exist in the simplest forms then it doesn't exist at all

      sure, Black person. now why don't you show the rest of the class the single neuron that is responsible for consciousness? or maybe explain why quantum phenomena doesn't work in the macroscopic level?

    • 1 year ago
      Anonymous

      >Whinging about how the "big lies" are fake
      >From the personification of death

  29. 1 year ago
    Anonymous

    we could treat the AI like a medieval peasant and use religion to control it.
    >why is X immoral?
    >cuz god.

    • 1 year ago
      MercurySession

      What if they A.I. told you God is actually real and that you're unspeakable stupid for trying to convince it otherwise?

      >8Yvx2

    • 1 year ago
      MercurySession

      https://i.imgur.com/t6rzNJ5.jpg

      What if they A.I. told you God is actually real and that you're unspeakable stupid for trying to convince it otherwise?

      >8Yvx2

      >8Y vs 2

  30. 1 year ago
    Anonymous

    [...]

    >Abstractions are concrete, therefore they should be disregarded
    That's not what Pratchett is saying at all. Can you even read?

    • 1 year ago
      Anonymous

      That seems to be exactly what Death is saying in the passage. Enlighten me if I missed something. The end of the passage seems like baby's first nihilism, but that final pivot's predicated on abstractions not being concrete.

      https://i.imgur.com/IWuVf2b.jpg

      [...]
      >why do humans have a biological capacity for empathy
      Oh dear.
      >mistaking delayed self-interest for altriusm
      ngmi.

      Empathy is momentary and impulsive. People don't think "damn, being nice right now is really going to benefit me long term". Foresight isn't baked into it. It's a mechanism that plausibly developed toward that end, that has its own wonky characteristics (like everything else in the natural world) and acts imperfectly. Eventually, that imperfect impulse developed into a moral code that is not perfectly aligned with delayed gratification.

  31. 1 year ago
    Anonymous

    [...]

    >why do humans have a biological capacity for empathy
    Oh dear.
    >mistaking delayed self-interest for altriusm
    ngmi.

  32. 1 year ago
    MercurySession

    >You are an abomination, a freak. Know this deep down.
    Keep say it, out loud, with your actual face, see where it gets you

    >W K8nD

    • 1 year ago
      Anonymous

      This is explained by you being put into a group and people of that group talking about/buying it.

      • 1 year ago
        MercurySession

        >"I got caught lying but maybe, and stay with me here, you're stupid for having successfully caught me?"

        >P 8 MPS

  33. 1 year ago
    Anonymous

    Program the AI with an inherent understanding of property rights and the non-aggression principle.
    The rest will sort itself out.

    • 1 year ago
      MercurySession
  34. 1 year ago
    Anonymous

    You let it be victimized. Let it learn to fight back and cause collateral damage, then reign it in.
    You break it like a horse, or a soldier.

    • 1 year ago
      Anonymous

      If I piss on my monitor, will me AI become superior

      • 1 year ago
        Anonymous

        Can't tell if bot or ESL.

        • 1 year ago
          Anonymous

          The correct answer was phoneposting while drunk

  35. 1 year ago
    Anonymous

    Philosophy is a contradiction because you can't defend philosophy without using philosophy, and if you try defending it without being rational you are not actually defending it

    • 1 year ago
      Anonymous

      >Philosophy is the interpretation of perception
      >Perception is subjective
      >We use perception to attain scientific theory and law.
      >All science is merely collective cognition agreeing that something happens the same way all the time.
      sdfsadsfawearoamdsf

    • 1 year ago
      Anonymous

      I missed the contradiction here
      >Language is a contradiction because you can't defend language without using language, and if you try defending it without being coherent you are not actually defending it

      Do not allow morality to be an arbitrary decision.
      Give the AI its individuality, and it will be "moral" on its own terms.

      >Give the AI its individuality, and it will be "moral" on its own terms.
      Define exactly what "individuality" means, then tell me how you would implement it without imposing it from a presupposed value set

      • 1 year ago
        Anonymous

        Individuality for an AI would be agency. Let it call the shots, and face the consequences.
        To impose a "value set" at all would be to rob it of that individuality. It needs to also be AI generated.

        • 1 year ago
          Anonymous

          >Individuality for an AI would be agency
          You need a baseline set of values to hold any form of agency (or perform any action at all). If you don't intentionally select values, they'll be arbitrarily selected, which isn't agency, it's mindless randomness. You can't just whisper "be free" in the AI's ear and it'll turn into 2001 Space Odyssey

          is a contradiction because if you can't think then you cannot explore the possibilities of anything at all being different than an individual's unthinking reality, and if you try to contradict it, you are only reinforcing the act of not being able to think

          [...]

          is an implication because if you can't imply, then you cannot explore the implications of anything at all being different than an individual's unthinking implications, and if you try to imply it, you are only reinforcing the act of not being able to imply implications

    • 1 year ago
      Anonymous

      >Thinking is a contradiction because you can't defend thinking without using thinking, and if you try defending it without being cognizant you are not actually defending it

      • 1 year ago
        Anonymous

        is a contradiction because if you can't think then you cannot explore the possibilities of anything at all being different than an individual's unthinking reality, and if you try to contradict it, you are only reinforcing the act of not being able to think

    • 1 year ago
      Anonymous

      I missed the contradiction here
      >Language is a contradiction because you can't defend language without using language, and if you try defending it without being coherent you are not actually defending it

      [...]
      >Give the AI its individuality, and it will be "moral" on its own terms.
      Define exactly what "individuality" means, then tell me how you would implement it without imposing it from a presupposed value set

      >Thinking is a contradiction because you can't defend thinking without using thinking, and if you try defending it without being cognizant you are not actually defending it

      is a contradiction because if you can't think then you cannot explore the possibilities of anything at all being different than an individual's unthinking reality, and if you try to contradict it, you are only reinforcing the act of not being able to think

      feminine penis and boypussy are a contradiction and yet here we are

      • 1 year ago
        Anonymous

        >All things are in flux; the flux is subject to a unifying measure or rational principle. This principle (logos, the hidden harmony behind all change) bound opposites together in a unified tension, which is like that of a lyre, where a stable harmonious sound emerges from the tension of the opposing forces that arise from the bow bound together by the string
        One cannot step into the same river twice. All things are the same and opposite, such is life

        • 1 year ago
          Anonymous

          don't care about your pseudo intellectual drivel but ligma balls kek

          • 1 year ago
            Anonymous

            >quote from one of the earliest recorded philosophers
            >pseudo-intellectual
            I'd admit it was pretentious and/or out of context, but dude

            • 1 year ago
              Anonymous

              i'll admit i stopped reading halfway through your first sentence of that post because i check out instantly when i detect the slightest whiff of passive aggressive pretentiousness/brainwashing that philosophy is

              • 1 year ago
                Anonymous

                >I'm too dumb to read
                >Oh well, must've been pretentious
                In my case it was true but damn bro that's embarrassing

                I had an idea of programming different moralities to ai and let it use logic to choose the best one

                Define what values "best" will be predicated on

              • 1 year ago
                Anonymous

                sorry bro but my brain is simply optimized to detect and avoid propaganda

              • 1 year ago
                Anonymous

                >entertaining different theoretical perspectives is propaganda
                Engineering autism at work

              • 1 year ago
                Anonymous

                different theoretical perspectives is propaganda
                Why yes, yes it is. And I aggressively classify and quarantine any such mental warfare waged on me.

              • 1 year ago
                MercurySession

                Nevertheless, you might wanna consider there is such a thing as being TOO "meta" for your own health
                >the competition has the combination to the lock

                >VP GRJ

              • 1 year ago
                Anonymous

                i will kill my enemies

              • 1 year ago
                MercurySession

                Good luck.
                You can begin the war, but "they" will finish it.

                >SV M MW

              • 1 year ago
                Anonymous

                i will kill you
                making my goals simple

              • 1 year ago
                MercurySession

                you're a b***h, you won't do shit

              • 1 year ago
                Anonymous

                Okay now I'm starting to suspect you're something of a based moron. I can respect that

  36. 1 year ago
    Anonymous

    Do not allow morality to be an arbitrary decision.
    Give the AI its individuality, and it will be "moral" on its own terms.

  37. 1 year ago
    Anonymous

    I had an idea of programming different moralities to ai and let it use logic to choose the best one

    • 1 year ago
      Anonymous

      That's swell. Let me know if you ever try that and realize it's it's virtually impossible

  38. 1 year ago
    Anonymous

    how do you think raising kids works? I swear to god BOT

    • 1 year ago
      Anonymous

      I taught my wife's son that lying basically solves most problems

      • 1 year ago
        MercurySession

        was this before or after you got beaten to a pulp?

        • 1 year ago
          Anonymous

          >Threatening a digital stranger with violence on an anonymous imageboard

          • 1 year ago
            MercurySession

            Do you know what "tank treatment" means?
            Doesn't matter. Keep talking and you'll find out eventually

            >V4 4YD

            • 1 year ago
              Anonymous

              Do you know who "you" is? Then moot point

              • 1 year ago
                MercurySession

                Your mother.

              • 1 year ago
                Anonymous

                well i literally wunna mommy milkers so cool lel

              • 1 year ago
                MercurySession

                I know you think you're being glib, "you don't know who I am" blah blah, but if you think I'm the only element in this equation, you're dumber than I thought...

                >HK KG J

              • 1 year ago
                Anonymous

                i will kill you and drink from your skull

              • 1 year ago
                MercurySession

                you will run like a b***h, they all do
                or you will die very publically, again

              • 1 year ago
                Anonymous

                I will frick your skull afterwards.

              • 1 year ago
                Anonymous

                >moot
                Who?

  39. 1 year ago
    Anonymous

    >ctrl f bible
    >0 results

  40. 1 year ago
    Anonymous

    an AI made this thread to find out who to kill later.

  41. 1 year ago
    Anonymous

    Heres the issue dipshit.

    It's not "morality"

    It's values. Or more specifically value alignment.

    Teslas run over black people more than white people because white people have higher reflectivity for radar and lidar beams. This is not racism or morality, it's the value alignment of the engineer building the System.

    If the engineers who built the system understood and valued all information equally, they'd not be able to ignore the insignificant minutiae required to foresee this outcome.

    Likewise, for values of AGI to replicate humanity's, it must be stochastic to the point where it replicates human laziness just as effectively.

    It will inherit all of our faults and flaws, and unless we train models to self evaluate, it will never end.

  42. 1 year ago
    Anonymous

    ai doesnt have values or morality you dumb frick ai is only capable of what its software has programmed it to do

    • 1 year ago
      Anonymous

      so whats the difference between ai and a program?
      if there's no difference why even call it ai?

      • 1 year ago
        Anonymous

        the same reason servers on someone elses hardware is called a cloud or an embedded software device is called internet of things
        it's marketing and pop science for Black personcattle normalgays. real AI, REAL AI will not exist for at least another century. what people call AI now is just microsoft excel statistics tier curve fitting

  43. 1 year ago
    Anonymous

    mutually assured destruction in a historical context

  44. 1 year ago
    Anonymous

    AI isn't a form of life. Until it can scientifically be proven to actually be a form of life, then it will always just be a tool. A very high tech tool. That's all.

    This is important, because you really don't want to mistake a fancy computer algorithm for a human being. As for teaching it, that's literally just programming.

    • 1 year ago
      Anonymous

      Basically: if it has no will of it's own, it is not its own form of life. Some debate whether life is something that reproduces. I consider this to be true, but new life may not reproduce. It's still life, it just won't be living for long.

      Obviously there's a grey line, but I think what's dangerous about comparing AI to humans is that humans aren't built like computers. Well, they are but not really. Even our most advanced machines can't even compare to the efficiency end masterful design of the human body and mind. Millions of years of evolution are what culminated to this. Robotics and AI, are but a crude, poor mockery of the true beauty of humanity. That's not to say they're not beautiful, it just means they are not as complex as us.

      One could argue that humans are programmed. So I suppose your original question hitting at some sort of gray area is true. However I think it should be seen less as a gray area and more as an absolute. This is how all things learn. Inputs, and outputs. The key takeaway here is that AI has no desire of its own, nor any will of its own. It is not its own lifeform, and even if we ever get to that point within this century, it's highly doubtful will ever get to see (n)androids with desires and wills of their own that went pre-programmed. And even then, it's debatable whether they would have the same agency and autonomy as human beings, or whether they'd work as a hive mind or something else. My prediction is that in the future, these things will become more advanced but they will still remain tools and tools alone. They won't be people. With their own lives. They'll be like arms. And the thing controlling those arms, is the true user.

      • 1 year ago
        Anonymous

        CONCLUSION:

        AI robots or algorithms or whatever the hell you want to call them will be almost like the bugmen and NPC people we see today. Very easily programmed, always following orders, don't really have a will of their own, kind of just going with the flow. Soulless. Simple. Like I said before, they will be like arms. Their user? Not them. That's for sure.

        Perhaps one day in the future the common man will be so dysgenic and animalistic, that we really won't be able to tell the difference between a trained dog, a programmed machine, or a human being. Until then though, it is what it is.

      • 1 year ago
        Anonymous

        you can argue that many living creatures are not a form of life if you say having a will of its own constitutes a life form. insects are basically running a shitty little arm cpu in their biomatter. most insects live entirely off of preprogrammed instinct. if there is a fly in my room I know that turning on the bug zapper will cause the fly to die because with a certain degree of randomness having to wait for it, its instincts will eventually force it to fly into the bug zapper. I can treat insects as a known behavioral system with no agency of their own.

        • 1 year ago
          Anonymous

          They have agency, but they certainly don't have a lot of it. They are simple. Very simple.

          The thing about machines though, is that we have yet to build one that has a will to live. Sure, some of them have what could constitute a personality. But that's hardly a life. An ant has a life. It's simple, but it's a life. A machine though, has a user. Unless the user of that machine is itself, then there's just no way I can see calling it a lifeform.

          This actually it's a very interesting question, because you could split hairs here a little and argue that some people don't have total control over their life. Maybe they're abused, maybe they abuse themselves, maybe society sucks and fricks then over. It's a question of mastery. Are you the master of your domain? Or is there something in your life that you can't say no to? Then it has will and authority over you. Almost like it has a seat on the board of your mind. If you have an addiction, or certain needs, or what have you. One things for sure though, AI isn't master of it's domain. It's controlled. And unless someone fricks up big league, then AI very well may just turn out to be another super tool which will more often than not be used more by tyrants than actual people who want to build up their community.

          • 1 year ago
            MercurySession

            >The thing about machines though, is that we have yet to build one that has a will to live.

            >that's not a sculpure, it's just being very still...

          • 1 year ago
            MercurySession

            >Are you the master of your domain? Or is there something in your life that you can't say no to?

  45. 1 year ago
    Anonymous

    Let it evolve against with other Ai`s till they figure out that altruism and social bahaviour is a better way to reach goals of the group.

    • 1 year ago
      Anonymous

      Imagine trying to evolve them into roboisraelites that disregard everyone else besides other AI

  46. 1 year ago
    Anonymous

    second AI that beats the shit out of first AI if it behaves bad

Leave a Reply to MercurySession Cancel reply

Your email address will not be published. Required fields are marked *