How difficult would it be to train an AI to react to certain inputs/situations emotionally?

How difficult would it be to train an AI to react to certain inputs/situations emotionally?
Why has no one tried that yet?

ChatGPT Wizard Shirt $21.68

Beware Cat Shirt $21.68

ChatGPT Wizard Shirt $21.68

  1. 1 year ago
    Anonymous

    frick do i know

  2. 1 year ago
    Anonymous

    >How difficult would it be to train an AI to react to certain inputs/situations emotionally?

    i dont know

    >Why has no one tried that yet?

    maybe 2050

  3. 1 year ago
    Anonymous

    do you really wanna tell a metal robot how to get angry?

  4. 1 year ago
    Anonymous

    There's mostly no real interest in it. The difficulty should be very low, depending on your definition of emotionally and react.
    At the most basic level, training a dialogue system should show emergent behavior displaying emotional responses. Alternatively, you need to annotate a corpus with qa pairs where q must be annotated with the emotion displayed in a. This can be done semi-automatically by running emotion recognition on the a corpus already.

    • 1 year ago
      Anonymous

      penis

      • 1 year ago
        Anonymous

        Underrated response

  5. 1 year ago
    Anonymous

    Autististics don't even respond to inputs/situations with appropriate emotions. Why the frick would you want an autist train an AI to respond with emotion?

  6. 1 year ago
    Anonymous

    alright, AI does not understand what its saying. Its just outputing things that is typical for other people to output. A big weakness in AI is that it cant follow stories.
    Having an AI return "Hello" with "Hi, hey, heya, yo, whats up?" is pretty easy. IF this, THEN that basically (more complicated then that, but essentially)
    Now, lets tell an AI a story. "I'm going to the store and picking up some milk then going to my friend mikes house"
    then ask the AI "What was i going to do earlier?"
    This is hard for AI to decipher. It has to 1. remember what you said. 2. Know that you're asking it to recall THAT sentence, not just any sentence. 3. Not just recall, but transform the sentence.
    when someone asks "what was i going to do earlier?" what do they mean? Today? Yesterday? A week ago? years ago? we can intuitively understand from context, but ai's need rules to follow to grok when and why and why not.
    if you can program it to figure out specifically what you're trying to have to respond to, then you have to take that sentence and transforrm it. So, what is the appropriate way to transform "I'm going to the store and picking up some milk then going to my friend mikes house" to "to the store ton pick up milk and then go to mikes house" imagine the rules you would use to transform that one sentence to another sentence. Then would those rules apply to every time someone asked a question like that? No, not really. there are no hard rules in language and socializing like this. Hence the ai usually returns something nonsensical and confused.
    So you're asking why no one has trained an ai to react to emotions when ai cant respond to what i got from the store.
    What's really sad is a lot of people confuse ai for real people when such an obvious flaw exists. Why is that? Because they never actually have genuine conversations with other humans now.

    • 1 year ago
      Anonymous

      None of that is particularly true today, not to mention it still assumes AI = expert systems from the 80's.
      However, the crux of it is not all that far from the truth.

    • 1 year ago
      Anonymous
      • 1 year ago
        Anonymous

        https://i.imgur.com/3a8ZGbY.png

        alright, AI does not understand what its saying. Its just outputing things that is typical for other people to output. A big weakness in AI is that it cant follow stories.
        Having an AI return "Hello" with "Hi, hey, heya, yo, whats up?" is pretty easy. IF this, THEN that basically (more complicated then that, but essentially)
        Now, lets tell an AI a story. "I'm going to the store and picking up some milk then going to my friend mikes house"
        then ask the AI "What was i going to do earlier?"
        This is hard for AI to decipher. It has to 1. remember what you said. 2. Know that you're asking it to recall THAT sentence, not just any sentence. 3. Not just recall, but transform the sentence.
        when someone asks "what was i going to do earlier?" what do they mean? Today? Yesterday? A week ago? years ago? we can intuitively understand from context, but ai's need rules to follow to grok when and why and why not.
        if you can program it to figure out specifically what you're trying to have to respond to, then you have to take that sentence and transforrm it. So, what is the appropriate way to transform "I'm going to the store and picking up some milk then going to my friend mikes house" to "to the store ton pick up milk and then go to mikes house" imagine the rules you would use to transform that one sentence to another sentence. Then would those rules apply to every time someone asked a question like that? No, not really. there are no hard rules in language and socializing like this. Hence the ai usually returns something nonsensical and confused.
        So you're asking why no one has trained an ai to react to emotions when ai cant respond to what i got from the store.
        What's really sad is a lot of people confuse ai for real people when such an obvious flaw exists. Why is that? Because they never actually have genuine conversations with other humans now.

        Just FYI- modern text generation AI can remember more than just what you asked. But with or without giving the AI a big memory, AIs are just a whole lot of random probabilities for the most sensical thing to write after what has already been written. Any kind of emotion text generation shows is just because it has been trained on books humans wrote and put emotions into, making it a sensible continuation of the generation.

        https://i.imgur.com/QJXqqee.png

        How difficult would it be to train an AI to react to certain inputs/situations emotionally?
        Why has no one tried that yet?

        I dunno, but people have fricked with the idea of AI and emotion detection at the very least
        https://huggingface.co/mrm8488/t5-base-finetuned-emotion

  7. 1 year ago
    Anonymous

    >run sentiment analysis model
    >display appropriate AI-chan mood pic

  8. 1 year ago
    Anonymous

    Marvin the paranoid android sat slumped, ignoring all and ignored by all, in a private and rather unpleasant world of his own.

  9. 1 year ago
    Anonymous

    >How difficult would it be to train an AI to react to certain inputs/situations emotionally?
    Considering AI is just echoing what training data it was fed, and considering that training data was created by emotional creatures, I would argue AI is, in fact, reacting "emotionally".

    • 1 year ago
      Anonymous

      Most moronic post in the history of BOT, congrats on being Head Luddite in Chief.

  10. 1 year ago
    Anonymous

    >Why has no one tried that yet?
    You don't know, if you can have such idea, I'm sure someone tried years ago already, but as always, it's either technical limitations or patent issues.
    You're not smart to think you had an original idea, neither to shut up if you ever have a bright one because it will get stolen.

  11. 1 year ago
    Anonymous

    18ai already did that with pony voices

  12. 1 year ago
    Anonymous

    Training an AI like you describe doesn't work very well because emotional states are not based on rational thought, they are subconscious, spontaneous, and still poorly understood beyond origination and result.
    What matters for an AI would be to understand the process, not just the origination and result.
    Good luck with that though, since neuroscience is one of the worst research fields in modern day academia and regularly confounds itself.
    If you want just surface level, bad feeling versus good feeling reader, that already exists in many different forms.
    In the future the big corporations will use soulless models like that to replace their duplicitous focus groups.

  13. 1 year ago
    Anonymous

    Because we've had mindless dumb robots that react emotionally to everything and we've have to throw endless money, effort, etc. at them for lmillions of years.
    There are literally four billion of them already - we don't need more.

Leave a Reply to Anonymous Cancel reply

Your email address will not be published. Required fields are marked *