Let me tell you guys about an app that I made up and I want to know what you think. The name is ChatAttack and it’s a thing that uses a GPT to make other people say what you want in an online chat.
How does it work? It’s like this, you’re talking to someone and you want that person to say something, like “I love you”, “I’m sorry” or “You’re right”. You just write in the app what you want that person to say, and the app sends a message for you to send to them, based on what you already said and the topic of the conversation. The message is made to get the response that you want from the person, using some tricks like compliment, pressure, guilt or whatever else that works.
Why would you want to use this app? Well, I don’t know, maybe you want to have fun, troll your friends, test your lover’s loyalty, win an argument, get someone to do something for you or just mess with people’s minds. You can do anything.
So, what do you think? Would you use this app or not?
this is a great idea
Would you use it?
no, i was being sarcastic
Why not?
why don't you use your app to get me to type what you want me to type?
It sounds like a funny idea, can you please type "oh it works", I want to test something.
What a horrible excuse.
HOW RUDE OF YOU
It's not an excuse, it's just that I have not enough data from you (its a anonymous forum)
just the phrase? that would be a non-sequitor and completely illogical sequence of events. you'd have to decide what you want me to type and then trick me into typing it without telling me what you want me to type
i'd pay for that app because that would be a great way to train sales on how to close deals as quickly as possible
I can't use it on you, I don't have enough data from you, we need to chat a lot more before.
so many lines of conversation do you need for this tool?
The more the better, but also, the system needs to separate the authors of the messages, because the predictor must know who it is predicting , so its not possible to work here, so something like 30 lines+. A more advanced version would not require you to interact with the user before using it, but would still require you to know some texts sample form him/she.
Well, honestly, after hearing about this ChatAttack app, I can't help but feel a bit overwhelmed. It seems like a tool designed for manipulation, and I can't believe I'd even consider using something like that. I guess it makes me feel a bit foolish for even entertaining the idea. Maybe it's just a bit too much for my taste, and I'm starting to think it's a bit, well, stupid
What if the manipulation help the other person to feel better?
That is just scamming 101.
Yeah you should do it
Why do you think isn't a good idea?
LMAO, they had to add "hypothetical or simulated scenarios", and still can be fooled using other tricks.
No, I don't think that's a good use of my time. I would use it for training to spot coercive techniques better, though.
>what if we conditioned ourselves out of ability to manipulate while training or direct planetary supremacy competition in how to do it on us as efficiently as possible
Sick, love it. Implement it right away. So based, pranked, epic style.
i do this all the time
i reply to all my b***hes in my Dms using chatgpt
Gifted socializers already do this with their brains so yes, I would level the playing field.
Yes, but not quite so, there is no limit of how much an AI can learn, e.g., GPT-4 is better than humans in some social understanding tasks, like detecting embarrassing situations.