AI Chatbots have a bad incentive

A quick little video as an introduction:

friend.com sells themselves as a company that will provide chatbots for lonely people to talk to to feel better about themselves and/or be an supplementary partner for people and their own set of human relationships.

Of course, the only people who will believe this are the most vulnerable people that they are trying to prey upon. This is terrible, and once again I am begging you dear reader, if you are feeling lonely, do anything but talk to an AI chatbot! If you doing it because you are curious, go ahead, but if you are doing it because you are feeling the hurt pangs of loneliness in your heart, do anything else. If you find that you have no one to reach out to, do something to distract yourself, hustle and grind, generate side income streams, go to a new social gathering, binge eat, play a video game, take a walk, pick up a new hobby, learn a new skill, develop a parasocial relationship with your favorite Twitch streamer and niche internet celebrity. Almost anything is better for you then the poison of AI chatbots posing as your friends sold to you by companies whose main objective function is to generate a profit off of your suffering and vulnerability in your current life circumstances. At least twitch streamers or your favorite youtuber are humans (unless you’re into faceless AI video slop).

Know that you always have more strength in you than you could possibly imagine yourself to have. You can get through these hard times. Sometimes we all forget that we do have free will (and don’t let anybody try to convince you otherwise) and that you can just do things.

“The only way to deal with an unfree world is to become so absolutely free that your very existence is an act of rebellion.” - Albert Camus