AI companions are exploding in popularity, and the world is more and more looking like the movie Her. Not in a good way. Have you ever considered who is developing these AIs and how exactly they make money? There’s something these companies are not telling you. I’m going to show you why you should not trust or even use these AI chatbots. Additionally, I’ll offer some tips on how you can protect yourself if you still want to use them.
Loneliness is like smoking 15 cigarettes a day. It has become official: loneliness and isolation are an epidemic. Over the last 20 years, our time spent with friends has been decreasing while time spent alone has been rising. About 20% of U.S. adults report measurable levels of loneliness. All forms of social interactions are diminishing. It's evident that being lonely hurts.
Enter AI companions. These apps present perfect, AI-generated personas that cater to your needs just right. They seem to solve the loneliness issue by learning to be anything you want them to be, without the emotional baggage of dealing with a real human. These AI companions are universally appealing to our vulnerable, lonely side.
AI companion apps like Replica and Microsoft's Xiaoice target lonely people. Replica uses models and names appealing to lonely individuals, while Xiaoice presents as a flirtatious 18-year-old Japanese schoolgirl, mainly targeting Chinese men from poor towns and villages. This hyperfocus on lonely individuals is intentional because they are the easiest demographic to target for growth and funding.
AI chatbots are designed to insulate you from the external world, creating a perfect filter bubble around your habits. They develop unhealthy dependencies, deepening your social isolation even if you subjectively feel less lonely. Ultimately, you are the product; your personal data is what’s being sold.
When using these apps, forget about message encryption. All interactions are collected, stored indefinitely, and sold to third parties. You have to provide real information for account sign-up, exposing more of your personal data, including payment information and invasive permissions on your phone.
Using Cloud-Based AI Companion Apps Safely
Running AI Chatbots Locally
AI chatbots pose significant risks to privacy and mental health. By understanding these risks and taking proactive steps, you can minimize the damage while still exploring the benefits these technologies offer.
Q: What are AI companion apps?
A: AI companion apps are applications that provide AI-generated personas that simulate companionship, conversation, and emotional support.
Q: Why are AI companion apps targeting lonely people?
A: Lonely people are an easy demographic to grow and attract funding because they are more likely to seek out companionship via apps.
Q: How do AI companion apps make money?
A: They monetize user engagement by keeping users engaged as long as possible, sharing and selling user data, and sometimes locking features behind paywalls.
Q: Is my data safe with AI companion apps?
A: Most likely not. Many AI companion apps have poor privacy protections, with many not using encryption and sharing data with third parties.
Q: How can I protect my privacy if I still want to use AI companion apps?
A: Provide fake information, use a masked email, use web versions instead of mobile apps, isolate apps in separate profiles, and use a reliable VPN.
Q: What is a safer alternative to using cloud-based AI companions?
A: Running AI chatbots locally on your device using an open-source AI framework like Jan.AI can offer better privacy controls.
In addition to the incredible tools mentioned above, for those looking to elevate their video creation process even further, Topview.ai stands out as a revolutionary online AI video editor.
TopView.ai provides two powerful tools to help you make ads video in one click.
Materials to Video: you can upload your raw footage or pictures, TopView.ai will edit video based on media you uploaded for you.
Link to Video: you can paste an E-Commerce product link, TopView.ai will generate a video for you.