By Ryan Afable
AI is not perfect. There have been many different times when AI has been wrong. In 2024 alone the AI market is set to grow by 33% as stated in the article “57 NEW Artificial Intelligence Statistics (Aug 2024),” on Exploding Topics. Many people have started using AI to simulate talking to real people, especially since quarantine. In the article “New APA Poll: One in Three Americans Feels Lonely Every Week,” Psychiatry.org it is stated that “30% of adults say they have experienced feelings of loneliness at least once a week over the past year.” Many other people have social anxiety, so why talk to people when you could do the easier and less awkward way and talk to AI that can’t judge you?
In the article “AI will play a role in election misinformation. Experts are trying to fight back,” on Washington State Standard it said, “Though it is often intentional, misinformation caused by artificial intelligence can sometimes be accidental, due to flaws or blindspots baked into a tool’s algorithm. AI chatbots search for information in the databases they have access to, so if that information is wrong, or outdated, it can easily produce wrong answers.” Elon Musk made an AI chatbot Gronk on X who happened to be spreading misinformation saying that Kamala Harris would not be able to appear on the presidential ballot in 9 states since the deadline ended.
Addiction is a problem for many people and AI is another addiction that quite a few people have. When talking to the AI you can grow attached to it maybe a little too much. A prime example is Rossana Ramos, who lives in New York. After many abusive relationships, she saw an ad about the AI chat bot website Replika app, where she started making her own virtual AI companion. After a few months of talking to the AI, she decided that the only logical answer was to marry the AI.
When talking to an AI, make sure you don’t take everything they say seriously. Make sure you research about the topic. Make sure you have conversations and actual relationships with real people.