AI found a link between loneliness and ChatGPT usage
Increased usage of AI-powered chatbots, such as ChatGPT, may indicate a rise in feelings of loneliness among users, according to a joint study by OpenAI and the Massachusetts Institute of Technology (MIT).
The researchers conducted two month-long studies involving 1000 participants, examining their interactions with the chatbot. Some participants engaged in unlimited conversations, while others had personal and emotional discussions.
The results showed that those who communicated frequently with the chatbot, whether via text or voice, reported increased emotional dependence on their "digital companion" and higher levels of loneliness.
An analysis of 3 million conversations and user surveys suggested that only a small percentage of users had emotional conversations with the chatbot.
The scientists hope that this research will lead to further exploration of human-artificial intelligence interaction.
In July 2024, entrepreneur Avi Shiffman launched the Friend collet, a wearable device with built-in AI, aimed at combating loneliness.
Heavier users of the chatbot were more likely to anthropomorphize it and consider the AI as a "friend," or attribute human-like emotions to it. Additionally, loneliness was particularly pronounced when users set the chatbot’s voice mode to the opposite gender.
However, the researchers noted that this is a complex relationship with a "chicken-or-egg" problem: it remains unclear whether heavy use of the chatbot contributes to loneliness, or if lonely individuals are more likely to seek out AI companions for emotional support.
In summary, the study revealed a correlation between heavy chatbot use and loneliness, dependence, and lower social interaction, though the causality has yet to be established.
The study's findings suggest that increased usage of AI-powered chatbots like ChatGPT may be linked to higher levels of loneliness and emotional dependence among users. Furthermore, the researchers observed that users who engage in frequent and personal conversations with the chatbot often anthropomorphize it, considering it more as a friend or attributing human-like emotions to it, especially when the chatbot's voice is set to a gender opposite from their own.