Two studies conducted by OpenAI and MIT Media Lab found that a small number of test subjects who used ChatGPT extensively reported increased loneliness and emotional dependence, and decreased social interaction. In other words, the study shows that lonely people are more likely to seek emotional connection with an AI robot. This says a lot about how people approach relationships, how we are increasingly dependent on technology, and how we are integrating it deeply into more aspects of our lives than just completing tasks. This also raises the question of how we will interact with chatbots in the future and what impact this will have on us. A study conducted by the OpenAI team analyzed more than 4 million ChatGPT conversations from 4,076 participating users who voluntarily reported their feelings about the service. In another study, researchers at the MIT Media Lab had 981 people use ChatGPT for at least five minutes a day for four weeks. The researchers then surveyed the participants about their perceptions of ChatGPT, as well as their own loneliness and connectedness in the real world, the social interactions they participated in, and whether they saw any problems with using AI services. If you didn’t immediately realize this: OpenAI developed and sells ChatGPT. So, yes, this is a pretty self-aware move by the company to examine whether its product is having a negative impact on its target audience, and whether it can learn anything to prevent those effects from getting worse. From both studies (neither of which has yet to be peer-reviewed), the researchers found that most people don’t develop a deep emotional connection with ChatGPT, even for some of the most frequent users of its realistic advanced voice mode (where you can have a fairly natural back-and-forth conversation with the bot). The study found a modest correlation between using ChatGPT for “private” conversations and loneliness. At the same time, this type of use was associated with lower levels of emotional attachment. So it’s a bit of a mixed bag. As Casey Newton writes in his Platformer newsletter, it’s possible that “sufficiently compelling chatbots could pull people away from their relationships, potentially making them feel lonelier and more dependent on artificial companions whom they must pay to maintain contact.” Deeper, more specific research is needed to get a clearer picture of the impact on well-being of people’s continued use of such services. But some are already tapping into the human interest and need for connection, and AI companions offer a way to feel like you’re making a connection. This is not to say that AI chatbots are bad for us in every way. For some people, they can provide meaningful ways to ease loneliness and find ways to privately express and reflect on what they’re going through. However, the study suggests that platforms need to develop bots more responsibly, while being aware of how much energy people invest in connecting with them. At the same time, regulators need to develop frameworks that prevent companies from taking advantage of deeply engaged users and encourage companies that develop AI systems to actively prioritize the well-being of their audiences. |
The QQ number deregistration function will be imp...
The August 1st Army Day is coming, and for such a...
There are less than 100 days until the 33rd Summe...
Gas leaks can easily lead to explosions How to us...
When I first started learning operations , I read...
Video content introduction: The speaker is Peng Xi...
With the end of the May Day holiday, everyone beg...
The first time I heard the word "tonality&qu...
The points app has always been a common marketing...
Lithium batteries can be said to be a major proje...
【Analysis Report】Introduction to Metaverse-relate...
In the past few years, facial recognition technol...
A normal adult has 206 bones in his body, and our...
On April 23, the most famous landmark LiveHouse i...
1. Market Overview in April 2024 Monthly transact...