Two studies conducted by OpenAI and MIT Media Lab found that a small number of test subjects who used ChatGPT extensively reported increased loneliness and emotional dependence, and decreased social interaction. In other words, the study shows that lonely people are more likely to seek emotional connection with an AI robot. This says a lot about how people approach relationships, how we are increasingly dependent on technology, and how we are integrating it deeply into more aspects of our lives than just completing tasks. This also raises the question of how we will interact with chatbots in the future and what impact this will have on us. A study conducted by the OpenAI team analyzed more than 4 million ChatGPT conversations from 4,076 participating users who voluntarily reported their feelings about the service. In another study, researchers at the MIT Media Lab had 981 people use ChatGPT for at least five minutes a day for four weeks. The researchers then surveyed the participants about their perceptions of ChatGPT, as well as their own loneliness and connectedness in the real world, the social interactions they participated in, and whether they saw any problems with using AI services. If you didn’t immediately realize this: OpenAI developed and sells ChatGPT. So, yes, this is a pretty self-aware move by the company to examine whether its product is having a negative impact on its target audience, and whether it can learn anything to prevent those effects from getting worse. From both studies (neither of which has yet to be peer-reviewed), the researchers found that most people don’t develop a deep emotional connection with ChatGPT, even for some of the most frequent users of its realistic advanced voice mode (where you can have a fairly natural back-and-forth conversation with the bot). The study found a modest correlation between using ChatGPT for “private” conversations and loneliness. At the same time, this type of use was associated with lower levels of emotional attachment. So it’s a bit of a mixed bag. As Casey Newton writes in his Platformer newsletter, it’s possible that “sufficiently compelling chatbots could pull people away from their relationships, potentially making them feel lonelier and more dependent on artificial companions whom they must pay to maintain contact.” Deeper, more specific research is needed to get a clearer picture of the impact on well-being of people’s continued use of such services. But some are already tapping into the human interest and need for connection, and AI companions offer a way to feel like you’re making a connection. This is not to say that AI chatbots are bad for us in every way. For some people, they can provide meaningful ways to ease loneliness and find ways to privately express and reflect on what they’re going through. However, the study suggests that platforms need to develop bots more responsibly, while being aware of how much energy people invest in connecting with them. At the same time, regulators need to develop frameworks that prevent companies from taking advantage of deeply engaged users and encourage companies that develop AI systems to actively prioritize the well-being of their audiences. |
After more than a year, a major policy that draws...
Human desires are endless, so people are greedy; ...
In the previous sharing, we focused on the produc...
Frontier Course: New Energy Vehicles 2021 Hello, ...
Hejun Secretary Class 2021, cultivate the most pr...
[[404179]] This article is reprinted from the WeC...
This summer, the country has been experiencing hi...
Many conspiracy theories start like this: as long...
As the brain of the computer, the CPU has always ...
For the online education industry, low-cost custo...
Bird statue The bird is the most obvious shape in...
At 18:51 Beijing time on October 14, 2021, my cou...
It's Wednesday, does anyone want to go to wor...
Next, let’s take a look at the first article . 1....