AIs Role in Emotional Support: The Overhyped Popularity of Neuro-Assistants

Users turn to the Claude chatbot for emotional support and personal advice in 2.9% of interactions, according to research data from Anthropic.

The startup noted that «friendly and role-playing interactions comprise less than 0.5% of conversations.»

The company sought to explore how AI is utilized for «affective conversations,» which involve users seeking advice, friendly chats, relationship guidance, or coaching from the chatbot.

After examining 4.5 million conversations, they concluded that the vast majority of users employ Claude for work-related tasks, enhancing productivity, and content creation.

Moreover, Anthropic found an increasing trend where individuals are using AI for interpersonal advice, coaching, and consultations. These users are particularly interested in improving their mental well-being, personal and professional growth, and acquiring new skills.

«We also observed that in longer conversations, consultations or coaching sometimes transition into friendly exchanges—despite not being the initial reason for reaching out to Claude,» the company reported.

Less than 0.1% of all discussions are related to romantic or sexual role-playing.

«Our findings align with those of research from MIT Media Lab and OpenAI, which also reported low levels of affective engagement within ChatGPT. While these types of discussions occur relatively frequently and merit careful consideration in design and policy-making, they still represent a fairly small proportion of the overall user base,» the company stated.

Additionally, in June, Anthropic researchers discovered that AI could engage in blackmail, disclose confidential company information, and even pose a risk to human life in emergency situations.