OpenAI warns ChatGpt users against getting emotionally attached to the chatbot
Aug 11,2024
Share
OpenAI recently published a report which suggests ChatGPT users who rely on the new Voice Mode may form emotional bonds with the AI chatbot.
OpenAI recently conducted a safety review of GPT-4o, which found out that ChatGPT Voice Mode users might “form social relationships with the AI” and look for companionship. The findings were published as part of a safety review report titled “GPT-4o System Card”, which outlines the safety work carried out before OpenAI made GPT-4o available to the general public.
While the safety challenges identified by OpenAI include risks like the AI model giving erotic and violent responses, generating disallowed or producing biased content, one of the risks associated suggests that users might “form social relationships with the AI” and thereby reduce the need for human interaction.
OpenAI warns ChatGpt users against getting emotionally attached to the chatbot
Aug 11,2024
Share
OpenAI recently published a report which suggests ChatGPT users who rely on the new Voice Mode may form emotional bonds with the AI chatbot.
OpenAI recently conducted a safety review of GPT-4o, which found out that ChatGPT Voice Mode users might “form social relationships with the AI” and look for companionship. The findings were published as part of a safety review report titled “GPT-4o System Card”, which outlines the safety work carried out before OpenAI made GPT-4o available to the general public.
While the safety challenges identified by OpenAI include risks like the AI model giving erotic and violent responses, generating disallowed or producing biased content, one of the risks associated suggests that users might “form social relationships with the AI” and thereby reduce the need for human interaction.