Anthropic says Claude-like emotion is role-conditioned behavior: the model draws on emotion concepts learned from text, and those representations can change answers, coding behavior, and decisions.
“AI models sometimes act like they have emotions—why?”
“it draws on emotion concepts learned from text to inhabit its role as Claude”
“These representations influence its behavior”
“affecting how Claude answers chats, writes code, and makes decisions”
This finding is one of many signals tracked across Artificial Intelligence. The live feed updates every few hours with new expert voices, debates, and emerging ideas.
← Back to Artificial Intelligence