How Can Robots Simulate Human Emotions? Exploring AI Sentiment and Speech Acts
This article examines the concept of emotional robots, distinguishing true emotion from sentiment analysis, and proposes a framework for endowing conversational AI with artificial emotions by analyzing user utterances through speech‑act theory, semantic roles, and contextual cues to generate appropriate robot emotional feedback.
Preface
When people familiar with AI hear the term “emotional robot,” they often think of Pepper, a robot co‑developed by SoftBank and Aldebaran Robotics in 2014 that claims to perceive human emotions from tone and facial expression. Many other robots are marketed with an "emotional" label, but these systems actually perform sentiment analysis—an AI technique that infers a speaker’s attitude from textual or spoken cues. While sentiment analysis improves feedback collection in e‑commerce, recognizing a user’s emotion does not mean the robot itself possesses genuine feelings.
Design Scope
Our exploration focuses on a dialogue‑centric emotional system that identifies the key elements in speech that trigger robot emotions, infers the most appropriate emotional response, and expresses it through gestures, replies, and text‑to‑speech, thereby delivering a more realistic emotional interaction.
Exploration 1: Finding Direction
How are emotions actually generated?
There is no single academic consensus on the origin of emotions. Physiological theories attribute emotions to bodily reactions, neurological theories to brain activity, and cognitive theories to mental processes. From a psychological perspective, emotions intertwine with temperament, personality, mood, and motivation. Generally, emotions can be seen as innate, evolution‑encoded responses that are also shaped by cognition and environmental factors, typically unfolding in three steps: external stimulus, evaluative response, and emotional experience.
Where to start?
In a conversational scenario, the external stimulus that elicits emotion is primarily the content, tone, and body language of the interlocutor. Our research concentrates on the spoken utterance itself and seeks methods to analyze it in order to infer the robot’s appropriate emotional feedback.
Robot emotion does not directly correspond to user emotion
Initially, one might assume that detecting the user’s emotion is sufficient to determine the robot’s response. We attempted a simple formula: Robot Emotion = User Emotion . However, this approach fails because the same emotion can be expressed in many different sentences, and the appropriate robot reaction depends on context, intent, and the target of the utterance. Additional factors such as the robot’s persona, character, and situational context also influence the emotional feedback.
Finding a new angle for analyzing user utterances
Instead of focusing solely on sentiment, we turn to speech‑act theory, which divides sentence analysis into grammatical, semantic, and pragmatic layers. From a pragmatic viewpoint, a sentence carries a communicative function (e.g., greeting, praising, insulting) that often hints at the emotional response the robot should generate. For example, a user’s praise typically elicits gratitude or joy from the robot, while an insult may trigger anger or defensiveness.
Hypothesis
Based on the communicative function of sentences, we propose a new emotion inference formula that combines speech‑act categories with semantic information about the emotion’s target, allowing more precise robot emotional responses.
Expression‑Behavior Analysis
We distinguish between the emotional content of a sentence and the object of that emotion. For instance, "You are a smart robot" (praise directed at the robot) and "I am a smart child" (self‑praise) both convey praise, but the robot should respond with gratitude in the former case and with self‑affirmation in the latter. By combining the speech‑act (pragmatic) and the emotional target (semantic), we can generate more accurate robot emotional feedback.
To Be Continued
The current exploration provides a fresh perspective for designing emotional robot systems. The next article will present further experimental results and refinements.
References
Kendra, C. (2019, September 9). Overview of the 6 major theories of emotion. Verywell Mind. Retrieved January 21, 2019, from https://www.verywellmind.com/theories-of-emotion-2795717
Sebastian, G. (2017). Feelings and Emotions: The A – Z Guide. Laughter Online University. Retrieved January 1, 2013, from http://www.laughteronlineuniversity.com/feelings-and-emotions/
Berit, B. (2018, June 30). Basic and Complex Emotions. Psychology Today. Retrieved January 21, 2019, from https://www.psychologytoday.com/intl/blog/the-superhuman-mind/201806/basic-and-complex-emotions
张伯江. 施事角色的语用属性[C]. 北京:中国语文,2002年第6期:483‑494.
胡裕树. 现代汉语[M]. 上海: 上海教育出版社, 1981.
Tencent Mobility Industry Design Center
The Tencent Mobility Industry Design Center (SMD) is Tencent's user experience team focused on the industrial internet.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
