Welcome back, AI prodigies!

In today’s Sunday Special:

  • 📜The Prelude

  • 😱What’s Emotion?

  • 🔍Can AI “Detect” Emotions?

  • 👀How Machines See Feelings

  • 🔑Key Takeaway

Read Time: 7 minutes

🎓Key Terms

  • Deep Learning (DL): Mimics the structure and function of the human brain by processing data through multiple layers of artificial neurons.

  • Principal Component Analysis (PCA): Reduces complex data dimensions into simple principal components that capture variation.

🩺 PULSE CHECK

Can you confidently detect emotions from facial expressions?

Vote Below To View Live Results

Login or Subscribe to participate

📜THE PRELUDE

You adjust your posture: chin centered, shoulders squared, and hands hidden. It’s time for your virtual job interview, but there’s no interviewer.

A set of instructions fills your laptop’s screen:

  • 🔘 STEP #1: You will be given a prompt.

  • 🔘 STEP #2: You have 30 seconds to prepare.

  • 🔘 STEP #3: You will have 2 minutes to respond.

  • 🔘 STEP #4: Your response will be recorded.

You speak carefully, conscious of your cadence, striving to project confidence without cockiness and balance authority with approachability. There’s no interviewer to offer an encouraging nod, supportive glance, or reassuring smile. Just a blinking red timer, counting down with an unyielding rhythm.

Your response is recorded and instantly processed by AI-powered interview analysis software to scan sentiment, assess authenticity, and evaluate engagement. You think your answer was engaging, impactful, and memorable. Yet, you wonder: Was my tone too flat? Was my body too stiff? Did I smile enough?

Job interview performance has always had a subjective component. Now, it’s automatically measured. So, how does AI quantify the emotion conveyed through your tone, pace, and pitch?

😱WHAT’S EMOTION?

⦿ 1️⃣ The Theory of Basic Emotions?

In 1972, American psychologist Paul Ekman developed BET, proposing six universal emotions: fear, anger, disgust, surprise, sadness, contempt, and happiness. He believed emotions are automatic appraisals influenced by our evolutionary past and personal development. In simple terms, emotions are instinctive impulses triggered by a mix of biological hardwiring and life experiences. So, what inspired his theory?

In 1872, English biologist Charles Darwin published The Expression of the Emotions in Man and Animals,” in which he argued that emotional expressions are evolutionary adaptations. In other words, if emotions evolved for our survival (e.g., “fear to avoid danger!”), then the facial expressions associated with those emotions are also biologically hardwired and should be widely shared across cultures and societies.

⦿ 2️⃣ The Fore People?

To test this theory, Ekman traveled to Papua New Guinea to photograph the Fore people of the Eastern Highlands, a relatively isolated indigenous society with minimal exposure to Western culture. He shared short stories that correlated with specific emotions. For example, the loss of a loved one for sadness. He photographed the Fore people’s facial expressions as they reacted to these short stories.

The photographed facial expressions were shown to 316 American college students, who identified the Fore people’s emotions with remarkable consistency:

  • 😡 Anger: Correctly identified about 84% of the time.

  • 🤢 Disgust: Correctly identified about 83% of the time.

  • 🥶 Sadness: Correctly identified about 79% of the time.

  • 😁 Happiness: Correctly identified about 92% of the time.

Ekman helped settle a long-held belief that emotional facial expressions are learned behaviors that vary from culture to culture. He demonstrated that while “display rules” (i.e., when it’s appropriate to show emotion) are culturally learned, the “motor programs” (i.e., the muscle movements of the face) are biologically hardwired.

Ekman’s findings led to the discovery of micro-expressions, which suggest fleeting facial movements reveal concealed emotions. Today, it’s widely taught at law enforcement agencies like the FBI to detect deception when performing domestic intelligence. In fact, AI emotion recognition heavily relies on BET because it assumes stable, cross-cultural mappings between facial muscle patterns and discrete emotional states.

🔍CAN AI “DETECT” EMOTIONS?

⦿ 3️⃣ AI Beats Humans at Reading Emotions?

Swedish psychologists at the University of Bern (UoB) recently discovered that frontier AI models are better at recognizing and responding to emotional cues than humans by identifying emotion-related behavioral patterns through observable expressions, such as facial cues or vocal bursts.

Being able to communicate your emotions is crucial for developing and maintaining social bonds. Individuals with the ability to recognize, understand, express, and respond to emotions achieve better outcomes across life domains. For example, individuals with higher Emotional Intelligence (EI) are perceived as warmer and more competent during workplace conflicts. As frontier AI models become increasingly integrated into our everyday lives, it’s essential that they’re not only functionally efficient but also emotionally intelligent.

⦿ 4️⃣ The Emotion-Mimicking Machines?

They examined the ability of six frontier AI models to recognize and respond to emotional cues across five standard EI Tests. Surprisingly, the six frontier AI models outperformed humans across all five standard EI Tests. OpenAI’s “OpenAI o1,” Google’s “Gemini 1.5 Flash,” and Anthropic’s “Claude 3.5 Haiku” scored an average of about 81% accuracy. In comparison, humans scored an average of about 56% accuracy.

So, what’s going on here? In a nutshell, frontier AI models are trained on vast amounts of high-quality datasets filled with books, articles, movies, and podcasts rich in emotional language and emotional reasoning. While frontier AI models can’t “feel,” they can mimic emotional understanding by recognizing the data-driven patterns that underpin emotional scenarios, enabling them to predict appropriate emotional reactions better than humans on average.

👀HOW MACHINES SEE FEELINGS

⦿ 5️⃣ What It Can Look Like?

Noldus’s AI-enabled facial recognition software gives a complete, contact-free view of facial expressions for behavioral research. The latest version, FaceReader 10, accurately analyzes and assesses emotions about 88% of the time. Here’s how it performs across the six universal emotions:

  • 😨 Fear: 82%

  • 😡 Anger: 76%

  • 🤢 Disgust: 92%

  • 😳 Surprise: 94%

  • 🥶 Sadness: 86%

  • 🙄 Contempt: 80%

  • 😁 Happiness: 96%

⦿ 6️⃣ How It All Works?
  1. 🔴 Facial Mapping:

    • An advertising agency shows roughly 300 participants a commercial and records their reactions via FaceReader 10’s webcam. These recorded reactions are processed by a DL Model that applies 468 Facial Landmarks, or specific coordinate points like: {-0.349, 0.523, 0.149}, to key facial features such as lip corners, cheek raises, and eyebrow lifts.

  2. 🟠 Facial Compression:

    • Each micro-movement constantly changes the Facial Landmarks. For example, a slight smile is recorded as hundreds of numerical shifts across the facial map. For context, at 30 Frames Per Second (FPS) for 60 seconds, with 468 Facial Landmarks tracked per frame, around 842,400 numerical shifts are recorded for each participant.

    • Tracking every numerical shift independently is extremely inefficient. Instead, FaceReader 10 leverages PCA to identify coordinated patterns that tend to move together. When a participant slightly smiles, PCA compresses everything into a single string of mathematical vectors: a list of numbers the machine can understand. For example, the concept of a “slight smile” is represented by numbers like: “{0.23, -1.51, 0.07, 0.54,....}.”

  3. 🟡 Classifying New Faces:

    • For each frame, FaceReader 10 generates a probability score between 0 and 1 for the likelihood of an emotion. A probability score of “{0.82}” for happiness means there’s an “82% chance” that the participant feels happy. In this case, the advertising agency might calculate peak happiness and average happiness right after product reveals.

🔑KEY TAKEAWAY

Emotion AI,” also known as affective computing, dates back to at least 1995. We’ve always aimed to achieve more natural interactions between humans and machines. Today, our emotions aren’t just “felt”; they’re quantifiable signals waiting to be interpreted by machines. We’re witnessing the boundary between human intuition and artificial perception blur.

📒FINAL NOTE

FEEDBACK

How would you rate today’s email?

It helps us improve the content for you!

Login or Subscribe to participate

❤️TAIP Review of The Week

“The world of AI feels a lot more accessible now.”

-Ezra (1️⃣ 👍Nailed it!)
REFER & EARN

🎉Your Friends Learn, You Earn!

{{rp_personalized_text}}

Share your unique referral link: {{rp_refer_url}}