- The AI Pulse
- Posts
- 🧠 Are you just a brain in a meat suit?
🧠 Are you just a brain in a meat suit?
PLUS: How Embodied AI Makes Robots More Human

Welcome back AI prodigies!
In today’s Sunday Special:
📜The Prelude
💪What’s Embodied Cognition?
🦾What’s Embodied AI?
🪞Can Embodied AI = Embodied Cognition?
🔑Key Takeaway
Read Time: 7 minutes
🎓Key Terms
Humanoid Robots: Robots designed to resemble the human body in form, feel, and function.
Embodied AI: AI that takes physical form in robots, drones, or self-driving cars to interact with the real world.
🩺 PULSE CHECK
As AI gains physical form, what worries you?Vote Below to View Live Results |
📜THE PRELUDE
What if you woke up fully conscious, with every thought clear, rational, and detailed, yet found yourself trapped within a motionless shell? Unable to move, touch, or balance, your mind became a restless prisoner, alive and aware but painfully disconnected from reality.
For centuries, we viewed human intelligence as an abstract concept residing solely within the brain. However, our everyday experiences suggest otherwise. For instance, when you catch a falling cup, your hand moves before your mind “decides” to act, revealing how tightly intertwined our brain and body truly are.
Advances in robotics are beginning to mirror this connection, with AI-powered humanoid robots learning to see, act, and do just like humans. As technological advancements blur the boundary between mind and machine, we’re compelled to ask: “Is human intelligence something that can be programmed, or is it fundamentally rooted in the inseparable dance between body, brain, and soul?”
💪WHAT’S EMBODIED COGNITION?
⦿ 1️⃣ Mind = Machine?
In the mid-20th century, cognitive scientists supported the Information Processing Theory (IPT), which proposed that humans record, store, and retrieve information in a manner similar to that of computers.
We often acquire information through our five main senses (i.e., sight, smell, sound, taste, and touch), encode this information into memory, and later retrieve it to guide our decisions. Similarly, computers take in several types of inputs (e.g., mouse, keyboard, or microphone), process these inputs using algorithms, and produce relevant outputs. This comparison implies that the brain depends on our five main senses to receive information about the world, rather than experiencing the world directly.
French phenomenological philosopher Maurice Merleau-Ponty challenged this comparison, alleging that it assumes there’s a clear line between “me” and “the world around me.” In other words, it’s incorrect to imagine that you’re completely separate from the world you experience.
Instead, he argued that our perception is shaped by our bodily presence in the world, meaning we’re deeply intertwined with our surroundings. He formalized this idea into a theory: human consciousness isn’t a spectator of the world; instead, it dwells within it.
For example, when driving a car, it gradually stops feeling like a separate object. In a way, the car starts to feel like an extension of your body. Like when parallel parking, you instinctively feel how close the tires are to the curb, not through conscious measurements, but through bodily awareness.
⦿ 2️⃣ Thinking Through Metaphors.
Several decades later, American cognitive linguist George Lakoff expanded this theory by exploring the Conceptual Metaphor, which helps us understand one idea by comparing it to another. For instance, when we “see” someone’s point, “carry” emotional baggage, or “run” out of time, we’re trying to understand abstract ideas by relating them to the familiar physical actions of “see,” “carry,” and “run.”
As a result, we often understand abstract ideas through physical experiences. When people imagine physical actions, the same brain regions used for movement are activated. For example, when you imagine kicking a ball, it triggers activity in the motor cortex. This suggests that thinking is a form of simulating physical action.
⦿ 3️⃣ Knowledge Beneath Words.
In 1958, Hungarian-British polymath Michael Polanyi introduced the concept of Tactic Knowledge: skills we possess but struggle to put into words. We might know how to shoot a basketball or play an instrument without being able to fully explain how we do it.
For example, seasoned surgeons and professional golfers show poorer performance when forced to verbalize the processes they usually perform implicitly, a phenomenon known as verbal overshadowing.
⦿ 4️⃣ So, What?
We possess three commonly overlooked forms of human intelligence:
🤗 Feeling: Our body affects how we experience the world.
🤔 Thinking: We think by mentally rehearsing actions.
🧐 Knowing: We know things we can’t always explain.
So, the big question is: “Can AI mimic these three deeply human abilities?”
🦾WHAT’S EMBODIED AI?
⦿ 5️⃣ From AI to Action?
Embodied AI gives AI a physical form, enabling it to interact with the real world. A prime example of this new frontier is Tesla’s Robotaxi: an autonomous ride-hailing service powered by self-driving cars. So, how does it work?
⦿ 6️⃣ Tesla’s AI-Powered Robotaxi.
Tesla Vision utilizes a network of cameras to provide real-time video feeds of the Robotaxi’s surroundings. These real-time video feeds are processed with Computer Vision (CV) to:
Detect objects like cars, cyclists, and pedestrians.
Identify road rules like traffic lights, stop signs, and lane lines.
Estimate the distance (i.e., depth) and speed (i.e., velocity) of objects relative to the Robotaxi.
Predict what might happen next (e.g., “The cyclist might merge into your lane!”).
Generate a safe path (e.g., “Change lanes to avoid the cyclist!”).
Implement real-world driving commands (e.g., “Apply brakes, adjust the steering wheel, and signal a lane change!”).
🪞CAN EMBODIED AI = EMBODIED COGNITION?
⦿ 7️⃣ Humanoid Robots With Human Intelligence?
Figure AI is actively exploring how to give Figure 03 (F.03) the same internal sense of bodily awareness as humans.
They’ve already leveraged principles from a Generalized Multisensory Correlation Model (GeMuCo), which helps humanoid robots build a basic awareness of their own body by constantly comparing what they expect to feel with what they actually feel. For example, each time F.03 moves a robotic limb, it measures the difference between the predicted feel and the real sensation. These measurements help it innately understand the variations in physics, friction, and velocity in relation to robotic movements.
🔑KEY TAKEAWAY
The essence of human intelligence springs from the holistic fusion of body, brain, and soul. Most importantly, the body plays a BIG role in how humans perceive the world around them.
Embodied AI aims to replicate this perception by giving AI a physical form to see, act, and do just like humans. It’ll never be exactly like us, but if it performs just as well as we do, does it even matter?
📒FINAL NOTE
FEEDBACK
How would you rate today’s email?It helps us improve the content for you! |
❤️TAIP Review of The Week
“Just totally awesome!”
REFER & EARN
🎉Your Friends Learn, You Earn!
You currently have 0 referrals, only 1 away from receiving 🎓3 Simple Steps to Turn ChatGPT Into an Instant Expert.
Share your unique referral link: https://theaipulse.beehiiv.com/subscribe?ref=PLACEHOLDER