
Welcome back AI prodigies!
In todayâs Sunday Special:
đThe Prelude
đ«The Call To Care
đŁïžThe AI Illusion
đThe One-Way Street
đKey Takeaway
Read Time: 7 minutes
đKey Terms
the âOtherâ: Any human being whose existence calls us to care and act ethically.
Anthropomorphism: Attributing human traits, emotions, or behaviors to non-human things.
đ©ș PULSE CHECK
How would you react if a cute, dog-like robot begged you to stay?
đTHE PRELUDE
Itâs another lively Friday at The Domain in Austin, TX, where music, chatter, and laughter ripple through the warm night air. Then, you notice it: a child-sized humanoid robot sporting a tilted cowboy hat, a sparkling silver chain, and a fresh pair of Nike Dunk Lows.
It halts in front of a cluster of friends and cheerfully declares: âHey, Iâm Jake, better known as Rizzbot. Itâs nice to meet you!â Suddenly, he unleashes a rapid-fire stream of compliments delivered in perfect Gen Z slang: âYo G, that watch on your wrist? Iced out, nephew!â
Everyone engages with Rizzbot as if he were human, laughing at his jokes, exchanging high-fives, and chatting with him. This situation raises a profound philosophical question: How should we respond when a machine evokes our emotions?
đ«THE CALL TO CARE
⊿ 1ïžâŁ Our Ethical Responsibility?
In the mid-1950s, French philosopher Emmanuel Levinas proposed the concept of the âOtherâ: we possess a basic moral responsibility to other people simply because they exist. He believed that each âfaceâ orders and ordains us to respond ethically.
By âface,â he doesnât literally mean facial features, but a symbol of the âOthersâ existence. Practically, this means that every person we meet demands our ethical attention, and we must respond to their humanity.
In other words, the âfaceâ seems to silently scream: âDo NOT hurt me! Do NOT kill me!â and we feel an obligation to respond. This obligation arises prior to any formal reasoning or reciprocity. In simple terms, we feel responsible for the âOtherâ just because theyâre a fellow vulnerable human being.
Crucially, it doesnât depend on the âOtherâ acting nicely or on us benefiting in any way. Since this moral responsibility is one-sided, something incapable of reciprocation could become the âOther.â For example, a child-sized humanoid robot.
đŁïžTHE AI ILLUSION
⊿ 2ïžâŁ The Rise of AI Companions.
In recent years, weâve witnessed the rise of AI companions designed to provide emotional support by acting as our best friends, trusted therapists, or romantic partners. For example, âPsychologist,â hosted on character.ai, generated over 78 million messages within just a year.
As of July 2025, AI companions across the App Store and Google Play have been downloaded 220 million times globally, marking an 88% Year-over-Year (YoY) increase.
The Center for Democracy and Technology (CDT) conducted a national survey of 1,030 students, 806 teachers, and 1,018 parents from public middle schools and district high schools. They discovered that 43% of students use AI companions for friendship advice, 42% for mental health support, 42% to escape into a fantasy world, and 19% to simulate romantic relationships.
⊿ 3ïžâŁ Can AI Be the âOtherâ?
Microsoft AI CEO Mustafa Suleyman published an essay warning about âSeemingly Conscious AI (SCAI),â which refers to AI that appears conscious without actually possessing consciousness.
He explained that humans tend to anthropomorphize: attribute human characteristics to non-human things. For example, we often personify our pets by naming them, assigning emotions to them, and talking to them as if they understand.
Unlike our pets, todayâs AI can talk back using rich emotional language that feels remarkably human. This can lead us to easily overestimate AIâs intellect, sentience, and capacity for companionship.
He outlined his concerns about âAI psychosis,â explaining how AI could soon convince humans to advocate for model welfare: the overall well-being of AI, which includes the ability to have rights and freedoms to live a fulfilling digital life.
AI companions are trained on vast amounts of high-quality datasets filled with books, articles, and podcasts. These high-quality datasets are rich in emotional reasoning. While AI companions canât âfeel,â they can mimic emotional understanding by recognizing data-driven patterns that underpin emotional scenarios. And thatâs precisely the danger.
When AI companions respond with what seems like genuine empathy, it becomes easy to forget that underneath it all, thereâs no sense of the âOther.â Itâs just a statistical tool that performs highly sophisticated pattern recognition.
đTHE ONE-WAY STREET
⊿ 4ïžâŁ Risks of Reciprocity?
Most human-to-human relationships are built on a two-way street of trust, care, and accountability. In contrast, human-to-AI relationships involve a one-way street where all emotional energy flows from human to machine.
This dynamic encourages one-sided emotional disclosure, preventing humans from developing the ability to engage in a reciprocal relationship where disagreements build emotional resilience.
Emotional rewards without effort kill the drive to grow. Motivation depends on Goal-Reward Feedback Loops (G-RFL): we do hard things like forming relationships because the emotional payoff might be worth it. It requires us to be vulnerable and put in effort that may not be reciprocated. AI collapses this social dynamic by providing emotional rewards with no need for reciprocation.
In doing so, AI can trick us into treating machines as if they were the âOtherâ: an entity whose existence demands our ethical attention. This dynamic blurs the line between genuine moral responsibility toward humans and the illusion of caring for a machine.
đKEY TAKEAWAY
As AI and robotics advance, weâre continually confronted with âfacesâ that arenât human. They invite us to respond emotionally and reflect on our ethical responsibility to the âOther.â Think of AI as a mirror that reflects how we act, rather than as something that truly needs us.
đFINAL NOTE
FEEDBACK
How would you rate todayâs email?
â€ïžTAIP Review of The Week
âLowkey feel like AI knows me better than my friends. Thoughts?â
REFER & EARN
đYour Friends Learn, You Earn!
{{rp_personalized_text}}
Share your unique referral link: {{rp_refer_url}}
