- The AI Pulse
- Posts
- 🤖 Humans Don’t Want to Talk to Machines?
🤖 Humans Don’t Want to Talk to Machines?
PLUS: Humans Correct, Guide, and Influence Robotic Behavior, “PhD-Level AI,” Explained.

Welcome back AI enthusiasts!
In today’s Daily Report:
⚙️Humans Don’t Want to Talk to Machines?
🦾Humans Correct, Guide, and Influence Robotic Behavior
🎓“PhD-Level AI,” Explained.
🛠Trending Tools
🥪Brief Bites
💰Funding Frontlines
💼Who’s Hiring?
Read Time: 3 minutes
🗞RECENT NEWS
TAVUS
⚙️Humans Don’t Want to Talk to Machines?

Image Source: Tavus/“The OS for human-AI interaction + Building blocks to empower your AI agent to see, hear, respond, and look human.”/Screenshot
Tavus, a company that empowers AI Agents to feel more humanlike, recently launched a Conversational Video Interface (CVI).
Key Details:
CVI helps AI Agents replicate how humans naturally perceive, interpret, and respond to the world around them.
It relies on three AI models to bring AI Agents to life:
Phoenix-3: Generates natural facial movements that involve the eyebrows, cheeks, and mouth to replicate human expressions.
Raven-0: Tracks body language to perceive a human’s intent, context, and emotion.
Sparrow-0: Handles conversational timing to eliminate awkward pauses.
CVI helps AI Agents look and feel more human during video interactions.
Why It’s Important:
CVI opens up new possibilities for AI Agents in areas that require nuanced communication, such as virtual therapy.
AI Agents that can understand human emotions and interpret human intentions are better equipped to offer more personalized support to humans.
AI RESEARCH
🦾Humans Correct, Guide, and Influence Robotic Behavior

Image Source: MIT CSAIL Interactive Robotics Lab/NVIDIA Seattle Robotics/“Inference-Time Policy Steering through Human Interactions”/Screenshot
NVIDIA recently developed “Inference-Time Policy Steering (ITPS),” which enables humans to correct, guide, and influence a robot’s behavior.
Key Details:
Imagine a robot is helping you fold the laundry, and you ask it to “grab the hamper” or “take the detergent,” but the robot’s gripper messes up.
ITPS allows you to correct the robot’s behavior with these two simple interactions:
Drawing: If the robot’s arm movements are awkward, you can gesture the desired path.
Touching: If the robot’s grip is too loose or too tight, you can adjust the gripper.
ITPS doesn’t require you to collect New Datasets to retrain the AI System that’s powering the robot’s brain.
Why It’s Important:
Different humans may have different preferences. ITPS allows robots to adapt to these individual preferences right away.
“The consumer will expect the robot to work right out of the box, and if it doesn’t, they want an intuitive mechanism to customize it,” said NVIDIA Research Scientist Yanwei Wang.
AI TRENDS
🎓“PhD-Level AI,” Explained.

Image Source: Canva’s AI Image Generators/Magic Media
AI firms have a new buzzword: “PHD-Level AI.” For instance, OpenAI reportedly plans to launch an AI Agent that supports “PhD-Level Research” for $20,000 a month. So, what exactly does “PhD-Level AI” mean?
AI firms like OpenAI base their “PhD-Level AI” claims by leveraging specific benchmarks. For example, OpenAI recently introduced “Deep Research,” an AI Agent that finds, analyzes, and synthesizes content across hundreds of websites to develop comprehensive summaries like a “PhD-Level Research Analyst.”
OpenAI knows “Deep Research” performs like a “PhD-level Research Analyst” because it achieved 26.6% accuracy on Humanity’s Last Exam (HLE), which is designed to assess Advanced Reasoning across mathematics, the humanities, and the natural sciences, similar to what a “PhD Candidate” would be expected to demonstrate.
🩺 PULSE CHECK
Is “PhD-Level AI” a misleading term?Vote Below to View Live Results |
🛠TRENDING TOOLS
👏Swatle is an AI-powered team collaboration tool.
📦Kive instantly creates visually on-brand product visuals.
🚀Mochii.AI is where intelligence meets infinite possibility.
🍋Lemni builds AI Agents for all your customer interactions.
📊Graphy enables anyone to become a skilled data storyteller.
🔮Browse our always Up-To-Date AI Tools Database.
🥪BRIEF BITES
Luma AI introduced “Ray2 Flash,” a large-scale video generator capable of creating realistic visuals with natural, coherent motion.
Chinese AI Firm Manus launched the world’s first General AI Agent that handles real-world tasks through step-by-step replays.
Engineer Zakwan Ahmad is building an AI-powered smart pen that integrates with ChatGPT to convert handwritten notes into digital conversations.
💰FUNDING FRONTLINES
9 U.S. AI startups have landed over $100M so far in 2025.
WilsonAI raises a $1.7M Pre-Seed Round for an AI-Based Paralegal Assistant.
Knostic secures an $11M Funding Round to eliminate Enterprise AI Data Leaks.
💼WHO’S HIRING?
Rad AI (Remote): Software Engineer Intern, Summer 2025
Tesla (Palo Alto, CA): Rendering Engineer, AI Simulation, Summer 2025
NVIDIA (Santa Clara, CA): GPU Power Architect, Entry-Level
xAI (Palo Alto, CA): Full Stack Software Engineer, Starfleet, Mid-Level
Meta (Sunnyvale, CA): Design Verification Engineer, Senior-Level
📒FINAL NOTE
FEEDBACK
How would you rate today’s email?It helps us improve the content for you! |
❤️TAIP Review of The Day
“I can use some of this almost daily. Great stuff”
REFER & EARN
🎉Your Friends Learn, You Earn!
You currently have 0 referrals, only 1 away from receiving ⚙️Ultimate Prompt Engineering Guide.
Copy and paste this link to friends: https://theaipulse.beehiiv.com/subscribe?ref=PLACEHOLDER
Reply