• The AI Pulse
  • Posts
  • 🧠 Can AI Decode the Language of the Animal Kingdom?

🧠 Can AI Decode the Language of the Animal Kingdom?

PLUS: How we can understand minds without entering them.

Welcome back AI prodigies!

In today’s Sunday Special:

  • šŸ“œThe Prelude

  • šŸ—£ļøWhy Words Aren’t Enough

  • šŸ—ŗļøMapping Theoretical Understanding

  • šŸ‹AI Can Help Us Decode Whale Calls?

  • šŸ”‘Key Takeaway

Read Time: 7 minutes

šŸŽ“Key Terms

  • Machine Learning (ML): Leverages data to recognize patterns and make predictions without being explicitly programmed to do so.

  • Deep Learning (DL): Mimics the structure and function of the human brain by creating multiple layers of ā€œartificial neuronsā€ to detect patterns.

  • Neural Network (NN): A network of interconnected nodes that process large amounts of complex data to learn, recognize, and discover patterns.

🩺 PULSE CHECK

If AI could translate whale sounds into plain English, would it truly understand whales?

Vote Below To View Live Results

Login or Subscribe to participate in polls.

šŸ“œTHE PRELUDE

You’re resting on a sun-stained sailboat, drifting across the open ocean. As night falls, the sea and sky merge into a single stretch of boundless darkness. Stars shine overhead, their reflections flickering across the glassy surface of the sea.

Every few seconds, the deck vibrates with a low, rhythmic hum. You lean over the railing, attuned to the faint sounds below. It’s barely audible, so you connect to the Hydrophone.

You can hear it: a sharp click, followed by a cluster of clicks. You can feel it: each click vibrates through the speakers. You can see it: each click spikes on the screen. Somewhere beneath you, from the depths of the abyss, a humpback whale is speaking, but you can’t understand it.

Throughout history, advances in technology have helped us explore the unknown. Telescopes revealed the Sun at the center of our solar system. Seismographs showed how to measure the strength of earthquakes. So, could AI decode the sounds of the Animal Kingdom?

šŸ—£ļøWHY WORDS AREN’T ENOUGH

⦿ 1ļøāƒ£ Meaning Depends on a Way of Life

In 1953, Austro-British philosopher Ludwig Wittgenstein famously wrote: ā€œIf a lion could speak, we couldn’t understand him.ā€ He wasn’t talking about vocal cords or translation software. Even if lions spoke perfect English, Wittgenstein doubted they could truly communicate with us. Their instincts, impulses, and interests, shaped by unique lived experiences, are fundamentally different from ours. In essence, the same words simply wouldn’t carry the same meanings.

When a lion feels ā€œhungry,ā€ it’s a visceral, all-consuming state that warrants extreme violence to kill prey. When we’re hungry, our stomachs growl and we overpay for lunch. The same word conveys a radically different meaning. In other words, the word might be the same, but the worlds aren’t.

⦿ 2ļøāƒ£ The Wall of Subjective Experience?

In 1974, American philosopher Thomas Nagel famously asked: ā€œWhat Is It Like to Be a Bat?ā€ He wasn’t curious about bat biology. He was curious about the limits of perspective: whether humans can grasp a reality that lies entirely outside their own.

Ultimately, he concluded that while we can analyze a bat’s brain and sonar signals, we can never truly know what it’s like to experience the world through echoes as a bat does. Nagel argued that even if we could implant echolocation into humans, we still wouldn’t truly understand the bat’s reality. We’d only understand the human’s reality modified by echolocation.

⦿ 3ļøāƒ£ Beyond the Boundary of Experience

Wittgenstein and Nagel drew a hard philosophical boundary on perspective: you can’t fully understand another being’s experience because it’s private and tied to their way of life. In simpler terms, knowing how something happens doesn’t explain what it’s like to experience it. But can’t we understand another being’s perspective through careful observation, imaginative projection, or shared experiences?

šŸ—ŗļøMAPPING THEORETICAL UNDERSTANDING

⦿ 4ļøāƒ£ Learning From the Past?

In 1948, American mathematician Norbert Wiener pioneered the foundational principles of Cybernetics: how humans, animals, and machines ā€œsteerā€ themselves toward a specific goal through feedback.

He argued that everything and everyone relies on Feedback Loops: actions produce results, those results are measured, and future actions are adjusted accordingly. For example, humans raise their voices when they aren’t heard, bats emit chirps when echoes fail, and thermostats apply heat when it’s too cold.

None of these requires conscious thought. More specifically, humans don’t deliberately raise their voices, bats don’t consciously emit chirps, and thermostats act automatically based on programmed rules. Nevertheless, they all close the gap between the current state and the desired state.

So, understanding doesn’t require inhabiting another creature’s consciousness. If we understand the relationship between actions and their results, we can interpret meaning. In other words, we can understand what a behavior communicates by analyzing how it affects the creature’s state. AI’s ability to detect patterns across vast streams of data can do exactly that.

šŸ‹AI CAN HELP US DECODE WHALE CALLS?

⦿ 5ļøāƒ£ The Songs of the Sea
  1. šŸ”“ Data Collection:

    • To record whale calls, marine biologists suspend Hydrophones from drifting buoys and anchored boats. These underwater microphones capture all the ocean’s sounds, from ship engines to seafloor rumblings and dolphin clicks.

    • The marine biologists isolate sound signals from whale call recordings and convert them into Spectrograms, which display time, frequency, and amplitude all at once. This process enables them to extract Features, such as how long a whale waits between calls during ā€œhunting.ā€

  2. 🟔 Pattern Detection:

    • An ML model is trained on thousands of whale call recordings, each labeled with several different Features like ā€œmatingā€ or ā€œfeeding.ā€ Over time, the ML model learns to associate specific whale calls with certain behaviors. As a result, when marine biologists feed new whale call recordings into the ML model, it can classify them accurately within seconds.

    • Using ML models to classify whale call recordings works, but a single humpback whale can produce thousands of overlapping calls per hour. The sheer volume of vocalizations can quickly exceed what a marine biologist can manually interpret. More importantly, they only extract Features they already expect to hear. This means ML models are only trained to classify the expected.

  3. 🟢 Pattern Recognition:

    • DL models are designed for situations where the patterns are too subtle, too numerous, or too complex for humans to label. Instead of telling a DL model which Features to look for, marine biologists let it independently discover them by stacking multiple layers of learning on top of one another to form the NN.

    • In a NN, each layer transforms the data slightly, emphasizing certain patterns and suppressing others. By the time the whale call recordings reach the final layers, the DL model detects patterns of patterns that convey more specific actions like deep dives or prey pursuits.

šŸ”‘KEY TAKEAWAY

AI doesn’t settle whether we can understand animal communication. It completely reframes the question. By excelling at patterns and predictions, it pushes us to decide whether understanding means truly experiencing a whale’s world or simply being able to interpret whale calls.

šŸ“’FINAL NOTE

FEEDBACK

How would you rate today’s email?

It helps us improve the content for you!

Login or Subscribe to participate in polls.

ā¤ļøTAIP Review of The Week

ā€œExactly what I was looking for. Bravo! šŸŽ©ā€

-Dillion (1ļøāƒ£ šŸ‘Nailed it!)
REFER & EARN

šŸŽ‰Your Friends Learn, You Earn!

You currently have 0 referrals, only 1 away from receiving šŸŽ“3 Simple Steps to Turn ChatGPT Into an Instant Expert.