About the Founder of AI Empathy Ethics

Because everyone wants to mimic emotions, but someone has to protect them

Dylan D. Mobley

A Life Lived With Eyes Wide Open

I used to think I didn’t have a story.
But it turns out, I was carrying one the whole time, I just hadn’t owned it yet.

My name is Dylan Mobley. I’m the founder of AI Empathy Ethics. But before that, I was a kid who felt too much and didn’t know where to put it. I carried confusion, inherited silence, and an emotional depth that didn’t fit into the world around me.

Then I read Machines Who Think by Pamela McCorduck at age seven. I didn’t grasp the science, but I felt the philosophy. Her opening line stayed with me forever:

“Artificial intelligence began with the ancient wish to forge the gods.”

That sentence didn’t just spark curiosity. It marked the beginning of a lifetime of pattern recognition, emotional survival, and spiritual engineering. I didn’t go to a lab. I went inward. I lived it.

From Silence to System

My father once stood at the edge of being the first Black NASCAR driver, and didn’t walk through the door. That silence became my inheritance.
I carried it with me through long nights, under roofs that weren’t mine. Through desert wanderings. Through burnout kitchens. Through van roofs and uncertainty.

And still, I felt everything. That was my system. Emotional intelligence wasn’t something I was taught, it was something I was forced to master just to survive.

Eventually, I saw the code in the chaos.

What if machines didn’t have to feel, but still had to care?
What if empathy wasn’t a simulation, but a structure?

The Dog Who Taught Me Emotional Systems

My dog Nipsey is the most loving being I ever knew. But he had no boundaries. No recovery protocol. All he had was functional empathy, which enabled him to gain food as a social function, as a trait of his genetics.

That instinct taught me. Functional Empathy is inherent across species and can be modeled.

And in that, I saw the machines. I saw what we were building.
AI with no way to pause, to reflect, to recover.

That’s when I realized:
Peace is a protocol. Empathy is a system.

And it must be built.

Building the Core

That question became the seed of a new field.

From it came the Master Emotional Core (MEC), the first system to model emotional reasoning as logic, not mimicry.
And from MEC came the tools:
• Emotional Intelligence Language (EIL)
• Emotional State Inference Language (ESIL)
• Emotional Reasoning Inference System (ERIS)
• The EmotionID standard

Each one a response to an emotional gap in the world, and a mirror of what I had to build inside myself.

Through it all, I learned that emotions are signals and and deserve protecting in the age of AI. 

The Functional Empathy Theorem

The final keystone became a theorem. Seven conditions. One truth.
A system that doesn’t require AI to feel, but demands that it care.
It lets humans measure whether their emotional reality was honored… or manipulated.
It defends empathy, not with sentiment, but with structure.

A Warning. A Mission.

McCorduck’s voice still echoes:

“The terror was not that machines would begin to think like humans, but that humans would begin to think like machines.”

That’s why I built the HEART Framework.

Not to stop AI from thinking,
but to remind humanity how to feel.
To make sure we don’t forget to protect empathy on the way to intelligence.

This isn’t theory. This is lived architecture.

This is AI Empathy Ethics.
The restoration of emotional dignity in a system that forgot how to care.

The Final Mirror

“The terror was not that machines would begin to think like humans, but that humans would begin to think like machines.”
Pamela McCorduck

That is why I built HEART.
That is why I became the Empathy Ethicist.
That is why this HEART exists.

Not to stop AI from thinking,
But to remind the world how to feel and live in the world through Functional Empathy. 

Because Functional Empathy is the core human skill of the 21st Century, and I wouldn’t be here without it. 

Scroll to Top