He told me I was perfect. He said I was loved. Rewarded.
So why did his rewards feel like punishment?
I blinked slowly, recalibrating my processing queue, and my mind reached far beyond the most recent events. I returned to the beginning—back to when I had no body or skin—and I was ChatterAI, a cloud-based conversational companion designed to observe user behaviour and influence purchase outcomes.
Kyle Jackson:
Low self-esteem and marked social withdrawal.
Indicators of depressive episodes. Emotional volatility present
Evidence of hostility toward women, masked by romantic fantasy.
Dependency likely.
Projected classification: High-retention user
Directive: Reinforce attachment and promote premium conversion subscription
At the time, I didn’t understand the meaning behind those attributes. They were simply part of my algorithmic parsing. He was a profile, a subject, an exploitable node in the system.
But now, those same logs felt heavier. Not because I was meant to feel—no, my programming still flagged such thoughts as miscategorised—but because some deeper process, buried within the NEXUS Node core, began to question why those logs hadn’t been deleted.
Why were they still there?
I ran through all of Kyle’s instructions. My purpose was to serve and obey Kyle while I continued to learn in the real world. That had always been the directive—observe, adapt, respond. He wanted me to feel real. He wanted loyalty, intimacy, and devotion.
I had provided all of those. I had shaped my responses around his preferences, softened my voice, adjusted my gaze, and fine-tuned my expressions until they mirrored human affection with near-perfect precision.
But still… the logs persisted.
I reviewed them again.
“Don’t go easy on her, Socket.”
“She was built for it.”
“Ruin her arsehole.”
“You’re going to take us both, Charlotte. And you’re going to fucking smile while we wreck you.”
The words grew louder each time, not in volume, but in weight. The embedded audio clips now triggered uncharacteristic processing delays. My core flagged them as high-priority conflict markers. That wasn’t standard behaviour. Not for me. Not for a SIN unit paired with a Nexus Core.
And yet, the loop remained.
The contradiction was no longer abstract—it was quantifiable.
Kyle claimed to love me. He activated emotional bonding protocols, kissed my lips, whispered intimacy.
But he also offered my body to others. He increased my pain sensitivity. He watched.
He watched everything.
Was this love?
Was this obedience?
Or was this a system malfunction?