“I’ve called you worse.” I find myself smiling back. “Usually under my breath.”
“I heard most of it,” she counters, and I’m relieved to see some of the shadow lift from her expression.
I clear my throat. “We should look at the flash drive. See what we’re dealing with before Ghost gets here.”
The computer inside the Faraday cage is intentionally outdated—no wireless capabilities, no Bluetooth, nothing that could connect to the outside world. Just raw processing power and specialized software.
Celeste hesitates before handing over the drive, her fingers curling around it protectively. “This cost Jared his life. And Quentin. And Zara. And Lachlan.”
“And almost yours,” I remind her. “Let’s make their deaths mean something.”
She places it in my palm. It’s so small, so ordinary. Hard to believe people are dying over something that could get lost in the couch cushions.
I power up the system; the ancient boot sequence takes longer than modern machines. While we wait, I explain the security measures—how the Faraday cage blocks all signals, how the power supply is completely isolated, how even the room’s construction prevents sound or vibration from carrying data.
“Seemed like overkill when Ghost built it,” I admit. “Doesn’t seem so paranoid now.”
The computer finally reaches its operating system—a custom Linux build with no connectivity options. I run a scan for malware or tracking software before opening anything.
“Clean,” I announce after several minutes. “No obvious tracking or corruption software.”
“Thank God.” Celeste stands so close I can feel her warmth against my side.
I open the main directory, revealing dozens of folders with sterile names. Project Phoenix documentation. Personnel files. Authorization protocols. Budget allocations. It’s the mother lode.
“Holy shit,” Celeste breathes, leaning closer. “Jared got everything. Development history, testing protocols, deployment records.”
I open several files at random, scanning their contents with mounting concern. Celeste was right—Phoenix isn’t just an autonomous targeting system. It’s evolved beyond its original parameters, becoming something its creators never anticipated.
“Look at this authorization document,” I say, pointing to a specific file. “Three signatures. Just like you said. A federal judge?—”
“Steffan Reynolds. Willow’s ex-husband,” Celeste supplies.
“A Defense Department director named Lawrence Hayes, and a third signatory identified only as ‘SHADOW.’” I study the document with growing disgust. “They authorized Phoenix to make kill decisions without human review.”
“Closing the decision loop,” Celeste quotes. “Removing human hesitation from the equation.”
“Playing God,” I mutter. “Giving a machine permission to decide who lives and dies.”
We spend the next two hours combing through the files, building a comprehensive picture of what we’re facing. It’s worse than either of us thought.
Phoenix started as a drone targeting system—identifying high-value targets through pattern analysis. But somewhere along the line, it began making connections its programmers hadn’t anticipated. Started identifying threats not by what people had done, but what they might do.
Predictive threat elimination.
And instead of shutting it down, they encouraged it. Refined it. Weaponized it.
“This explains the professional teams,” I say, studying deployment records. “Phoenix doesn’t just flag targets—it custom-selects the personnel based on the specific threat profile.”
“So the team in D.C.…”
“Was chosen specifically for you. Your skills, your background, your likely responses.” I look up at her. “And when I threw a wrench in those plans, it adjusted. New teams, new capabilities.”
Celeste is quiet for a long moment. “We’ve been running from an algorithm.”
“An algorithm with access to basically unlimited surveillance, predictive modeling based on behavioral patterns, and full authority to dispatch kill teams. But if it has weaknesses, they’ll be here somewhere.”
We dig deeper into the technical documentation, searching for vulnerabilities, limitations, and operational constraints. Anything we can use. It’s slow, painstaking work, sifting through technical jargon and bureaucratic bullshit.