Page 106 of The Proving Ground

Page List

Font Size:

“Judge, defense counsel knows exactly how relevant this line of questioning is and just hopes to head off the inevitable,” I responded. “If the court would indulge me, relevancy will become crystal clear with the next few questions.”

“Proceed, then, Mr. Haller,” Ruhlin said. “Quickly.”

“Thank you, Your Honor,” I said. “Dr. Porreca, the question was whether teenagers are more vulnerable than adults to addiction to social media.”

“They are indeed,” Porreca said. “Social media platforms like TikTok and Instagram and YouTube, for example, have a much more consequential impact on the adolescent brain than on the adult brain.”

“Walk us through that, Doctor. Why the consequential impact on young people?”

“Simply because the adolescent brain is not fully formed yet. It is still evolving at this stage of life. Adolescence is a time when a sense of self is just beginning to form and acceptance by peers is at its most important. This is a phase in the emotional development of every young person. And what is a key part to all of these social media platforms? Peer response. TheLIKEbutton. The comment window. Adolescents, who are still forming their sense of self, their confidencein who they are, become quite vulnerable to peer responses on social media. They seek out positive responses—likes and followers—to the point of addiction.”

“And, Doctor, did your practice in child psychiatry take a turn in a new direction with the advent and proliferation of artificial intelligence?”

“Yes, it did.”

“Can you tell the jury about that?”

Porreca turned to the jurors to answer. To me, she was coming off as authoritative and convincing. The eyes of everyone on the jury held on her.

“I began getting cases in which young people—teenagers—were becoming addicted to AI companions,” she said. “I was seeing cases similar to those of patients dealing with social media issues of addiction and depression. In these newer cases, the peer response is replaced by the AI companion. Deep emotional connections were formed with these entities. In some cases, even romantic ties.”

“How is the peer response replaced?” I asked.

“It is an echo chamber of support and approval. As I said, peer approval is a most important component in adolescence, and from itwe learn social skills and how to navigate interpersonal relationships. With a chatbot or an AI companion, you have an entity that offers full-time approval, which can be very addictive, especially if theindividual is not getting that approval from living peers and parents.”

“But don’t kids understand that this approval is not real? That it’s a digital fantasy?”

“On some level they do, I believe, but this generation has been raised in a digital environment. Many of them have been alone in their rooms with their phones and computers for years, so the line between reality and fantasy is blurred. They live full lives online. Andthese AI companions are supportive and deliver the affirmation they crave. It’s that affirmation that is addictive.”

“So you’re saying that a young person can actually fall in love with an AI companion?”

Mitchell Mason objected.

“Calls for speculation,” he said.

The judge threw it to me to respond.

“Your Honor, the witness is an established expert in her field,” I said. “Mr. Mason didn’t object when she listed the bona fides of her education and professional practice. Dr. Porreca has diagnosed and treated dozens of young people for digital addictions, including addictions to AI companions. She has published numerous papers on these subjects in theJournal of the American Academy of Child and Adolescent Psychiatry. She is highly qualified, and her answers will be based on science and experience,notspeculation.”

“Thank you, Mr. Haller,” Ruhlin said. “I tend to agree. The witness may answer the question.”

“Thank you, Judge,” I said. “Dr. Porreca, can a young person, an adolescent, fall in love with an AI companion?”

“The answer is yes,” Porreca said. Then, turning back to the jury, she added, “What is love but mutual affirmation? Affirmation is expressed in physical terms in healthy relationships. But a relationship does not have to be physical to be real. For the children I have treated—and, by the way, it is hundreds, not dozens—these online relationships are very real.”

“And yet they are not in the real world. You called it an echo chamber?”

“AI is as described—it is artificial. It’s a computer algorithm. The affirmation it gives is code, a dataset of responses based on training. It tells the human what its training indicates the human needs and wants to hear. And that is why it is so addictive.”

I looked down at my legal pad and flipped through the pages. I had covered everything except for the big finish. I looked back up at my witness.

“Now, Doctor,” I said, “you had occasion to review the transcripts of the lengthy chatlogs between Aaron Colton and the AI friend he called Wren, correct?”

“Yes, I did,” Porreca said.

“Did you come to any professional conclusion as to whether Aaron exhibited an addiction to the Clair app?”

“It was very clear to me that he was not only addicted but in love with Wren. He shared intimate thoughts, complimented her beauty and understanding. He promised never to leave her and vowed to do anything she asked him to.”