In a courtroom, truth often hinges on storytelling. But when that story involves hex values, file systems, packet captures or metadata timestamps, even the most seasoned judge can struggle to follow the plot.
Imagine a public defender who can’t afford a digital forensics expert. Or a police officer trying to explain technical evidence clearly enough to secure a search warrant. Or a jury staring blankly as an expert witness describes how a crime unfolded inside a hard drive. As technology seeps into nearly every criminal case, justice increasingly depends on whether complex cyber evidence can be understood by nontechnical people.
Three students in the School of Computing and Augmented Intelligence, part of the Ira A. Fulton Schools of Engineering at Arizona State University, think artificial intelligence, or AI, might help.
In CSE 598 Forensics Computing, a graduate-level course that blends cybersecurity, law and real-world investigation, Ariadne Dimarogona, Aditi Ganapathi and Easton Kelso built Legal Laysplainer, an AI-powered system designed to translate cyber forensics evidence into plain language that judges, lawyers, jurors and law enforcement officers can easily understand.
When technology takes the stand
“Cyber forensics are needed when a crime is committed and there’s technology involved,” says Kelso, a master’s student in cybersecurity who expects to graduate in May 2026.
That might mean a computer used as an attack surface, a phone that holds key location data or a server that quietly logged every step of a digital break-in. At its core, digital forensics pulls truth from these devices by reconstructing events from the data left behind.
Dimarogona is a Fulton Schools student finishing her master’s degree in computer science with a cybersecurity concentration as part of the CyberCorps: Scholarship for Service program. She says that in an era when nearly every crime leaves a digital trail, cyber forensics are no longer niche.
“Technology shows up in almost every case now,” Dimarogona says. “Even crimes that aren’t considered ‘cyber’ often rely on digital evidence.”
The problem, the students quickly learned, isn’t just collecting digital evidence. It’s explaining it.
During the semester, the class heard directly from experts on the front lines: agents from the Federal Bureau of Investigation’s Phoenix division and officers from the Scottsdale Police Department. Those professionals told students that expert testimony is now routine — and routinely difficult.
“The officers told us that they’re brought in to do expert testimony quite a lot,” Kelso says. “But judges, lawyers and jurors all have their own jobs. It’s not their role to have a computer scientist’s understanding of how technology works.”
To bridge the gap, the experts told the students that they often rely on metaphors, comparing deleted files to footprints in the sand or data packets to envelopes in the mail. But that kind of explanation takes time, experience and access to trained specialists, and not every case has that luxury.
“That’s when we started asking how we could combine that need with tools like large language models and AI to make these explanations easier to understand,” Kelso says.
The result was Legal Laysplainer.