Partial vision loss can make life challenging for more than six million Americans. People with visual disabilities that can’t be remedied with glasses or contacts can sometimes struggle to safely navigate the world, with some having trouble reading street signs or dealing with changes in terrain while walking.

A student in the School of Computing and Augmented Intelligence, part of the Ira A. Fulton Schools of Engineering at Arizona State University, studies ways to use artificial intelligence, or AI, to help people with visual disabilities more fully experience the world around them.

Kelly Raines is working on her master’s degree in computer science through the Fulton Schools Accelerated Master’s degree program. After being awarded her undergraduate degree in computer science with an emphasis in software engineering in fall 2024, Raines joined the program, which allows engineering students to earn a master’s degree with as little as one additional year of study.

As an undergraduate student, she connected with opportunities in the Laboratory for Learning Evaluation of autoNomous Systems, or LENS Lab, a research group led by Ransalu Senanayake, a Fulton Schools assistant professor of computer science and engineering. Under Senanayake’s supervision, Raines developed AI that can work with smart glasses, including the Ray-Ban Meta AI Glasses, to assist those with visual disabilities by allowing them to ask questions about their environment and receive assistance in making informed decisions.

In January, Raines was given special recognition for the work from the Computing Research Association, or CRA, where she received an honorable mention in the Outstanding Undergraduate Researcher Awards. The award program is designed to acknowledge and foster the development of talented North American computer science students.

Read the full story on Full Circle.