In southeastern Mesa, Arizona, construction crews are hard at work on a state-of-the-art data center. The $1 billion facility will open in 2026 and provide approximately 2.5 million square feet of processing power — the equivalent of more than 43 football fields — for Meta, the parent company of Facebook.

This is a world where every text message, phone call and website click leaves a virtual trace, and all that information must be stored somewhere. The new Meta Mesa Data Center is a nod to the spiking need for digital information management. The advent of the era of artificial intelligence, or AI, has accelerated both the demand for more centers and the need for them to become more energy efficient.

Zhichao Cao, an assistant professor of computer science and engineering in the School of Computing and Augmented Intelligence, part of the Ira A. Fulton Schools of Engineering at Arizona State University, is studying improvements in data storage systems that are designed to enhance the performance, resource management and sustainability of sites like the Meta Mesa Data Center.

Cao has received a 2025 Faculty Early Career Development (CAREER) Award from the U.S. National Science Foundation, or NSF, for his work on innovative data storage solutions.

Valuable new solutions for key-value stores

Over the last 50 years, data centers have evolved. For many years, these facilities were large banks of identical file servers, or ultra-powerful computers, that worked cooperatively to complete tasks.

“But we found out that using homogeneous servers for different kinds of jobs and applications can waste resources and sometimes not be sustainable,” Cao says. “Some types of data analysis require a lot of processing power but don’t need high-performance storage systems. These systems, or databases, need tons of storage but not powerful CPUs or GPUs.”

Today, data centers are moving toward what’s known as a disaggregated model. Storage, the processing power of CPUs or GPUs, and memory, or the part of the system that houses the instructions computers need to operate, all exist separately in different machines and resource pools. This allows engineers to tap and provision the resources they need and not waste the ones they don’t.

Cao’s work is designed to leverage this structure. He is creating new and better ways to manage data that are tailored to disaggregated centers. His work specifically concerns persistent key-value stores. In these kinds of systems, data is stored in key-value pairs. A key, or a unique identifier like a social security or phone number, is used to help quickly find records, known as values.

The data is designed to be persistent, or to remain stored for long-term preservation, even if there is a power loss.

Data centers consume significant amounts of energy and water. Cao’s research hopes to conserve these valuable resources. Historically, computer engineers focused on processing speed, without much concern for power use. Quicker was always better. But not every task needs to be completed lightning fast.

“It’s good to start redesigning existing data systems for the new data center architecture, focusing more on the tradeoff between performance and sustainability,” Cao says. “We’re redesigning the persistent key-value stores to make them more efficient and provide very precise control. This allows those stores to scale up or down as needed when tasks require more processing or more storage.”

He is forging relationships with key partners, including experts at Samsung, Snowflake, Western Digital and Meta to ensure the solutions being developed meet industry needs.

Read the full story on Full Circle.