English and most other human languages apply a feature known as negation. For example, the sentence “Sarah never leaves her house in the morning without coffee” includes two negations: never and without. But the most common interpretation of this sentence is positive. People infer that Sarah does actually leave her house — after coffee.

“Roughly 20% of sentences in English use one or more negations. So, they are very common. Even young children understand and use them,” says Eduardo Blanco, an associate professor of computer science in the Ira A. Fulton Schools of Engineering at Arizona State University. “But computers find it incredibly challenging to comprehend negation in human or natural language. Even state-of-the-art machine translation systems experience substantial drops in performance when faced with negations in textual inputs.”

Blanco is an expert in natural language processing, a subfield of computer science that seeks to design algorithms and models that enable machines to better understand people. His research within the School of Computing and Augmented Intelligence, one of the seven Fulton Schools, centers on computational semantics or the construction of meaning representations from text such as human-generated questions.

Read more on Full Circle