January 25, 2016

A new quantum approach to big data


This diagram demonstrates the simplified results that can be obtained by using quantum analysis
on enormous, complex sets of data. Shown here are the connections between different regions
of the brain in a control subject (left) and a subject under the influence of the psychedelic compound
psilocybin (right). This demonstrates a dramatic increase in connectivity, which explains some of the
drug’s effects (such as “hearing” colors or “seeing” smells). Such an analysis, involving billions of brain
cells, would be too complex for conventional techniques, but could be handled easily by the new
quantum approach, the researchers say. Courtesy of the researchers

(January 25, 2016)  System for handling massive digital datasets could make impossibly complex problems solvable.

From gene mapping to space exploration, humanity continues to generate ever-larger sets of data — far more information than people can actually process, manage, or understand.
Machine learning systems can help researchers deal with this ever-growing flood of information. Some of the most powerful of these analytical tools are based on a strange branch of geometry called topology, which deals with properties that stay the same even when something is bent and stretched every which way.

Such topological systems are especially useful for analyzing the connections in complex networks, such as the internal wiring of the brain, the U.S. power grid, or the global interconnections of the Internet. But even with the most powerful modern supercomputers, such problems remain daunting and impractical to solve. Now, a new approach that would use quantum computers to streamline these problems has been developed by researchers at MIT, the University of Waterloo, and the University of Southern California.

The team describes their theoretical proposal this week in the journal Nature Communications. Seth Lloyd, the paper’s lead author and the Nam P. Suh Professor of Mechanical Engineering, explains that algebraic topology is key to the new method. This approach, he says, helps to reduce the impact of the inevitable distortions that arise every time someone collects data about the real world.

In a topological description, basic features of the data (How many holes does it have? How are the different parts connected?) are considered the same no matter how much they are stretched, compressed, or distorted. Lloyd explains that it is often these fundamental topological attributes “that are important in trying to reconstruct the underlying patterns in the real world that the data are supposed to represent.”

read entire press  release >>