The strengths and limitations of computing advances are pushing the boundaries of what quantum machines can do.
From the phone in your pocket to the world’s fastest supercomputers to the still-experimental world of quantum technology, computers come in many forms, each with different strengths and limitations. But what might computers of the future be able to do?
While classical computers power our daily lives, supercomputers are able to tackle some of humanity’s toughest challenges. Quantum computers, meanwhile, may one day solve problems that we can’t even think of yet. Understanding how these three types of computers differ and how they build upon one another highlights the unique role each plays in modern technology.
At the University of Rochester, researchers study how to improve the operation of today’s computers, use supercomputers to advance research, and explore fundamental breakthroughs in quantum science and technology. Their work helps connect discoveries in fields such as physics to engineering to advances in computing.
Says John Nichol, an associate professor in the Department of Physics and Astronomy, who studies experimental quantum computing and quantum behavior in nanoscale systems: “It’s a really exciting time to be studying quantum mechanics and computing.”
Classical computers: the everyday mainstay
Classical computers are the machines most of us know and use in our daily lives, from laptops and phones to desktops and servers.
At the heart of every computer is a processor, an integrated circuit (or chip) containing many billions of transistors. Transistors are tiny switches that represent bits and can be either a 0 (off) or a 1 (on). The processor performs calculations by turning the transistors on and off, following instructions stored in memory. These instructions come from software (or code), which tells the computers step by step how to process information and carry out tasks.
STUCK ON YOU: URochester students bring a personal flair to laptops, a ubiquitous form of classical computing. (University of Rochester photo / Nicholas Foti ’19)
Strengths: Many of today’s classical computers have multiple processors—usually between 4 and 16—which allow them to work on several tasks at once. Specialized processors for artificial intelligence can have thousands of small processors. Classical computers are versatile, affordable, and reliable for daily tasks such as typing documents, browsing the internet, emailing, and streaming videos.
Limitations: Some problems are too large or too complex for classical computers to solve in a reasonable amount of time. For decades, chip performance was improved by shrinking transistors to fit more on a chip, a principle known as Moore’s Law. But this approach has its limits. Eby Friedman, a professor in the Department of Electrical and Computer Engineering, studies how to design processors and computing systems to maximize performance and energy efficiency. He explains, “Transistors have been shrinking for decades, but we’re now at the point where the transistors and wires are only a few nanometers wide. You can’t shrink forever.”
Once you reach transistors that are so small they are on the scale of atoms, physics gets in the way: heat builds up, electrical current becomes excessive, and efficiency plummets. Friedman’s research addresses these issues without relying solely on shrinking transistors but instead by rethinking how processors and entire computing systems are designed. His work explores ways to boost performance and energy efficiency through smarter circuits and architectures, allowing classical computers to keep advancing even as the physical limits are reached.
Although classical computers are sufficiently fast for everyday tasks, improving their performance is important for tackling complex engineering problems, analyzing large datasets, and supporting emerging technologies such as artificial intelligence.
WHEN CLASSICAL WAS CUTTING-EDGE
From punch cards to parallel processors, trace URochester’s decades-long history of harnessing computational power.
(University of Rochester photo / Department of Rare Books, Special Collections, and Preservation)
URochester computing center. (University of Rochester photo / Department of Rare Books, Special Collections, and Preservation)
URochester computing center circa 1970. (University of Rochester photo / Department of Rare Books, Special Collections, and Preservation)
Brian McIntyre and John Wilkerson in 1987. (University of Rochester photo / Department of Rare Book, Special Collections, and Preservation)
Computer terminals in the reference area in 1994 in what is now Lam Square. (University of Rochester photo / Liz Argentieri)
Campus computing station with rows of desktop computers in 2001. (University of Rochester photo / Elizabeth Lamark)
Chase Hermsen in a Hylan computer lab in 2010. (University of Rochester photo / J. Adam Fenster)
Wendi Heinzelman, Electrical and Computer Engineering and Computer Science Professor and Dean of Graduate Studies for Arts, Sciences and Engineering is photographed with tablet computers used in her research in wireless communications and networking, mobile computing, and multimedia communication at her lab in Hopeman Hall May 29, 2014. // photo by J. Adam Fenster / University of Rochester
Supercomputers: the scaled-up powerhouse
Supercomputers take classical computing to the next level. At their core, supercomputers are classical computers, but they are much bigger and faster. Instead of a handful of processors, supercomputers use hundreds of thousands of large processors working together.
But supercomputers aren’t the only way to scale up classical computers. Friedman says that many tasks once reserved for supercomputers are now handled by data centers, which are warehouses of connected servers. While supercomputers are designed to solve one massive problem at incredible speed, data centers such as Google’s or Amazon’s cloud server farms are built to handle many independent tasks at once, including hosting websites and cloud storage or running AI tools. Their power comes from their large scale rather than extreme speed operating on a single problem.
While these systems may be built for different goals, classical computers, supercomputers, and data centers all share the same basic components and principles.
“The technology is optimized differently, the architectures are optimized differently, and the software is optimized differently, but, at the same time, it’s still transistors, they still compute in memory, it’s still binary, and the programming languages are ones you’ll recognize,” Friedman says.
POWER UP: In addition to being one of the world’s most powerful computer systems, URochester’s Conesus supercomputer is among the most energy efficient, as ranked by the TOP500 list. (University of Rochester photo / J. Adam Fenster)
Strengths: Supercomputers are good at solving computational-heavy problems that would take ordinary computers way too long to solve. They can predict weather, model pandemics, design new structures, and simulate complex systems. At Rochester’s Laboratory for Laser Energetics, for example, researchers use the supercomputer Conesus to model physics phenomena at extreme temperatures and densities like those at the center of stars, including simulating fusion to advance clean energy and national security applications, analyzing experimental data, and planning more effective experiments.
“The supercomputer Conesus has dramatically enhanced LLE’s computational capabilities, allowing for an understanding of very complex physics phenomena in three dimensions with unprecedented details,” says Valeri Goncharov, LLE’s theory division director and an assistant professor of research in the Department of Mechanical Engineering. “A significant increase in computational speed also opens up opportunities to use simulation results to train AI models, enabling transformative advances in areas like fusion research.”
Limitations: Despite their massive power, and even with advances in classical computer circuits, architecture, and efficiency, there are still calculations that are beyond the reach of a supercomputer. For instance, supercomputers can simulate simple molecules but simulating complex molecules with high accuracy can overwhelm even the best machines.
“Simulating my favorite molecule—and perhaps your favorite molecule—caffeine, would take a number of transistors roughly equal to the number of silicon atoms in the entire planet Earth,” Nichol says. “We’re actually not far away from individual transistors being the size of an individual atom. If we want to continue the scaling of modern computing technology, we will be confronted with the quantum properties of individual atoms and, in a sense, will be forced into quantum computing.”
Get to know URochester’s supercomputer
Meet Conesus, the University’s state-of-the-art supercomputer. Named after one of the Finger Lakes, it fuses physics modeling with AI and machine learning to accelerate breakthroughs in high-energy-density science. What once took 30 weeks to compute can now be done in just days.
Quantum computers: the new paradigm
Quantum computers are fundamentally different from classical computers and supercomputers. Instead of ordinary bits, which can only be 0 or 1, quantum computers use quantum bits called qubits. Due to quantum mechanical principles like superposition and entanglement, qubits can exist in multiple states at once. That is, they can simultaneously be both 0 and 1, opening up computational possibilities that are beyond the reach of classical computers. Some quantum computers go beyond the qubit and use qudits, which involve not only 0 and 1 but also 2, 3, and so on.
The foundations of quantum mechanics were developed about 100 years ago, when scientists realized that existing classical physics couldn’t explain certain phenomena such as why heated metal glows red instead of blue. They discovered that light comes in tiny packets, later called photons. In the decades that followed, these ideas developed into quantum mechanics, the branch of physics that involves studying how very tiny particles, such as electrons and atoms, behave in ways that are often strange and unpredictable.
FRIDGE BENEFITS: A dilution refrigerator in Nichol’s lab. Such refrigerators can cool atoms to nearly absolute zero temperatures, making quantum computers colder and improving their performance. (University of Rochester photo / J. Adam Fenster)
Progress in quantum computing therefore depends on understanding the fundamentals of quantum mechanics, which is why scientists like John Nichol study these systems at their most fundamental level.
“Information is physical,” Nichol says. “Computers manipulate information. Physics tells us how that information is manipulated. So quantum computing is actually a really great example of this interplay between physics and computing.”
Strengths: Quantum computers currently aren’t faster for everything, but they could be transformative in certain areas, most notably for encryption and decryption, by allowing new methods for securing and transmitting data. Quantum computers are also suitable for tasks such as simulating complex molecules and materials, which could transform drug discovery and materials science. Researchers are also exploring how quantum computers might help with optimization problems such as improving logistics or traffic flow.
Limitations: Quantum technologies are in their infancy. Because they are still experimental and expensive, “the main market for quantum computers is governments and other scientists,” Nichol says. One of the biggest hurdles is error correction—developing a quantum computer that can reliably detect and fix its own errors so it can run long, complex calculations without breaking down.
“A universal fault-tolerant quantum computer is still some years away in the future,” Nichol says. “But there are companies and academic research groups that now have quantum processors with hundreds of qubits. These machines are starting to be able to do some pretty significant calculations that can’t be done by even the best classical supercomputers.”
The future of computing
Nichol and Friedman both note that classical computers won’t be going away—at least, not anytime soon. Classical computers will remain essential to power daily life, while supercomputers will continue to tackle the hardest scientific and engineering challenges. Quantum computers, meanwhile, may one day open up entirely new problems and solutions.
So, might we one day have quantum computers in our homes?
Nichol says quantum computers aren’t likely to show up in people’s homes in the near future, since the algorithms and calculations they run aren’t especially useful for everyday tasks. But, he points out, it’s hard to predict how the technology will evolve.
“Suppose we travel back in time to 1960, and we told people what we’re doing today with computers,” Nichol says. “They wouldn’t believe us. I think we’re in the same spot with quantum computers. We have no idea all the things that they’ll be useful for.”