Throughout my career, I’ve always been impressed by Intel’s involvement in a wide spectrum of computing and electronics technologies. These range from the mundane and practical on one hand, to forward-looking and disruptive advances on the other. A lot of these weren’t technologies for which Intel ever intended to take direct advantage of over the long term. I think a lot about how Intel facilitated the creation of and early advances in USB. Intel even sold USB chips in the first couple years of USB’s emergence, but stepped aside from that with the knowledge that their main focus was selling processors.
USB made computers and a myriad of consumer electronic devices better and easier to use, and that, Intel knew, advanced the whole industry in which their microprocessors thrived. Today, look around your home, your office and even your car and count the number of USB connectors there are. It’s pretty obvious that USB’s impact has been truly universal.
Aside from mainstream, practical solutions like USB, Intel also continues to participate in the most forward-looking compute technologies. Exemplifying that, in January at the Consumer Electronics Show (CES) show in Las Vegas, Intel announced two major milestones in its efforts to develop future computing technologies. In his keynote address, Intel CEO Brian Krzanich announced the successful design, fabrication and delivery of a 49-qubit superconducting quantum test chip. The keynote also focused on the promise of neuromorphic computing.
In his speech, Krzanich explained that, just two months after delivery of a 17-qubit superconducting test chip, Intel that day unveiled “Tangle Lake,” a 49-qubit superconducting quantum test chip. The chip is named after a chain of lakes in Alaska, a nod to the extreme cold temperatures and the entangled state that quantum bits (or “qubits”) require to function.
According to Intel, achieving a 49-qubit test chip is an important milestone because it will allow researchers to assess and improve error correction techniques and simulate computational problems.
Krzanich predicts that quantum computing will solve problems that today might take our best supercomputers months or years to resolve, such as drug development, financial modeling and climate forecasting. While quantum computing has the potential to solve problems conventional computers can’t handle, the field is still nascent.
Mike Mayberry, VP and managing director of Intel Labs weighed in on the progress of the efforts. “We expect it will be 5 to 7 years before the industry gets to tackling engineering-scale problems, and it will likely require 1 million or more qubits to achieve commercial relevance,” said Mayberry.
Krzanich said the need to scale to greater numbers of working qubits is why Intel, in addition to investing in superconducting qubits, is also researching another type called spin qubits in silicon. Spin qubits could have a scaling advantage because they are much smaller than superconducting qubits. Spin qubits resemble a single electron transistor, which is similar in many ways to conventional transistors and potentially able to be manufactured with comparable processes. In fact, Intel has already invented a spin qubit fabrication flow on its 300-mm process technology.
At CES, Krzanich also showcased Intel’s research into neuromorphic computing—a new computing paradigm inspired by how the brain works that could unlock exponential gains in performance and power efficiency for the future of artificial intelligence. Intel Labs has developed a neuromorphic research chip, code-named “Loihi,” which includes circuits that mimic the brain’s basic operation.
While the concepts seem futuristic and abstract, Intel is thinking of the technology in terms of real-world uses. Intel says Neuromorphic chips could ultimately be used anywhere real-world data needs to be processed in evolving real-time environments. For example, these chips could enable smarter security cameras and smart-city infrastructure designed for real-time communication with autonomous vehicles. In the first half of this year, Intel plans to share the Loihi test chip with leading university and research institutions while applying it to more complex data sets and problems.
For me to compare quantum and neuromorphic computing to USB is as about as apples and oranges as you can get. But, who knows? When the day comes when quantum or neuromorphic chips are in our everyday devices, maybe my comparison won’t seem far-fetched at all.
This appears in the February (331) issue of Circuit Cellar magazine
Not a Circuit Cellar subscriber? Don’t be left out! Sign up today:
Jeff served as Editor-in-Chief for both LinuxGizmos.com and its sister publication, Circuit Cellar magazine 6/2017—3/2022. In nearly three decades of covering the embedded electronics and computing industry, Jeff has also held senior editorial positions at EE Times, Computer Design, Electronic Design, Embedded Systems Development, and COTS Journal. His knowledge spans a broad range of electronics and computing topics, including CPUs, MCUs, memory, storage, graphics, power supplies, software development, and real-time OSes.