Spy Agencies Fund IBM's Quantum Computing Research - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Government // Leadership
Commentary
12/11/2015
09:06 AM
Larry Loeb
Larry Loeb
Commentary
50%
50%

Spy Agencies Fund IBM's Quantum Computing Research

Big Blue gets a multi-year grant from the US intelligence community for the development of quantum computing technology. A universal quantum computer could tackle challenges such as safeguarding against cyberattacks and speeding up medical R&D.

10 Linux Distros Perfect For Holiday Gift-Giving
10 Linux Distros Perfect For Holiday Gift-Giving
(Click image for larger view and slideshow.)

The US Intelligence Advanced Research Projects Activity (IARPA) program has notified IBM that it will award the company's scientists a major multi-year research grant to advance the building blocks for a universal quantum computer. IBM announced the grant on December 8.

IBM is not disclosing further terms of the award because "it is subject to completion of final contract negotiations," according to the company.

IARPA is the research arm of the 17-member US Intelligence Community, which includes the CIA. We think a universal quantum computer could be great at addressing intelligence challenges such as deciphering encrypted data. IBM has other use-cases in mind. According to the company, "This type of leap forward in computing could one day shorten the time to discovery for life-saving cancer drugs to a fraction of what it is today; unlock new facets of artificial intelligence by vastly accelerating machine learning; or safeguard cloud computing systems to be impregnable from cyber-attack."

A quantum computer differs from a traditional computer, in which computational elements are represented by a 0 or a 1. Quantum computers have atom-sized bits that can represent 0, 1, or a superposition of both 0 and 1 at the same time. These quantum bits, or qubits, can allow a computer to perform multiple parts of a calculation at once. This makes them far more powerful than traditional computers.

This all comes at price. Qubits are currently unstable hardware and have to be shielded from heat and electromagnetic interference. They also need to be cooled to near absolute zero (-459 degree F). Otherwise, they return damaging errors.

The IARPA award is funded under the Logical Qubits (LogiQ) program of IARPA led by Dr. David Moehring. The LogiQ Program seeks to overcome the limitations of current quantum systems by building a logical qubit from a number of imperfect physical qubits.

(Image: bestdesigns/iStockphoto)

(Image: bestdesigns/iStockphoto)

According to an article in Quartz, the most powerful quantum computer that IBM has built contains only eight qubits. This machine represents qubits as Josephson junctions, which are two layers of superconductor separated by a thin insulating layer.

[IBM's not the only one making the quantum leap. Read Google, NASA Bet on Quantum Computing.]

IBM has shown, by arranging the computer into a 2D array, the ways to correct for the two kinds of errors that can occur in this kind of machine: bit-flip and phase errors. This is a major step in getting useful, trustworthy output from a quantum computer.

Alternative approaches, as those one taken by D-Wave, NASA, and Google -- which put the qubits in a line -- can't detect both bit-flip and phase errors at the same time.

**Elite 100 2016: DEADLINE EXTENDED TO JAN. 18, 2016** There's still time to be a part of the prestigious InformationWeek Elite 100! Submit your company's application by Jan. 18, 2016. You'll find instructions and a submission form here: InformationWeek's Elite 100 2016.

Larry Loeb has written for many of the last century's major "dead tree" computer magazines, having been, among other things, a consulting editor for BYTE magazine and senior editor for the launch of WebWeek. He has written a book on the Secure Electronic Transaction Internet ... View Full Bio
We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
larryloeb
50%
50%
larryloeb,
User Rank: Author
12/22/2015 | 2:56:01 PM
Re: Leaping into the future
Well, that's why they are getting all this money thrown at them. Lots of people want it done.
kstaron
50%
50%
kstaron,
User Rank: Ninja
12/22/2015 | 2:35:44 PM
Leaping into the future
Feels like we are getting into the realm of science fiction. If they can figure a way around the errors at normal temperatures this could change computing and everything that is touched by it, which is basically every aspect of our lives at this point.Can't wait to see what comes of this research.
Slideshows
Reflections on Tech in 2019
James M. Connolly, Editorial Director, InformationWeek and Network Computing,  12/9/2019
Slideshows
What Digital Transformation Is (And Isn't)
Cynthia Harvey, Freelance Journalist, InformationWeek,  12/4/2019
Commentary
Watch Out for New Barriers to Faster Software Development
Lisa Morgan, Freelance Writer,  12/3/2019
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
The Cloud Gets Ready for the 20's
This IT Trend Report explores how cloud computing is being shaped for the next phase in its maturation. It will help enterprise IT decision makers and business leaders understand some of the key trends reflected emerging cloud concepts and technologies, and in enterprise cloud usage patterns. Get it today!
Slideshows
Flash Poll