The quantum computer, a promise for the future for the nuclear industry

Philippe Duluc

Chief Technology Officer, Atos Big Data & Security and member of the Scientific Community

 Paul da Cruz

Global Business Development Director for Energy and Utilities at Atos

Posted on: 11 September 2018

The nuclear industry needs more and more computing power to meet the myriad challenges it faces. While today's supercomputers will be inadequate in a few decades, quantum computing is generating considerable interest and hope.

As a key component of the energy mix, the nuclear industry faces innumerable challenges when it comes to maintaining its current role and preparing itself for the future. The most pressing need for the existing plants is to reduce their costs. For atomic energy to remain competitive, we estimate that it needs a cost reduction of around 30%.

There are various ways to achieve this. Behavioral simulation opens up new scenarios to optimize the safety and efficiency of operations by providing technicians with information and predicting the behavior of plant equipment with unparalleled accuracy and completeness. Augmented reality means they can act more quickly, more efficiently, and with greater safety, in perfect coordination with the control room. The use of data to better control production, or to apply planning and 5D design, enables maintenance and construction operations to be optimized.

Based on the Internet of Things, 3D, Big Data, artificial intelligence, and complex simulation, these kinds of operational applications require computational power that was only available for research until recently.

Computation at the heart of the nuclear industry

The development, industrial validation, and construction of new generation reactors, such as small modular reactors (SMR), high and very high temperature reactors (HTR, VHTR), and rapid neutron reactors (RNR) require ever greater computing power for modelling and simulation.

While nuclear research and the nuclear industry are already among the first to use High Performance Computing (HPC), and the CEA (Commissariat à l’énergie atomique) has some of the most powerful supercomputers in the world, all of these new developments will only increase demand at an exponential rate. However, conventional IT is already reaching its limits. The traditional race for computing power, which is primarily a race for miniaturization, is approaching the insurmountable wall of atomic scale. The sector is also very attentive to the development of quantum computers, which should allow this limit to be exceeded, as well as exponentially multiplying performance for certain algorithms.

Quantum, beyond high performance computing

Quantum computers are based on subatomic physical objects (qubits) with states that can encode binary information. The big difficulty is managing to intertwine and stabilize these qubits to obtain the equivalent of a processor. Qubits form unstable structures, weakened by so-called "decoherence" phenomena. To date, researchers have successfully stabilized twenty qubits in an operational setting and fifty in an experimental setting. The systems known as Noisy Intermediate-Scale Quantum (NISQ) tolerate certain disturbances and can display 50 to 100 qubits whose short periods of stability are sufficient to perform certain operations. Generally, quantum calculators are not necessarily destined to become general-purpose computers. They serve as accelerators for certain calculation sections or deal with specific problems for which they are particularly suitable, such as n-body problems or combinatorial optimization.

Of course, both of these are of great interest to the nuclear industry! In research, being able to simulate dynamic systems with strongly correlated n-bodies would be a decisive step forward to help improve our understanding of this question. As for combinatorial problems, they are a key issue for the industry, similar to the optimization of shutdowns. So, it is hardly surprising that the US Department of Energy (DOE) is among the first customers of the Atos quantum QLM simulator, or that the CEA is launching an industrial research post, focusing in particular on developing more robust qubits. From atom to atom, the nuclear industry and quantum computing are walking into the future hand in hand.

Share this blog article


About Philippe Duluc

Chief Technology Officer, Atos Big Data & Security and member of the Scientific Community
Philippe Duluc graduated from Ecole Polytechnique in Paris, initially working as a military engineer, first for the French Ministry of Defence, and then for the Prime Minister’s office, in various security and technology management positions. After 20 years of service, he joined the private sector, first as the Corporate CSO for Orange Group, then as the Manager of Bull’s Cybersecurity Business Unit. After having put in place mission critical systems within Atos’ Big Data & Cybersecurity Division, he is now CTO of Atos’ Big Data & Security Division. He is an adviser to the European Network and Information European Agency, and has a keen interest in scientific and technical domains involved in information society development: cryptography, security, computing, communications, and Big Data, including future directions like the quantum revolution.

About Paul da Cruz

Global Business Development Director for Energy and Utilities at Atos
Paul is the Global Business Development Director for Energy and Utilities at Atos.  With 30 years’ experience, he has worked right across the value chain in nuclear, fossil and renewables.  Paul has a deep understanding of transmission, distribution, trading and retail. He has been the European Chair of the American Nuclear Society on ICHMI; a member of the Special Interest Group of the Institution of Engineering and Technology for Power Generation Control; a member of the organizing committee of PowerGrid Europe; and member of the Steering Group of the Institution of Nuclear Engineers of Great Britain.