Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content.
You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Managing your cookies

Our website uses cookies. You have full control over what you want to activate. You can accept the cookies by clicking on the “Accept all cookies” button or customize your choices by selecting the cookies you want to activate. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button.

Necessary cookies

These are essential for the user navigation and allow to give access to certain functionalities such as secured zones accesses. Without these cookies, it won’t be possible to provide the service.
Matomo on premise

Marketing cookies

These cookies are used to deliver advertisements more relevant for you, limit the number of times you see an advertisement; help measure the effectiveness of the advertising campaign; and understand people’s behavior after they view an advertisement.
Adobe Privacy policy | Marketo Privacy Policy | MRP Privacy Policy | AccountInsight Privacy Policy | Triblio Privacy Policy

Social media cookies

These cookies are used to measure the effectiveness of social media campaigns.
LinkedIn Policy

Our website uses cookies to give you the most optimal experience online by: measuring our audience, understanding how our webpages are viewed and improving consequently the way our website works, providing you with relevant and personalized marketing content. You can also decline all non-necessary cookies by clicking on the “Decline all cookies” button. Please find more information on our use of cookies and how to withdraw at any time your consent on our privacy policy.

Skip to main content

The value of Exascale

Why life sciences must pay attention to exascale computing

Exascale is a measure of computer speed, where exascale means the ability to do an exa-something every second or, in other words, 1018 somethings a second. Usually, the ‘somethings’ are ‘calculations’, so an exascale is a computer that can achieve 1018 calculations (a billion billion) every second.

Power of supercomputing in life sciences

As we approach an era where the supercomputing community is talking more vocally about exascale computing systems, with high performance computing (HPC) teams around the world making concrete plansfor their implementation within the next few years, why does it make sense to put resources into what, at first sight, may seem to be a hubristic goal? These systems will cost hundreds of millions, and consume as much power as a small town – are they worth all that investment?

In recent decades, supercomputing has proven itself as a highly effective complement to traditional science, with HPC simulation as an addition to lab science and the ability to handle large data volumes opening up new investigative domains. The obvious example in life sciences is large-scale gene sequencing and the new scientific avenues that open from the ability to sequence the genomes not just of individuals, but populations. The huge benefits to disease understanding, as well as treatment and outcomes, are obvious – helping medical treatments become less invasive and more effective at the same time.

Reaching new limits

But there are two reasons why this progress might stall – the first is on the science side, the second is on the computational side. From the scientific point of view, there are many situations where phenomena operate across a very wide range of length scales, or time scales, or both. For example, many cancerous tissues grow at a molecular scale of maybe tens of Angstroms (10-10m) and at a cellular scale of 100 µm or so, and on up to the human scale of ~2m. So, length scales vary by perhaps a factor of a billion, all for the same disease.

The huge benefits to disease understanding, as well as treatment and outcomes, are obvious – helping medical treatments become less invasive and more effective at the same time.

While we may have effective HPC models for the processes operating at each individual scale, tying these models together effectively is an immense undertaking. In a similar way, time scales vary across many orders of magnitude, from molecular bond vibration in femtoseconds, to protein folding in milliseconds, to disease progression in months or years (and lots in between). This means we know that computer models are generally effective, and we know that we have models that work well for individual aspects of (in this example) cancer; but we just have computers that are nowhere near big enough to handle the multifold calculations needed to draw meaningful insights for disease progression.

This brings us on to the second issue, on the computational side. We have all been used to a world where if you couldn’t compute something, just wait a few years and the computers will get faster, and away you go. Unfortunately, the world isn’t like that anymore – the core frequency of computer systems hasn’t substantially changed in over a decade. What used to happen is that the little flywheel that spits out calculations would spin faster, and so software would run faster without anyone having to do anything to it. But there came a point where the physics of silicon made it impossible to increase the speed of the flywheel any further. It has been relatively easy to hide this fact as processor manufacturers were able to increase the efficiency of their products. But it is getting harder to achieve these sorts of no-cost, or low-cost advances in computing; so, we must seek elsewhere if we want to solve the sorts of high complexity challenges discussed above.

Game-changers to come

We see teams around the world making great strides towards their solution. These teams are collaborations of hardware, software, data science and applications science specialists, because the way forward requires contributions across a range of disciplines. And for every cure-for-cancer application you can think of (and there are lots more where that came from: smart grids, earthquakes, climate change and mitigation all need exascale), you can also think of either a game-changer economic application or a defence-type application.

So, it is my belief that the economies of industrial societies will be dependent on exascale computing for decades to come, just as they have been on supercomputers over the last few decades.

If you would like to hear more about exascale computing and its impact, I’ll be speaking at the CompBioMed Conference, running from the 15-17 September. You can view the details and register for this conference here.

Read more

By Dr. Crispin Keable

Posted on: August 18, 2021

Share this blog article


About Crispin Keable
HEAD OF BIG DATA AND HPC and member of the Scientific Community
With over 25 years’ experience in supercomputing, Crispin has worked through the evolution of High Performance Computing (HPC) from proprietary systems in hardware and software to open standards, open source and commodity technologies. He works with a wide range of scientific and technology organizations to help define, translate and realize their technical and simulation goals.

Follow or contact Crispin