What is exascale computing?

Conventionally, the speed of a computer is gauged by the number of arithmetic operations it can conduct per second (floating point operations per second or FLOPS). Presently, the fastest computers have a speed of 93 PetaFLOPS, which translates into a whopping 93x1015 operations per second. However, this threshold is going to be broken by exascale computers that will be capable of delivering speeds of 1018 operations per second. To put this in perspective, a standard desktop computer can deliver speed in the range of GigaFlops (109 operations per second) and an exascale computer will be more powerful than a desktop by a factor of one billion.

Barriers to exascale computing

One may think that by connecting memories of standard computers we can achieve a faster system by the virtue of the system acting in unison. However, such kind of approach is not workable. For example, just by connecting 1000 car engines together cannot produce a car thousand time faster, and the amount of energy that such kind of system would consume and the kind of pollution it would create is unfathomable. Hence, a different kind of mechanism is needed for creating exascale computers.
A standard computer works by obtaining data from a memory unit and transporting it to a computing unit, calculate the results and moving the results back to the memory unit. If the same mechanism is used for an exascale computer, then the quantum of energy required would be of a herculean scale, enough to power entire countries. Hence, a saner approach to build an exascale computer would be to store data as close as possible to the computing unit where it needs to go.
One technique to achieve this closeness between data and computing unit is to extend the layout of traditional two-dimensional electric circuits by stacking the circuits in the third dimension. Not only will this consume less energy, but it will also increase the speed of data transfer. Presently, the major constraint on processing speed is the speed of data transfer. To achieve the speed of an exascale computer, thousands of such processing and memory units with fast connection speeds are required. To produce such kind of system is prohibitively expensive.


Subscribe now for valuable insights!



Applications of exascale computing

Fight against pollution

Exascale computing opens new vistas in fighting pollution that is caused by the burning of fossil fuels. A major chunk of the energy needs of the world is satisfied by burning fossil fuels. However, by understanding deeply the process of combustion and the chemical processes comprising it, we can minimize the amount of pollution created, thereby optimizing the combustion process. This can be achieved by exascale computing, and it may be possible to increase the efficiency of combustion systems in engines and turbines to the tune of 25-50 percent. Such increased efficiency would translate into lower consumption of fossil fuels and reduced pollution levels.


Furthering material science

Discovering new materials is critical for the development of novel technologies. One way this can be done is by creating complex calculations that simulate how materials behave in nature and to have a massive database of existing compounds that helps to combine them to produce new materials having desirable properties. Presently, deep learning and classical simulation technologies are used for this purpose. However, to accelerate this process significantly, exascale computing is an ideal candidate. One critical application of exascale computing may be to aid the discovery of more efficient batteries that will revolutionize energy storage.


Enhancing healthcare

Exascale computing will aid in furthering cancer research by understanding the protein interactions in the cells of cancer patients and analyzing the records of millions of patients to come up with optimal treatment procedures. Treatment of various infections may also be discovered with the help of this technology.


Improving the quality of life

Exascale computing will contribute significantly to improve our quality of life by improving life in cities. This can be achieved by optimizing infrastructure such as transportation, housing, energy and others that are key parameters in determining the quality of life. With the help of this technology, large scale calculations can be performed on huge datasets, sensors and simulation results and deriving key insights out of them. Such kind of inferences will be crucial in designing and planning new cities and creating smart infrastructure. Big data plays a crucial role in the technological evolution and discovery of new insights.
In addition, weather can be predicted more accurately and faster with the help of exascale computing and severe weather can be understood clearly by predicting the path of hurricanes. This can be a boon for saving precious lives during extreme weather.


Race to build the first exascale computer

America’s first exascale computer, named Aurora, is hoped to be operational by the end of 2021. However, China is giving stiff competition to the US in the race to develop supercomputers and is expected to unveil its first exascale computer in 2020 itself. However, it is hoped that Aurora will reclaim the title of the world’s fastest supercomputer for the US once it is operational in 2021.


Subscribe now for valuable insights!