Supercomputing and academic resources to accelerate research into the virus shift gears as vaccines to fight the pandemic might be nearing production.

Joao-Pierre S. Ruth, Senior Writer

November 24, 2020

4 Min Read
Image: oz - stock.Aobe.com

The COVID-19 High Performance Computing (HPC) Consortium, which formed back in March in the early weeks of the pandemic, is updating its focus and taking on more research projects in light of the data gathered about the spread of the virus and potential vaccines nearing production.

When the consortium initially appeared, the public-private effort brought together supercomputing and other resources of IBM, NASA, Amazon Web Services, Google Cloud, Microsoft, Massachusetts Institute of Technology, Rensselaer Polytechnic Institute, and others. The intent at the time was to apply computing power to epidemiology, molecular modeling, and bioinformatics to support research into COVID-19. Those resources include IBM Summit, a supercomputer used by the Department of Energy with 200 petaflops of compute capacity.

The mission was updated recently to help identify possible near-term treatments for people infected with the virus in the hopes of improving patient outcomes. The move to a new direction stems in part from the volume of data that is now available about the virus.

The consortium stated that some additional research project areas may include:

  • The use of large clinical datasets to understand and model patient response to the virus.

  • The use of multiple clinical trials to learn and validate vaccine response models.

  • Evaluation of combination therapies by using repurposed molecules.

  • Creation of epidemiological models by using large multi-modal datasets.

Jamie Thomas, general manager for strategy and development with IBM Systems, says the launch of the consortium and the elements it brought together represented a novel concept for pooling compute resources. “There are now 43 parties involved across 15 countries,” she says. The consortium began with 330 petaflops of compute capacity, Thomas says, which offered researchers access to supercomputing power typically limited to universities and national laboratories. In the latest phase, the consortium scaled up capacity to 600 petaflops, she says. “That increases the reach for researchers trying to understand something associated with the pandemic and COVID-19,” Thomas says.

In the first phase of the consortium, she says, resources were applied to such research as a study by Utah State University on droplet analysis and the aerosol nature of the virus. That can speak to being informed about transmission within indoor facilities and how to protect against other viruses comparable to COVID-19 in the future, she says.

Though there have been announcements about breakthroughs in vaccines to combat the virus, Thomas says research needs to continue. “The outcomes we see today are substantial evidence that the rapid rate of discovery has benefited all of us,” she says.

Ongoing research continues to test additional treatment options because there is the possibility not every treatment will be as effective with everyone, she says. Iowa State, for example, is studying why certain segments of the population are affected more severely than others, Thomas says.

The next phase for the consortium will include a focus on additional studies on patient outcomes, she says, which may include treatments and drug resistance. They might also look into environmental conditions and other factors related to the virus. Thomas says these establish a need for an ongoing capability in the United States for scientific situations of this nature.

There might be a long-term role for such a public-private collaboration to play in the years ahead. IBM’s CEO Arvind Krishna recently wrote an open letter to President-elect Joe Biden’s incoming administration about developing a science reserve for the betterment of the country. “Perhaps this is an example of what can be attained with a scientific reserve of this nature,” Thomas says.

Progress continues to be made on vaccines for COVID-19 and even after they enter production, it will still take time to cover the world population, she says. The need to continue support for further research and new areas will remain. “Understanding this disease will serve us well for anything that comes forward in the future,” Thomas says. “We have accomplished in a short period of time what would have taken much, much longer without the capabilities of supercomputers.”

Thanks to the consortium's resources such as IBM Summit, Thomas says researchers have been able to make progress in days rather than over many months. “That’s why it is so important to have this kind of capability at our fingertips,” she says. Quantum computing, Thomas says, could also be a significant accelerator in addressing such complex problems as the pandemic and other issues in the future. “These next-generation machines are really artificial intelligence caliber supercomputing devices that allow us to achieve these results,” she says.

 

For more content on supercomputing, follow up with these stories:

What Quantum Computing Could Mean for Software Development

Is Quantum Computing Ready for Prime Time?

Supercomputers Recruited to Work on COVID-19 Research

 

About the Author(s)

Joao-Pierre S. Ruth

Senior Writer

Joao-Pierre S. Ruth has spent his career immersed in business and technology journalism first covering local industries in New Jersey, later as the New York editor for Xconomy delving into the city's tech startup community, and then as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight. Joao-Pierre earned his bachelor's in English from Rutgers University. Follow him on Twitter: @jpruth.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights