A host of companies, including IBM, Microsoft, and Google, along with universities and national labs have teamed up to form the COVID-19 High Performance Computing (HPC) Consortium. This new partnership is designed to provide scientists with supercomputing resources as they figure out how to combat the coronavirus-caused disease known as covid-19.
Faced with a rapidly spreading illness, scientists can create thousands of models on supercomputers in order to better understand the epidemic, characterise the virus, and devise potential vaccines and drug treatments. The organisers of the new consortium will provide 16 supercomputing systems to researchers, as well as a community to engage in the fight together.
“The benefit of having the consortium is to speed up and accelerate the scientific discovery that has to happen in order to develop a vaccine, understand the virus, and eventually kill it,” Michael Rosenfeld, vice president of Data Centric Solutions at IBM, told Gizmodo. He said that high-performance supercomputers might be able to do in minutes or hours what regular computers do in days, months, or years.
The consortium currently represents supercomputers from companies including IBM, Amazon, Google, and Microsoft; universities including Massachusetts Institute of Technology and Rensselaer Polytechnic Institute; and Department of Energy National Laboratories including Lawrence Livermore, Oak Ridge, and Los Alamos, as well as NASA and the National Science Foundation. The consortium is encouraging covid-19 researchers to submit proposals through a central portal, which a steering committee will review in order to connect researchers with the right supercomputing resources.
Supercomputing centres have always supplied discretionary computing time for emergencies, such as during hurricane response, said Kelly Gaither, Texas Advanced Computing Centre’s director of health analytics. She told Gizmodo it was a no-brainer to devote time to fighting the coronavirus.
As for what scientists will actually do with supercomputers during this pandemic, many are trying to understand the structure of the virus and its “spike” protein, as well as how it differs from other coronaviruses, like the virus behind SARS. Supercomputers have already shown their worth in fighting the disease on this front; the Summit supercomputer at Oak Ridge National Laboratory allowed researchers to whittle 8,000 potential virus-fighting molecules down to just 77, for example. Others are using the computers to generate simulations of how the pandemic could play out, when the peak will occur, how long it will last depending on what measures are in place, and what locations will be in most need of supplies.
Researchers have already been submitting proposals to research the virus using U.S. supercomputing resources. The National Science Foundation (NSF) issued a call for proposals relating to covid-19 earlier this month and has already funded 10 quick-turnaround grants totaling $US1,592,789 ($2,749,868), an NSF spokesperson told Gizmodo.
Rosenfeld told Gizmodo that the consortium offers researchers an opportunity to collaborate in ways they might not have done before, such as by helping one another get their code up and running more quickly on the processors. Gaither said that this encourages scientists from disparate specialties to link up and solve problems in new ways and to think creatively about how to incorporate supercomputers into their research.
While it’s impossible to predict how long this pandemic will last, we can only hope that new scientific advances will help us beat it sooner and increase our defences against future pandemics.