Harvard University researchers have successfully employed Google’s public cloud infrastructure to replicate a supercomputer used in a heart disease study. This innovative approach addresses the challenge of limited access to powerful supercomputers for research.
The study aimed to simulate a novel therapy dissolving blood clots and tumor cells in the circulatory system. The researchers faced a common problem: insufficient access to supercomputing resources for their calculations. While they could run one simulation on a supercomputer, they couldn’t repeat or refine it due to limited availability.
To overcome this hurdle, the researchers partnered with Citadel Enterprise Americas LLC and used Google Cloud’s public cloud to recreate a supercomputer-like environment.
Though public cloud platforms are designed for different tasks like web hosting and streaming, the team utilized thousands of virtual machines on Google Cloud, achieving 80% of the efficiency of dedicated supercomputers. Google Cloud’s chief technologist for high-performance computing, Bill Magro, highlighted the potential of cloud technology in solving scientific computing challenges.
While this breakthrough offers an alternative for research organizations needing extensive computing power, experts like Holger Mueller from Constellation Research Inc. don’t find it surprising.
Google Cloud’s configurability and advanced capabilities have been demonstrated through tasks like translation models. Despite the emergence of public cloud as a potential competitor in high-performance computing, the limited availability of cloud resources due to high demand ensures supercomputer providers need not be overly concerned. The cloud’s capacity constraints persist due to the growing interest in AI workloads.