- It bears repeating: high-performance computing, big data, and high-speed research networks are essential components of contemporary science.
- They are also inevitable outgrowths of human strengths and desires.
- Distributed computational power saves time and lives.
Whenever we ask researchers if they could have accomplished their research goals without the aid of high-performance and distributed computing resources, the question is usually met with a chuckle and then some quick math explaining how out of reach it would have been otherwise.
Though the question has become laughable, it’s hardly a laughing matter. In 2015 computation is essential for scientific discovery, and that means it’s transforming our world.
“A full modeling of all of the behavior of an ultrasound scan would have required a century using the average desktop computer, a time frame that eliminates the value of those simulations.”
That’s the bottom line for Andrew Hesford, senior engineer at HABICO and research assistant professor at the University of Rochester. Hesford spoke with us in February; he and his team are working on ultrasound detection of breast tumors. Thanks to supercomputers, Hesford’s research is honing a doctor’s analytical tools, which means earlier detection and more lives saved.
“If you want to give students a robust experience in science, get them engaged, and provide access they wouldn’t have had otherwise, this is the way to do it.”
That’s key for Sue Schmidt, the North American Network of Science Labs Online (NANSLO) project coordinator at Western Interstate Commission for Higher Education (WICHE). In May, she spoke to us about bringing lab tools to previously ignored areas. Since some students cannot attend traditional classes and some colleges cannot afford high-end equipment, NANSLO uses national research and education networks like Internet2 to connect students to scientific labs.
Five years ago, such problems were insurmountable and students and colleges would have had to make some difficult choices. But now, network speeds and infrastructure are in place to democratize access to the educational resources our students deserve.
“What took only a few days on Mira would have taken a few hundred years on the average desktop — in other words, not doable.”
So says Michael Deem, chair of the bioengineering department at Rice University. Deem told us in April how his team uses supercomputers to screen 2.6 million molecule-sized sponges (zeolites) used by the petrochemical industry.
Deem's research has found a way to reduce ethanol purification to a single step, saving time and money for medical and research facilities — and customers like us. Finding the appropriate zeolite is also improving oil viscosity, which means better fuel efficiency and longer lasting engines.
If you’ve followed our publication for any amount of time, you’ve probably read some variant of the preceding sentiments more than a few times. It’s obvious in retrospect, but it bears closer look: Computation is transforming our lives.
And, according to Irene Qualters, division director of advanced cyberinfrastructure at the US National Science Foundation (NSF), expectations are increasing. “The community we support is not only multidisciplinary and highly internationally collaborative, but researchers expect their work to have broad societal impact.”
HPC, high-speed networks, big data — these technologies are amplifications of human strengths. They signify we’re expanding our capacities through digital means to understand the enormous complexity within and without us. Our cyber-extension is the latest stage in humanity’s age-old reach for knowledge, and is a harbinger of strides we’ve yet to take. Like every phase in human history, this one may bring peril, but it holds even more promise.