• Subscribe

At Science Node, we need your help. Science Node is serving more people than ever before. Because of the economics of support for scientific research organizations, our sponsorship dollars are running behind our expenditure rate. We'd like to raise $20k from readers to balance the books for the first 6 months of the year. Donate now to Science Node through the IU Foundation.

Revolutionizing ocean observation

Speed read
  • Scientists have an incomplete understanding of the world's oceans
  • Technological advances--from sensors to robotics--are now making ocean data easily accessible
  • 25 years of consistent data collection will shed light on critical issues such as climate change, ocean acidification, and carbon cycling

Despite covering 71 percent of the Earth’s surface, we still don’t know as much as we could about the world’s oceans. In fact, we currently have better maps of the surface of Mars than we do of the ocean floor.

Access the ocean. The OOI collects continuous observations of the seafloor, ocean, and atmosphere in the Atlantic and Pacific. Collected data is freely disseminated to scientists addressing research on coastal ocean dynamics, ecosystem health, plate-scale seismicity, seafloor volcanism, and more. Courtesy Ocean Observatories Initiative.

An enormous amount of ocean data still needs to be collected, but it will require a collaborative effort. That's why scientists are turning to large-scale operations like the Ocean Observatories Initiative (OOI).

This project, funded by the National Science Foundation (NSF), utilizes data from a collection of sensors and platforms all over the globe that measure physical, chemical, geological, and biological properties of the ocean.

These systems are managed by Rutgers, the State University of New Jersey, the University of Washington, and Oregon State University. The coalition is led by the Woods Hole Oceanographic Institution.

Of course, collecting so much data demands a robust cyberinfrastructure (CI). To learn more about how Rutgers made this happen, we talked with OOI CI program manager and associate director at the Rutgers Discovery Informatics Institute (RDI2) Ivan Rodero.

 A race to the finish line

<strong>The ultimate goal</strong> of the OOI is to give efficient and free access to researchers and educators. Here, University of Washington undergraduate student Kearstin Williams samples seawater for ship- and shore-based analyses to verify instruments on the moorings. Courtesy Mitch Elend, University of Washington.The world would be a truly wonderful place if researchers had unlimited time and funding for their projects. Sadly, that’s often not how modern science works. Like many others, Rodero was working on a major time crunch for completing the OOI’s cyberinfrastructure.

“The NSF decided to move the cyberinfrastructure to Rutgers in 2014 and the deadline for moving into operations was before 2016,” says Rodero. “We had to construct the cyberinfrastructure within 13 months. The anticipated time-frame for doing that was five years.

Meeting this timeframe was Rodero’s proudest moment on the project—and it should be. The OOI is a large and complicated system, and it demands a lot from its cyberinfrastructure.

“We have over a hundred servers,” says Rodero. “It’s a data-centric infrastructure. We have about 20 petabytes of storage in tape, and we have four petabytes on SSD and disk across two different (CI) sites.”

Considering the amount of information the OOI handles, this storage architecture makes perfect sense. In 2018, the OOI enabled its research community users to download over 7 terabytes of data in just a single month.

Outside of storage and backup needs, the OOI’s physical infrastructure is massive. It contains 89 platforms that carry a combined total of more than 830 instruments.

These instruments—from cameras and acoustic receivers to fluid samplers and seismometers—have allowed over 100,000 data products to be designed, built, and deployed. Additionally, almost all necessary scientific devices have been delivered:

  • 100 percent of OOI arrays are delivered, deployed, and operational
  • 95 percent of planned moored platforms are delivered and deployed
  • 98 percent of planned instruments are delivered
  • 100 percent of planned gliders are delivered and commissioned

<strong>Tracking undersea eruptions.</strong> This Bottom Pressure and Tilt Meter installed at the summit of the Axial Seamount measures the rise and fall of the seafloor due to melt migration in the subsurface. Courtesy NSF-OOI/UW/CSSF. <a href='https://creativecommons.org/licenses/by-nc-nd/3.0/us/'>(CC BY-NC-ND)</a>One way to conceptualize the impact of all this data is to look at a real-world example. The Axial Volcano, located off the coast of Washington state, erupted on April 23, 2015. The OOI Bottom Pressure and Tilt Meter (BOTPT) recorded this underwater event, from earthquake tremors to lava flows. Analysis of the information collected helped scientists create a threshold of inflation that can predict the volcano’s future eruptions.

What the future holds

Thankfully, it looks like these instruments will be well funded in the coming years. The NSF has awarded a five-year, $220 million contract to organizations involved in the OOI. This will ensure the future of a 25-year planned lifetime for the OOI project.

<strong>Big buoy.</strong> A Global Southern Ocean Surface Buoy waits for loading in Chile before the Southern Ocean Array deployment cruise. Credit: Bob Weller, Woods Hole Oceanographic Institution.Of course, the true goal of this entire operation was to give efficient and free access to researchers and educators. The OOI Data Portal allows anyone with an internet connection to investigate a wide variety of ocean data.

“We have served over 150 million requests already since the project started operations,” says Rodero. “That’s over 100 terabytes of data delivered to the user community from over 100 distinct countries.”

Scientists have been separated by geography and disciplines in the past, but projects like the OOI show that era is coming to an end. 

“Science is changing a lot, and we are moving from traditional approaches in which a single researcher in a lab would be able to obtain discoveries,” says Rodero.

“Right now, we are in a different era where data is extremely important for research and for enabling interdisciplinary discoveries in a global system. We really believe that having this data is going to give a lot of opportunities.”

Read more:

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2018 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.