A multi-institutional team of oceanographers and information-technology researchers based at the University of California, San Diego, is developing a science-driven, networked "one-stop shop" cyberinfrastructure that will revolutionize how scientists share data about the world's oceans.
The $400 million Ocean Observatories Initiative (OOI), funded by the National Science Foundation (NSF), marks an unprecedented collaboration to monitor and forecast environmental changes in the oceans on global, regional and coastal scales. Scientists will be able to extrapolate from data gathered by an array of more than 50 diverse sensor types and other scientific instruments that will communicate through permanently installed seafloor cables and satellite telemetry. Scientists will then be able to share data with their colleagues around the world via OOI's networked cyberinfrastructure, which is being implemented by a team of computer scientists, engineers and geophysicists. Partners in the OOI network include the California Institute for Telecommunications and Information Technology (Calit2) at UC San Diego and UC San Diego's Scripps Institution of Oceanography.
"We are facing some real challenges in terms of climate variability and secular change, and we want to be able to collect data, make it available and understand what's happening to the planet," said Principal Investigator John Orcutt, a professor of geophysics at Scripps Oceanography, a distinguished researcher at the San Diego Supercomputer Center and a member of the Calit2 Executive Council. "With this project, we will be pushing the limits of our understanding of ecology, biology and oceanography so we can provide the means for scientists to mitigate threats to the world’s largest ecosystem and the flywheel of Earth’s climate system."
After receiving its final design approval from the NSF, expected later this spring, the OOI project will expand the number of ocean-monitoring projects already in existence, including the U.S. Integrated Ocean Observing System (IOOS) administered by the National Oceanic and Atmospheric Administration (NOAA), as well as the Global Ocean Observing System (GOOS) managed by the United Nations.
IOOS data, which focus on direct applications to everyday societal needs, will feed into GOOS, while OOI's research will concentrate on discoveries enabled by new technologies. In addition, the NSF’s TeraGrid (an open scientific discovery infrastructure) and the National LambdaRail (an innovative network for research and education) will interface with OOI, permitting scientists to share data faster (low latency), in greater volume (high bandwidth) and with greater flexibility (high performance).
“It is exciting to imagine how OOI will bring us a real-time understanding of the ocean on an enormous scale, which requires automated data product generation and an ability to make that data accessible everywhere," said Calit2 Director Larry Smarr. "The goal is to create a semantic infrastructure that allows the research communities to collectively modify the infrastructure to fit their own needs."
Another important component of the initiative is its effort to broaden science and education outreach. The general public will be able to view and interact with oceanographic data by way of OOI's Web portal and its automated data products. During a lesson on the El Niño phenomenon, for example, a science teacher might use a combination of OOI’s raw scientific data, data graphs and annotated Google Maps to write an application that visualizes ocean temperature distribution near the California coast. A lesson on deep-sea microbial ecologies might involve watching a high-definition video feed that captures an underwater volcano emitting plumes of superheated water.
"The data being gathered through this project will be available to anyone in the world in real time," explained Orcutt. "There will only be a few-seconds delay between the time the data are collected and the time they are made available to the public. Normally, ocean scientists have sole access to the data for about two years before they’re made public, so this is a real revolution in the democratization of oceanography."
In addition to its educational component, the project is divided into three physical components designed to collect data on global, regional and coastal scales. The global component, for example, will deploy four deep-ocean moorings – two in the Atlantic and two in the Pacific – which will measure physical, chemical, geological and biological variables and relay the data through satellite communications.
A sea-floor cable (the regional component) will extend 1,000 miles into the ocean from the Pacific Northwest shoreline, providing power and networking for instruments that can be plugged in at several locations in the northeast Pacific. An array of buoys off Massachusetts’ Nantucket Island and the Oregon and Washington coasts comprise the coastal component: The Massachusetts buoys will initially take measurements on the continental shelf, but will be moved to a different location after 5-10 years.
“The scientific opportunity is to collect long-term, continuous time-series measurements,” remarked Frank Vernon, a research geophysicist with Scripps and deputy director of the project. “Before, we could only make episodic or temporary measurements in many regions of the world’s oceans, and when it comes to a source of data this vast, you need to be measuring 24/7.”
Calit2’s Michael Meisinger, a software architect for the project, says that prior to efforts like OOI and IOOS, scientists relied largely on ad hoc forms of data acquisition, storage and collaboration. It was up to individual scientists to know who was researching what – from ocean water temperatures and salinity to currents and plankton counts – and determine how they might pool their resources to address larger systemic issues.
“Even when those relationships were established, collaboration was often hindered by incompatible data products,” explained Meisinger. “What OOI does is put an operating system on top of all the data, so acquisition and sharing can be very broadly and openly available.”
"Researchers will have access to an entire array of applications and computing resources and can thereby cover a much broader range of data,” added OOI cyberinfrastructure project manager Matthew Arrott, a technical professional with Calit2. “We'll be able to get the information from an instrument to an academic institution at 10 gigabits per second speeds in a secure, scalable and self-healing manner."
Chief System Architect Ingolf Krueger, who is a Calit2 participant and associate professor of Computer Science and Engineering in UC San Diego’s Jacobs School of Engineering, envisions the OOI cyberinfrastructure as a “system within a larger effort.”
"The infrastructure we are providing is designed to be service-oriented from its inception, so that others can utilize our services and we can utilize others' services as they become available,” he explained. “This notion of service orientation facilitates scalability, reliability, flexibility and the ability to interface seamlessly with a wide range of resources. This way, it doesn't create a hiccup in the system if someone brings in a new instrument. There are defined processes for these instruments from cradle to grave."
Construction on OOI is expected to start in the summer of 2009, pending approval by NSF and the National Science Board. The program is designed to unfold over the course of 25-30 years, with the expectation that scientists will eventually "bring the loop closed" and begin to look for ways to mitigate, in real time, deleterious changes to the ocean environment, Arrott said.
"The greatest innovation of this program over pre-existing programs is the notion of real-time interactivity," he added. "We look toward robotics and autonomous systems and we hope to be able to use the data we have gathered to change the observatory itself, in real-time, and in ways that will be significant to all of us."