Ocean Observatories Initiative (OOI): Status of Design, Capabilities, and Implementation
The National Science Foundation's (NSF) Ocean Observatories Initiative (OOI) will
implement the construction and operation of an interactive, integrated ocean observing network. This research-
driven, multi-scale network will provide the broad ocean science community with access to advanced
technology to enable studies of fundamental ocean processes. The OOI will afford observations at coastal,
regional, and global scales on timeframes of milliseconds to decades in support of investigations into climate
variability, ocean ecosystems, biogeochemical processes, coastal ocean dynamics, circulation and mixing
dynamics, fluid-rock interactions, and the sub-seafloor biosphere. The elements of the OOI include arrays of
fixed and re-locatable moorings, autonomous underwater vehicles, and cabled seafloor nodes. All assets
combined, the OOI network will provide data from over 45 distinct types of sensors, comprising over 800 total
sensors distributed in the Pacific and Atlantic oceans. These core sensors for the OOI were determined
through a formal process of science requirements development. This core sensor array will be integrated
through a system-wide cyberinfrastructure allowing for remote control of instruments, adaptive sampling, and
near-real time access to data. Implementation of the network will stimulate new avenues of research and the
development of new infrastructure, instrumentation, and sensor technologies. The OOI is funded by the NSF
and managed by the Consortium for Ocean Leadership which focuses on the science, technology, education,
and outreach for an emerging network of ocean observing systems.
Cyberinfrastructure for the US Ocean Observatories Initiative: Enabling Interactive Observation in the Ocean
The Ocean Observatories Initiative (OOI) is an environmental observatory covering a diversity of oceanic environments, ranging from the coastal to the deep ocean. It recently passed final design review. Construction is planned to begin in mid-2010 with deployment phased over five years. The key integrating element of the OOI is a comprehensive cyberinfrastructure whose design is based on loosely coupled distributed services, and whose elements are expected to reside throughout the OOI observatories, from seafloor instruments to deep sea moorings to shore facilities to computing and archiving infrastructure. There are six main components to the design comprising the core capability container, consisting of four elements providing services for users and distributed resources and two infrastructural elements providing core services. The sensing and acquisition component provides capabilities to acquire data from and manage distributed seafloor instrument resources, including their interactions with each other and with the infrastructure power, communication and time distribution networks. It includes services to publish instrument data and a repository for instrument behaviors and processes. The data management component provides capabilities to distribute and archive OOI data, including cataloging, versioning, metadata management, and attribution and association services. A core component will be an OOI-standard data/metadata model. The analysis and synthesis element provides a wide range of services to users, including control and archival of models, event detection services, quality control services, and collaboration capabilities to enable the creation of virtual laboratories and classrooms. The planning and prosecution element gives the ability to plan, simulate and execute observation missions using taskable instruments, and is the cyberinfrastructure component that turns the OOI into an interactive observatory. The remaining elements are the common operating infrastructure (COI) and the common execution infrastructure (CEI). The COI provides core services to manage distributed, shared resources in a policy-based framework, including a distributed service infrastructure for the secure, scalable and fault tolerant operation and federation of the operational domains of authority comprising the OOI. It includes capabilities to manage identity and policy, manage the resource life cycle, and catalog/repository services for observatory resources. It also manages interactions with resources on an end-to-end basis. The CEI provides an elastic computing framework to initiate, manage and store processes that may range from initial operations on data at a shore station to the execution of a complex numerical model on the national computing infrastructure.
Drinking From a Firehouse: Managing Video From Ocean Observatories
In early 2009, the Monterey Bay Aquarium Research Institute (MBARI) deployed a novel low-light camera
system on the Monterey Accelerated Research System (MARS), a deep-sea cabled observatory in Monterey
Canyon, California, USA. Ample access to power and ethernet communications enable this moored camera
system, ORCA's Eye-in-the-sea (EITS), to stream video back to researchers near continuously for months at a
time. In order to use such large volumes of video data for science, MBARI has assembled a system for
detecting, tracking, cataloguing and visualizing events. The video management system has 2 main
components, the Automated Visual Event Detection (AVED) system and the Video Annotation and Reference
System (VARS). As cabled observatories are deployed the use of long-term video cameras will increase
significantly. An overview of the system will be presented as an example of how other researchers might
manage video data from deployed video systems.
Design, Observing and Data Systems, and Final Installation of the NEPTUNE Canada Regional Cabled Ocean Observatory
NEPTUNE Canada (NC; www.neptunecanada.ca) will complete most of the installation of the world's first
regional cabled ocean observatory in late 2009 off Canada's west coast. It will comprise five main observatory
nodes (100-2700m water depths) linked by an 800km backbone cable delivering 10kVDC power and 10Gbps
communications bandwidth to hundreds of sensors, with a 25-year design life. Infrastructure (100M) and
initial operational funding (20M) is secured. University of Victoria (UVic) leads a consortium of 12 Canadian
universities, hosts the coastal VENUS cabled observatory, with Ocean Networks Canada (ONC) providing
Observatory architecture has a trunk and branch topology. Installed in late 2007, the backbone cable loops
from/to UVic's Port Alberni shore station. The wet plant's design, manufacture and installation was contracted
to Alcatel-Lucent. Each node provides six interface ports for connection of science instrument arrays or
extensions. Each port provides dual optical Ethernet links and up to 9kW of electrical power at 400VDC.
Junction boxes, designed and built by OceanWorks support up to 10 instruments each and can be daisy-
chained. They accommodate both serial and 10/100 Ethernet instruments, and provide a variety of voltages
(400V, 48V, 24V, 15V). Backbone equipment has all been qualified and installed; shore station re-equipping is
complete; junction boxes are manufactured. A major marine program will deploy nodes and instruments in
July-September 2009; instruments to one node will probably be deferred until 2010.
Observatory instruments will be deployed in subsurface (boreholes), on seabed, and buoyed through the water
column. Over 130 instruments (over 40 different types) will host several hundred sensors; mobile assets
include a tethered crawler and a 400m vertical profiler. Experiments will address: earthquake dynamics and
tsunami hazards; fluid fluxes in both ocean crust and sediments, including gas hydrates; ocean/climate
dynamics, including acidification and nutrient fluxes; deep-sea ecosystems dynamics; and engineering and
computer science research.
NC's software system interfaces between users and cabled observatory and responds to a three-fold
mandate: acquire data from various instruments/sensors underwater; provide lifetime storage and
redistribution capabilities for all data; and allow authorized users to remotely and interactively control
experiments. Data Management and Archiving System (DMAS) is being developed in-house, with adoption of
Service-Oriented Architecture (SOA) and using Web Services to expose the functionality of DMAS' various
components. An internal messaging bus allows various functional components to interact through the publish
and subscribe paradigm, using Java programming language. DMAS is developing a modern environment for
users: data access, data processing and experimentation control within a Web 2.0 environment. This will allow
users, on top of data and instrumentation access, to perform data visualization and analysis on-line with either
default or custom processing code, as well as simultaneously interacting with each other. These social
networking aspects will be within NC's new Oceans 2.0 environment.
The observatory is designed to be expandable in its footprint, nodes and instruments and provides a
magnificent facility for testing prototypes of new technologies monitored and demonstrated in real-time. NC
and ONC invite new scientific and industrial participation, experiments, instrumentation and data services.
Real-Time Measurements of a Tsunami Wave Height on DART Records
In 2005-2008, an array of tsunameters (Deep-ocean Assessment and Reporting of Tsunamis (DART) buoys)
was deployed in the oceans.
The real-time measurements of an instant sea surface height with 1 mm accuracy, provided by the buoys, are
used to detect and measure a tsunami wave, to deduce some of its characteristics (such as to specify the
tsunami source) and eventually to
evaluate the tsunami hazard at the coast in advance of the tsunami arrival.
The accuracy of detecting tsunami waves at DART buoys largely determines the accuracy of any forecast of the
future tsunami evolution.
However, a tsunami component of a DART buoy signal is commonly under few cm in amplitude, and is
masked by much more powerful tidal component with typical amplitudes of one meter or more.
Tidal predictions and digital filtering are the two major techniques of removing a tidal component of a record.
Tidal predictions require a priori knowledge about tidal processes at a particular buoy location, and, in the best
case, are 2-4 cm accurate, which is not accurate enough for tsunami quantification. Extracting the tsunami in
real-time via conventional digital filtering (such as convolution with a sliding window) also faces some
difficulties. First, an ongoing tsunami signal is located on the edge of the record. Second, a record might have
missing data. In particular, as a buoy construction feature, "an event" recording is likely to start with a few hour
A filtering technique robust to "edge effects" and gaps in data has been developed in NOAA Center for Tsunami
Research for de-tiding DART records in real-time (Tolkova E. Principal Component Analysis of Tsunami Buoy
Record: Tide Prediction and Removal. Dyn. Atmos. Oceans, Vol.46/1-4, pp.62-82 (2009)). The method employs
a pre-computed orthogonal set of functions defined on a specific time interval.
It has been shown, experimentally and analytically, that a sub-space spanned by these functions holds any
tidal fragment of a specific length recorded anywhere at the deep ocean. In particular, a set of eight such
functions approximates any 24.75 hour long fragment of any DART tidal record with 0.2-0.3 cm RMS error per
reading. Fitting a fragment of a DART record with these functions provides an efficient tool in solving a number
of problems such as
separating tidal and non-tidal components in the fragment, filling in gaps in tidal records, and short-term tidal
The above technique applied to existing deep-ocean tsunami records resulted in successful extraction of a
tsunami wave with a few mm accuracy from the first minutes of its appearance on a DART record.
Though developed with application to tidal records, the method might be applicable to a wide range of
processes with a relatively narrow frequency band.
Towards to Understanding the Recurrence Cycle of Mega Thrust Earthquakes and Tsunamis Around the Nankai Trough Southwestern Japan -Real Time Monitoring (DONET), Simulation and Comprehensive Analyses-
The Nankai trough located southwestern Japan is well known as the mega thrust earthquake seismogenic
There are four mega thrust seismogenic zones such as the Tokai, Tonankai, Nankai and Hyuga off Kyushu
In the Nankai trough, mega thrust earthquakes are occurring with an interval of 100-200 years.
Therefore, we have to improve the structure model and the recurrence cycle simulation model with higher
reliabilities. Especially, the estimation of recurrence cycle between the Tonankai and Nankai earthquake is very
important for disaster preventions. To understand and estimate the next mega thrust earthquakes, we develop
the real time monitoring system of crustal activities around the Nankai trough, and improve simulation mode
based on the detail structures and drill results.
Finally, we have to comprehend results of researches.
The details of research approaches are indicated as follows,
1) Construct the detailed crustal medium around the Nankai trough using controlled sources and seismic
tomography using dense seismic lines and OBS network arrays.
2) Observations of crustal activities around the Nankai trough and north eastern Japan as a comparative
research of the Nankai trough seismogenic zone.
3) Construct the database of long term plate coupling dynamics and study the diversity of recurrence pattern
and scale of next mega thrust earthquakes.
4) Develop the advanced simulation methods using crustal medium structures, the drilling results and real
time monitoring data.
5) Improve the large scale recurrence cycle simulation model based on theoretical, experimental analyses
and data assimilation.
6) Evaluate the precise strong motions and tsunamis for the disaster prevention based on the detailed crustal
7) Develop the reliable risk management system for next mega thrust earthquake.
8) Develop and construct the real time monitoring system around the Nankai trough.
(DONETFDense ocean floor network system for Earthquakes and Tsunamis)
Finally, to understand the next mega thrust earthquakes and propose new risk management integrate, it is
quite necessary and important to comprehend scientific results. Furthermore, if unusual crustal activities are
observed, especially in the case of seismic linkage between the Tonankai and Nankai mega thrust
earthquakes, we have to analyze and interpret these phenomena as precursors or not by using real time
monitoring data and comprehensive estimations.
RAMA: Research Moored Array for African-Asian-Australian Monsoon Analysis and Prediction
The Indian Ocean is unique among the three tropical ocean basins in that it is blocked at 25°N by the Asian land mass. Seasonal heating and cooling over this land mass sets the stage for dramatic monsoon wind reversals and intense rains over areas surrounding the basin. These climate variations have significant societal and economic impacts that affect half the world's population. Despite the importance of the Indian Ocean for both the regional and global climate though, it is the most poorly observed and least well understood of the three tropical oceans. This presentation describes the Research Moored Array for African-Asian- Australian Monsoon Analysis and Prediction (RAMA), which has been designed to provide sustained, basin scale time series data in the Indian Ocean for climate research and forecasting. RAMA is intended to complement other satellite and in situ components of the Indian Ocean Observing System and it is being implemented through a coordinated multi-national effort involving institutions in several countries. We will review the scientific rationale, design criteria, and implementation status of RAMA. We will also illustrate some of the important intraseasonal to interannual time scale phenomena in the region observed with new RAMA time series data. Potential applications of the data for forecasting purposes will also be discussed.
The Mobile Buoy: An Autonomous Surface Vehicle for Integrated Ocean-Atmosphere Studies
A solar-powered Autonomous Surface Vessel (ASV) called OASIS (Ocean-Atmosphere Sensor Integration
System) has been developed that makes measurements spanning the ocean mixed layer and lower
atmospheric surface layer. An OASIS ASV can be remotely commanded to act as a boat, drifter, or untethered
buoy (when programmed to keep at a station). OASIS has performed cross-shelf transect surveys within the
mid-Atlantic Bight (63 km), Gulf of Maine, and additional field tests to develop techniques to map harmful algal
One example of the utility of the OASIS ASV is with carbon dioxide (CO2) fluxes - predicting future climate
change will require that scientists understand what controls exchanges of carbon dioxide between the
atmosphere and ocean interior. OASIS measurements from the Gulf of Maine transect included surface ocean
CO2 partial pressures from 320 to 670 μatm, air-sea CO2 sea-to-air fluxes from -3.2 to +12.2
mmol m2 d-1, upper ocean currents (0-50 m depth), surface ocean fluorescence, temperature and
salinity, and several additional measurements. We are also installing a cabled, autonomous ocean mixed-
layer hydrographic profiling system for future deployments.
The complete integration of atmosphere and ocean measurements onboard an autonomous navigating
vehicle is a key advance for ocean observation technology and observational science programs. ASVs have
great potential for ocean and climate studies, and can become a major component of earth observation
systems in the coming decades.