Home | Contact ST  

Feature Article

Marine Spatial Data In the Cloud

By Timothy Kearns


TerraSond has collected more than 50 terabytes of data
around the world since its founding in 1994.

When Thomas Newman co-founded TerraSond Ltd. (Palmer, Alaska) in 1994, the idea of a terabyte (TB) of data seemed baffling in its magnitude.

Nineteen years later, Newman’s precision geospatial survey firm has collected data around the globe and serves clients from offices on three continents with increasingly advanced technology. TerraSond now deploys equipment that ingests raw data at the rate of gigabytes per hour and processing software that renders it in dozens of formats. Its servers now hold more than 50 terabytes of data collected from more than 15 countries.

Data of a volume that once seemed unimaginable is now very real and presents major challenges. “Visually organizing two decades worth of survey work in a single place is no easy task,” said Newman. “We’ve spent a considerable amount of time, effort and money seeking an efficient way to manage the data itself.”

TerraSond’s IT department has grown in sophistication and capability over the years, but a comprehensive approach to managing and sharing data of extreme volumes has been elusive. As complexity grew, TerraSond began to look to software and server-side solutions from enterprise vendors and consultants. After months and tens of thousands of dollars invested in researching possibilities, they sought a different approach.

“We began to realize that managing enterprise-level storage on our own was far more expensive and complex than was worthwhile,” said Newman. “That approach also didn’t solve the problem of easy and secure access for our teams and clients around the world.”

TerraSond started to discuss the potential of cloud storage, particularly with OneOcean Corp., a start-up based in the high-tech hub of Seattle, Washington, which is home to cloud leaders such as Amazon.com and Microsoft.


The Big-Data Problem
“Data is growing faster than IT can manage and there isn’t yet a bandwidth connection on the planet that will allow you to easily upload a terabyte of data with a web browser or attach it to an e-mail,” said Don Davis, president of OneOcean.

These problems are coming to a head as organizations face the ever-increasing costs and demands of proprietary software formats and ever-tighter limits on time, budget and capacity.

The growth of big data leads to a crisis of big waste, with data that are underutilized after a first application; data that are not shared, accessed or applied beyond the original commission; and data that lie dormant because they are too costly or difficult to deal with.

OneOcean saw the potential to wean away from dependence on software to get value from information, where data sets express themselves in a rich abstract, at a fraction of the size of source files, agnostic to locked formats and accessible anywhere via the cloud.

Ocean scientists and data analysts all commonly seek to convert data into knowledge that has value, but data locked in a silo hinders this process. Enterprise software providers help tackle the problem by turning ocean data into maps, charts and other processed products with value.

Turning raw data into a useful product requires controlling it, and software companies have a vested interest in maintaining that control. There is a steep cost to this processing, which goes beyond the costs of software licenses. The concern is with the often inevitable side effect of locking data into a proprietary format, and storing that data as files and catalogs in that format.

The dependency on software is deepening as data gets bigger. The volume of marine data is indeed growing at a staggering pace. The means of data collection now range from autonomous drones and self-charging floating buoys to scientific vessels outfitted with potentially dozens of different data-gathering platforms. Just over 10 years ago, a typical hydrographic survey utilizing the latest multibeam echo sounders in approximately 100 meters of water would yield less than 1 gigabyte of data per hour. Modern systems being deployed today average 1 to 10 gigabytes per hour.

Larry Mayer, director of the Center for Coastal and Ocean Mapping at the University of New Hampshire, speculates that with the advent of new echo sounders collecting water column information, in addition to multiple pings in the water at once, data collection rates could explode to 400 to 500 gigabytes in the future.

TerraSond’s 50 terabytes are modest compared to some major repositories. NOAA’s National Geophysical Data Center alone is managing more than 750 terabytes of marine data, and adding 20 to 30 terabytes quarterly through its network of private and public survey vessels.

Big data requires specialized knowledge, training, software and, most importantly, a community of individuals who understand both the processing and the products of raw information. Increasingly, these people are dispersed across offices, organizations and countries around the globe, making them far more dependent on the systems that collect and process ocean data today than ever before.

This dependency comes with tangible expenses—licensing fees, training, hardware upgrades, expensive consultants—and a significant opportunity cost to productivity and progress. A single ocean cruise takes weeks or months, utilizes thousands of labor hours and requires a significant financial investment. Furthermore, this type of survey is usually conducted for a single purpose, with the data later shelved. This puts valuable data out of reach for potential recipients in other industries or sectors.

“Data is growing exponentially in volume, velocity and variety,” said Ed Lazowska, the Bill and Melinda Gates chair of computer science and engineering at the University of Washington. “The challenge is putting it to work efficiently and effectively—getting it out of its silos and into the hands of the people who need it.” To continue this article please click here.


Timothy Kearns is OneOcean’s vice president of services. He has worked in technology, hydrography and bathymetry, and is an expert in marine GIS. He is versed in software application design, business development and marketing of maritime-related services, enterprise solution architectures and chart-production systems. He incubated and launched GIS-based systems at Esri and the Canadian Hydrographic Service for the management, modeling and visualization of bathymetric and hydrographic data. At OneOcean, he tackles the big-data challenges of clients with specialized solutions.




-back to top-

-back to to Features Index-

Sea Technology is read worldwide in more than 110 countries by management, engineers, scientists and technical personnel working in industry, government and educational research institutions. Readers are involved with oceanographic research, fisheries management, offshore oil and gas exploration and production, undersea defense including antisubmarine warfare, ocean mining and commercial diving.