image of magazine
on the cover buttonmasthead buttonsubmit feedback buttonsubmit an article button
Article headline and image of submarine

By Quentin Finney, Navy/Marine Corps Sales Manager, Google


Whether you’re mapping the best routes around a new city, looking up how to get to a restaurant, or deciding which trail to take on a hike, chances are you’ll pull out your smartphone for directions.
But in a submarine, getting directions, data, and information is a much tougher—and more time-consuming—task. Situational awareness (SA) flows in on several screens, each sourcing data differently and each presenting a different perspective.

The challenge to integrate these perspectives and improve the speed and ease of submarine SA to near-smartphone levels drove a team, headed by Google Enterprise, to develop a solution now being tested for Virginia-class subs.

The highly collaborative development process would come to engage many partners, large and small, and involve many of the venues for developing and demonstrating innovations. Google’s prior experience with clients including the National Geospatial-Intelligence Agency and the Office of Naval Intelligence, as well as its long-term cooperative research and development agreements already in place, made it the right partner to help the U.S. Navy find an answer.

The project began two years ago when Mark Steele, Information and Knowledge Manager at COMSUBLANT in Norfolk, Va., got underway with a few submarines. The retired surface warfare line officer had spent his career on ships, but his role as knowledge manager in the submarine force required he learn how information flows while underway—beneath the seas. As he stood in the sub’s control room watching the information flow, he was struck by the difference. He asked: How can the commander get improved situational awareness in less time? Is there a more natural way to display navigation information while submerged?

“Four hundred feet below the surface, there are no bridge wings to look out from to get contextual data—just flat-screen displays of paper charts,” he says. “Current navigation tools (Voyage Management System) display a submarine’s position in two dimensions, while the submarine is operating in a three-dimensional environment.”

Screens with multiple data streams
In a submarine, a crew’s SA is limited to what comes in on its system screens. The commander could be looking at 6 to 36 control screens, each providing different and critical data, and many functioning as digitized versions of a paper chart. One screen might show sonar, another fathometer data, another GPS or radar data, depending on vessel submergence.

To make navigation and tactical decisions in real time, data from multiple screens must be rapidly blended, comprehended, assessed, and analyzed. Where does the synthesis of these complex data streams take place? Solely in the submarine commander’s or the officer of the deck’s brain.
Granted, technology developers and crew alike say that’s a pretty good place for such analysis to happen. The combination of experience, expertise, and understanding of mission doesn’t get much better.

But even the sharpest leader can experience fatigue and human error. Understanding and communicating data among humans takes time. And that “server space” in the commander’s brain might be better used another way—in executing a speedier critical decision cycle, for instance, and executing that uniquely human facility: judgment.

As with all technology, separating what’s better done by machine than by humans and managing the user interface are the keys. Submarines had long relied on stovepipe-style systems, with data coming in on independent, non-coordinated streams. The challenge was not only to integrate the streams, but to tap the most complete and unified data—and to make the results as easy as possible to apply to the decision-making process.

That is, it would be like getting directions and traffic information on a smartphone—but in a tougher environment, with more complex data and much higher stakes.

Steele had partnered with Google in the past, exploring potential use cases for search engines onboard submarine networks. As he considered subsurface navigation after his submarine trip, he asked: Can we employ Google Earth technology that everyone knows and trusts in the terrestrial plane and render existing certified navigation data to provide a more natural view of the submarine navigational picture?

So Steele reached out to Google again to explain his hypothesis. In creating the solution, Google and its partners worked with the Navy and key mission partners from Lockheed Martin Area 51 and Johns Hopkins University (JHU) Applied Physics Lab (APL) seeking a rich mission-planning tool that would:

Reduce the margin for error: Let machines do what humans had done.

Shrink the decision cycle: The less time spent building a mental picture, the more time is available to focus on the decision cycle.

Decrease speed-to-capability: Get a solution that will reduce training time and stand up quickly.

The goal was, in essence, to reproduce part of the integrative function of the commander’s brain—and to get it on the screen quickly using modern data visualization techniques, fully visualized, and in a way that didn’t take a lot of training resources. But also important in development were attempts to realize savings in time and money and increase mission success.

Flexible software meets massive data
What’s happening around you? Where are you going? How will you get there? Geospatial information is a blanket term that condenses these basic navigation questions and others. On a sub, these classic queries were being answered with contextually disconnected data sets. Each screen represented an independently operated, disparate baseline-referenced system. Another hitch: the different tools and interfaces used to gather and present the data tended to have been developed at different times—meaning they often had differing frames of reference as well.

In addition, the current software didn’t represent the global dimension of SA needed. To navigate safely, a car or a ship needs to consider two spatial dimensions—what’s on the same level as the vehicle and what’s above. For submarines and aircraft, the challenge is upped to three dimensions: what’s around, what’s above, and what’s below. This demands a data solution that can handle multiple layers of information.

Google worked with Steele, taking a minimalist and no-cost approach to developing the initial concept. The Google Enterprise team also engaged longtime partner Thermopylae Sciences + Technology, a small and highly innovative service-disabled veteran-owned technology company. While Thermopylae worked on developing the application, Google provided the massive data capability and spatially rendered framework using Google Earth.

Initially a small company called Keyhole Inc. and funded by In-Q-Tel, Google Earth has become a widely used virtual globe, map, and geographical and geospatial information program. Google Earth uses satellite and aerial photography as well as a 3D virtual globe, layering images and information to map terrain all over the earth. A hugely popular commercial product, Google Earth in various forms has also continued to be an important part of defense mapping. Now it was time to put it to the test with submarine use cases.

The approach from the beginning was to use Commercial Off-the-Shelf (COTS) data visualization techniques, which increase familiarity and allow cost savings. Operators would use Google’s 3D globe software as well as its application programming interfaces (APIs) to allow the software components to interact with each other and integrate the data. But building on APIs from scratch can be time-consuming and costly—both factors the Navy clearly seeks to avoid.

Enter Thermopylae’s solution, iSpatial, which leverages common denominators for software that mesh with Google Earth and other Google geo-technologies. The team terms iSpatial an enabler—on top of a Google platform, it helps to build rapid capabilities around a specific mission set.

With it, customers can quickly and easily build platforms over Google Earth. Users would benefit from three integrated layers: extensive geospatial data from Google, their own familiar web interface, and incoming data based on their specific mission and location—all in one user-friendly place. The result is a true fusion plot in real time viewed as a virtual terrain map.


Illustration of submarine underwater
Mock up sample of a Virginia-class submarine at depth,
viewed within Google Earth.


Familiarity speeds training and trust
One of the biggest advantages the solution offers is simplicity. The user interface is intuitive and familiar to anyone—particularly younger operators who have previously used Google Earth. Therefore, training and adoption time is vastly reduced. The solution is also highly customizable and technology agnostic, though, so Navy users can pick and choose what works for their needs.

“Often when government customers switch to a new technology, it means a full start-over,” says John-Isaac Clark, chief innovation officer at Thermopylae. “We wanted to eliminate that. You’re still using innovative technology, but you don’t have to build it from scratch and it really shrinks the training time.”
Another big consideration: Submarines employ big data. Any solution had to stand up to the rigors and sheer size of high-resolution terrain data as well as have the capacity to combine and analyze several such huge data streams. Google Earth has extensive data on ocean floor contours as well as surface data. Tightening up disparities currently found among chart and map data and going to a single-map approach can significantly reduce costs.

Space is always at a premium on a sub, and IT space is no exception. Eliminating redundant data sets and pulling everything together on a common foundation platform frees IT space for adding other new capabilities.

Finally, Google Earth has the advantage of trust. Its data sources are accurate—they include the ability to leverage National Geospatial-Intelligence Agency-certified bathymetric data as well as Navy meteorology and oceanography data, among other key data sources—all while operating in the fully disconnected environments where subs traverse.

The security and cloud considerations were already in place. Google Earth clients include the Air Force Weather Agency, Joint Task Force - Homeland Defense, National Geospatial Intelligence Agency, and Office of Naval Intelligence. Google Earth has a long track record of supporting the geospatial needs of these and other agencies. Finally, Google has earned a reputation for speeding innovation and getting the best out of open, collaborative processes.


Illustration of submarine underwater
Sample visualization of a "minimum safe operating envelope:
box around a sub at depth enabling better off-hull visualization
of terrain and water column relationships.


Navigating the presentation process
Fusing Google Earth with iSpatial, Steele and the Google/Thermopylae team produced an unclassified mockup on Virginia-class submarines. To demonstrate integrating and digitizing information streams, they rendered navigation data through Google Earth algorithms, building a 3D realization of a vessel’s path. The video would track the sub as it navigated a common sub transit area: the terrain-rich environment of the Strait of Juan de Fuca and Puget Sound outside Seattle.

“The Google team took the algorithm from my garage-based project and developed it into a digitized navigation display that would be more readily conversant to the natural eye,” Steele says.

At the invitation of the Submarine Tactical Response Group (STRG), which is charged with identifying and consolidating fleet tactical needs and prioritizing them for the software developers, and the Submarine Navigation Improvement Program (SNIP), charged with exploration of future navigation capabilities, Steele presented a working model from a laptop at JHU APL. Chaired by SUBDEVRON 12, STRG’s recommendations become the basis for those presented to the acquisitions community to guide technical development.

Presenting to STRG opened up the ability to get the Google Earth technology into the acquisitions process and fueled involvement from more partners. Next, Steele and the Google team would shift focus to showing leaders just what this solution could do.

Subsequent actions included attendance at the Submarine Technology Symposium and exposure to the process developed through a new Navy innovation workshop called Tactical Advancements for the Next Generation (TANG), set up by SUBDEVRON 12, staff from the Undersea Warfare Business Area in JHU APL’s Force Projection Department, and NAVSEA’s Program Executive Office, Integrated Warfare Systems (PEO IWS 5). There, a group specifically charged with innovation tried the navigation model and tweaked it, providing further improvements to the system’s functionality for Submariners.

The solution was then installed in the Area 51 Lab in Manassas, Va., the Lockheed Martin future concepts lab designed to evaluate and showcase systems for PEO IWS. As a systems integrator for other submarine programs, Lockheed Martin helped set up the project for ongoing testing and demonstration. The installation gave these commanders and other Navy members the opportunity to try out the solution and give feedback.

At every stage, the response was highly positive. In spring of 2013, after several rounds spent evaluating varied alternatives, PEO IWS selected Google Earth to begin forming a common geospatial foundation. Google Earth is now planned for inclusion in the architecture of the next 42 submarines as part of technical insertion 14, and the system will begin rolling out with Advanced Process Build 15.
These same tools could have broader applications as well. The navigation model has also caught the eye of surface fleets. Those who have tried the solution say they see the possibilities: Google Earth mapping could be used on every Navy ship. Commanders of all types of vessels would see Navy data atop their own familiar platforms built on top of Google Earth. And this experience would be available in support of missions anywhere in the world, with or without an Internet connection.

While the main purpose is to make conditions on subs safer and empower faster decision-making, the solution also serves another end: It shows how ideas that start with Sailors themselves can take advantage of commercial technologies and partners for a better solution for all.

“The application of COTS technology in this capacity has the real possibility of helping Submariners be more effective in their tactical and navigational decision-making processes,” Steele says, “further enabling their ability to handle more complex scenarios with greater probability of safety and mission success.”

TANG Juices Up Innovation
An important step in sharpening the new solution was to present it to those who would be working with it in the field. That’s where Tactical Advancement for the Next Generation (TANG) came in.
There’s sometimes a perception in the military that out-of-the-box thinking can be unpopular or even discouraged. Yet coming up with creative solutions on the fly can be where a Sailor is at his or her best—and the military has been the site of some of the most exciting technological innovation. To get in touch with that innovative spirit, SUBDEVRON 12 and the Johns Hopkins Applied Physics Lab (APL) Force Projection Department’s Undersea Warfare Business Area helped establish the TANG workshops.

Applying the corporate and tech worlds’ model of facility with techniques to increase innovation, the workshops would get ideas from junior submarine officers and sonar and fire control technicians. Civilian clothes and free brainstorming were the rules at the first such workshop, held in San Diego in late 2012.

An APL report described the workshop: “TANG worked because of submarine culture: focused, agile, and willing to try new things. ‘The environment inspires a can-do attitude, as well as creativity and the ability to find workarounds,’ says Don Noyes, Operator Machine Interface (OMI) Working Group co-chair, of the Signal and System Analysis Group in the USW Business Area. ‘They’re always problem-solving. It’s the culture of the sub force to always innovate.’”

Sidebar source: http://www.jhuapl.edu/newscenter/stories/st121119.asp