Feature ArticlesAugmented Reality Brings Digital Information Into View
By Marianne Molchan
and Security Specialist
Molchan Marine Sciences
Augmented Reality (AR) is changing the way in which we receive information in our homes, on street corners and on military platforms. AR applications combine a real-world scene with a virtual scene that augments the user's view with relevant information.
Brunswick, Maine-based Technology Systems Inc. (TSI) has created a working AR application with the U.S. Navy called the Augmented Reality Common Operational Picture (ARVCOP).
The software provides users with access to intelligence and navigation information via a unique display. This increases the user's situational awareness by taking a wide variety of relevant information and rendering it into a real-world view in 3D and in real time.
Military users of ARVCOP have reported that they experienced increased navigational accuracy and shortening of their operational timelines for critical missions, and field testing of ARVCOP with newly trained operators has repeatedly proven that users require less training time to accomplish the same tasks than their colleagues operating without ARVCOP.
The software works by combining real and virtual imagery. For example, when an operator is navigating a vessel in coastal waters, rivers or along the shoreline using a global positioning system and electronic charts, information from the charts is overlaid onto real-time video imagery of the surrounding environment. Data associated with specific locations or objects are georeferenced and positioned in their correct location in the combined video image. For navigational purposes, a camera image may be augmented with information such as route waypoints, bearing to next waypoint, vessel heading, local hazards, channel centerline, etc.
The virtual navigation aids and virtual 'rails' on either side of the ship track provide additional steering guidance to the vessel operator in fog (or at night) for improved track-keeping performance.
Since this digital information is layered over the real-world image, operators can also see real-world hazards and threats and their physical relationship to the planned route.
Navigators can also enter premission information, including mine fields, safe navigation lanes, kelp beds and loiter boxes into their electronic database. They can then transpose this information via ARVCOP onto the real-world video image. Operators use this information for mission planning, execution and post-mission analysis.
Providing this critical link between sensors and real-time and archived data allows navigators, tactical action officers and sensor operators to communicate with each other with a common understanding of the environment and mission-related information.
ARVCOP's Genesis and Future
ARVCOP's roots came from common technical elements with video gaming and simulation. The mix of virtual and live war fighters utilizing computer simulation and operating in the real world provided a logical step in the evolution of more sophisticated AR applications for air, land and sea.
ARVCOP and its predecessors evolved in response to capability requirements from a variety of government, academic and commercial customers. The software was further refined based on customer feedback and design staff input.
As interest in ARVCOP grew, additional support for AR system development came from several sources, including the U.S. Department of Defense.
The Maine Technical Institute's economic stimulus development program awarded a grant supporting Pervasive AR—a personal AR system in which the user wears a set of glasses and a small belt that transmits the augmented data into their view. Under this effort, core system algorithms were devised that allowed the virtual world to be georegistered on real-world, real-time video.
Next, the U.S. Coast Guard explored the possible use of AR in developing a 'virtual buoy' system. The system would require users to have a visual capability (video or wearable goggles) to view virtual buoys in all visibility conditions. This would reduce the need for and expense of setting and maintaining buoys that are run down, dragged, lost, icebound or simply in need of repair.
AR developments that would enhance the core system capabilities were subsequently developed and patented by TSI. Following the patents, use of a nonmilitary AR navigation system was also explored by the megayacht community.
Navy Warfare Development Command and the Coast Guard Research and Development Center supported a limited objective experiment in which ARVCOP, including the virtual buoy data, was installed on the bridge of the high-speed vessel Joint Venture (HSV-X) and used in various conditions. HSV-X operators were impressed by the ease of navigating into ports with complicated entrance channels in fog and at night. In reduced visibility, the augmented view provided a much more intuitive display of critical navigation information, reducing mental workload for the operators and improving safety, they reported.
The U.S. Navy's Small Business Innovation Research (SBIR) program also supported the development of a modular mission planning toolkit to capitalize on ARVCOP's ability to incorporate mission planning functions. This effort resulted in a plug-in architecture to support a range of geographic information system products, communications systems, sensor packages, and analysis and display capabilities.
Into the Wild
The Navy's Explosive Ordnance Disposal Mobile Unit 1 uses ARVCOP on their 11-meter rigid-hulled inflatable boats (RHIBs). These craft have both operator and navigator stations and have the capability to be linked to a headquarters station on shore. RHIB operators are responsible for deploying and retrieving divers, marine mammals and unmanned systems in harm's way. They are largely night operators. ARVCOP provides essential information about preprogrammed hazards, retrieval points and navigation information for improved mission performance.
The Navy has also been examining a tailored version of ARVCOP, which is being developed to support operations in unique environments, such as rivers. One capability being investigated will use unmanned aerial vehicles (UAVs) for riverine operations, with the goal of creating a mosaic of near-real-time aerial images of the operational area. During a riverine training exercise, ARVCOP-equipped boats and vehicles shadowed active components to compare their effectiveness with the units using standard equipment. Metrics such as 'time to kill' and 'time to rescue' vividly demonstrated the advantages of AR in enhancing battlefield effectiveness.
The need for unmanned system operators to monitor multiple environments such as sea surface, underwater, desert and other rugged terrain inspired the development of the Unmanned Planning and Execution Augmented Reality System (U-PEARS). U-PEARS takes the mixed reality concept a step further by providing the operator with a seamless transition between manned, remote and autonomous operations in all environments. ARVCOP incorporates all navigation systems into a single integrated display, and U-PEARS provides this display at the remote site, thus enabling the operator to proceed as if he or she were on the craft.
Amphibious assault vehicle (AAV) operators have their own challenges during navigation in predetermined assault lanes from ship to shore. To address this issue, ARVCOP-AL (assault lane) has been developed for the AAV. ARVCOP cameras and data presentation for both the operator and assault troops below deck provide improved situational awareness that is intended to offset the problem of limited visibility caused by the AAV's low position in the water.
A related product, ARVCOP-MPS (mission planning and support) station is used by the navigator of an expeditionary craft and is the primary user interface for the control of information flow between the operator and all external systems. This allows access to mission-critical data from headquarters, as well as external data sources such as the Mine Warfare and Environmental Decision Aids Library. The system supports interoperability with other unmanned underwater vehicles and UAV systems and accepts direct, real-time video feeds. Finally, it also provides a real-time position interface for tracking team assets.
ARVCOP embodies not only a technical blending of real and virtual worlds in a powerful end product, but it also uniquely merges capabilities developed in domains ranging from gaming to military combat operations. AR is the accumulation of the best features from a range of evolutionary and revolutionary developments, captured by intelligent design and driven by a truly diverse range of end-users.
Marianne Molchan is the president of Molchan Marine Sciences (MMS), a veteran and woman-owned small business enterprise supporting the development and execution of projects in international marine transportation safety, mine warfare, maritime security, littoral and underwater operations, and marine technology transfer.
Ric Walker joined Molchan Marine Sciences as the maritime safety and security specialist after 30 years of federal service at the U.S. Coast Guard Research and Development Center. Walker's interests center on advanced navigation systems and underwater port security.