‘High-velocity learning’ at Keyport


What do View-Master, Google Cardboard, and a free Captain America VR viewer from Kellogg’s have to do with national defense?

Answer: they, or something like them, might someday be part of the Navy’s new learning strategy based on research being done at the Naval Undersea Warfare Center — Keyport.

In January, newly-installed Chief of Naval Operations Adm. John Richardson introduced a strategy that he called “high-velocity learning.” The plan called for making better use of technology in order to achieve learning goals, according to military.com.

While Richardson’s plan to improve learning throughout the fleet is still in its infancy, he said officials have launched another pilot program at Navy shipyards.

The Naval Postgraduate School says the term “high-velocity learning” was penned by Steven J. Spear in his book, “The High-Velocity Edge,” which explores methods for building a system of “dynamic discovery” — attacking and solving problems when they occur, converting weaknesses into strengths, sharing information and developing leaders invested in their subordinates’ successes.

“You bring people in the door at a great rate and great commitment by the nation and you run into a training program that really hasn’t adapted for decades,” Richardson said. “How long does it take to bring someone in through the door, get them processed and then make them an effective worker down on the shop floor, in the dry dock, whatever their job may be?”
More elements of Richardson’s high-velocity learning strategy have yet to roll out. According to his campaign design for the fleet, the Navy will also expand its use of simulators, online gaming and analytics in the near future to improve training and use of technology as a tool.

Enter the virtual/augmented-reality research program at Navy Base Kitsap — Keyport, where they’ve been working with what they call “mixed reality systems” since 1999 when they developed their first vest-worn computer, according to Corey Countryman, systems engineer for the project.
With Navy Innovative Science and Engineering funding, NUWC Keyport is currently investigating maintenance and training efficiencies and total ownership cost reductions gained by applying augmented and virtual reality interactive display technologies to industrial tasking representative of the military system maintenance and training support routinely provided by NUWC Keyport. This R&D project aims to build system prototypes capable of providing the operator with a combination of visual and auditory information associated with procedural tasking or training objectives, according to the article “Augmented Reality in the US Navy.” According to that article, prior research indicated a potential 30 to 50 percent reduction in time-on-task and 50 percent reduction in rework over traditional paper and fixed-computer procedural displays.

To help understand the terms “augmented reality” and “virtual reality,” think Pokemon Go and Worlds of Warcraft. The first is an example of augmented reality where you see an image of the real world with a computer-generated overlay; the other is an example of virtual reality where everything you see is computer-generated. Put the two together, and you get “mixed reality.”
The lab space where the work gets done is tucked away in one corner of a warehouse on the Keyport base. There you find the aforementioned View-Master VR viewer, along with Google’s and Kellogg’s cardboard VR goggles. Insert your smartphone into one of the latter, download the apps and there you are:360 degree, 3-D virtual reality. Simple and inexpensive. Augmented reality research is made possible by a motion capture setup that would not be out of place in a small movie studio. Stand at the worktable in the center of the setup and you are surrounded by LED spotlights and motion capture cameras hanging from a spidery rectangular aluminum framework. Put on the AR spectacles and you see what’s physically in front of you on the table—in this case a dummy computer rack. But, thanks to the glasses, instructions that appear to be floating in mid-air explain how to properly insert a software card into the rack. Attempt to put the card in upside down and the motion capture cameras pass that information along to the computer that graphically shows you how to do it right.

At the moment, the AR rig is something of a Rube Goldberg lash-up, with various items taped and glued to the frames of the glasses that, in turn, are tied by black cables to two nearby personal computers. But that’s only to be expected, explained Corey Countryman, the civilian project engineer. The group’s mission is to evaluate what is available in the civilian game technology sector and evaluate it for its strengths and weaknesses. The ultimate goal is to write up the specifications and requirements for new technologies and products that can safely meet the Navy’s needs; to map out all of the components for all aspects.

“Our objective is tools to help fleet personnel do their jobs better,” Cmdr. Deborah White said. An aerospace experimental psychologist, she was transferred to the project when her husband, Cmdr. Robert Jezek, assumed command of the naval Submarine Support Center-Bangor, last February.
The major choke point, both agree, is that the new technology needs to be safe from a security perspective.

According to Countryman, game technology may look great but it often is not robust enough from a security perspective to meet Navy requirements. “We ask ourselves what can we put into place to make it safe?” he said.
The Keyport team is part of the Navy Augmented Reality Community (NARC), a group self-organized with a common passion for AR/VR, and with a desire to work more effectively across NRE boundaries to advance the state of the art. The NARC held its first face-to-face meeting in San Diego in July 2015. Here are some of the security challenges the group identified:


Wi-Fi is wireless data connectivity. Some of the best potential applications of AR in the Navy require a wireless connection for data retrieval and communication. WiFi is currently rare in the Navy. Where it has been implemented officially, it is in office spaces, rather than where most of the work is being done on ships or in depots. The process of connecting a new device to these rare WiFi networks is onerous. Security is a valid concern with these WiFi networks, but high levels of security are possible, but the work has not been done to verify how secure or unsecure they are.


Many emerging devices in the AR realm connect to one another with Bluetooth communications. Current policy prohibits the use of Bluetooth devices on ships.


Computer vision is a big part of how AR interfaces with the world, and this requires camera input into the system. However, there are many sensitive spaces such as entire shipyards, and areas where classified data is present, where cameras are prohibited.

Mobile devices

The Navy’s adoption of mobile devices has been slow, though there are signs of movement on that front. Even when devices are approved, they are often not being used to their full advantage, but rather are used solely as phones or for email. While they can be used to replace binders and paper, they are not viewed as a permissible alternative.


Before any technology can progress toward operational use, it is important that it be tested in an operational environment. The barrier to entry for this sort of experimentation currently results in a newly developed technology being tested only once or twice a year. With the rapid advance of new information and computing technology, that barrier results in an increasing lag between what is available commercially and what is approved for experimentation. Ideally, sailors would be involved in the process of technology development from cradle to grave, but because every time software or hardware is changed, or a new component is added to the system, it must be re-certified. The process for obtaining approval to demonstrate a stand-alone system is onerous. Without flag level intervention the process for simple demonstration is a minimum of 6 months after the system has been locked down. Agile development is impossible within the current experimentation environment. Industry standards for testing in an agile development environment is closer to a 2 week development and test period. It is difficult today to image the Navy being agile enough to accommodate 2 week development/experimentation cycles, but that is precisely what we’ll need as this technology continues to accelerate. One problem here is that the same process is used for reconfiguring essential combat systems as is used for minor experiments that don’t impact any other system. The tolerance for risk is extremely low.

Human testing

In order to get approval to obtain feedback or do human testing, the process is the same. An Institutional Review Board (IRB) is required. This process can take 6 months to a year for permission to begin testing, and presents a barrier to entry for exactly the kind of work that needs to be done to determine whether or not a technology or process will even improve capability. As in the case of experimentation, it appears that the same strict rules and multiple layers of approval are necessary for human testing that endanger sailor’s life and health, as is required for human testing to ask a sailor if they like the blue or red button better.

Data in the Cloud

Some emerging services such as voice recognition and image recognition that require immense amounts of computing power can be best processed using third-party cloud services such as Google or Amazon’s services. Because these services are not government-owned, we cannot assume that they are secure enough to put even unclassified, sensitive information on them. The Navy needs to determine how secure these platforms are, and how we can use them and for what types and classifications of information. It is likely to be impractical and less effective to depend upon only Government owned/managed cloud resources.

“Why [should] I care”

One big initial challenge will be helping policymakers understand why AR is important and why they need to enact policies regarding AR.

About the Naval Undersea Warfare Center Division, Keyport

The Keyport base opened in 1914 as the Pacific Coast Torpedo Station. It did torpedo repair, torpedo ranging and testing, and was a torpedo school. In 1930 it was renamed the Naval Torpedo Station and was the major center of torpedo production and testing during work War II. It became the Naval Undersea Warfare Engineering Station in 1978, and in 1992, its responsibilities expanded when it became the Naval Undersea Warfare Center division, Keyport. The other NUWC is located in Newport, Rhode Island.

Today, NUMC-Keyport’s mission is to provide advanced technical capabilities for test and evaluation, in-service engineering, maintenance and industrial base support, fleet material readiness, cyber security, and obsolescence management for undersea warfare. While it’s focus is on the undersea fleet, the program works with other Naval Surface and Undersea War Centers, and several universities, including the University of Washington.
to learn more about the challenges and benefits of employing mixed reality in the Naval environment, see Augmented Reality in the US Navy: A Roadmap, http://thearea.org/download/published/white-paper/AR_in_the_Navy-A-_Roadmap_DistA.pdf for more information.