6961
Views

The Situational Awareness Challenge

Shipboard technology should be designed around human factors

tech
File image

Published Dec 9, 2016 4:49 PM by Gary Gomez

Today’s maritime decision-maker must take action based on an increasingly complex set of situational awareness factors including the huge operating space, common sea lanes but not established routes as in air traffic control, unavoidable and severe weather, vessel security, and intermodal issues like port status, rail and truck transport. The visual and mental complexity of such factors can be overwhelming.

In response, surveillance technologies continue to provide more and more data in the often mistaken belief that more data leads to better decision-making. Google the term “maritime situational awareness,” for instance, and the images you get are of sensor networks and maps filled with dots. But situational awareness is not a technology. It is a state of mind, and the technology needs to be designed around the human brain.

Unfortunately, technology development has thus far focused on adding more dots rather than applying the human component to situational awareness systems, and we need to reverse that trend.

The Human Component

Situational awareness is defined as “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future." (i) Note that the terms “perception,” “comprehension” and “projection” are mental acts of understanding, analyzing and deciding. In other words, information systems do not create situational awareness. They enable it by providing the raw data on which situational awareness is based.

So the proper integration of these considerations in system design should enhance the relationship between the data and the decision-maker, the goal being to create an information environment compatible with the way decisions are made.

Providing additional data, supplemented by artificial intelligence, may be the appropriate thing to do in some cases, especially if the previous volume of information was inadequate or presently poorly. But increased data levels can heighten user concerns about reliability, stemming from uncertainty about the system’s logic for filtering information or a lack of user control over the automation process. (ii) Uncertainty can also derive from concerns about data credibility and format.

The human decision-maker tends to work from a risk-mitigation rather than risk-avoidance perspective, and he or she can be comfortable making decisions based on imperfect information. Situational awareness systems can help mitigate some of the uncertainty by providing context and exposing anomalies so the decision-maker can make adjustments.

One way to achieve this is through adaptive situational awareness, in which the human operator can adjust the level of automation based on factors such as system performance and human confidence levels. For example, a ship’s location and projected location are common pieces of information in situational awareness systems. But not all data are equal, and the projected course and location of a vessel are based on a number of complex factors.

The decision-maker can be helped by qualifying this data and showing, if requested, the accuracy probability of a given projection and on what factors the projection is based. For example, is the ship reporting via AIS? How accurate have reports from this ship been in the past? What is the historical data on vessel movement? And how accurate have previous projections been on this ship?

Display Complexity

Another issue that can cause problems is display complexity due to factors like information density, grouping, layout, and the pace and consistency of information updates. (iii) User interface displays – intended to better manage the information presentation – often have the opposite effect by distracting the user from a consistent train of thought because of the additional windows.

The physical layout of the situation awareness center is another consideration. Human factors and ergonomics can help determine where and how display screens are positioned, where individuals sit based on the type of information they are dealing with, and the flow of information.

How Account for Intuition?

While there are many ways to integrate human factors into the design of situational awareness systems, an ongoing challenge is trying to incorporate the elusive quality of intuition. How do you capture the intuition of the experienced mariner?

Intuition is, unfortunately, not taught in a classroom. It is not found in a guidebook. And while an information system cannot substitute for that human capability, it can be designed to take intuition into account by determining the type of information to be presented and the proper format and context to stimulate further thought. Decision-makers can help in the design process by providing input that can be unstructured anecdotal experience or structured, objective, rules-based direction.

One common technique to capture this knowledge is the so-called Critical Decision Method (CDM), developed in order to “elicit information regarding expert decision making . . . for system development.” CDM “has been applied in a number of domains, including . . . air traffic control . . . naval warfare, [and] rail.” (iv)

Functionality vs. Usability

In the final analysis it is really up to the end-user to ensure that the situation awareness system has the preferred type and amount of information displayed in a useful fashion. Money is always a factor with IT expenditures, but when conducting the cost-benefits analysis keep in mind the risks of focusing too much on functionality at the expense of usability. 

 

Gary Gomez is a technology consultant based in Washington D.C. He is a retired naval aviator and the author of numerous studies on information support to decision-makers.

Endnotes

i Mica R. Endsley, Betty Bolte and Debra G. Jones, “Designing for Situation Awareness” (Boca Raton, FL: CRC Press, Taylor & Francis Group, 2003), 13.

ii Hasmik Atoyan, Jean-Marc Robert and Jean-Remi Duquet, “Human Factors Analysis of Different Types of Uncertainties in Complex Systems” in Elisa Shahbazian and Galina Rogova (Editors), “Human Systems Integration to Enhance Maritime Domain Awareness for Port/Harbour Security” (Fairfax, VA: IOS Press Inc., 2010), 64.

iii Endsley, Bolte and Jones, op. cit., 141-142.

iv Neville A. Stanton, Paul M. Salmon, Guy H. Walker, Chris Baber and Daniel P. Jenkins (Editors), “Human Factors Methods: A Practical Guide for Engineering and Design” (Burlington, VT: Ashgate Publishing Company, 2005), 98.

The opinions expressed herein are the author's and not necessarily those of The Maritime Executive.