Monday, March 17, 2008

Tactical Decision Making Under Stress (TADMUS)

The Tactical Decision Making Under Stress (TADMUS) decision support system program was initiated by the U.S. Navy in response to the 1986 accidental shoot down of an Iranian Airbus aircraft by the U.S. Vincennes in 1986. The Vincennes was a state of the art guided missile cruiser with the Navy’s most advanced Combat Information Center (CIC), equipped with the AEGIS combat system. The AEGIS system was in itself a Decision Support System which should have help to prevent the accidental destruction of the Iranian aircraft. A paper concerning the development of the TADMUS can be found at: http://www.pacific-science.com/kmds/TADMUS_DSS.pdf

Aegis, which means shield, is the Navy’s most modern surface combat system. Aegis was designed and developed as a complete system, integrating state-of-the-art radar and missile systems. The missile launching system, the computer programs, the radar and the displays are fully integrated to work together. This makes the Aegis system the first fully integrated combat system built to defend against advanced air, surface, and subsurface threats. The AEGIS Combat System is highly integrated and capable of simultaneous warfare on several fronts -- air, surface, subsurface, and strike. Anti-Air Warfare elements include the Radar System AN/SPY-1B/D, Command and Decision System, and Weapons Control System.

According to the article, emotional stress might have had an effect on the decision making process that lead to the tragic incident described above. The TADMUS program was established to answer how stress might affect decision making and what might be done to minimize those effects. According to the article, “developing a prototype decision support system (DSS) that minimized the effects of stress was one of the goals of the TADMUS project. The approach taken in designing the DSS was to analyze the cognitive tasks performed by the decision makers in the shipboard Combat Information Center (CIC) and then to develop a set of display modules to support these tasks based upon the underlying decision making processes naturally used be the CO/TAO team.”

The results of the study showed that experienced decision makers were not particularly well served by the current systems in demanding missions. In 87% of the decisions made, the evaluators determined that information transactions associated with the tactical situation assessment involved the subjects trying to match the observed events in the scenario to those that they had previously experienced. In 12% of the cases, the subjects developed a novel hypothetical explanation to explain the events that they were observing. With regards to selecting their course of action to the events, 94% of the subjects applied their tactics based upon established rules of engagement, while the remaining 6% developed a strategy extrapolated from their previous experience.

The tests also showed that experienced decision makers were not well served by the current DSS in demanding missions. The teams experienced periodic loss of situation awareness, often linked to limitations in human memory and shared attention capacity. According to the article, “environmental stressors such as time compression and highly ambiguous information increased decision biases, e.g. confirmation bias, hyper-vigilance, task fixation, etc. Problems related to decision bias included: (a) carrying initial threat assessment throughout the scenario regardless of new information (framing error) and (b) assessing a track based on information other than that associated with the track (i.e. old intelligence data, e.g. confirmation bias).”

The prototype DSS was developed with the goals of: (a) minimizing the mismatches between cognitive processes and the data available in the CIC to facilitate decision making; (b) correcting the shortcomings of the current displays in imposing high information processing demands and exceeding the limitations of human memory; and (c) transferring data from a numeric form to a graphical representation wherever possible. Basically, the prototype DSS would use improved graphics to increase human recognition of the meaning of the data. The goal of the new displays was to reduce errors, reduce workload, and improve adherence to the rules of engagement.

The article is extremely interesting because it addresses two key issues. First, the need to constantly improve on an existing DSS due to changes in the environment which the DSS was designed to support. Secondly, the use of graphical displays to reduce the bias in the decisions reached by the systems users. The article also supports several of the observations reached in our text, to include the limitation on the amount of information that an individual can process at a given time, and how individuals process their information differently based upon their level of experience.

I have had the pleasure of being able to work with the original decision support system (AEGIS) while at the Naval Academy. The system required extensive use to become familiar with the various graphic symbols, input devices, automated functions, etc. The combat simulations that I dealt with were not what I would qualify as complicated. The use of graphics to help evaluate more complicated, target rich environments would be helpful. In the example given in the paper concerning the accidental shoot down of the Iranian Airbus, I would not have considered that particular scenario a difficult problem to track with the AEGIS system. The Iranian tragedy was most likely the result of stress.

The detection, evaluation, and prosecution decisions that must be made by both the DSS and the human users must be made in a matter of moments in some cases. Many systems have been automated, but few captains want to trust the safety of their commands to a software program. Failure to do so resulted in the USS Shark being struck by two French-built Exocet anti-ship missiles fired from an Iraqi fighter. The AEGIS system did automatically track and ID the incoming missiles as hostile and had the Vulcan anti-missile system been activated, the system would have engaged the threat and most probably shot both missiles down. Human error resulted in the ship being struck by both missiles, failure to trust the DSS system to properly perform its mission. Based upon historical data and personal observation, I would suggest further automation of several of the tasks which are currently being performed by humans to increase response time and eliminate bias on the part of the decision maker. In addition, the individuals who use these systems must be better trained and must be taught to trust the system to reach the proper decision and manage they ships weapon systems to deal with the various threats.

1 comment:

Vicki said...

An interesting article and an interesting discussion.