Monday, March 17, 2008

Tactical Decision Making Under Stress (TADMUS)

The Tactical Decision Making Under Stress (TADMUS) decision support system program was initiated by the U.S. Navy in response to the 1986 accidental shoot down of an Iranian Airbus aircraft by the U.S. Vincennes in 1986. The Vincennes was a state of the art guided missile cruiser with the Navy’s most advanced Combat Information Center (CIC), equipped with the AEGIS combat system. The AEGIS system was in itself a Decision Support System which should have help to prevent the accidental destruction of the Iranian aircraft. A paper concerning the development of the TADMUS can be found at: http://www.pacific-science.com/kmds/TADMUS_DSS.pdf

Aegis, which means shield, is the Navy’s most modern surface combat system. Aegis was designed and developed as a complete system, integrating state-of-the-art radar and missile systems. The missile launching system, the computer programs, the radar and the displays are fully integrated to work together. This makes the Aegis system the first fully integrated combat system built to defend against advanced air, surface, and subsurface threats. The AEGIS Combat System is highly integrated and capable of simultaneous warfare on several fronts -- air, surface, subsurface, and strike. Anti-Air Warfare elements include the Radar System AN/SPY-1B/D, Command and Decision System, and Weapons Control System.

According to the article, emotional stress might have had an effect on the decision making process that lead to the tragic incident described above. The TADMUS program was established to answer how stress might affect decision making and what might be done to minimize those effects. According to the article, “developing a prototype decision support system (DSS) that minimized the effects of stress was one of the goals of the TADMUS project. The approach taken in designing the DSS was to analyze the cognitive tasks performed by the decision makers in the shipboard Combat Information Center (CIC) and then to develop a set of display modules to support these tasks based upon the underlying decision making processes naturally used be the CO/TAO team.”

The results of the study showed that experienced decision makers were not particularly well served by the current systems in demanding missions. In 87% of the decisions made, the evaluators determined that information transactions associated with the tactical situation assessment involved the subjects trying to match the observed events in the scenario to those that they had previously experienced. In 12% of the cases, the subjects developed a novel hypothetical explanation to explain the events that they were observing. With regards to selecting their course of action to the events, 94% of the subjects applied their tactics based upon established rules of engagement, while the remaining 6% developed a strategy extrapolated from their previous experience.

The tests also showed that experienced decision makers were not well served by the current DSS in demanding missions. The teams experienced periodic loss of situation awareness, often linked to limitations in human memory and shared attention capacity. According to the article, “environmental stressors such as time compression and highly ambiguous information increased decision biases, e.g. confirmation bias, hyper-vigilance, task fixation, etc. Problems related to decision bias included: (a) carrying initial threat assessment throughout the scenario regardless of new information (framing error) and (b) assessing a track based on information other than that associated with the track (i.e. old intelligence data, e.g. confirmation bias).”

The prototype DSS was developed with the goals of: (a) minimizing the mismatches between cognitive processes and the data available in the CIC to facilitate decision making; (b) correcting the shortcomings of the current displays in imposing high information processing demands and exceeding the limitations of human memory; and (c) transferring data from a numeric form to a graphical representation wherever possible. Basically, the prototype DSS would use improved graphics to increase human recognition of the meaning of the data. The goal of the new displays was to reduce errors, reduce workload, and improve adherence to the rules of engagement.

The article is extremely interesting because it addresses two key issues. First, the need to constantly improve on an existing DSS due to changes in the environment which the DSS was designed to support. Secondly, the use of graphical displays to reduce the bias in the decisions reached by the systems users. The article also supports several of the observations reached in our text, to include the limitation on the amount of information that an individual can process at a given time, and how individuals process their information differently based upon their level of experience.

I have had the pleasure of being able to work with the original decision support system (AEGIS) while at the Naval Academy. The system required extensive use to become familiar with the various graphic symbols, input devices, automated functions, etc. The combat simulations that I dealt with were not what I would qualify as complicated. The use of graphics to help evaluate more complicated, target rich environments would be helpful. In the example given in the paper concerning the accidental shoot down of the Iranian Airbus, I would not have considered that particular scenario a difficult problem to track with the AEGIS system. The Iranian tragedy was most likely the result of stress.

The detection, evaluation, and prosecution decisions that must be made by both the DSS and the human users must be made in a matter of moments in some cases. Many systems have been automated, but few captains want to trust the safety of their commands to a software program. Failure to do so resulted in the USS Shark being struck by two French-built Exocet anti-ship missiles fired from an Iraqi fighter. The AEGIS system did automatically track and ID the incoming missiles as hostile and had the Vulcan anti-missile system been activated, the system would have engaged the threat and most probably shot both missiles down. Human error resulted in the ship being struck by both missiles, failure to trust the DSS system to properly perform its mission. Based upon historical data and personal observation, I would suggest further automation of several of the tasks which are currently being performed by humans to increase response time and eliminate bias on the part of the decision maker. In addition, the individuals who use these systems must be better trained and must be taught to trust the system to reach the proper decision and manage they ships weapon systems to deal with the various threats.

Wednesday, March 12, 2008

Military Decision Modeling

Doctors Ross, Klein, Thunholm, Schmitt, and Baxter have written an article entitled “The Recognition-Primed Decision Model”. The article can be found at: http://www.au.af.mil/au/awc/awcgate/milreview/ross.pdf. The authors discuss the US Army’s testing of an improved DSS system as a means of increasing it operations tempo (speed at which it conducts its military operations) and allow the army to act and react faster than the enemy.

The article details the army’s experimentation and tests of a new DSS called RPM (Recognition-Primed Decision Model) and compare the results of its use to the existing DSS system currently in use. The current decision support system MDMP (Military Decision Making Process) uses decision analytical rationale called multi-attribute utility analysis. The authors consider the MDMP to be too time consuming in reaching its conclusions and that the delay degrades the findings of the system.

The MDMP and the RPM both follow a series of four phases in the development of operational orders for the command. These phases are:
1. Identify the Mission (Conceptualize the Course of Action - COA)
2. Test/Operationalize the COA
3. Wargame the COA (For executors as well as planners)
4. Develop Orders (Operations Orders)

The RPM was designed to build on experience and expertise associated with the lessons learned with the army’s use of the MDMP. The RPM requires a more experienced user to successfully operate and it eliminates the previous systems requirement to establish 3 possible Courses of Actions and accept the original COA as proposed by the commanding officer. Studies showed that the first COA proposed was the one actually used in over 90% of the cases. If the COA proves enviable during the war gaming phase of the process, then a new COA would be proposed and the remaining steps repeated to test the new course of action. Under the current system (MDMP) all three COAs required under that system are processed simultaneously and consume a great deal of processing time.

The plausibility of a successfully chosen course of action is tested in the third phase of the process – war gaming. The third phase uses graphics and war gaming (confrontation models) to evaluate the COA prior to execution and allows the decision makers the ability to modify COA quickly from lessons learned.

The new system forces planners to communicate, especially the CO in discriminating his initial COA and reasons for his course of action. The newer system facilitates input from the staff officers who usually possess additional expertise in their specialty areas than does the Commanding Officer. The system allows them to modify aspects of the proposed COA to optimize the possibility of a successful outcome. The RPM also requires additional training in its use and usually requires a more experienced staff to operate efficiently.

An initial review of the system by the army officers who were testing the system indicated that they preferred the RPM over the existing MDMP but that additional testing and possible modifications were still potentially required before they would switch from their current system.

Monday, March 3, 2008

Decision Making at Berkeley

While surfing the web looking for an article on decision making for this blog I happened upon the Career Services page from Berkeley University. The site is located at: http://career.berkeley.edu/Plan/MakeDecisions.stm. The page/site has broke decision making down into three basic areas: 1) Factors influencing the individual’s decisions, 2) Decision making styles, and 3) a Take Action: Decision Making Models section. The purpose of the site is to assist students in picking a career path.

The site describes three factors that affect our decisions. These factors include:
1. Information Factors
2. Decision-Making Experience
3. Personal Factors

Additional information is available on each of the previously listed factors via a link located on the site.

The site goes on to inform the reader that individual decisions will depend on the decision-making style and the importance associated with the outcome of the decision, indicating that different decision styles will generate different outcomes. The site recommends a planned decision-making style (a structured decision making process). This would indicate that the university considers all of its students to be novice decision makers.

The University of Berkeley site goes on to list three different styles of structured decision making processes: a Pros & Cons model, an Analytical Decision-Making Worksheet model, and an Imaginative-Visualization Experience model. The Pros and Cons model is an un-weighted decision making model as opposed to the Analytical Decision-Making Worksheet which does weight the various factors. Either of these two models would be ideal for the novice decision maker. The Imaginative-Visualization Experience model has the decision maker trying to imagine and experience the possible outcome of their decisions internally. This last model would most likely be utilized by an experienced decision maker. In addition each of these decision styles has time constraints associated with their use. Most college students will face the critical issue of time availability directly affecting the choice of their decisions based upon the limits in the amount of time that can be used to research a topic and the type of decision model that will be chosen.

I found that the site provided an interesting look into basic decision making models and a rudimentary version of a decision support system. The site is aimed at its student body and therefore largely inexperienced decision makers. The site isn’t designed so much as to help a student make a decision, but on what a student should look at when making a decision. As an example the site does not provide any drill down capabilities for helping a student pick a major or a career path.