MMTIBK. CAUFOBNIA M» -oOOSt NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS HUMAN FACTORS ENGINEERING AND OPERABILITY IN THE DESIGN OF ELECTRONIC WARFARE SPACES ABOARD UNITED STATES NAVAL COMBATANTS by David J. Blauser, Jr. September 1985 Thesis Advisor: C. W. Hutchins, Jr. Approved for public release; distribution unlimited T226037 SECURITY CLASSIFICATION OF THIS PACE (Whit Data Eni.cO REPORT DOCUMENTATION PAGE READ INSTRUCTIONS BEFORE COMPLETING FORM t. REPORT NUMBER 2. GOVT ACCESSION NO 3. RECIPIENT'S CATALOG NUMBER 4. TITLE (and Submit) Human Factors Engineering and Operability in the design of Electronic Warfare Spaces Aboard United States Naval Combatants 5. TYPE OF REPORT a PERlOO COVERED Master's Thesis Seotember 19 8 5 6. PERFORMING ORG. REPORT NUMBER 7. AUTHORCJ vDavid J. Blauser, Jr. S. CONTRACT OR GRANT NUMBERf*; 9. PERFORMING ORGANIZATION NAME ANO AOORESS Naval Postgraduate School Monterey, California 93943-5100 10. PROGRAM ELEMENT, PROJECT, TASK AREA a WORK UNIT NUMBERS II. CONTROLLING OFFICE NAME ANO AOORESS Naval Postgraduate School Monterey, California 93943-5100 12. REPORT DATE September 1985 13. NUMBER OF PAGES 117 U. MONITORING AGENCY NAME 4 ADDRESSf// dltlarant from Controlling Otflca) 15. SECURITY CLASS, (ot thta report) 15«. DECLASSIFICATION/ DOWNGRADING SCHEDULE 16. DISTRIBUTION STATEMENT (ot thlt Raport) Approved for Public Release, Distribution Unlimited 17. DISTRIBUTION STATEMENT (of tha abatrael antarad In Block 20. It dltlarant from Raport) 18. SUPPLEMENTARY NOTES 19. KEY WORDS (Continue on ravarsa alda II nacaaaary and Idantlty by block number I Human Factors Engineering, Operability, Electronic Warfare, Link Analysis, Task Analysis, Criticality, MOAT, Operability Analysis 20. ABSTRACT (Conttnua on ravaraa alda It nacaaaary and Idantlty by block number) The purpose of this thesis is to present and discuss a method of assessing the effectiveness of a work space layout. In addition, this method will provide the framework for pinpointing those areas of layout design where redesign will be most cost effective. The objective is to address inefficiencies in the layout of warfare modules on U.S. Navy combatants. In particular, the Electronic Warfare Module on aircraft carriers is assessed due to the highly DD , FJ;:"7, 1473 EDITION OF I NOV 65 IS OBSOLETE S N 0102- LF- 01 J- 6601 SECURITY CLASSIFICATION OF THIS PAGE (Whan Data Entarad) SECURITY CLASSIFICATION Of THIS PACE (Whmt Dmm Bnt—4) time-critical nature of electronic warfare. The method chosen in this thesis is a modification of two techniques of assessment: Integration Analysis and Mission Operability Assessment Technique (MOAT). The portions of these techniques used are Link Analysis, Task Analysis, and Operability Analysis. The application herein concludes that the SW Module layout design on the latest NIMITZ- class aircraft carriers was less than 40% effective in promoting mission accomplishment. Approved for Public Release, Distribution Unlimited Human Factors Engineering and Operability in the Design of Electronic Warfare Spaces Aboard United States Naval Combatants by David J. Blauser, Jr. Lieutenant Commander, United States Navy B.A., Illinois College, 1972 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN SYSTEMS ENGINEERING (ELECTRONIC WARFARE) ABSTRACT The purpose of this thesis is to present and discuss a method of assessing the effectiveness of a work space layout. In addition, this method will provide the framework for pinpointing those areas of layout design where redesign will be most cost effective. The objective is to address inefficiencies in the layout of warfare modules on U.S. Navy combatants. In particular, the Electronic Warfare Module on aircraft carriers is assessed due to the highly time- critical nature of electronic warfare. The method chosen in this thesis is a modification of two techniques of assessment: Integration Analysis and Mission Operability Assessment Technique (MOAT) . The portions of these techniques used are Link Analysis, Task Analysis, and Operability Analysis. The application herein concludes that the EW Module layout design on the latest NIMITZ-class aircraft carriers was less than 40?* effective in promoting mission accomplishment. PUDjbJ^ KUDU LIBRARY NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA 93943-3003 TABLE OF CONTENTS I. INTRODUCTION 10 A. BACKGROUND 10 B. NEED 12 C. PURPOSE OF THESIS 14 II. THE NATURE OF THE DIFFICULTY 15 A. CURRENT METHOD 15 B. DEFICIENCIES 16 1 . Lack of User Input IS 2. Lack of Human Factors Engineering .... 18 3. Lack of a Learning Curve 19 4. No Plan for Growth 19 III. APPROACH 23 A. IMPROVEMENT TO LAYOUT 23 B. TASK ANALYSIS 29 C. LINK ANALYSIS 33 D. OPERABILITY ANALYSIS 42 E. QUESTIONNAIRES 46 IV. RESULTS 51 A. LINK ANALYSIS 53 1. Link Analysis Figures 53 2. Link Analysis Table 56 B. OPERABILITY ANALYSIS 58 TABLE OF CONTENTS V. DISCUSSION AND RECOMMENDATION 66 A. LINK ANALYSIS DISCUSSION 66 B. OPERABILITY ANALYSIS DISCUSSION 67 C. EXTRAPOLATION 68 D. CONCLUSION 68 E. RECOMMENDATION 69 APPENDIX A COPY OF QUESTIONNAIRES 70 APPENDIX B QUESTIONNAIRE RESULTS 91 APPENDIX C THE DELTA METHOD 102 LIST OF REFERENCES 114 BIBLIOGRAPHY 115 INITIAL DISTRIBUTION LIST 117 LIST OF TABLES 1. OPERATORS TASKS AND SUBTASKS 31-32 2. RANKING MATRIX 48 3. LINK ANALYSIS BY POSITION 54 4. MEAN RANK ORDER AND STANDARD DEVIATION FOR EACH RATING MATRIX CELL 60 5. RANK ORDER OF OPERATOR RATING MATRIX 60 6. FINAL RANK ORDER INVERTED FOR DELTA METHOD .... 61 7. DELTA METHOD SOLUTION FOR OPERATOR SUBTASK RATING SCALE 61 8. NORMALIZED INTERVAL SCALE 62 9. RANK ORDER OF SUBTASKS BY CUMULATIVE WEIGHT .... 64 LIST OF FIGURES 1. EW Module Layout (Proposed) 17 2. USS CARL VINSON EW Module 21 3. Link Correlation Matrix 35 4. Internal Communication Links 3S 5. External Visual Links - Operators 40 6. External Visual Links - Status Boards 41 7. External Manual Links ..... 43 8 ACKNOWLEDGEMENT I would like to publically thank several people who helped make this effort possible: CDR Charles Hutchins, who served as my thesis advisor and helped me out of the many tight spots I backed myself into; Captain (Ret.) Wayne Hughes, my second reader, who encouraged me as I sought to define the thesis; LCOR Paul Fishbeck, who read the manuscript and offered much valuable advice; the Electronic Warfare Officer, USS CARL VINSON, LT Inman for graciously allowing me the use of the time of his men for my research; EW1 Tom Fox and EW1 Marvin Daughtrey, EW Supervisors, USS CARL VINSON, for arranging for the testing to come off as envisioned; and my wife, Tami, for encouraging me at every step. My most heartfelt and deepest gratitude belongs to the Lord God Most High, who presented me with the idea for this thesis, lead me through it, and showed anew that "all things work for good to them who are the called according to His purpose" (Romans 8:28). I. INTRODUCTION A . BACKGROUND As seen in several of the recent, wars and conflicts, speed and timing are crucial in modern warfare. In the Falklands War, the lack of time available to react to a threat caused the loss of HMS SHEFFIELD. The HMS SHEFFIELD was sunk by fires that could not be brought under control as a result of a strike by an Exocet missile. Even though the ship had weapons systems that could have defeated the Exocet, its inability to initially detect the missile at a far range rendered these defenses useless. The Electronic Warfare (EW) operators on the SHEFFIELD had little warning of the Exocet due to self -induced jamming. When the self- jamming (inadvertent, of course) ceased, the Exocet was immediately detected, but it was too late to engage. The missile struck about ten seconds later. Although technologically superior, the British did not correctly manage the Radio Frequency H ll II II i or 1 O' J co 1 II II II rH 3 0 u - I. — ■ ["'; I I— I 3 T} 0 w o CO OS < u CO CO 3 CM 0) a ■H Cm ■M 0 o ApwaSpsspj 21 a passageway within CIC that was heavily traveled. There was simply no room in the EW Module for both the EW operators and MUTE--one or the other had to go. In addition to the problems cited above, to accommodate the inclusion of all the equipment in the EW Module, some severe space economies had to be made. The layout now took on the appearance as shown in Figure 2. To allow some passage of operators and maintenance people among and around the equipment, a "straight line" layout was adopted. This had the sole advantage of allowing all the equipment possible to be place in the space. However, the question can logically be asked, "Does such an arrangement add or detract from the efficiency and effectiveness of space utilization in accomplishing the mission?" New equipment added to a space that was not designed for it may cause integration problems due to its intrinsic nature (i.e., in the equipment itself), its new location (e.g., the SLQ-17 computer rack), and reduced workspace (in our example, several racks where one used to be to the exclusion of another piece of equipment--MUTE) . The remainder of this thesis will be given over to attempting to find a workable solution to the problem of adequately designing a work space, in particular, an EW Module. As indicated earlier this is an area where the costs are in dollars and effort, but the payoff is in shorter reaction time and, ultimately, in ships and lives saved. 22 III. APPROACH A. IMPROVEMENT TO LAYOUTS The solution to layout/arrangement improvement is neither simple nor straightforward. An improvement, however, can be found in a threefold approach to the problem. These are: CD a ship class, module, mock-up at a land based laboratory, (2) fleet inputs added to it on a regular basis, and (3) a quantifiable measure that can be used to determine overall effectiveness and pinpoint problem areas. Establishing a class, module, mock-up at a land based laboratory makes good sense. Here, the results of several mock-ups can be stored and compared. Here, too, a "learning curve" can be established. What does not work for one class and module may never work, or it may work for another class ship and another module. The cost of mock-ups can be kept low. Mock-ups of new equipment entering the fleet can be sent to just one location and then incorporated into the design or redesign. Mock-ups of new ship classes can easily be done there. NOSC at San Diego seems to be a good place to have this mock-up facility for several reasons. First, experts there have already done some mock-up work and have a certain amount of experience in this area. Secondly, they are near 23 a good source of fleet, inputs in San Diego. Once a mock-up was designed (or redesigned) , NOSC could request some fleet operators from one of the ships of that particular class and these operators could critique the mock-up and make suggestions for improvement. For added realism and additional inputs, a mock scenario could be played out by the operators on the mock-up. This has the added possible benefit of uncovering any oversight by either NOSC or the operators' critique. The two logical places for the mock-up site are Norfolk, Va . and San Diego, Ca . Fleet inputs in the design/redesign of the layout process is of the utmost importance. The fleet operators are the people who have to use the equipment and accomplish the mission within the space. They, from the benefit of several years individual and many years collective experience, will be able to note problems with the mock-up that the designers may have missed. Designers of single equipments tend to think of their equipment in isolation from all others. Layout designers are often not familiar with the operating characteristics of all the equipment. Fleet operators suffer from neither of these deficiencies. However, operators do have a bias toward doing things as they are currently done and may resist change. Nevertheless, they are still probably the best ones to evaluate the mock- up. 24 Aa seemingly complete as the combination of both laboratory mock-up and fleet input might be, there is one more area that needs to be covered. This is a quantitative assessment of the present layout and mock-up layouts. There are several reasons for this. First, a quantitative assessment of a present layout may indicate that it does not need improving or that the cost of improving the layout is not justified by the amount of improvement. Second, a quantitative assessment based in part on questionnaires to fleet operators may awaken thoughts of some inadequacy that was not present in the conscious memory but was tucked away in the recesses of the mind. Finally, a quantitative assessment is necessary to be able to compare functional layouts one to another. The final aspect of this approach is a way of assessing the effectiveness of the layout. Various techniques have been developed that will aid in assessing effectiveness. However, these methods have been used on systems that are dissimilar to those found on ships and must be modified. The method that will be utilized is a combination of three different but related techniques: Task Analysis, Link Analysis, and Operability Analysis. Two major studies have been reviewed to determine the extent of these analyses and how they might be modified for a layout improvement application. These are Integration Analysis and Mission Operability Assessment Technique (MOAT) . A brief 25 look at each of these will Indicate the salient portions of each for this application. Integration Analysis is the integration of Task Analysis, Operator Interviews, and Link Analysis to evaluate a system's Functional Mock-up. Integration Analysis was designed as a viable Test and Evaluation technique for the earlier stages of Developmental Test and Evaluation (DTS.E) in order to reduce design discrepancies and minimize acquistion costs and time CRef. 13. MOAT, an evaluation methodology, measures the operability of a system or subsystem in terms of operator tasks performed during a mission. It essentially is an Operability Analysis. In general, MOAT addresses the problem of how well an operator can use a system or subsystem to perform tasks within the mission context. Contrasted to evaluations using human engineering design criteria which present only pass or fail information, this technique provides information on the degree of system and/or subsystem success or failure. CRef . 2:pp. 3-43 The underlying techniques of task analysis, scaling methodology, and multi-attribute utility two operators CRef. 3:p. 2043. These links may be visual (such as an instrument scan) , functional or manual (hand to control), or verbal (communications). Inefficiencies are present when links are comparatively long, crossing one another, blocked, or outside optimal visual or reach envelopes. The links are produced from the task analysis and illustrate all the operator-required functional, visual, and communication tasks. Link Analysis can be applied to all scenarios involved during all operational and emergency conditions CRef. 3:p. 205 and Ref.43. Link Analyses are normally of two types: panel layout and tactical compartment or multiple operator work area. With the development and procurement of individual subsystems (i.e., WLR-1, SLQ-17, etc.) a certain amount of panel layout link analysis has been done. However, little if any has been done on the combination of systems arranged in a workspace (in this example, the EW Module). Hence, the 33 Link Analysis will necessarily be of the latter type (i.e., multiple operator work area) . The multiple operator work area type of link analysis is dependent on the correlation matrix. Beginning with the correlation matrix and an area layout, all interactions (links) required to perform a particular functional task are examined in terms of the frequency with which they occur and their criticality. If the criticality is assigned a numerical value, it may be multiplied by the frequency in order to obtain a weighted link value. The work area is overlaid with the weighted links permitting a picture of all the interactions taking place within the system being analyzed. The system design can then be modified to shorten the distance between the workstations that are connected by the weighted links CRef. 53. Figure 3 contains the correlation matrix for the EW Module in CARL VINSON. A correlation matrix is a figure that provides an indication of the links between two operators, positions, or between an operator and a position. Usually a criticality associated with the particular links is included in the matrix. In Figure 3, only the links of interest are listed. The figure is read by selecting the two entries for which links are desired and reading diagonally down from the top one and diagonally up from the bottom one until the intersecting diamond is reached. The diamond contains both the particular links between the two 34 CO CO c c 0 0 •M ■p r rH (H cH ^ (0 (0 1 1— 1 u u u Ph to m ■p GO "' I 2, 11'rHl i i i ! [ Jj*! I £ O a -a u u r-~ o Q) r-{ CM •M 1 1 3 o- V / P. J >H 6 00 D 0 CJ v^< r, -J n==;i ! L^ i n 1 ! r=iM) «w ~ il-i ! 1 ! ' ! ! 1 (^ ! o QQ ApMaSesspj CO (5i*nWw P im E- rt! XK "v-. • nf) |3 P l»-ell J^ 00 4 0 rd e o o T3 O 03 CO 3 -M id 00 i-l — 1 r' Li » 1 f i ; f — * i * i L ^J J-. c- o a) r— 1 CNI p 1 1 a o- « a. ■J ?M 6 CO D 0 u S-* f, -.1 , I- — — — — 'J il ) « — - - 1- 1 • ! •"^r: 1 1 1 1 1 1 l-~ I t 1 1 ! L--d -a o u3 & rd 0 ca w 3 •H id 00 CO c •H (d c cu •H ApMa^csspj CO (s'i^Ww 3 f-1 ^ H rd XI »- "CI ~-> +-> i»-eu J* 00 41 This equipment, ia not adjacent to the SLQ-17 and one of the operators (normally the SLQ-17 operator) may need to reload, reboot, or reconfigure the system in the event of a casualty or normal operations. Figure 7 shows the external manual links. D. OPERABILITY ANALYSIS In the introduction to this section, it was stated that Operability Analysis was comprised of MAU and scaling theory. MAU is a Bayesian-oriented decision-making paradigm. There are three major aspects of the MAU model which are particularly important to this application. First, the basic structure principle in MAU is hierarchical decomposition. The mission is broken down into hierarchical grouping of tasks and subtasks. The model provides the structure and rules necessary to investigate and integrate the interrelationships of all these tasks and subtasks. Second, the definition of utility used in the MAU model allows for the optimum evaluation of alternatives which is dependent upon the selection of a single criterion. This means that multidimensional outcomes must be transformed into a single figure of merit such as utility, system worth, system effectiveness, or, as in this application, operability. Third, a scaling of the selected criterion. The scaling methodology used in this application, as in MOAT, was conjoint measurement. 42 & CD ■M •H T3 b u 5 id — ! W^ 1 ! i i - __. C £h CD -M X w •H 43 Recall that, operabillty can be viewed as a function of task critical ity, operator workload, and space ef f ectiveneaa. Therefore, when considering each task from a operability standpoint, each task that is performed has some combination of these three dimensions. There is a difficulty in assessing the degree of each attribute and combining them into a meaningful measure of operability. Since this can not be assessed directly by objective methods, the scaling methodology of conjoint measurement was devised to assess space operability subjectively. The problem of scaling tasks in dimensions of criticality, frequency, and system effectiveness has been successfully solved by using objectively anchored rating scales CRef. 2:p. 20]. Therefore, a similar rating scale procedure seemed suitable in this instance. The major difficulties involved with this approach are those of measuring the degree of operator effort (or watch section effort) and the layout effectiveness. This was to be expected, however, since not only were these different from any known previous study but also involved interactions on a higher scale than that experienced before. There is a substantial correlation in rating of task difficulty and subsystem effectiveness. The attempt to solve the rating scale problem is accomplished by dividing it into two separate ratings. On the F/A-18 program it was desirable to have two ratings; one with respect to pilot 44 workload C •H > 0 u a 6 en a) c > •H -U o <4-l w CU u a CO Layout Enhances Specific Task Accomplishment Adequate Performance Acnievabie (Layout Sufficient) Inadequate Performance due to Layout, Cannot Com- pensate For Sub- Par Perforr.arca Workload fit Critical Lsvel; Conper.sa: : :r- Very Excessive Workload Con- siderably Higner "v,£-.'; ^nticipatea; C:*"=rssticn Wajcr ■r-"-"ar-a'"-3 workioac Sligntly Higher Tnan .Qntici oated; Mccerate Compensation; vincr Interference wcr Ant ■■dead fts icicatec; ;nterrerer Compensai: WORKLOAD IMPROVING 48 ranked. Thia was of the utmost. importance since the conjoint measurement of the subtask assessment and Module operability depended upon it. Table 2 contains a blank Ranking Matrix. The EW personnel were asked to rank the intersections from best to worst for the "typical" subtask. It was assumed that the rank order for the matrix would vary little from subtask to subtask. Helms found this to be true CRef. 2: p. 34]. This may have been the most difficult part of the questionnaire and the EW operators were forced to draw upon all their knowledge and previous experience in order to produce a rank order that was meaningful and replicable. This matrix, the intersection of two ordinal scales (OW and SE) , is part of conjoint measurement. The Ranking Matrix was expanded and an interval scale constructed via a linear expansion know as the delta method. This resulted in an interval scaling from 0 to 100 and was used to evaluate the total Module operability. The delta method of converting two of these ordinal scales to an interval scale is described in Appendix C. Using this interval scale, the intersection of any particular set of Operator Workload and Space Effectiveness values on the returned Operator Subtask Questionnaire gives a predetermined Operability Value between 0 and 100. An Operator Workload value between one and four inclusive served to identify a column while a Space Effectiveness value identified a row. The intersection of the row with 49 the column indicated the assessment of that particular subtask by an operator. For every subtask, this Operability Value was obtained for each rater and the mean and standard deviation were calculated. This mean value represented the Operability Value for that subtask. The remaining ordinal scale is that of the Criticality. There was no attempt to convert this to an interval scale. Although operators' skills might vary, causing significant deviations in the ratings from rater to rater, there should be only one standard for the criticality of a subtask as it relates to mission accomplishment. This single measure of criticality was taken to be the mean of the criticality ratings. The Operability Value was multiplied by the criticality resulting in a Weighted Operability Value. A Weighted Deficit Value was computed as (100 - Operability Value) multiplied by the Criticality of the subtask. Whereas the Weighted Operability Value will give an indication of the "goodness" of the layout for a particular subtask, the Weighted Deficit Value gives an indication of how much improvement is required to optimize Module Operability for a particular subtask. The Link Analysis Questionnaire was given approximately one week before the Operator Subtask Questionnaire. It was hoped that the brief exposure to the first questionnaire increased the accuracy of the second. 50 IV. RESULTS To teat, thia application, a auitable platform was required. The Electronic Warfare Module on an U.S. Navy aircraft carrier waa selected. The particular ship, USS CARL VINSON (CVN-70) , was chosen for three reasons: availability, accessibility, and familiarity. CARL VINSON had just returned from a seven month cruise and was in a stand-down period and, so, available. The ship's homeport, Alameda, Ca . , was readily accessible for the test. Finally, the ship's layout was familiar enough to the test director to allow a minimum amount of time to be spent on the ship and, therefore, lessen the impact upon the ship's daily work and schedule. There were limitations to the scope of testing. First, the test was not done at sea which produced two limitations. In regards to Link Analysis, operator usage of the various links and the associated frequencies could not be monitored. This was considered to be a major limitation in regards to only the Link Analysis portion of the test. The compensation for this was the Link Analysis Questionnaire concerning the frequency of link usage. A minor limitation concerned the inability to observe the actual Subtasks and ascertain the criticalities under actual conditions. This 51 was compensated by the Operator Subtask Questionnaire, which was considered adequate. A further limitation was the small number of valid responses for the questionnaires. There were three valid responses for the Link Analysis Questionnaire, five for the Rank Ordering portion of the Operator Subtask Questionnaire, and from five to seven for the Subtask Assessment portion of the Operator Subtask Questionnaire. While these numbers are small from a statistical point of view, they can not be discounted. The limited sample size should be an inducement for further testing. Furthermore, the sample size for any aircraft carrier will never be much greater than about twelve due to manning levels. The sample size was seven due to leave and various schools but included the personnel with the most experience. In may be argued that not testing other platforms is a limitation. However, since no two EW Modules on U.S. aircraft carriers are alike, the lack of multiple testing is a moot question. The test was conducted in the EW Module of USS CARL VINSON (CVN-70) . The Module was used so that the personnel could refresh their memory with regards to the layout as they evaluated the subtasks in relation to the layout. The guidance given to the EW personnel before and during the test stressed that they could ask any question they wished of anyone they wished. They were encouraged to 52 confer with each other about the workload, effectiveness, and criticality. A. LINK ANALYSIS The results of the Link Analysis were taken from the Link Analysis figures and from the Link Analysis Questionnaire. The questionnaire was produced from the Link Analysis figures and the Task Analysis in order to determine the frequency that these links were used. The EW operators on USS CARL VINSON were asked to estimate how many times during a standard eight <8) hour watch they utilized the links. The Link Analysis Questionnaire is listed in Appendix A and the results of the Link Analysis is shown in Table 3. 1 . Link Analysis Figures The most critical links were assessed to be the communication links between operators and the visual links between positions. The criticality of the links were chosen to reflect mission accomplishment and the frequency of usage confirmed the criticality. There were four links considered in the Link Analysis: internal communication, external communication, external visual, and external manual. Of these four, the two most important links are the internal communications and external visual. This is because the external communication will generally involve only one operator (the EW Supervisor/NTDS operator) and there is 53 Position and Tasks TABLE 3. LINK ANALYSIS BY POSITION Frequency Link Critical lty Weignted Link Value HUM Operator: i. Talk/comunicate with the SLQ-17 operator? 2. View the presentation on the SLQ-17 console? 3. Talk/comunicate with the NTDS operator? 4. View the presentation on the NTDS console? 5. View the NTDS Status Board (SB) ? S. View the MUM Status Board? 7. Update the WUM Status Board? 8. Check (visually) the SLQ-17 computer? 9. Reboot, reset, or work with the SLQ-17 computer? 13. Check «JTE? 11. Change any settings on MUTE? 16.667 IC 3 77.333 EV 3 44.889 IC 3 73.333 EV 3 33.333 EV 1 33.333 EV a 1.667 EM 2 31.088 EV 1 1.667 EM 1 3.080 EV 2 0.667 EM 2 50.00 231.99 132.80 219.99 33.33 66.67 3.34 31.80 1.67 6.00 1.34 SLQ-17 Operator: 6. 7. 8. 9. 10, Talk/communicate with the NTDS operator? View the presentation on the NTDS console? Talk/communicate with the HUM operator? View the presentation on the WLR-1 console? View the WLR-1 Status Board? View the MTDS Status Board? Uodate the NTDS Status Board? Check (visually) the SLQ-17 computer? Reboot, reset, or worn with the SLQ-17 computer? Check MUTE? Change any settings on MUTE? 54.000 IC 3 162.88 73.333 EV 3 219.99 71.667 IC 3 215. 88 48.333 EV 3 144.99 23.080 EV & 59.80 31.333 EV 3 93.99 9.000 EM 1 9.88 26.667 EV 2 53.34 4.000 EM 3 12.00 1.088 EV 1 1.88 0.667 EM 1 8.67 NTDS Operatcr/EW Supervisor: 1. TaiK/coCTunicate with the SLQ-17 operator? 2. View the presentation on the SLQ-17 console? 3. Talx/conisumcate with the WLR-1 ooerator? 4. View the presentation on the WLR-1 console? 5. View the WLR-i Status Board? 6. View the NTDS Status Boarc? 7. jpdate the NTDS Status Board? 8. Check (visually) the SLQ-17 computer? 9. Reboot, reset, or work with tne SLQ-17 computer9 18. Check MUTE? 11. Change any settings on MUTE? 12. CoiMunicate outside the Module? 45.667 IC 3 136.39 65.667 EV 3 197.80 72.333 IC 3 215.99 60.667 EV 3 182.88 45.008 EV 3 135. 88 48.333 EV 3 i44.99 16.080 EM 3 48.88 34.667 EV 1 34.67 4. -880 EM 1 4.88 6.667 EV 1 6.67 4.800 EM 1 4.89 35.333 EC 3 ;85. 99 KEY: IC - Internal Communications; EC - External Communications; EV - External Visual; EM - External Manual 54 little requirement for manual links outside of ones own position. An external communications link example is the link between the EW Supervisor/NTDS operator and the communications that enable him to communicate outside the Module. However, this requires that the operator rise from his seat to communicate. As a remedy, the NTDS operator uses a hand mike that hangs down near his console. This is a partial solution because he still needs to rise from his seat to select another station on the communication box. Additionally, the hand mike hanging so close to his console presents a clutter problem. Note that the communication and visual link between the NTDS operator (EW Supervisor) and the WLR-1 operator is the longest and partially blocked. The links between the WLR-1 operator and the NTDS operator and SLQ-17 operator are long, allowing him to view very little of the environment. The WLR-1 operator's visual links are very long and the parallax effect severely degrades his observation. Note the long link lines between the SLQ-17 and NTDS positions and the WLR-1 Status Board, and the WLR-1 operator and the NTDS Status Board. Finally, note the very long external visual links to the SSQ-82 MUTE and that they cross. MUTE is required to be checked periodically for faults or changes in the various monitor boxes. The distance is great enough between MUTE and the rest of the Module that only the WLR-1 55 operator can effectively monitor it. However, this requires considerable movement on the part of the WLR-1 operator. 2. Link Analysis Table The frequency of the various links were determined by the Link Analysis Questionnaire. The ideal way to determine link frequency is to count the actions/link interactions during the watch. Since this was not possible, the questionnaire approach was chosen. The Link Analysis is intended here to focus attention at the links that are used most often. The frequency of link usage is multiplied by the weight (criticality ) of the link and an indication of its relative importance is determined. When the links associated with the WLR-1 operator are considered, it can be noted the longest links are the internal communication and external visual links between him and the SLQ-17 and NTDS positions. These links are also the most critical and the most frequently used. The average number of times the operator tries to view the NTDS console is 73.333. Yet this console is the furthest away (see Figures 4 and 5) . The WLR-1 operator communicates more with the NTDS operator for two reasons. Many times the NTDS operator is also the EW Supervisor. The fullest picture of the entire environment of surface, subsurface, and air contacts is present on the NTDS. The other large frequency usage is the visual links for the presentation on the SLQ-17 56 console. This console is only slightly closer than the NTDS console. In the case of the SLQ-17 operator, the first six entries in Table 3 are the ones with the greatest criticality and the highest frequency of use. The high criticality and frequency associated with checking the SLQ- 17 computer is understandable since the SLQ-17 operator is specifically trained to know what to look for on the computer face. Note that the SLQ-17 operator views the presentation at the NTDS console much more than that at the WLR-1 position. It can be seen from Figure 5 that these external visual links between the SLQ-17 and the NTDS are much shorter than between SLQ-17 and the WLR-1. At the same time, the SLQ-17 operator communicates more with the WLR-i operator than with the NTDS operator. This suggests that the SLQ-17 operator gets a better picture of the environment from the NTDS but better information concerning the environment from the WLR-1. The NTDS operator/EW Supervisor are combined because many times the EW Supervisor will man the NTDS console for a major portion of the watch. This is necessary because all the external communications are at or near the NTDS console. Note the large frequency and high criticality associated with communications outside the Module (external communications link) . There appears to be a reversal of interaction between the NTDS operator/EW Supervisor and the 57 WLR-1 and SLQ-17 positions. He views the SLQ-17 console more than communicates with the operator but talks more to the WLR-1 operator than views the WLR-1 displays. Recall from Figures 4 and 5 that both the internal communications and the external visual links between NTDS and WLR-1 are very long. Additionally, note how much he looks at the WLR- 1 Status Board even though it is the furthest away (Figure 6) . The Link Analysis is important since it serves to indicate which links are long, important, and possibly overworked. As such it can be used as a starting point in the redesign of a layout by showing which links need to be reduced in length. The Link Analysis results should also support the results of the Operability Analysis. B. OPERABILITY ANALYSIS The Operator Subtask Questionnaire was divided into two parts: the Subtask Assessment and the Ranking Matrix. The Subtask Assessment was given first. The criteria for this evaluation and the test itself are given in Appendix A. The second half of the Operator Subtask Questionnaire was the Ranking Matrix. All returned valid rankings (n=5) were entered into the matrix and a mean determined for each block and the matrix numbered accordingly. The standard deviation was calculated in case of a tie. This matrix with the mean rank values and the standard deviation is 58 illustrated in Table 4. The resultant rank matrix is shown in Table 5. Next this rank ordering was converted to an interval scale. This was done by reversing the order of the numbering so that the best of the Operator Workload and Space Effectiveness is #16 and the worst is #1 (see Table 6) . Using this as a base, the delta method of linear expansion was used to determine an interval scale. See Appendix C for a brief description, example of the delta method, and the final work sheet for this application. Table 7 shows the result of the delta method which is the desired interval scale. The results of the delta method were normalized by dividing all the blocks by the highest value in the block; in this application it was 102. Table 8 is the normalized interval scale for this application. The Operability Value was weighted (multiplied) by the mean assessed Criticality of that particular Subtask to derive the Weighted Operability Value. The Weighted Operability Value has the potential to range from an absolute minimum of 0 (0x1) to an absolute maximum of 500 (100x5). The range noted was 14.14 to 418.87. The Weighted Deficit Value gives an indication of how much improvement is needed to optimize Layout Effectiveness for a particular Subtask. The greatest Weighted Deficit Value was 485.00 while the least was 28.09. The Weighted 59 TABLE 4. MEAN RANK ORDER AND STANDARD DEVIATION FOR EACH RATING MATRIX CELL 1 9.6 1 4.6 1 2.4 1 1 1 1 2.50 1 1.14 1 0.55 1 1 1 11.0 1 8.0 1 5.2 1 2.8 1 1 2.55 1 1.58 1 1.10 1 0.84 1 1 13.8 1 10.8 1 8.8 1 5.4 1 1 1.64 1 1.10 1 1.10 1 1.34 1 1 16 1 14.0 1 12.2 1 10.4 1 1 1 1.22 1 2.17 1 3.13 1 I I I Mean Rank I I Standard Deviation I I TABLE 5. RANK ORDER OF OPERATOR RATING MATRIX SE 1 9 4 2 1 1 1 12 7 5 3 1 1 14 11 8 6 1 1 16 15 13 10 1 2 3 OW 60 TABLE 6. FINAL RANK ORDER INVERTED FOR DELTA METHOD SE SE 1 8 13 15 16 1 1 5 10 12 14 1 1 3 6 9 11 1 1 1 2 4 7 1 2 3 OW TABLE 7. DELTA METHOD SOLUTION FOR OPERATOR SUBTASK RATING SCALE 22 35 47 1 55 77 90 102 1 1 40 62 75 87 1 1 24 46 59 71 1 1 0 22 35 47 1 55 40 24 2 3 OW 61 TABLE 8. NORMALIZED INTERVAL SCALE SE 21 34 2 3 OW 46 1 54 75 89 100 1 1 40 61 74 86 1 1 24 45 58 70 1 1 0 21 34 46 1 54 40 24 62 Deficit Value can range from 500 to 0. The larger the number, the greater the amount of improvement is needed. The Total Module Operability for this particular EW Module was computed to be 39.2 X. This computation is as follows. There were 49 Subtask evaluated. The summation of the criticalities of these Subtasks in order to accomplish the mission was 200.84. By assuming a perfect layout, we can multiply by 100 to obtain a maximum score of 20,084. Next the Weighted Operability Values were summed to obtain the actual score of the Module of 7872.31. When the actual score is divided by the maximum, an indication of the effectiveness of the layout is obtained. Table 9 contains an ordering of the Subtasks by cumulative weight. This was determined by dividing the Weighted Deficit Value by the optimum layout effectiveness to determine how much the deficit each Subtasks comprises. These were then ranked from most to least. This table gives an indication of which Subtasks should be improved first in order to achieve the most cost effective approach to improving the Module. Table 9 contains the rank ordering by cumulative weights, the Subtask number, a brief description of the Subtask and its associated position, the operators polled with their evaluation of the Subtask in terms of Operator Workload and Space Effectiveness converted to an interval scale, the operability value (the mean of the operators' 63 TA3LE 9. RANK ORDER OF SUB1 "ASKS BY CUMULI lTivh WEIGH Operapility Deficit Cueeslative to. Subtask Dwcnotion (Pos) oi 03 03 04 03 06 07 Value Value Critical itv Ueiaht Total* 1 1.2,1 Rprt EU Info (Sup) 21 1 8 8 0 8 3 1888 97.888 5.000 197 197 2 1.2.2 Prov SI Recce (Supi 21 1 8 8 8 8 3 1088 97.388 3.380 197 7.94 3 4.1.1 Sreh AssqnBnds (UUU 8 8 8 8 21 8 3 1088 97.888 3.888 197 11.91 * 4.2.2 Rprt EU Irrfo (UUU 34 8 8 8 8 21 3 7.337 32.143 3.088 177 15.53 3 4.2.1 Pry at - KTDS (UUU 21 8 8 8 8 8 3 1888 97.088 4.714 174 19.42 6 4.1.4 OF Signals (UUU 21 8 8 8 SB 8 3 11.206 88. 714 3.088 163 2183 7 4.1.2 Analyre Sigs (UUU 8 8 8 8 21 8 3 14.286 35.714 1008 151 26.56 8 3.1.1 Co— licate (SB) 21 24 8 21 8 8 21 18.373. 39.427 4.371 133 29.31 9 2.1.1 Enter ID'S (NTDS) 43 21 8 8 8 8 8 9.337 98.143 4.508 132 23.23 11 4.2.4 log Intercepts (UUU 8 8 8 8 78 24 3 11429 36.571 4.371 124 36.47 11 2.1.4 Enter EU Fix (NTDS) 34 8 8 21 1 ,88 8 8 22.143 77.337 1008 119 39.56 12 2.1.2 Enter ESI Info (NTDS) 43 8 8 21 8 8 3 9.857 98.143 4.714 108 42.74 13 3.1.3 Update SB (SB) 43 8 8 34 21 21 21 28.289 79.711 4.714 188 45.32 14 2.1.3 Brno. Resolution (NTDS) sa 2i 8 45 21 8 21 21714 76.286 4.429 177 48.59 13 11.3 Trnqle ESN Bfnq (NTDS) 21 8 8 21 38 8 45 23.888 75.888 4.429 172 51.31 16 1.1. S Nan ID Request (SUP) 21 34 8 34 - 21 — 22.088 78.088 4.008 156 3137 17 L 1.4 Assign E5N Srcft (Sup) 58 34 8 21 S3 8 3 24.429 73.571 4.088 148 56.35 18 3.1.2 Advise toaula (SB) 34 21 21 21 3 3 74 24.429 75.371 4.088 148 SB. 33 19 2.2.1 Report EU Info (NTDS) 43 61 8 SB SB 21 3 34.714 65.286 4.571 144 61.27 21 1.2.3 Nay by ESN (Sup) 21 21 34 21 3 21 74 17.508 32.508 1598 137 6154 21 3.2.4 Update SB (17) 21 45 21 21 21 21 21 24.357 75.143 1371 128 63.34 22 1.1.3 Assign ESN resp (Sup) - 45 34 45 21 - 73 44.888 56.388 4.500 136 67.38 23 1.1.3 Nan ID Roust-sftip (Supj l 58 34 21 45 - 46 — 48.388 59.288 4.258 186 69.36 24 3.1.3 Evaluate Data (17) 45 45 45 21 188 21 43 46.888 54.888 4.571 182 71.39 23 3.1.3 Enter Paraseters (17) 48 S4 54 34 21 54 48 45.286 54.714 4.429 1.98 7135 25 4.3.4 tonitor NUTS (UUU 34 34 34 34 21 34 SB 33.371 64.429 1714 1.96 73.32 27 1.2.4 De/Brief Aintinq (Supi 34 21 34 21 3 21 74 29.286 78.714 1286 1.98 77.32 28 4.3.1 tonitor EKQt (UUU SB 21 61 S3 21 61 45 46.429 51571 4.286 1.38 79.78 29 4.3.2 Rprt SNCQN (UUU 53 34 46 SB 21 46 46 44.143 55.337 1357 1.75 31.46 38 1.3.1 EDI Criteria (Sup) 58 59 38 45 21 45 — 52.588 47.588 4.590 1.75 3121 31 1.1.2 Assign Searcn (Sup) SB 45 34 45 21 78 75 49.714 58.286 4.143 1.71 34.32 32 3.1.1 tonitor AutoCrrltn (17) 61 73 54 75 8 75 24 32.888 48.088 4.286 1.59 36.61 33 2.1.6 Evai ESN 8mq (NTDS) 78 74 74 SB 21 45 5B 37.143 42.357 4.286 L58 38.11 34 2.2.2 Update SB (NTDS) 45 74 74 SB 21 45 45 31.714 48.286 1371 1.41 39.52 33 4.3.3 Log, Violations (UUU SB S3 34 46 78 34 45 49.2B6 58.714 1808 1.25 98.77 36 1.2.3 Update SB (Sup) 78 SB S3 SB S3 SB 61 68.143 39.357 1714 1.21 91.39 37 12.1 Report EU (17) 86 188 36 36 21 36 3 66.429 31571 4.286 1.18 9116 38 112 Provide Data (17) 36 86 74 74 S3 78 24 67.429 32.571 4.800 1.87 94.23 39 112 EstaP EDI Nodes (17) 78 188 188 86 45 108 3 71.714 28.286 4.371 1.86 33.29 4* 4.1.3 Assist Eval EDI (UUU 34 34 34 108 34 46 SB 48.571 51.429 1286 3.96 96.25 41 11.2 EstaP Op Modes (17) 86 188 108 188 53 73 45 38.571 19.714 1357 3.51 97.36 42 1.1.1 Assign SUM7 (Supi 78 188 188 188 SB SB 73 38.143 19.857 1167 8.52 97.33 43 113 Assist EDI Esploy (17) SB 188 108 188 SB 108 38 32.286 17.714 1333 3.48 97.36 44 113 tonitor EU Entry (17) 78 188 36 74 46 108 74 78.571 . 21.429 1571 3.45 98.31 45 111 Engage Trgts-ED! (17) 74 188 36 74 188 188 38 38.357 11.143 4.714 3.43 38.74 46 11.4 tonitor Envrnant (17) 86 188 188 188 36 108 45 38.143 11.357 4.808 3.39 99.13 47 4.13 Update SB (UUU SB 108 188 36 SB 108 74 32.286 17.714 1429 8.33 99.48 48 1.1.7 tonitor SLfl-17 (Sup) 36 108 108 108 78 36 — 98.333 9.667 1667 8.29 99.77 49 4.1.3 Check (sagas (UUU 78 108 188 188 SB 188 36 37.714 12.286 1286 3.23 108. 88 64 ©valuations), the deficit value (100 - operability value), the mean criticality, the cumulative weight or percentage of the total deficit that that particular Subtask comprises, and the total percentage. 65 V. DISCUSSION AND RECOMMENDATIONS There has been no attempt to ascertain what Weighted Deficit Value or Weighted Operability Value is acceptable. This is beyond the scope of this effort. The purpose has been to identify which areas are in need of improvement and what areas should be addressed first in order to realize the greatest amount of improvement for a given effort. To answer the question of what Weighted Deficit or Operability Value is acceptable will call for additional research targeting the Subtasks individually to a greater detail than was attempted here. A. LINK ANALYSIS DISCUSSION The Link Analysis results show that there is only one position that might be considered acceptable in relation to the lengths of its links. This is the SLQ-17 position. This can be seen in part from the relatively good showing that the SLQ-17 console position had in comparison to the other two positions. The SLQ-17 operator can easily view what is displayed on the NTDS console and, without excessive movement, view the WLR-1 displays. He is within good viewing distance of the NTDS Status Board and his own SLQ-17 computer. The viewing distance to the WLR-1 position and its associated Status Board are rather long, but still viewable. Because of its relatively good positioning in 66 relation to the rest of the Module, SLQ-17 entries were much lower in Table 9. This would indicate that the layout actually promotes increased operator compensation since the other positions did not score as well. A score of 39.2% is an indication of a poor the Module layout contributing to an increased operator compensation burden. Were the module layout better, the operators would have felt much better about the Module and the score would have been higher. B. OPERABILITY ANALYSIS DISCUSSION Several observations can be made from Table 9. First, the SLQ-17 appears to be the best position in the EW Module since its first entry is in twenty-first place in the table and most of the entries are at the bottom of the table. Almost 27* of the possible improvements can be made in the first seven entries and these are just for the EW Supervisor and the WLR-1 operator. Note that the criticalities of these Subtaaka are the highest. In other words, these Subtasks which are very critical are poorly supported by the layout, relative to the less critical Subtasks. Most of the lower criticalities are associated with Subtasks that have a relatively good layout effectiveness. It can be reasonably argued that Module Operability of 39. 25* is not sufficient for an EW Module. What can not be argued is how much improvement is enough. Nor can it be extrapolated from this study what improvement a 67 rearrangement, can result, in. However, it can be seen that improvement can be made in certain areas, as is indicated by careful perusal of Table 9. C. EXTRAPOLATION Further, this approach can be used for possible extrapolations. For example, comparing Figure 1 and Figure 2, similarities are noted. They have the same arrangement of positions (i.e., from left to right, WLR-1, SLQ-17, and NTDS) . The positions are arranged in "straight line" type of layout. This resulted in a low Layout Effectiveness rating for USS CARL VINSON. It may be readily conjectured that another arrangement would work better, namely, a "crescent" shaped layout with the NTDS in between WLR-1 and SLQ-1 and the supervisor's position raised and directly in back of the NTDS operator. D. CONCLUSION It can be concluded that the present configuration of the EW Module on USS CARL VINSON does not result in an optimal utilization of this Module in terms of EW mission accomplishment. Further, there is a real need to assess the layout operability of the warfare modules onboard U.S. Naval combatants. This thesis has provided one way in which a measure of the effectiveness of a particular layout can be 68 determined. Although this was a limited teat, indicatlona are that thia approach works, and further testing is warranted. Building a new layout is urged with the hopes that it may prove by testing to be better than the last one, using the Link Analysis and Operability Analysis illustrated in this work. What is significant and useful from the Link Analysis is that any improvement in layout design should probably start with ensuring that the critical links are not overly long or taxed beyond their limit. Any improvement to the layout design should take into account these critical links to reduce them to their optimum and any changes must not adversely affect the links since in that case any gain in layout design may be cancelled by a loss in link utilization. By conducting tests at landbased test sites, the risks of error are reduced. By the utilization of mock- ups and fleet inputs, the risks can be reduced even further. The result is a more effective layout enhancing mission accomplishment . E. RECOMMENDATION It is recommended that a land based test facility be established that would incorporate the ideas, recommendations, and methods indicated in this thesis as a first step to upgrading our combat workspaces. 69 APPENDIX A OPERATOR SUBTASK QUESTIONNAIRE GENERAL INSTRUCTIONS The purpose of this questionnaire is a subjective evaluation of the layout effectiveness of the EW Module for use in an algorithm to determine, in an objective sense, the effectiveness of the layout in accomplishing the mission of the Module. To do this there is a series of subtasks differentiated by operator that must be assessed in terms of operator workload per subtask, space effectiveness per subtask, and criticality of the subtask toward overall mission accomplishment. What is required is to make this assessment based on your experience and expertise. There is no time limit, you may ask questions of anyone you wish, and you should go and look at the Module to make sure of your answers especially if you are unsure of some of the questions concerning movements. There are no right or wrong answers, but try to be as precise as you can. A scenario, hopefully similar to your recent operations in the Sea of Japan, has been constructed. For each of the subtasks on the next page, mark with an "X", the description that best describes the operator workload (OW) and the space effectiveness (SE) . If the arrangement of the space has little or no effect on subtask accomplishment, then it would 70 rate the highest (4) . Conversely, if the layout or arrangement of the space negatively impacts subtask accomplishment, then it would rate a (1). Give your assessment of the criticallty of the subtask in relation to the overall mission accomplishment. The descriptions of criticality, operator workload, and space effectiveness are listed on a separate sheet. SCENARIO This scenario begins with the assumption of the watch by a particular section They are the on-coming watch section in the EW module of a NIMITZ class aircraft carrier that is steaming in the open ocean with six escorts. The escorts are one VIRGINIA class cruiser, two SPRUANCE class destroyers, one OLIVER HAZARD PERRY class frigate, one LOS ANGELES class submarine, and an oiler. There are heightened tensions world-wide with a probable confrontation between the two super-powers. There is a Soviet task group within 200 NM . The task group is comprised of a KIEV class aircraft carrier, a KIROV class cruiser, a SOVERMENYY class destroyer, two KRIVAK III class frigates, a SLAVA class destroyer, and three auxiliaries. Additionally, ECHO II, VICTOR III, and OSCAR class submarines are known to be in the area but unlocated for the past twelve hours. A Mod-KASHIN is the tattletale for the 71 Battle Group. Both forces are within range of Soviet air power . General Quarters is not set, but a heightened Condition III steaming watch is manned. There has been a momentary lapse of 400 Hz power and the NTDS is being reloaded. The SLQ-17 needs to be reloaded and reprogrammed. As the NTDS is brought on the line, the WLR-1 operator is told to recheck the past entries in his log and verify that they are still active. After 15 minutes, the WLR-1 operator reports that he has intercepted several new signals. One is an airborne mapping and reconnaissance radar. One appeared to be a brief intercept of a submarine radar. Another is an air search radar and the last is a missile acquistion radar. The NTDS air trackers report jamming on both long range and 3-D air search radars. The SLQ-17 alarms and displays hostile missile symbols from both the suspected direction of the SOviet task group and two angles 30 degrees either side of the task group. Deck Launched Interceptors are airborne within one minute. The EW operator at the NTDS console is entering ESM bearing lines and attempting to identify unknown contacts. The SLQ-17 operator shifts operation of the ECM portion to automatic as the TAO frees weapons. The WLR-1 operator is attempting to search the known hostile missile homing radar ranges to facilitate identification. General Quarters is 72 sounded. The TAO orders EMCON to be set for battle and the WLR-1 operator selects EMCON D on MUTE. A quick check of both the NTDS scope and that of the SLQ-17 indicates that the number and direction of the inbound unknowns do not match. The EW watch section tries to match the emerging identification from the WLR-1 and SLQ-17 to both the SLQ-17 and the NTDS presentations. If you have trouble envisioning this scenario, recall the Sea of Japan operations on your last deployment and consider the signal environment and tactics you saw then. 73 OPERATOR WORKLOAD. SPACE EFFECTIVENESS, and CRITICALITY WORKLOAD/COMPENSATION/INTERFERENCE (Mental and Physical) (1) Workload Extrese Compensation Extreme Interference Extreae (2) Workload High Coapensation High Interference High (3) Workload Moderate Coapensation Moderate Interference Moderate (4) Workload Low Coapensation Ion Interference Low SPACE EFFECTIVENESS (1) Inadequate Performance Due to Layout (2) Adequate Performance Achievable; Layout Sufficient to Spec- ific Task (3) Layout Enhances Spe- cific Task Accomplish- ment (4) Layout Design Integrates Multiple Tasks CRITICALITY: How important is it that the operator/team be able to perform this task as compared to the other tasks in successfully completing the mission? Scale Value: <1) Of very small importance. Ability to perform this task as compared to other tasks in this duty is unimportant, or almost unimportant, in order to successfully complete the mission of the Module. (2) Of small importance. This task within this duty is less important than most tasks required to successfully complete the mission of the Module. <3) Of moderate importance. This task within this duty is about as important as most tasks required to successfully complete the mission of the Module. <4) Of substantial importance. This task within this duty is more important than most tasks required to successfully complete the mission of the Module. (5) Of extreme importance. This task within this duty is extremely important in order to successfully complete the mission of the Module. 74 EW Supervisor Task: 1.1 Direct ESM search Subtaaka 1.1.1 Assign search parameters to SLQ-17 Operator Workload: _ <1) (2) (3) (4) Space Effectiveness: <1) (2) <3) (4) Criticality: 1.1.2 Assign search parameters to WLR-1 Operator Workload: <1) (2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 1.1.3 Assign ESM sensor report responsibilities -- own ship Operator Workload: <1) (2) <3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: 1.1.4 Assign ESM sensor report responsibilities -- force Operator Workload: (1) (2) <3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 75 1.1.5 Initiate manual ID request - ship Operator Workload: . (1) <2> <3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 1.1.6 Initiate manual ID request - force Operator Workload: <1) <2) (3) (4) Space Effectiveness : <1) (2) (3) (4) Criticality: _____ 1.1.7 Monitor automatic correlations/associations, (SLQ-17) Operator Workload: (1) <2) <3) <4) Space Effectiveness : <1) <2) <3) (4) Criticality: Task: 1.2 Report/Disseminate EW Information Subtaaka 1.2.1 Report evaluated EW information Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) <3) (4) Criticality: 76 1.2.2 Provide EW recommendations Opera-tor Workload: (1) <2> (3) (4) Space Effectiveness: (1) <2> <3> <4) Criticality: 1.2.3 Update status board near NTDS console Operator Workload: <1) (2) <3) (4) Space Effectiveness: <1) (2) <3) (4) Criticality: 1.2.4 Brief /debrief embarked Airwings Operator Workload: (1) (2) (3) (4) Space Effectiveness: <1) <2) (3) <4) Criticality: 1.2.5 Navigation by passive EW Operator Workload: (1) (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: Task : 1.3 Counter Hostile Environment Subtasks 77 1.3.1 Promulgate ECM employment criteria Operator Workload: _____ _ (1) <2> <3> (4) Space Effectiveness:, ______ <1) (2) <3) (4) Criticality: NTDS Operator Task: 2.1 Collect and enter EW data into NTDS Subtasks 2.1.1 Enter manual ID information into NTDS Operator Workload: _____ <1) <2) (3) (4) Space Effectiveness : _____ (1) (2) <3) (4) Criticality: 2.1.2 Enter manual ESM/NTDS track associations Operator Workload: , <1) (2) (3) <4) Space Effectiveness: <1) (2) (3) (4) Criticality: 2.1.3 Perform triangulation of ESM bearing . lines Operator Workload: <1) <2) (3) (4) Space Effectiveness: ______ _____ (1) (2) (3) (4) Criticality: 78 2.1.4 Enter EW fixes Opera-tor Workload: (1) (2) <3> <4> Space Effectiveness: (1) (2) (3) <4) Criticality: 2.1.5 Advise operators of bearing resolution Operator Workload: <1) (2) (3) <4> Space Effectiveness: <1) <2) (3) (4) Criticality: 2.1.6 Evaluate externally reported ESM bearings Operator Workload: (1) (2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: Task : 2.2 Report/Disseminate EW Information Subtask 2.2.1 Report evaluated EW information Operator Workload: (1) (2) (3) (4) Space Effectiveness: <1) (2) <3) (4) Criticality: 2.2.2 Update status board near console Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) (3) <4) Criticality: 79 SLQ-17 Operator Task : 3 . 1 Conduct ESM Search Subtaak 3.1.1 Monitor automatic correlations/ associations Operator Workload: (1) (2) (3) <4> Space Effectiveness: <1> <2) <3) <4) Criticality: 3.1.2 Establish operating modes of SLQ-17 (ESM) Operator Workload: <1) <2) <3) (4) Space Effectiveness : , (1) (2) (3) (4) Criticality: 3.1.3 Enter detection and response parameters (ESM/ECM) Operator Workload: (1) <2) (3) (4) Space Effectiveness: ___^ (1) (2) (3) (4) Criticality: 3.1.4 Monitor environment on NTDS console Operator Workload: <1) (2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 80 3.1.5 Evaluate displayed data Operator Workload: (1) <2> (3) (4) Space Effectiveness: <1) <2> <3) (4) Criticality: Task: 3.2 Report/Disseminate EW Information Subtask 3.2.1 Report evaluated EW information Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: 3.2.2 Provide ESM/ECM data to NTDS Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: 3.2.3 Monitor entry of EW data into NTDS Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1> (2) (3) (4) Criticality: 81 3.2.4 Update status board Operator Workload: 3.3.1 Engage targets with ECM Operator Workload: <1) <2) (3) (4) Space Effectiveness: (1) (2) <3> (4) Criticality: Task : 3.3 Counter Hostile Environment Subtask <1> (2) (3) (4) Space Effectiveness : _ <1) <2) (3) <4) Criticality: 3.3.2 Establish ECM operating Modes Operator Workload: (1) (2) (3) (4) Space Effectiveness : <1) (2) (3) (4) Criticality: 3.3.3 Assist in promulgation of ECM employment criteria Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: ______ 82 WLR-1 Operator Task: 4 . 1 Conduct. ESM Search Subtasks 4.1.1 Search assigned bands Operator Workload: <1> (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: 4.1.2 Analyze intercepted signals Operator Workload: <1> (2) (3) (4) Space Effectiveness: <1) (2) (3) <4) Criticality: 4.1.3 Check intercepts for images/harmonic3 Operator Workload: <1> <2) (3) <4) Space Effectiveness: (1) (2) (3) (4) Criticality: 4.1.4 Accurately DF intercepted signals Operator Workload: <1) (2) (3) (4) Space Effectiveness: <1) (2) (3) (4) Criticality: 83 4.1.5 Assist in evaluating ECU Operator Workload: „ (1) (2) (3) <4) Space Effectiveness: (1) <2) <3) (4) Criticality: Task: 4.2 Report/Disseminate EW Information Subtasks 4.2.1 Provide ESM data to NTDS Operator Workload: (1) (2) (3) <4) Space Effectiveness: _____ (1) (2) (3) (4) Criticality: 4.2.2 Report evaluated EW information Operator Workload: (1) (2) (3) (4) Space Effectiveness: , <1) (2) (3) (4) Criticality: 4.2.3 Update status board near position Operator Workload: <1) (2) (3) (4) Space Effectiveness: , (1) (2) (3) (4) Criticality: . 84 4.2.4 Log all intercepts Opera-tor Workload: (1) <2> (3) (4) Space Effectiveness: (1) <2> (3) (4) Criticality: Task: 4.3 EMCON Subtasks 4.3.1 Monitor EMCON Operator Workload: (1) (2) <3> <4) Space Effectiveness: (1) (2) (3) (4) Criticality: 4.3.2 Report violations of EMCON Operator Workload: <1) <2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 4.3.3 Log violations of EMCON Operator Workload: (1) <2> (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 4.3.4 Monitor MUTE Operator Workload: (1) (2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: 85 EW Status Board Task,: 5.1 Maintain Status Boards Subtaaka 5.1.1 Communicate with operators Operator Workload: , <1) (2) <3> <4> Space Effectiveness : <1) (2) <3) (4) Criticality: 5.1.2 Advise operators of any information received Operator Workload: <1) (2) (3) <4) Space Effectiveness: <1) (2) (3) (4) Criticality: 5.1.3 Update status boards Operator Workload: _____ (1) (2) (3) (4) Space Effectiveness: (1) (2) (3) (4) Criticality: _____ 86 Rating Matrix Cell Rank Order For the following matrix, rank the blocks from best to worst CI to 16) . The lowest numbered block is the intersection of the best of the rows and columns. The number two block is next best, and so on. Note the arrows and the phrases associated with them. Design means the design of the layout or arrangement. Do not think of the layout of one workstation, such as the WLR-1 or SLQ-17, but of the entire EW Module. Think of the scenario already presented in order to properly consider the workload. Ask any questions you want or talk among yourselves or go and look at the layout. Multiple Tasks Integrated Layout Enhances Specific Task Accoaolishment Adequate Performance Achievable (Layout Sufficient) Inadequate Performance due to Layout, Canr.ot C:- pensate For Sub- Par Performance cn C •H > o u S w c > •H +J u 4-1 U-| w Workload At Workloaa Con- workload Biigntly Workload hs Critical Level; siderabiy Higner higher Than Anticipated; Compensation Than SnticiDated; finticioatec; No Interference; Very Excessive EcKpenssticn Moderate No Compensation Hi-h; Irrterferenca -crnoensation; Kajor v.ir.ir Interference WORKLOAD IMPROVING 87 LINK ANALYSIS QUESTIONNAIRE WLR-1 Operator: How many times in an eight, hour watch do you: 1. Talk/communicate with the SLQ-17 operator? 2. View the presentation on the SLQ-17 console? 3. Talk/communicate with the NTDS operator? 4. View the presentation on the NTDS console? 5. View the NTDS Status Board (SB)? 6. View the WLR-1 Status Board? 7. Update the WLR-1 Status Board? 8. Check (visually) the SLQ-17 computer? 9. Reboot, reset, or work with the SLQ-17 computer? 10. Check MUTE? 11. Change any settings on MUTE? 88 SLQ-17 Operator: How many times in an eight hour watch do you: 1. Talk/communicate with the NTDS operator? 2. View the presentation on the NTDS console? 3. Talk/communicate with the WLR-1 operator? 4. View the presentation on the WLR-1 console? 5. View the WLR-1 Status Board? 6. View the NTDS Status Board? 7. Update the NTDS Status Board? 8. Check (visually) the SLQ-17 computer? 9. Reboot, reset, or work with the SLQ-17 computer 10. Check NUTE? 11. Change any settings on MUTE? 89 NTDS Operator/EW Supervisor: How many times in an eight hour watch do you: 1. Talk/communicate with the SLQ-17 operator? 2. View the presentation on the SLQ-17 console? 3. Talk/communicate with the WLR-1 operator? 4. View the presentation on the WLR-1 console? 5. View the WLR-1 Status Board? 6. View the NTDS Status Board? 7. Update the NTDS Status Board? 8. Check (visually) the SLQ-17 computer? 9. Reboot, reset, or work with the SLQ-17 computer? 10. Check MUTE? 11. Change any settings on MUTE? 12. Communicate outside the Module? 90 APPENDIX B QUESTIONNAIRE RESULTS Rating Matrix Cell Rank Order In the following matrix, the blocks rank from best to worst (1 to 16). The lowest numbered block is the intersection of the best of the rows and columns. The number two block is next best, and so on. Note the arrows and the phrases associated with them. Design means the design of the layout or arrangement. Multiple Tasks Integrated Layout Enhances Specific Task Accomplishment Adequate Performance Achievable (Layout Sufficient) Inadequate Performance cue to Layout, Cannot Com- pensate For Sup- Par Performance 12 14 16 11 15 10 13 Workload fit Workload Con- workload Slightly Workload 3s Critical i_evei; sicerably Higher Higner Than Anticipated; Compensation "^an Anticipated; Anticipated; No Interference; very Excessive Compensation Moderate No Compensation High; Interference Compensation; tfajor *inor Interference 91 RESULTANT INTERVAL SCALE for OPERABILITY Multiple Tasks Integrated 54 75 88 in Layout Enhances Specific Task Accomplishment 40 61 74 as Adequate Perforsance Achievable (Layout Sufficient) Inadequate Performance due to Layout, Cannot Com- pensate For Sub- Par Performance 24 45 58 78 21 34 46 Worxioad At Critical Level; Compensation Very Excessive Workload Con- sideraoly Higher Than Anticipated; Compensation Hign; Interference Major Workload Slightly Higher Than Anticipated; Moderate Compensation; Finer Interference Workload As Anticipated; Ho Interference; No Compensation 92 RESULTS OF ALL OPERATORS NOTES: The following results are the Operability Value of all reaponaea for each aubtaak. Generally, n equalled 7 for the subtasks, although there were some with only six reaponaea. The Standard Deviation ia that of the sample and not the population (i.e., the atandard deviation was calculated using n vice n-1). The Interval Scale shown was baaed on the Ranking Scale which the EW operators provided through the questionnaire. The Ranking (an ordinal) Scale was then converted to an interval scale by means of the Delta Method 0. This inequality is clearly satisfied for any positive value for di, so it is not necessary to redefine this value. It will be recalled that values were assigned to A and B so that B would have a higher value than A. The second inequality is AQ > BP . Substituting the values in this inequality gives d3 > di. This inequality is not true for all values of d3 and dj. . However, since d3 > di, it possible to replace d3 by di ♦ d3' , for positive d3' . Now, for any choice of positive di and d3' , the inequality di + d3' > di holds. On the work sheet, d3 is replaced by di + d3' ; that is, in any row with a d3, a di and a d3' are added and the d3 is deleted. For convenience, and because the d3' column looks exactly like the d3 column, the d3' column is put where d3 was. This is merely a relabeling of columns. Note that as many "marks' as were in the d3 column are added to the di column. The work sheet at this point looks like Figure C-4. The next inequality, BQ > AQ implies 2di ♦ d3' > di + d3' or di > 0. Again it is not necessary to make any changes to satisfy this constant. 107 A B C P Q R CR BR CQ AR CP BQ AQ BP AP di d? dp' da. I I I I I I I 111 I I I I I II II I I I I I I I I I I I I 111 111 I I I II I II II I I I 111 I 1 11 11 1 1 1 1 1 1 1 1 1 1 111 1 11 11 1 1 1 1 1 1 1 1 1 1 111 11 11 1 1 1 1 I 1 1 1 1 1 1 II 1 11 11 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 11 1 111 1 1 1 1 1 1 1 1 1 1 111 111 1 1 1 1 1 1 1 1 1 1 111 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 l 1 1 1 AQ>BP=d3>di »d3=di*d3' BP>AP=di>0 do nothing Figure C-4 Proceeding up the work sheet, the next inequality is CP > BQ . This implies that d2 > di + d3' . Here, we do the same as before, in that we redefine d2 as di + d3' + d2' » and consequently replacing d2 in every row in which it occurs by a di, d3' , and d2' • Again d2' is put in the column where d2 was. Consequently, the procedure involves relabeling the d2 column d2' and adding a di and a d3' for each d2 in any row in which a d2 appears. The work sheet at this point looks like Figure C-5. 108 A B C P Q R CR BR CQ AR CP BQ AQ BP AP di d-?' d-*' da I I I I I I I II I I I I I I I 11 I 1 11 1 1 1 1 1 1 1 1 1 1 1 111 111 1 1 1 11 1 11 11 1 1 1 1111 11 111 11 1 1 1 1 1 1 1 1 1 1 111 1 11 11 1 1 1 1 1 1 1 1 1 1 1111 11 111 (III I 1 1 1 1 1 1 II 1 11 11 1 1 l 1 1 1 1 1 1 1 inn ii i i i i i i i i i i i i n i hi i i i i i i i i i i iii hi i i i i i i i i i i hi i i i i i i i i i i i i i i i i i i i CP>Ba=d2>di-d3' d2=di+d3' +d2' BQ>AQ=di>0 do nothing. AQ>BP=d3>di d3=di+d3 ' BP>AP=di>0 do nothing. Figure C-5 The inequality AR > CP implies a similar inequality among d's and is handled the same way. The work sheet at this point is given in Figure C-6. The ordering of CQ > AR implies an inequality among tne d's which is handled somewhat differently from the previous inequalities. CQ > AR implies that di * d3' > d4 ' . Since there are two d's on the left side of the inequality, it is not possible simply to make the substitution: di - d3' = da/ ♦ di' + d3' ' . Some rows may have different numbers of di and d3', so this replacement rule would be impossible to implement. 109 The following three step method is used to redefine the d's. First, d4' is split into two parts, d4' ' and d5 . Since di + d3' > d4' , this division may be arbitrarily done so that di > d4"' and d3' > ds- The preceding two inequalities may be handled by A B C P Q R CR BR CO AR CP BQ AQ BP AP di d?' d^' da' I I I I I I I 111 I I I I I I 11 11 11 1 1 1 1 1 1 1 1 1 1 l 111 111 1 1 1 1 11 1 1 11 11 1 1 1 1 1111 1 11 1 11 1 1 1 1 1 1 1 1 1 1 1 1 1 111 11 11 11 1 l l 1 1 1 1 1 l 1 1 111 1 1 l 11 1 l 1 l 1 1 1 1 1 l 1 1 11 I 1 11 11 l 1 l 1 l 1 l 1 1 l 1 11 l 1 11 1 I I I 1 1 1 1 1 1 1 1111 111 1 1 l 1 1 1 l 1 1 1 111 111 1 1 i 1 1 I 1 1 1 1 111 1 1 1 i 1 1 1 1 1 1 1 1 1 1 1 l 1 1 I AR>CP = d4>di -fd2' d4=di +d2' ♦04' CP>BQ=d2>di-d3' d2=di *d3' +d2 ' BQ>AQ=di>0 do nothing . AQ>BP=d3>di d3=di+d3' BP>AP=di>0 do nothinq . Figure C-6 previously discussed methods. Thus the three replacements are d4' = d4" + ds, di = d4'' + di', and d3' = d3" ♦ d5. 110 Now for any choice of positive d's, CQ > AR is satisfied. The work sheet now looks like Figure C-7. The methods of handling the remaining inequalities have already been discussed. Note that the steps to complete the worksheet is (matrix size - 1), or in the example, 8. For the Module Operability application there were 15 steps. When completed, the top half of the work sheet shows the relationship between scale values and the newly defined d's. For the example given, the following relationships hold: A = 0 B = di' ♦ d3" ♦ d4"' C = 2di' ♦ d2' + 3d3" - 2d4'" + ds P = 0 Q = di' + 2d3" ♦ d4'" + d5 R = 2di' ♦ d2' ♦ 4d3" + 3d4" ' ♦ 2d5 A B C P Q R CR BR CQ AR CP BQ AQ BP AP di d?' dq' da' ds l I I I l I i 111 I I I I l I 11 I 1 11 1 11 1 1 1 i 1 1 1 1 1 1 l 11 1 11 11 11 1 i 111 11 11 1 111 1 11 1 l llllllll 111 111 15 1111 1 1 1 1 1 1 1 1 1 111 11 11 1 1111 1 11 1 1 1 1 1 1 1 1 I 1 111 11 111 1 111 1111 1 1 I 1 1 1 1 1 111 11 11 1 111 1111 i 1 1 1 1 1 1 1 1 11 1 1 11 1 11 1 1 1 1 I 1 1 1 1 1 1 1111 11 1 11 1 1 1 I I 1 1 1 1 1 1 II 1 11 11 11 1 I 1 1 1 1 1 1 1 111 1 111 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 Cd4'=d4' ' -^5 (di=d4"-di' CQ>AR=di+d3' >d4' td3'=d5-d3' ' AR>CP=d4>di+d2' d4=di +d2 ' *d4 ' CP>BQ=d2>di*d3' d2=di +d3' +d2 ' BQ>AQ=di>0 do nothing. AQ>BP=d3>di d3=di *d3 ' BP>AP=di>0 do nothing. Figure C-7 111 Any choice of positive d's will give a scale for Factors I and II as well as overall performance which is an additive representation, and in which the overall ordering agrees with the empirical ordering. Figure C-8 contains the results of this example. djj do' da: d«a'" d^ dft d7_ dR. A J_ B I 1 I 1 C I 11 P I I 1 1111 111 I 1 I Q I 1 R 111 I 11 I 1 I 1 1111 1111 111 CR 11111 111 1111 1111 BR I 111 I 1 15 11111 111 CQ I 111 I 1 15 111 111 AR I 11 I 1 1111 1111 11 CP I 11 I 1 I 111 I 11 I 1 BQ I 11 AQ II 1111 111 I 1 I 11 I 1 BP J_L AP I I 1 I I I = 0 = 3 = 9 = 0 = 5 = 12 = 20 = 15 = 14 = 12 = 9 = S = 5 3 0 Figure C-S A common choice of d's is to make them all equal to 1. This choice for d yields a set of scale values which represent the set of minimal integers which will produce the requires rank order of the matrix cells. The completed work sheet for the Module Operability application follows. Note that the numbers within the d's are expressed in arabic numerals for viewing ease and the totals are listed on the side. 112 di " " <±->" '" dV <±A' " ds' " dfi' " d7 dA A 1 1 1 1 1 1 1 1 1 B 12 13 14 13 13 1 1 1 4 12 1 G 13 15 17 15 14 12 16 13 1 D 14 16 1 10 17 16 12 IS 14 1 P 1 1 1 1 1 1 1 1 1 Q 12 13 15 14 13 1 1 14 12 1 R 13 15 Id 16 15 12 17 14 1 S 15 17 111 IS 17 13 19 15 1 SD 19 1 13 121 115 1 13 15 1 17 19 1 SC IB 112 1 IS 1 13 1 11 15 1 15 IS 1 RD 17 1 11 1 IS 1 13 1 11 1 4 1 15 IS 1 SB 17 1 10 115 1 11 1 10 14 1 13 17 1 RC 16 1 10 1 15 1 11 19 14 1 13 17 1 QO 16 19 115 1 11 19 13 1 12 16 1 RB 15 IS 1 12 19 IS 13 1 11 16 1 QC 15 18 1 12 19 17 13 1 10 15 1 SA 15 17 1 11 IS 17 13 19 15 1 PD 14 16 1 10 ■ 17 16 12 IS 1 4 1 QB 14 16 19 17 16 12 IS 1 4 1 RA 13 15 IS 16 15 12 17 1 4 1 PC 13 15 17 15 1 4 12 16 1 3 1 QA 12 13 15 14 13 1 1 14 12 1 PB 12 13 14 13 13 1 1 1 4 12 1 PA 1 1 1 1 1 1 1 1 1 o 22 35 47 O 24 40 55 102 90 87 77 75 71 62 59 55 47 46 40 35 24 22 0 Figure C-9 113 LIST OF REFERENCES 1. Carlson, D.L., Integration Analysis: A Proposed Integration o£ Test, and Evaluation Techniques for Early On Detection of Human Factors Engineering Discrepancies . p. 15, Masters Thesis, Naval Postgraduate School, Monterey California, March, 1983. 2. Pacific Missile Test Center Report TP-79-31, Mission Operability Assessment Technique: A System Evaluation Methodology. by Lt. W.R. Helm and M.L. Donnell, 10 October 1979. 3. Huchinson, R.D., New Horizons for Human Factors in Design. McGraw-Hill, 1981. 4. McCormick, Ernest J. and Sanders, Mark S., Human Factors in Engineering and Design. 5th ed . , pp. 252- 254, McGraw-Hill Book Co., 1982. 5. Air Force Aerospace Medical Research Laboratory Report TR-81-35, Human Engineering Procedures Guide. by Charles W. Geer, pp. 134,137, September, 1981. 6. Coombs, Clyde H., A Theory of Data, pp. 96-102, John Wiley S. Sons, Inc., 1964. 114 BIBLIOGRAPHY Blauaer, D.J., Lieutenant Commander, U.S. Navy, and others, USS CARL VINSON Combat Direction Center Doctrine, Draft, December, 1981. (unpublished) Brandquist, Roland, Captain, U.S. Navy, and Sidrow, Michael R., Lieutenant, U.S. Navy, "Falklands Fallout: Strengthen the Surface Navy!", U.S. Naval Institute Proceedings, v. 110, July, 1984. Clarke, Michele, "A Picture is Worth..., A Look at Military Display Technology", Journal of Electronic Defense, v. 7, December, 1984. Gallotta, RADM Albert A. Jr, "EW, Information and Battle Management", Journal of Electronic Defense. v. 7, July, 1984. Grant, Peter M., Lieutenant, U.S. Navy, "Getting the Big Picture into Our CICs", U.S. Naval Institute Proceedings, v. 110, January, 1984. Gwyn, J.R., ESM Information Integration and the Advanced Combat Direction System. Masters Thesis, Naval Postgraduate School, Monterey, California, September, 1983. Morison, Samuel L., "Falklands (Malvinas) Campaign: A Chronology", U.S. Naval Institute Proceedings. v. 109, June, 1983. Roch, Donald R., "EW Information Display Systems'", International Countermeasures Handbook. 10th ed . , 1985. Ruhe, William J. Captain, U.S. Navy (Retired), "Antiship Missiles Launch New Tactics", U.S. Naval Institute Proceedings, v. 108, December, 1982. Snurkowski, Charles, Some Operational Observations on Surface EW Problems. Masters Thesis, Naval Postgraduate School, Monterey, California, October, 1982. Stiles, Gerald J., "BEKAA II? The Evolution of Close-in EW", Defense Science & Electronics. v. 4, February, 1985. 115 Tuner, Stansfield, Admiral, U.S. Navy (Retired), "The Unobvioua Leaaons of the Falklanda War", U. S. Naval Inatitute Proceedings, v. 109, April, 1983. Wickena, Christopher D., Engineering Psychology and Human Performance. Charles E. Merrill Publishing Co., 1984. 116 INITIAL DISTRIBUTION LIST No. Copies 1. Defense Technical Information Center 2 Cameron Station Alexandria, Virginia 22304-6145 2. Library, Code 0142 2 Naval Postgraduate School Monterey, California 93943-5100 3. Superintendent 1 ATTN: Chairman, Electronic Warfare Academic Group Code 73 Naval Postgraduate School Monterey, California 93943-5100 4. Commander, Naval Sea Systems Command 1 ATTN: CDR J.N. Edwards Code SEA -61X3 Naval Sea Systems Command Headquarters Washington, D.C. 20362 5. Commander, Space and Naval Warfare Command 1 ATTN: LCDR W.R. DeMain Code PDW 107-3C1 Space and Naval Warfare Command Headquarters Washington, D.C. 20362 6. Commander, Naval Ocean System Center 1 ATTN: Mr. David Lutz Code 433 Naval Ocean System Command San Diego, California 92152 7. Commanding Officer, USS CARL VINSON (CVN-70) 1 ATTN: LT Inman, EWO USS CARL VINSON (CVN-70) FPO San Francisco, California 96629-2840 8. Superintendent 2 ATTN: Department of Operations Research Code 55 Hu Naval Postgraduate School Monterey, California 93943-5100 9. LCDR D.J. Blauser 8 Fleet Air Reconnaissance Squadron One FPO San Francisco, California 96601-6550 117 c. Thesis B556 c.l 1 CPQH 21C030 Blauser Human factors engi- neering and operabili- ty in the design of Electronic Warfare spaces aboard United States Naval comba- tants. Thesis B556 c.l 21G030 Blauser Human factors engi- neering and operabili- ty in the design of Electronic Warfare spaces aboard United States Naval comba- tants . <8% thesB556 Human factors engineering and operabilit 3 2768 000 64638 4 DUDLEY KNOX LIBRARY