Digitising Command and Control Neville Stanton pdf download
Digitising Command and Control Neville Stanton pdf download
pdf download
https://ebookgate.com/product/digitising-command-and-control-
neville-stanton/
https://ebookgate.com/product/modelling-command-and-control-neville-
stanton/
ebookgate.com
https://ebookgate.com/product/freedom-from-command-and-control-
rethinking-management-for-lean-service-1st-edition-seddon/
ebookgate.com
https://ebookgate.com/product/two-english-republican-tracts-henry-
neville/
ebookgate.com
https://ebookgate.com/product/fix-freeze-feast-2nd-edition-kati-
neville/
ebookgate.com
Practical Plant Failure Analysis Neville W. Sachs
https://ebookgate.com/product/practical-plant-failure-analysis-
neville-w-sachs/
ebookgate.com
https://ebookgate.com/product/true-stories-of-alien-abduction-1st-
edition-stanton-t-friedman/
ebookgate.com
https://ebookgate.com/product/the-amorous-education-of-celia-seaton-
miranda-neville/
ebookgate.com
https://ebookgate.com/product/frommer-s-china-1st-edition-peter-
neville-hadley/
ebookgate.com
https://ebookgate.com/product/cisco-bgp-4-command-and-configuration-
handbook-william-r-parkhurst/
ebookgate.com
DIGITISING COMMANd ANd CONTROL
Human Factors in Defence
Series Editors:
Human factors is key to enabling today’s armed forces to implement their vision to ‘produce
battle-winning people and equipment that are fit for the challenge of today, ready for the tasks
of tomorrow and capable of building for the future’ (source: UK MoD). Modern armed forces
fulfil a wider variety of roles than ever before. In addition to defending sovereign territory and
prosecuting armed conflicts, military personnel are engaged in homeland defence and in undertaking
peacekeeping operations and delivering humanitarian aid right across the world. This requires top
class personnel, trained to the highest standards in the use of first class equipment. The military has
long recognised that good human factors is essential if these aims are to be achieved.
The defence sector is far and away the largest employer of human factors personnel across the
globe and is the largest funder of basic and applied research. Much of this research is applicable to
a wide audience, not just the military; this series aims to give readers access to some of this high
quality work.
Ashgate’s Human Factors in Defence series comprises of specially commissioned books from
internationally recognised experts in the field. They provide in-depth, authoritative accounts of key
human factors issues being addressed by the defence industry across the world.
Digitising Command and Control
A Human Factors and Ergonomics Analysis of Mission Planning and
Battlespace Management
Neville A. Stanton
University of Southampton, UK
Daniel P. Jenkins
Sociotechnic Solutions Ltd, UK
Paul M. Salmon
Monash University, Australia
Guy H. Walker
Heriot-Watt University, UK
Kirsten M. A. Revell
University of Southampton, UK
&
Laura A. Rafferty
University of Southampton, UK
© Neville A. Stanton, Daniel P. Jenkins, Paul M. Salmon, Guy H. Walker, Kirsten M. A. Revell and
Laura A. Rafferty 2009
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system or transmitted
in any form or by any means, electronic, mechanical, photocopying, recording or otherwise without the prior
permission of the publisher.
Neville A. Stanton, Daniel P. Jenkins, Paul M. Salmon, Guy H. Walker, Kirsten M. A. Revell and
Laura A. Rafferty have asserted their moral right under the Copyright, Designs and Patents Act, 1988, to be
identified as the authors of this work.
Published by
Ashgate Publishing Limited Ashgate Publishing Company
Wey Court East Suite 420
Union Road 101 Cherry Street
Farnham Burlington
Surrey, GU9 7PT VT 05401-4405
England USA
www.ashgate.com
UG485.D53 2009
355.3'3041--dc22
2009011260
Contents
4 Constraint Analysis 29
Method 30
Results 34
Conclusions 37
References 195
Index 203
Author Index 209
List of Figures
Figure 2.1 Illustration showing Human Factors effort is better placed in early stages of the
design process 10
Figure 2.2 Application of Human Factors methods by phase of the design process 11
Figure 3.1 Battle Group Headquarters 16
Figure 3.2 Planning timeline on a flipchart 17
Figure 3.3 Threat integration on map and overlay 18
Figure 3.4 Mission Analysis on a whiteboard 19
Figure 3.5 Effects Schematic drawn on a flipchart and laid on the map 19
Figure 3.6 COAs developed on a flipchart 20
Figure 3.7 DSO on map and overlay 21
Figure 3.8 DSOM on a flipchart 22
Figure 3.9 Coordination of force elements on map and overlay via a wargame 22
Figure 3.10 Coordination Measures captured on a whiteboard 23
Figure 3.11 Fire control lines on map and overlay also recorded in staff officer’s
notebook 24
Figure 3.12 Relationships between the cells in Battle Group Headquarters during mission
planning 26
Figure 4.1 Abstraction Hierarchy for the command process 31
Figure 4.2 Subjective opinion of the digital system in work domain terms 33
Figure 4.3 Concordance of positive ratings between levels 35
Figure 4.4 Concordance of negative ratings between levels 36
Figure 5.1 Hierarchical Task Analysis procedure 40
Figure 5.2 Combat Estimate Seven Questions task model 41
Figure 5.3 Question One HTA extract (1) 44
Figure 5.4 Question One HTA extract (2) 45
Figure 5.5 Question One HTA extract (3) 46
Figure 5.6 Question Two Mission Analysis HTA 48
Figure 5.7 Question Three HTA 49
Figure 5.8 Questions Four–Seven HTA 51
Figure 5.9 DSO construction HTA 52
Figure 5.10 Synchronisation matrix construction HTA 53
Figure 5.11 SHERPA EEM taxonomy 56
Figure 5.12 SHERPA flowchart 57
Figure 5.13 Advantages and disadvantages of each planning process 61
Figure 6.1 Propositional network example 67
Figure 6.2 Bde/BG HQ layout showing component cells 70
Figure 6.3(a) Question 1 SA requirements 72
Figure 6.3(b) Question 2 SA requirements 73
Figure 6.4 Combat Estimate task model 74
Figure 6.5 Question one propositional network 75
Figure 6.6 Question two propositional network 76
Figure 6.7 Question three propositional network 77
Figure 6.8 Question four propositional network 78
viii Digitising Command and Control
Figure 8.8 Diagram displaying a number of icons on top of one another 131
Figure 8.9 Diagram showing the LOP with the e-map turned off 133
Figure 8.10 Diagram displaying the ability to hide all icons except the user’s own 133
Figure 8.11 Diagram showing the purple colour coding of certain buttons 134
Figure 9.1 Overall median values for Visual Clarity 141
Figure 9.2 Comparison of median values for Visual Clarity by group 142
Figure 9.3 Overall median values for Consistency 143
Figure 9.4 Comparison of median values for Consistency by group 144
Figure 9.5 Overall median values for Compatibility 144
Figure 9.6 Comparison of median values for Compatibility by group 145
Figure 9.7 Overall values for Informative Feedback 146
Figure 9.8 Comparison of median values for Informative Feedback by group 147
Figure 9.9 Overall median values for Explicitness 148
Figure 9.10 Comparison of median values for Explicitness 148
Figure 9.11 Overall median values for Appropriate Functionality 149
Figure 9.12 Comparison of median values for Appropriate Functionality by group 150
Figure 9.13 Overall median values for Flexibility and Control 151
Figure 9.14 Comparison of median values for Flexibility and Control by group 151
Figure 9.15 Overall values for Error Prevention and Correction 152
Figure 9.16 Comparison of median values for Error Prevention and Correction by
group 153
Figure 9.17 Overall values for User Guidance and Support 154
Figure 9.18 Comparison of median values for User Guidance and Support by group 154
Figure 9.19 Overall median values for System Usability Problems 155
Figure 9.20 Comparison of median values for System Usability Problems by group 156
Figure 9.21 Overall median values for categories 1 to 9 156
Figure 9.22 Comparison of median values for categories 1 to 9 by group 157
Figure 10.1 Graph showing how PMV values map on to the predicted percentage of
people thermally dissatisfied 162
Figure 10.2 Longitudinal overview of the thermal environment extant in Bde HQ 165
Figure 10.3 Longitudinal overview of the thermal environment extant in BG HQ 165
Figure 10.4 Longitudinal overview of relative humidity extant in Bde HQ 166
Figure 10.5 Longitudinal overview of relative humidity extant in BG HQ 167
Figure 10.6 Noise levels measured in dB(A) at Bde HQ during the CPX 168
Figure 10.7 Noise levels measured in dB(A) at BG HQ during the CPX 169
Figure 10.8 The Cornell Office Environment Survey 172
Figure 10.9 BG and Bde responses to questions about environmental conditions 173
Figure 10.10 BG and Bde responses to questions about physical symptoms 173
Figure 10.11 Bar chart showing the extent of non-compliance with environmental
guidelines 178
Figure 11.1 Key enablers to enhance performance 187
This page has been left blank intentionally
List of Tables
The HFI DTC is a consortium of defence companies and Universities working in cooperation
on a series of defence-related projects. The consortium is led by Aerosystems International and
comprises Birmingham University, Brunel University, Cranfield University, Lockheed Martin,
MBDA and SEA. The consortium was recently awarded The Ergonomics Society President’s
Medal for work that has made a significant contribution to original research, the development of
methodology, and application of knowledge within the field of ergonomics
Jo Partridge
We are grateful to DSTL who have managed the work of the consortium, in particular (and in
alphabetical order) to Geoff Barrett, Bruce Callander, Jen Clemitson, Colin Corbridge, Katherine
Cornes, Roland Edwards, Alan Ellis, Helen Forse, Beejal Mistry, Alison Rogers, Jim Squire and
Debbie Webb.
This work from the HFI DTC was part-funded by the Human Sciences Domain of the UK
Ministry of Defence Scientific Research Programme.
Further information on the work and people that comprise the HFI DTC can be found on www.
hfidtc.com.
This page has been left blank intentionally
Glossary
3D Three dimensional
AH Abstraction Hierarchy
AoA Avenue of Approach
BAE Battlefield Area Evaluation
Bde Brigade
BG Battle Group
BS British Standard
C2 Command and Control
CAST Command Army Staff Trainer
CCIR Commander’s Critical Information Requirements
CDM Critical Decision Method
CO Commanding Officer
CoA Course of Action
Comms Communications
CoS Chief of Staff
COTS Commercial-off-the-Shelf
CPX Command Post Exercise
CRI Colour Rendering Index
CSSO Combat Service Support for Operations
CWA Cognitive Work Analysis
dB decibels
DP Decision Point
DSA Distributed Situation Awareness
DSO Decision Support Overlay
DSOM Decision Support Overlay Matrix
EEM External Error Mode
EEMUA Engineering Equipment & Materials Users Association
EN European Standardisation
EXCON Experimental Control Centre
FRAGO Fragmentary Order
GUI Graphical User Interface
HCI Human Computer Interaction
HEI Human Error Identification
HF High Frequency
HFI DTC Human Factors Integration Defence Technology Centre
HQ Headquarters
HTA Hierarchical Task Analysis
Hz Hertz
IR Information Requests
ISO International Standards Organisation
ISTAR Information, Surveillance, Targeting, Acquisition and
Reconnaissance
LOP Local Operational Picture
xvi Digitising Command and Control
HFI DTC, Transportation Research Group, School of Civil Engineering and the Environment,
University of Southampton, Highfield, Southampton, SO17 1BJ, UK.
n.stanton@soton.ac.uk
Professor Stanton holds a Chair in Human Factors in the School of Civil Engineering and the
Environment at the University of Southampton. He has published over 140 peer-reviewed journal
papers and 14 books on Human Factors and Ergonomics. In 1998 he was awarded the Institution
of Electrical Engineers Divisional Premium Award for a co-authored paper on Engineering
Psychology and System Safety. The Ergonomics Society awarded him the Otto Edholm medal
in 2001 and The President’s Medal in 2008 for his contribution to basic and applied ergonomics
research. In 2007 The Royal Aeronautical Society awarded him the Hodgson Medal and Bronze
Award with colleagues for their work on flight deck safety. Professor Stanton is an editor of the
journal Ergonomics and on the editorial boards of Theoretical Issues in Ergonomics Science and
the International Journal of Human Computer Interaction. Professor Stanton is a Fellow and
Chartered Occupational Psychologist registered with The British Psychological Society, and a
Fellow of The Ergonomics Society. He has a B.Sc. (Hons) in Occupational Psychology from the
University of Hull, a M.Phil. in Applied Psychology and a Ph.D. in Human Factors from Aston
University in Birmingham.
Dr Daniel P. Jenkins
Dr Paul M. Salmon
Human Factors Group, Monash University Accident Research Centre, Building 70, Clayton
Campus, Monash University, Victoria 3800, Australia.
Dr Salmon is a Senior Research Fellow in the Human Factors Group at Monash University and
holds a B.Sc. in Sports Science and an M.Sc. in Applied Ergonomics (both from the University
of Sunderland). He has over 6 years experience in applied human factors research in a number
of domains, including the military, civil and general aviation, rail and road transport and has
xviii Digitising Command and Control
previously worked on a variety of research projects in these areas. This has led to him gaining
expertise in a broad range of areas, including human error, situation awareness and the application
of Human Factors Methods, including human error identification, situation awareness measurement,
teamwork assessment, task analysis and cognitive task analysis methods. Dr Salmon’s current
research interests include the areas of situation awareness in command and control, human error
and the application of human factors methods in sport. He has authored and co-authored various
scientific journal articles, conference articles, book chapters and books and was recently awarded
the Royal Aeronautical Society Hodgson Prize for a co-authored paper in the society’s Aeronautical
journal.
Dr Guy H. Walker
School of the Built Environment, Heriot-Watt University, Edinburgh, EH14 4AS, UK.
Dr Walker read for a B.Sc. Honours degree in Psychology at Southampton University
specialising in engineering psychology, statistics and psychophysics. During his undergraduate
studies he also undertook work in auditory perception laboratories at Essex University and the
Applied Psychology Unit at Cambridge University. After graduating in 1999 he moved to Brunel
University, gaining a Ph.D. in Human Factors in 2002. His research focused on driver performance,
situational awareness and the role of feedback in vehicles. Since this time he has worked for a
human factors consultancy on a project funded by the Rail Safety and Standards Board, examining
driver behaviour in relation to warning systems and alarms fitted in train cabs.
Kirsten M. A. Revell
HFI DTC, Transportation Research Group, School of Civil Engineering and the Environment,
University of Southampton, Highfield, Southampton, SO17 1BJ, UK.
Ms Revell graduated from Exeter University in 1995 with a B.Sc. (Hons) in Psychology, where
her dissertation focused on the use of affordances in product design. After graduating, she spent 6
years working for Microsoft Ltd., implementing and managing the Microsoft Services Academy
which prepared graduates for technical and consulting roles across Europe, the Middle East and
Africa. In 2005 she undertook a second degree in Industrial Design at Brunel University. As part
of her degree, she spent 10 months on industrial placement with the Ergonomics Research Group.
During this time, she partook in a major field trial for the HFI DTC, assisting in data collection and
analysis. She intends to bring together her psychology and design disciplines by pursuing a Human
Factors approach to design, with a particular interest in affordances.
Laura A. Rafferty
HFI DTC, Transportation Research Group, School of Civil Engineering and the Environment,
University of Southampton, Highfield, Southampton, SO17 1BJ, UK.
Ms Rafferty completed her undergraduate studies in 2007 graduating with a B.Sc. in Psychology
(Hons) from Brunel University. In the course of this degree she completed two industrial placements,
the second of which was working as a Research Assistant in the Ergonomics Research Group.
During this 7-month period she helped to design, run and analyse a number of empirical studies
being run for the HFI DTC. Within this time Ms Rafferty also completed her dissertation exploring
About the Authors xix
the qualitative and quantitative differences between novices and experts within military command
and control. She is currently in her second year of Ph.D. studies, focusing on team work and
decision making associated with combat identification.
This page has been left blank intentionally
Preface
This book aims to show how Human Factors and Ergonomics can be used to support system
analysis and development. As part of the research work of the Human Factors Integration Defence
Technology Centre (HFI DTC), we are often asked to comment on the development of new
technologies. For some time now we have looked in-depth at Command and Control activities and
functions. The reader is guided to our other books on Modelling Command and Control, Cognitive
Work Analysis, Distributed Situation Awareness and Socio-Technical Systems (all published under
the Human Factors in Defence series) for a fuller appreciation of our work. The research reported
in this book brought all of these areas together to look in-depth at a proposal for a new digitised
system that would support Command and Control at Brigade Headquarters and below. For us it
was a good opportunity to apply the methods we had been developing to a system that was in
development. The pages within this book show you how we went about this task and what we
found.
It is often the cry of Human Factors and Ergonomics that we are not asked for our involvement
in system development early enough. In the past we have written books on Human Factors Methods
(published by Ashgate and others), which explain how to apply the methods to system design and
evaluation. Here we were given the opportunity, although we also feel that involvement when the
system was being tested was too late, as we would have preferred to have been involved in system
concept, design and development. Nevertheless, it is pleasing to have been involved in the testing
phase, so that any shortcomings could be addressed in subsequent design.
As with all projects of this nature, we have gone to great pains to disguise the system under
test for reasons of commercial confidentiality. This means that we are not allowed to disclose the
name of the products nor any screen shots of the equipment. We have redrawn all the pictures and
removed any reference to the company involved. It is a pity that such steps are required and we
wish organisations could be more open about the testing of their products. Any short-term pain
would turn into longer-term gain for the products, the users and the organisations involved.
As the contents of this book show, we started our analysis by understanding how mission
planning and battlespace management works with traditional materials. The research team not only
observed people conducting the tasks but also undertook the training in those tasks themselves.
There is much insight to be gained through participant-observation, more than mere observation
allows. It also enhanced the understanding of our subsequent observations, because we had
performed the tasks for ourselves.
People may approach this book with many different requirements, goals and agenda. For
those who want an overview of Human Factors Methods, we recommend chapter two. For those
who want to understand mission planning processes, we recommend chapter three. If you are
interested in any particular method, read the overview in chapter two, then chapter four for
Cognitive Work Analysis, chapter five for Hierarchical Task Analysis, chapter seven for Social
Network Analysis, chapter eight for SCADA Analysis, chapter nine for Usability Analysis and
chapter ten for Environmental Analysis. For those interested in collaboration and communication
in military headquarters, we recommend chapters three, six and seven. Finally, for those interested
in our recommendations for future design of digital Command and Control we recommend chapter
eleven. We have tried to write each chapter as stand-alone, but accept that people may want to dip
in and out of chapters to suit their particular needs. We also feel that this book serves as a perfectly
compatible accompaniment to any of our other books on Human Factors Methods, Modelling
xxii Digitising Command and Control
Command and Control, Cognitive Work Analysis, Distributed Situation Awareness and Socio-
Technical Systems. This book brings all of the topics presented in the previous books together to
focus on the analysis of a mission planning and battlespace management system.
Chapter 1
Overview of the Book
This book presents a Human Factors and Ergonomics evaluation of a digital Mission Planning and
Battlespace Management (MP/BM) system. Emphasis was given to the activities occurring within
Brigade (Bde) and Battle Group (BG) level headquarters (HQ), and the Human Factors team from
the HFI DTC distributed their time evenly between these two locations. The insights contained in
this volume arise from a wide-ranging and multi-faceted approach, comprising:
• observation of people using the traditional analogue MP/BM processes in the course of their
work to understand how analogue MP/BM is used in practice;
• constraint analysis (Cognitive Work Analysis, CWA) of the digital MP/BM system
to understand if digital MP/BM is better or worse than the conventional paper-based
approach;
• analysis of the tasks and goal structure required by the digital MP/BM, to understand the
ease with which the activities can be performed and identify the likely design-induced
errors;
• analysis of Distributed Situation Analysis (DSA), to understand the extent to which digital
MP/BM supports collaborative working;
• analysis of the social networks that the digital system allows to form spontaneously (to
understand the way in which people choose to communicate via voice and data);
• assessment against EEMUA 201 (Engineering Equipment & Materials Users Association)
to understand if digital MP/BM meets with best Human Factors practice in control system
interface design;
• assessment against a Usability Questionnaire, to gauge user reactions about the ease or
difficulty of using the digital MP/BM system); and
• an environmental survey, to understand the extent to which the Bde and BG environment
within which people are working meets with British Standard BS/EN/ISO 11064
Environmental Requirements for Control Centres.
A brief summary of the chapters of the book are presented next, with the detailed description of
methods, approach, findings and recommendations within the main body of the book.
Chapter two presents an overview of the Human Factors and Ergonomics discipline and
the methods associated with it. The discipline is introduced with a few examples of how it has
contributed to improved display and control design. This is consistent with the overall aim of
improving the well-being of workers, as well as their work, and the general goal of improved
system performance. Two examples in particular resonate with the purpose of this book, both
taken from aviation over 60 years ago but still with relevance today. Safety of systems is of
major importance in Human Factors and safety critical environments have much to gain from its
application. Human Factors and Ergonomics offers unique insights into the way in which people
work, through the understanding of the interactions between humans, technology, tools, activities,
products and their constraints. This understanding is assisted through the application of Human
Factors and Ergonomics methods, which are also introduced. Some of these are pursued through
Digitising Command and Control
the rest of the book. They offer complementary perspectives on the problem and can be used in an
integrated manner.
Chapter three presents observational studies of the tasks people were undertaking in the HQs
prior to digitisation. The conventional, analogue mission planning process is examined with the
objective of identifying the Ergonomics challenges for digitisation. Prototypes of digital mission
planning systems are beginning to be devised and demonstrated, but there has been concern
expressed over the design of such systems, many of which fail to understand and incorporate the
human aspects of socio-technical systems design. Previous research has identified many of the
potential pitfalls of failing to take Ergonomic considerations into account, as well as the multiplicity
of constraints acting on the planners and planning process. An analysis of mission planning in
a BG is presented, focussing on the tasks and the products produced. This study illustrates the
efficiency of an analogue process, one that has evolved over many generations to form the Combat
Estimate, a process that is mirrored by forces throughout the world. The challenges for digitisation
include ensuring that the mission planning process remains easy and involving, preserving the
public nature of the products, encouraging the collaboration and cooperation of the planners, and
maintaining the flexibility, adaptability and speed of the analogue planning process. It is argued
that digitisation should not become an additional constraint on mission planning.
Chapter four presents the constraint analysis performed on the digital MP/BM. This approach,
realised through CWA deconstructs the system into different levels of abstraction:
• Functional Purpose (that is, the reason that the system exists; for example, conduct planning
to enact higher command’s intent);
• Values and Priorities (that is, the measures of success for the system; for example, maintain
digital MP/BM effectiveness, minimise casualties, reduce time to generate products and so
on);
• Purpose Related Functions (that is, the functions the system is performing; for example,
coordination of units, position and status, threat evaluation, resource allocations and so on);
and
• Object Related Functions (that is, what the physical objects in the system do; for example,
data transmission, voice communication, blue positions, red positions, effects schematic
and so on).
This Abstraction Hierarchy (AH) was then used as a basis for interviewing staff officers at
BG and Bde level, to find out if digital MP/BM was significantly better, the same, or significantly
worse than conventional approaches. The findings showed that the system offered little to support
planning, with none of the respondents offering a positive opinion of the system’s ability to aid
the planning process. Further examination of the results showed that the digital MP/BM estimate
process was generally unsupported by the digital system and that in many cases digitisation had
a negative effect on tempo, efficiency, effectiveness and flexibility. The participants offered a
positive rating for the system’s ability to support battlefield management; however, examination of
the results reveals that this positive rating can be mainly attributed to the secure voice radio facility
rather than the digital MP/BM elements of the system.
Chapter five presents a deconstruction of the activities performed in the operation of the digital
MP/BM system. The deconstruction takes place under the rubric of Hierarchical Task Analysis
(HTA) and creates a hierarchy of goals, sub-goals and plans. This analysis produced a task model
for each of the seven questions in the Combat Estimate. This offers a much higher fidelity of analysis
for the steps involved in producing the Combat Estimate products, as overviewed in chapter three.
The HTA was used as the basis for investigating the ease or difficultly with which the operations
Overview of the Book
on the digital MP/BM system could be performed. Examples of the difficulties encountered are
presented together with suggested remedies in the redesign of the system. The HTA also formed the
foundations for human error identification analysis using the Systematic Human Error Reduction
and Prediction Approach (SHERPA) method. The SHERPA taxonomy was applied to the HTA in
order to identify likely error modes. Examples of the types of design-induced errors that users may
be likely to commit are presented. These errors are also intended to focus attention on remedial
strategies and stimulate design solutions.
Chapter six presents an evaluation of DSA during mission planning and execution activities
supported by the digital MP/BM system. The analysis was undertaken using a mind mapping
approach in order to understand how information and knowledge was spread around the various
agents (including the human players, artefacts, products and materials). This analysis was split into
three parts: Situation Awareness (SA) requirements analysis, analysis of SA during planning tasks
and a corresponding analysis for operational execution tasks. The SA requirements analysis indicated
that the system is not designed to support the range of distinct user SA requirements present in the
MP/BM system. The analysis of the DSA during the planning phases revealed questions about the
timeliness and accuracy of information, the tempo of planning in digital MP/BM, the accuracy of
information from digital MP/BM (as staff had to engage in additional checking activities) and the
poor support for different SA requirements in the different planning cells. Analysis of the operation
execution tasks revealed that the Local Operating Picture (LOP) was often out-of-date or spurious
(clarification of Own Situation Position Reports (OSPR) data was required, placing more load on
the secure voice channel for updates of the blue force positions) and that there was a low level
of trust in the LOP and OSPR (requiring the operations cell to compensate for digital MP/BM’s
shortcomings by drawing blue force positions directly on to the Smartboard – but these were wiped
off every time digital MP/BM updated or was changed). In summary, it was found that DSA was
not well supported by digital MP/BM as different personnel have different SA requirements, which
are subject to change, depending upon their tasks and goals at any moment in time.
Chapter seven considers the analysis of networks in digital Network Enabled Technology. The
ideas behind self-synchronisation of people in networks adapting to changes in the environment
are presented. Social Network Analysis (SNA) offers the means to populate the NATO SAS-050
model of command and control with live data, ascribing network metrics to each of the NATO
model’s three primary axes: patterns of interaction (from fully-hierarchical to fully-distributed),
distribution of information (from tight-control to broad-control) and decision rights (from unitary
to peer-to-peer). This usefully extends the model and enables it to meet several critical needs, firstly,
to understand not where a command and control organisation formally places itself in the model
but where it ‘actually’ places itself. Secondly, to see how the command and control organisation’s
position changes as a result of function and time. And finally, to understand the match between the
place(s) the organisation occupies in the so-called ‘approach space’ and how they map over to a
corresponding ‘problem space’. In this respect the analysis revealed a mismatch, which powerful
examples of emergent user behaviour (in terms of unexpected system use) tried to resolve. In other
words, the human-system interaction proved to be highly unstable, but the good news was that
the underlying communications architecture was able to facilitate rapid reconfigurations. Using
SNA to numerically model the behaviour of the system/organisation provides insight into tempo
(with characteristic patterns of reconfigurations in evidence), agility (as modelled by the different
areas occupied by the organisation in the NATO model) and self-synchronisation (as evidenced by
emergent behaviours). As well as these insights, the modelling work now provides a benchmark
for future iterations of the system.
In chapter eight the ‘look and feel’ of the MP/BM’s Human Computer Interface is assessed
for compliance with EEMUA 201 guidelines. EEMUA 201 represents accepted industry
Digitising Command and Control
best practice for the design and operation of control system Human Computer Interfaces.
Ideally, the interface for digital MP/BM should be designed to allow staff officers and clerks
to conduct their tasks effectively and efficiently. This means in turn that it should conform to
their expectations and allow them to find information and perform tasks in a straightforward
manner. The EEMUA guidelines are therefore an excellent basis for a review of existing
systems or for the design of new systems. EEMUA 201 covers all of the important Human
Factors concerns in this setting, such as: the number of screens, navigation techniques, use
of windows, screen format and layout considerations. The findings of this analysis show that
measured against the 35 EEMUA 201 principles the digital MP/BM only met eight of them.
Twelve principles were partially met (for which some improvements to the current system are
recommended), whilst a further eight principles failed to be met at all (for which significant
shortcomings in design were identified). A further seven of the EEMUA 201 principles were
deemed not applicable to digital MP/BM.
Usability assessment was undertaken with a Human Computer Interaction (HCI) questionnaire
in chapter nine, which was completed by the staff officers and clerks who used the digital MP/BM
at BG (13 respondents) and Bde levels (26 respondents). There were fewer staff in BG, which was
reflected in the respondent numbers. The questionnaire comprised nine main sections designed to
assess the usability of a particular device or system:
• visual clarity (the clarity with which the system displays information);
• consistency (that is, consistency of the interface in terms of how it looks, the ways in which
it presents information and also the ways in which users perform tasks);
• compatibility (that is, the system’s compatibility with other related systems);
• informative feedback (that is, the level, clarity and appropriateness of the feedback provided
by the system);
• explicitness (that is, the clarity with which the system transmits its functionality, structure
and capability);
• appropriate functionality (that is, the level of appropriateness of the system’s functionality
in relation to the activities that it is used for);
• flexibility and control (that is, the flexibility of the system, and the level of control that the
user has over the system);
• error prevention and correction (that is, the extent to which the system prevents user errors
from either being made or impacting task performance); and
• user guidance and support (that is, the level of guidance and support that the system provides
to its end users).
The overall ratings were generally lower at BG, but even at Bde level the overall ratings failed
to go beyond neutral. The system was rated particularly low on ‘explicitness’ and ‘error prevention
and correction’. This is mainly because the personnel using the system did not find it intuitive, with
some losing work altogether due to inadvertent errors.
For the sake of completeness, chapter ten presents assessments of the physical environment
within which digital MP/BM was being used. This was not intended to inform the design of digital
MP/BM, rather it was to consider if the surrounding environment met with current standards for
control centres (that is, BS/EN/ISO 11064 Environmental Requirements for Control Centres).
Whilst the comfortable and benign operational environment found in civilian domains, and to which
current best practice and guidelines applies, may not be directly relevant to military domains, there
remains an inviolable duty of care and opportunities to learn lessons. From a human performance
point of view, the command and control environment is generally too cold. Digitisation, and the
Overview of the Book
requirement this brings for sedentary computer-based work, is unlikely to improve this situation.
Noise levels approach harmful long-term exposure levels and maintained levels are well in excess
of best practice. Air quality (and associated low-level health symptoms) is poor. Lighting would also
fail to meet comparable civilian standards; it is too dark overall but has poor directivity meaning
that, paradoxically, there is also too much glare. Given the safety critical nature of the tasks being
undertaken, comparison against acknowledged best practice sees the present environment as being
sub-optimal for safe and efficient human performance.
Chapter eleven of the book presents a summary of all the preceding chapters, drawing all of
the main findings together. Conclusions on the extent to which the digital MP/BM system meets
Human Factors and Ergonomics criteria are drawn from the Constrains Analysis, HTA, DSA, SNA,
System Control and Data Acquistion (SCADA) Analysis, Usability Analysis and Environmental
Survey. This represents a very thorough assessment of any new system, but digitisation brings with
it additional requirements that have important ramifications, and therefore cannot be undertaken
lightly. The recommendations for short-term improvements in the current generation of digital
MP/BM system are divided into five sections: general design improvements, user-interface design
improvements, hardware improvements, infrastructure improvements and support improvements.
Looking forward to next generation digital MP/BM systems, general Human Factors design
principles are presented and Human Factors issues in digitising mission planning are considered.
Future system design would do well to consider the Human Factors methods presented in chapter
two, the understanding gained from mission planning demands and constraints in chapter three,
together with the insights gained in the various analysis from chapters four to ten. The design of the
digital MP/BM system should not become one of the operational constraints in mission planning
and battlespace management. The science and practice of Human Factors has much to offer in
terms of resolving this situation, provided that it is applied at the beginning (and then throughout)
the system design lifecycle, rather than at the end when its impact is significantly diminished.
This page has been left blank intentionally
Chapter 2
Human Factors in System Design
Human Factors
Human Factors and Ergonomics have over 100 years history in the UK and USA, from humble
beginnings at the turn of the last century to the current day. A detailed account of the historical
developments in both the USA and UK may be found in Meister (1999). This account even
covers the pre-history of the discipline. To cut a long story short, the discipline emerged out of
the recognition that analysis of the interaction between people and their working environment
revealed how work could be designed to reduce errors, improve performance, improve quality of
work and increase the work satisfaction of the workers themselves. Two figures stand out at the
early beginnings of the discipline in the 1900s, Frank and Lillian Gilbreth (Stanton, 2006). The
Gilbreths sought to discover more efficient ways to perform tasks. By way of a famous example
of their work, they observed that bricklayers tended to use different methods of working. With
the aim of seeking the best way to perform the task, they developed innovative tools, job aids
and work procedures. The resultant effect of these changes to the work meant that the laying of a
brick had been reduced dramatically from approximately 18 movements by the bricklayer down to
some four movements. Thus the task was therefore performed much more efficiently. This analysis
amongst others led to the discovery of ‘laws of work’, or ‘Ergo-nomics’ as it was called (Oborne,
1982). Although the discipline has become much more sophisticated in the way it analyses work
(as indicated in the next section), the general aims to improve system performance and quality of
working life remain the principle goals. Human Factors and Ergonomics has been defined variously
as ‘the scientific study of the relationship between man and his working environment’ (Murrell,
1965), ‘a study of man’s behaviour in relation to his work’ (Grandjean, 1980), ‘the study of how
humans accomplish work-related tasks in the context of human-machine systems’ (Meister, 1989),
‘applied information about human behaviour, abilities, limitations and other characteristics to the
design of tools, machines, tasks, jobs and environments’ (Sanders & McCormick, 1993), and ‘that
branch of science which seeks to turn human-machine antagonism into human-machine synergy’
(Hancock, 1997). From these definitions, it may be gathered that the discipline of Human Factors
and Ergonomics is concerned with: human capabilities and limitations, human-machine interaction,
teamwork, tools, machines and material design, environments, work and organisational design. The
definitions also place some implied emphasis on system performance, efficiency, effectiveness,
safety and well-being. These remain important aims for the discipline.
The role Ergonomics has to play in the design of displays of information and input controls is
particularly pertinent to the contents of this book, as the main focus is on the design of digital Mission
Planning and Battlespace Management (MP/BM) systems. The genus of Ergonomics in military
systems display and control design can be traced back to the pioneering works of Paul Fitts and
Alphonse Chapanis in aviation. Chapanis (1999) recalls his work at the Aero Medical Laboratory in
the early 1940s where he was investigating the problem of pilots and co-pilots retracting the landing
gear instead of the landing flaps after landing. His investigations in the B-17 (known as the ‘Flying
Fortress’) revealed that the toggle switches for the landing gear and the landing flaps were both identical
and next to each other. Chapanis’s insight into human performance enabled him to understand how the
pilot might have confused the two toggle switches, particularly after the stresses of a combat mission.
Digitising Command and Control
He proposed coding solutions to the problem: separating the switches (spatial coding) and/or shaping
the switches to represent the part they control (shape coding), so the landing flap switch resembles a
‘flap’ and the landing gear switch resembles a ‘wheel’. Thus the pilot can tell by looking at, or touching,
the switch what function it controls. In his book, Chapanis also proposed that the landing gear switch
could be deactivated if sensors on the landing struts detected the weight of the aircraft.
Grether (1949) reports on the difficulties of reading the traditional three-needle altimeter which
displays the height of the aircraft in three ranges: the longest needle indicates 100s of feet, the
broad pointer indicates 1,000s of feet and the small pointer indicates 10,000s of feet. The work of
Paul Fitts and colleagues had previously shown that pilots frequently misread the altimeter. This
error had been attributed to numerous fatal and non-fatal accidents. Grether devised an experiment
to see if different designs of altimeter could have an effect on the interpretation time and the error
rate. If misreading altimeters was really was a case of ‘designer error’ rather than ‘pilot error’ then
different designs should reveal different error rates. Grether tested six different variations of the
dial and needle altimeter containing combinations of three, two and one needles with and without
an inset counter as well as three types of vertically moving scale (similar to a digital display). Pilots
were asked to record the altimeter reading. The results of the experiment showed that there were
marked differences in the error rates for the different designs of the altimeters. The data also show
that those displays that took longer to interpret also produced more errors. The traditional three-
needle altimeter took some 7 seconds to interpret and produced over 11 per cent errors of 1,000
feet or more. By way of contrast, the vertically moving scale altimeters took less than 2 seconds to
interpret and produced less than 1 per cent errors of 1,000 feet or more.
Both of these examples, one from control design and one from display design, suggest that it is not
‘pilot error’ that causes accidents; rather it is ‘designer error’. This notion of putting the blame on the
last person in the accident chain (for example, the pilot), has lost credibility in modern Ergonomics.
Modern day researchers take a systems view of error, by understanding the relationships between all
the moving parts in a system, both human and technical, from concept, to design, to manufacture,
to operation and maintenance (including system mid-life upgrades) and finally to dismantling and
disposal of the system. These stages map nicely onto the UK MoD’s CADMID life cycle stages
(Concept – Assessment – Design – Manufacture – In-service – Disposal).
The term ‘Human Factors’ seems to have come from the USA to encompass any aspect of
system design, operation, maintenance and disposal that has bearing on input or output. The terms
Human Factors and Ergonomics are often used interchangeably or together. In the UK, Ergonomics
is mostly used to describe physiological, physical, behavioural and environmental aspects of human
performance whereas Human Factors is mostly used to describe cognitive, social and organisational
aspects of human performance. Human Factors and Ergonomics encompass a wide range of topics
in system design, including: Manpower, Personnel, Training, Communications Media, Procedures,
Team Structure, Task Design, Allocation of Function, Workload Assessment, Equipment Design,
System Safety and Health Hazards. The term Human Factors will be used throughout this book,
although this may also mean Ergonomics. Modern day Human Factors focuses on integration with
other aspects of System Engineering. According to the UK MoD, Human Factors Integration is
about ‘... providing a balanced development of both the technical and human aspects of equipment
provision. It provides a process that ensures the application of scientific knowledge about human
characteristics through the specification, design and evaluation of systems.’ (MoD, 2000, p. 6). This
book focuses on the examination of a digital command and control system that was developed for
both mission planning and battlespace management. Human Factors methods have been developed
over the past century, to help design and evaluate new systems. These methods are considered in
the following section.
Human Factors in System Design
Human Factors methods are designed to improve product design by understanding or predicting
user interaction with the devices (Stanton & Young, 1999); these approaches have a long tradition
in system design and tend to have greater impact (as well as reduced cost) when applied early on in
the design process (Stanton & Young, 1999), long before the hard-coding and hard-build has begun.
The design life cycle has at least ten identifiable stages from the identification of product need up
to product release, namely: identification of design need, understanding the potential context of
product use, development of concepts, presentation of mock-ups, refinement of concepts, start
of coding and hard-build, iterative design process, release of a prototype, minor refinements and
then release of the first version of the product. As illustrated in Figure 2.1 by the brown shaded
area, there is often far too little Human Factors effort involved far too late in the design process
(the dark shaded area is a caricature of the Human Factors effort involved in developing the digital
MP/BM system). Ideally, the pattern should be reversed, with the effort front-loaded in the project
(as illustrated by the light shaded area). Such a strategy would undoubtedly have led to a much
improved digital MP/BM system, at considerably reduced cost and with more timely delivery.
As a rough heuristic, the more complex a product is the more important Human Factors input
becomes. Complexity is not a binary state of either ‘complex’ or ‘not complex’, the level of
complexity lies on a non-numerical scale that can be defined through a set of heuristics (Woods,
1988):
• Dynamism of the system: To what extent can the system change states without intervention
from the user? To what extent can the nature of the problem change over time? To what
extent can multiple ongoing tasks have different time spans?
• Parts, variables and their interconnections: The number of parts and the extensiveness of
interconnections between the parts or variables. To what extent can a given problem be due to
multiple potential causes and to what extent can it have multiple potential consequences?
• Uncertainty: To what extent can the data about the system be erroneous, incomplete or
ambiguous – how predictable are future states?
• Risk: What is at stake? How serious are consequences of users’ decisions?
The environment that the command and control system operates within can be seen to be highly
complex. The system is dynamic as it frequently changes in unexpected ways, there are a huge
number of interconnected parts within the system, data within the system is frequently incorrect or
out of date, and the risk inherent in the system is ‘life or death’.
It is to the British Army’s immense credit that even the most recalcitrant of equipment issues
‘can be made to work’, but the new era of networked interoperability (and the complexity this
brings) challenges even this ability. Whilst convoluted ‘workarounds’ may appear to overcome
some of the system design problems in the short term, opportunities may be lost in gaining greater
tempo, efficiency, effectiveness, flexibility and error reduction. Indeed, this practice is, arguably,
fast becoming an optimum strategy for increasing error potential, reducing tempo, efficiency
and operational effectiveness. A new era of networked interoperability requires a new approach,
one that confronts the challenges of harnessing human capability effectively using structured
methodologies.
There are a wide range of Human Factors methods available for the analysis, design and
evaluation of products and systems. A detailed description of the most commonly used approaches
can be found in Stanton et al. (2005a, b). The choice of method used is influenced by a number
of factors; one of these factors is the stage in the design process. By way of an example of how
10 Digitising Command and Control
Figure 2.1 Illustration showing Human Factors effort is better placed in early stages of the
design process
structured approaches to human/system integration can be employed, Figure 2.2 relates a number
of specific methods to the design life cycle.
Figure 2.2 shows at least 11 different types of Human Factors methods and approaches that
can be used through the design life cycle of a new system or product (Stanton & Young, 1999;
Stanton et al., 2005a, 2005b). As the figure shows, many of these are best applied before the
software coding and hard-build of a system starts. The approach places emphasis on the analysis
and development of early prototypes. The assessment described within this book has come very
late in the design process, far too late to make fundamental changes in system design; even small
design modifications would be costly at this stage. It is extremely likely that an early involvement
of Human Factors expertise and methodology would have resulted in a better implementation of
the digital MP/BM system. The Human Factors methods advocated within the design life cycle
are described further below. The methods starred (thus*) were applied to the case study presented
within this book.
Cognitive Work Analysis (CWA)* is a structured framework for considering the development
and analysis of complex socio-technical systems. The framework leads the analyst to consider the
environment the task takes place within and the effect of the imposed constraints on the system’s
ability to perform its purpose. The framework guides the analyst through the process of answering
the questions of why the system exists and what activities are conducted within the domain, as well
as how this activity is achieved and who is performing it. The analysis of constraints provides the
basic formulation for development of the early concept for the system and the likely activities of
the actors within it. Thus CWA offers a formative design approach.
Systems design methods are often used to provide structure to the design process, and also to
ensure that the end-user of the product or system in question is considered throughout the design
process. For example, allocation of function analysis is used by system designers to determine
whether jobs, tasks and system functions are allocated to human or technological agents within
a particular system. Focus group approaches use group interviews to discuss and assess user
opinions and perceptions of a particular design concept. In the design process, design concepts are
evaluated by the focus group and new design solutions are offered. Scenario-based design involves
the use of scenarios or storyboard presentations to communicate or evaluate design concepts. A set
of scenarios depicting the future use of the design concept are proposed and performed, and the
design concept is evaluated. Scenarios typically use how, why and what-if questions to evaluate
and modify a design concept.
Human Factors in System Design 11
Usability Interface
Usability Interface Design Testing Evaluation
Cognitive Work Testing Evaluation and test
Analysis WESTT studies Human Error
modelling Identification
Figure 2.2 Application of Human Factors methods by phase of the design process
Workload, Error, Situation Awareness, Time and Teamwork (WESTT) is a Human Factors
tool produced under the aegis of the HFI DTC. The aim of the tool is to integrate a range of Human
Factors analyses around a tripartite closely-coupled network structure. The three networks are of
Task, Knowledge and Social networks, and are analysed to identify their likely effects on system
performance. The WESTT tool developed by the HFI DTC models potential system performance
and therefore enables the analyst to consider alternative system structures.
Usability testing* methods are used to consider the usability of software on three main
dimensions from ISO9241-11: effectiveness (how well does the product performance meet the
tasks for which it was designed?); efficiency (how much resource, for example, time or effort, is
required to use to the product to perform these tasks?) and attitude (for example, how favourably do
users respond to the product?). It is important to note that it is often necessary to conduct separate
evaluations for each dimension rather than using one method and hoping that it can capture all
aspects.
Human Error Identification (HEI)* methods can be used either during the design process
to highlight potential design induced error, or to evaluate error potential in existing systems. HEI
works on the premise that an understanding of an employee’s work task and the characteristics of
the technology being used allows us to indicate potential errors that may arise from the resulting
interaction (Baber and Stanton, 1996). The output of HEI techniques usually describes potential
errors, their consequences, recovery potential, probability, criticality and offers associated design
remedies or error reduction strategies. HEI approaches can be broadly categorised into two groups,
qualitative and quantitative techniques. Qualitative approaches are used to determine the nature
of errors that might occur within a particular system, whilst quantitative approaches are used to
provide a numerical probability of error occurrence within a particular system. There is a broad
range of HEI approaches available to the HEI practitioner, ranging from simplistic External Error
Mode (EEM) taxonomy-based approaches to more sophisticated human performance simulation
techniques.
12 Digitising Command and Control
Hierarchical Task Analysis (HTA)* is used to describe systems in terms of their goals and
sub-goals. HTA works by decomposing activities into a hierarchy of goals, subordinate goals,
operations and plans, which allows systems to be described exhaustively. There are at least 12
additional applications to which HTA has been put, including interface design and evaluation,
training, allocation of functions, job description, work organisation, manual design, job aid design,
error prediction and analysis, team task analysis, workload assessment and procedure design.
These extensions make HTA particularly useful in system development when the design has begun
to crystallise.
Interface Evaluation* methods are used to assess the human-machine interface of a particular
system, product or device. These methods can be used to assess a number of different aspects
associated with a particular interface, including user performance, user satisfaction, error, layout,
labelling, and the controls and displays used. The output of interface analysis methods is then
typically used to improve the interface through redesign. Such techniques are used to enhance
design performance, through improving the device or system’s usability, user satisfaction, and
reducing user errors and interaction time.
Design and Test studies are needed to determine if any measured differences between the new
systems and their baselines are real, statistically significant, differences that are likely to generalise
beyond the cases studied. There are two broad approaches for Design and Test studies: quantitative
and qualitative. Quantitative testing is a formal, objective, systematic process in which numerical
data is utilised to obtain information. Quantitative testing tends to produce data that compare one
design over another or data that compare a design against a benchmark. Qualitative testing considers
opinions and attitudes toward designs. Whilst the attitudes can be measured on scales, often the
approach involves an in-depth understanding of the reasons underlying human behaviour. Whilst
quantitative studies are concerned with relative differences in performance, qualitative studies are
concerned with the reasons for those differences. Typically quantitative research requires larger
random samples whereas qualitative research requires smaller purposely selected samples.
Teamwork Assessment* methods are used to analyse those instances where actors within a
team or network coordinate their behaviour in order to achieve tasks related to the team’s goals.
Team-based activity involves multiple actors with multiple goals performing both teamwork
and task-work activity. The activity is typically complex (hence the requirement for a team) and
may be dispersed across a number of different geographical locations. Consequently there are a
number of different team performance techniques available to the Human Factors practitioner,
each designed to assess certain aspects of team performance in complex systems. The team
performance techniques can be broadly classified into the following categories: team task analysis
techniques; team cognitive task analysis techniques; team communication assessment techniques;
team Situation Awareness (SA) measurement techniques; team behavioural assessment techniques;
and team Mental Work Load (MWL) assessment techniques.
Workload Assessment should be used throughout the design life cycle, to inform system
and task design as well as to provide an evaluation of workload imposed by existing operational
systems and procedures. There are a number of different workload assessment procedures
available. Traditionally, using a single approach to measure workload has proved inadequate, and
as a result a combination of the methods available is typically used. The assessment of workload
may require a battery of techniques, including primary task performance measures, secondary task
performance measures (reaction times, embedded tasks), physiological measures and subjective
rating techniques.
Situation Awareness (SA)* refers to an individual’s, team’s or system’s awareness of ‘what
is going on’ (Endsley, 1995a). SA measures are used to measure level of awareness during task
performance. The assessment of SA can be used throughout the design life cycle, either to determine
Human Factors in System Design 13
Mission failure is often thought to be the result of poor mission planning (Levchuk et al., 2002),
which places considerable demands on the planners and the planning process. This observation is
further confounded by the two general principles of warfare. The first principle is that of the ‘fog of
war’ (that is, the many uncertainties about the true nature of the environment – Clausewitz, 1832)
and second the principle that ‘no battle plan survives contact with the enemy’ (that is, no matter
how thorough the planning is, the enemy is unlikely to be compliant and may act in unpredictable
ways – von Moltke, undated). These three tenets (that is, the effects of uncertainty, the enemy
and failure on mission planning) require the planning process to be robust, auditable and flexible.
Mission planning has to be a continuous, iterative and adaptable process, optimising mission goals,
resources and constraints (Levchuck, 2002). Roth et al. (2006) argue that the defining characteristic
of command and control is the continual adaptation to a changing environment. Constant change
in the goals, priorities, scale of operations, information sources and systems being used means that
the planning systems need to be extremely adaptable to cope with these changes. According to
Klein and Miller (1999) there are many constraints acting on mission planning, including scarcity
of resources, time pressure, uncertainty of information, availability of expertise and the structure
of the tasks to be undertaken. Mission planning requires knowledge of the domain, objects in the
domain and their relationships as well as the constraints acting on the domain, the objects and their
relations (Kieweit et al., 2005). Klein and Miller (1999) also note that the planning cycles can
range from a couple of hours to a few days depending upon the complexity of the situation and the
time available. Given all of the constraints acting on the planning process and the need for the plan
to be continually revised and modified in the light of the enemy actions and changing situation,
Klein and Miller (1999) argue that ‘simpler plans might allow better implementation and easier
modification’ (p. 219). This point is reinforced by Riley et al. (2006) who assert that ‘plans need
to be simple, modifiable, flexible, and developed so that they are quickly and easily understood’
(p. 1143).
Mission planning is an essential and integral part of battle management. Although there are
some differences within and between the armed services (and the coalition forces) in the way
they go about mission planning, there are also some generally accepted aspects that all plans
need to assess. These invariants include: enemy strength, activity and assumed intentions,
the goals of the mission, analysis of the constraints in the environment, the intent of the
commander, developing Courses of Action (CoAs), choosing a CoA, identifying resources
requirements, synchronising the assets and actions, and identifying control measures. A
summary of the planning process for the US land military may be found in Riley et al. (2006)
and the Canadian land forces may be found in Prefontaine (2002). Their description has much
in common with land-based planning in the British Army, which is described in The Combat
Estimate booklet (CAST, 2007).
The mission planning process has been observed by the authors at the Land Warfare Centre at
Warminster and on training exercises in Germany. The observations at Warminster have been both as
participant-observers and as normal observers. This section describes the observed activities in the
16 Digitising Command and Control
planning process following a Warning Order (WO) received from Brigade (Bde). For the purpose
of this analysis, only the conventional materials (whiteboards, maps, overlays, paper, flipcharts and
staff officers’ notebooks) will be examined. As Figure 3.1 shows, the planning is undertaken in a
‘public’ environment when various people contribute and all can view the products. This ‘public’
nature of the products is particularly useful at the briefings, which encourages collaboration and
cooperation. It also helps to focus the planners’ minds on the important issues and the command
intent.
The WO arrived and was handed to the Chief of Staff (CoS) who read the whole document. The
CoS read the whole document first, highlighting relevant material for themself and the Company
level.
The WO was too detailed for Company level, so some editing by CoS was necessary, as well as the
inclusion of some additional material to clarify the anticipated task requirements.
The modified and edited WO was then sent to the companies below the Battle Group (BG), so
that they would have advance notice of the intention of the orders when they arrived. This gives
them an opportunity to prepare in advance of the actual orders.
The CoS created a planning timeline for the production of a plan to defeat an assault team that had
parachuted into their area. There were 2 hours available to construct the plan (from 13:00 to 15:00),
which meant approximately 17 minutes per question (of the Combat Estimate’s seven questions).
The planning timeline was drawn on a flipchart as shown in Figure 3.2.
Question 1 was undertaken by the Engineer and the Intelligence Officer in parallel with question
2. Key terrain features were marked (such as slow-go areas like forests and rivers), as were the
approximate disposition of the enemy forces and likely locations, potential Avenues of Approach
(AoAs), and likely CoA (see Figure 3.3). In this case, it was thought that the enemy assault force
was likely to try and meet up with the main armoured forces approaching from the west. The
Planning time-line
Battle
Group
level
plan
Two hours
planning
time
120/7 = 17
minutes
per
question
Enemy
location
and
strength
Mobility
corridors
Go and
slow-go
areas
Known
concepts
and
doctrine
enemy had landed in an area surrounded by forest which gave them some protection, although it
was thought that they had not landed where they intended.
The CoS interpreted the orders from Bde together with the BG Commander to complete the Mission
Analysis. Each line of the orders was read and the specified and implied tasks were deduced.
These were written by hand on to a whiteboard as shown in Figure 3.4. The Commander’s Critical
Information Requirements (CCIRs) and Information Requests (IRs) were identified and noted for
each task, when appropriate.
When the CCIRs/IRs had been completed, the CoS read them off the Mission Analysis
whiteboard (expanding where necessary to improve intelligibility) to a clerk who typed them
directly on to the Requests For Information (RFI) sheet. The requests were radioed up to Bde and
the responses were tracked on the whiteboard.
The CO then drew their required effects on to a flipchart (see Figure 3.5). Three effects were placed
above the planning line (SCREEN, CLEAR and DEFEAT) and four effects were placed below
the planning line (SCREEN, DEFEAT, GUARD and DEFEND). The two SCREEN effects were
placed to prevent the enemy from the west coming to the aid of the group who were being attacked.
Mission Planning and Battlespace Management 19
Interpret
orders
Identify
explicit
tasks
Deduce
implicit
tasks
Raise
CCIRs
and RFIs
SCREEN Understand
effect
CLEAR
Threat
Planning
effect and
line mission
DEFEAT
effect
Command
SCREEN
direction
effect
Main
DEFEND
effect effects
Rough
DEFEAT outline
and GUARD
effects
Figure 3.5 Effects Schematic drawn on a flipchart and laid on the map
20 Digitising Command and Control
The CLEAR effect was intended to remove any enemy from the forest, if they were there. The
DEFEAT effect was intended to incapacitate the enemy.
The CoS and BG Commander worked on three CoAs to achieve the Commander’s effects, as shown
in Figure 3.6. This was a very quick way to propose and compare three potential CoAs in response
to the CO’s Effects Schematic (remembering that the planning timeline only allowed 17 minutes for
each of the seven questions of the Combat Estimate).
Meanwhile the Engineer took the CO’s Effects Schematic and put the effects onto the ground,
using a TALC (TALC is believed to come from Talc Mica [a crystalline mineral which can be used
as a glass substitute], in the military it refers to a clear plastic sheet on which map overlays are
drawn) on a paper map (see Figure 3.7). Each effect became either a Named Area of Interest (NAI)
or a Target Area of Interest (TAI). Decision Points (DP) were placed between NAIs and TAIs. The
resultant overlay is called the Decision Support Overlay (DSO).
It is worth noting that it took approximately 15 minutes to construct the DSO on the TALC (by
the Engineer). Between the end of question four (Where can I best accomplish each action and
effect?) and the start of question five (What resources do I need to accomplish each action and
effect?) the Combat Estimate process was interrupted by the return of the CO, who requested a
briefing. The CO reviewed the CoAs and made some revisions, which was followed by a briefing
by the CoS to all cells in the HQ.
SCREEN
Chosen
effect COA
DEFEAT
effect
mapped
onto
map
Check
relation
between
GUARD CLEAR
effect effect NAI, DP
DEFEND and TAI
effect
The Engineer then constructed the Decision Support Overlay Matrix (DSOM) on paper, taking
the NAIs, TAIs and DPs from the paper map and linking them to each other, their location and
purpose, and the asset that would be used to achieve the effect (see Figure 3.8). There is a clear
link between the NAIs, TAIs and on the hand-written flipchart. The manual production of the
DSOM on the paper flipchart offers a process of checking the logic of the DSO, making sure that
the NAIs, TAIs and DPs link together and that the assets are being deployed to best effect (that
is, relating each asset to a purpose as the columns are next to each other in the flipchart version
of the DSOM).
Question 6. When and Where do the Actions Take Place in Relation to Each Other?
The CoS led the discussion of how the force elements would move together through the battle
(through a mixture of forward recce, mounted and dismounted troops, and armoured vehicles) with
logistical support and coordinated indirect fire ahead of them (controlled by the fire control lines
– see question seven). This was enacted on the map from the start position (on the left of Figure
3.9) to the end position (on the right of Figure 3.9) to capture the synchronisation issues, which
were recorded on to the Coordination Measures whiteboard (see Figure 3.10).
The coordination measures were used as a precursor to the construction of the synchronisation
matrix.
22 Digitising Command and Control
Map
relations
between
NAI, TAI
and DP
Identify
location,
purpose,
and
asset.
Synchronise
actions
and co-
ordinate
assets
Figure 3.9 Coordination of force elements on map and overlay via a wargame
Another Random Document on
Scribd Without Any Related Topics
Second row—thread before the needle, pearl 2 together. Repeat.
Pearl the last 2 together. Do the next row like the 1st.
Fourth row—pearl 1, * thread before the needle, pearl 2 together.
Repeat from *.
Fifth row—knit 1, * thread forward, knit 2 together. Repeat from *.
Sixth row—like the 4th.
Seventh row—like the 1st. Finish by knitting. Then begin again at
the 4th row.
Rain Pattern.—14 stitches are required for each pattern. 1st row—
knit 1, thread forward, knit 5, slip 1, knit 2 together, pass the slipped
stitch over, knit 5, thread forward. Repeat.
Open Pattern. 1st row—knit 1, bring the thread forward, slip 1, knit
1, pass the slipped over, knit 1, knit 2 together, bring thread forward.
Repeat.
Third row—knit 2, thread forward, slip 1, knit 2 together, pass
slipped over, thread forward, knit 1. Repeat.
Fifth row—knit 1, knit 2 together, thread forward; knit 1, thread
forward, slip 1, knit 1, pass slipped over.
Seventh row—knit 2 together *. Thread forward, knit 3, thread
forward, slip 1, knit 2 together, pass slipped over. Repeat from *.
Then go back to the 1st row.
Square Pattern, with reversed Holes.—Cast on 10 for each pattern
and two over, slip the first and knit the last in every row. Each
alternate row is 5 plain, 5 pearl.
First row—thread forward, slip 1, knit 1, pass slipped over, knit 3,
pearl 5.
Third row—knit 1, thread forward, slip 1, knit 1, pass slipped over,
knit 2, pearl 5.
Fifth row—knit 2, thread forward, slip 1, knit 1, pass slipped over,
knit 1, pearl 5.
Seventh row—knit 3, thread forward, slip 1, knit 1, pass slipped over,
pearl 5.
Ninth row—pearl 5, knit 3, knit 2 together, thread forward.
Now do every alternate row 5 pearl, 5 plain.
Eleventh row—pearl 5, knit 2, take 2 together, thread forward, knit
1.
Thirteenth row—pearl 5, knit 1, take 2 together, thread forward, knit
2.
Fifteenth row—pearl 5, take 2 together, thread forward, knit 3.
Nice Pattern for the Tops of Socks.—5 stitches for each pattern.
First row—pearl 2, keep the thread over the needle, knit 3 together,
thread over. Repeat.
Second row—pearl 3, knit 2. Repeat.
Third row—pearl 2, knit 3. Repeat.
Fourth row—like the second.
Very pretty little Pattern.—First row—knit 2, thread forward, slip 1,
knit 1, pass slipped over. Repeat.
Second row—Pearl 2, thread over, pearl 2 together. Repeat these
rows alternately.
A good plan to prevent the front part from stretching too much is, at
every 10th row to leave the last 10 stitches before the end
unknitted.
This pattern is for an ordinary figure, but after one trial the knitter
will find it can be altered to any size.
Baby’s Jacket.
Pins No. 9, and fleecy or fingering-wool.
Cast on 36, knit 5 plain rows.
Sixth row—knit 4, make 1, knit the remainder. The next row plain.
Eighth row—knit 5, make 1, the rest plain. Continue in this way to
make holes each time one stitch further from the edge until 13 holes
are done. Then do 6 rows, decreasing once in every row, at the 6th
from the edge, on the side where the holes are: this part goes under
the arm. Now cast it off, leaving 32 stitches on the needle for the
chest; knit these 32 for 22 rows, and then decrease (at the opposite
side to where you cast off) for 20 rows, thus making 10 decreasings;
knit 3 plain rows and cast off. Do another piece in the same way,
and for the back cast on 22 and knit 5 plain rows: this part is for the
neck.
Sixth row—knit 4, increase; knit to within 4 of the end, and increase
again; knit the remaining 4.
Seventh row—plain knitting.
Do these alternately for 26 rows until you have 48 stitches; knit 16
rows; then knit 1, take 2 together, take 2 together, make 1; plain to
within 5 of the end where you make 1, take 2 together, 2 together
again, knit the last. The next row plain. Repeat these two rows
alternately 12 times, then do 24 plain rows and cast off.
For the neck, pick up about 40 stitches in scarlet and do 4 rows of
ribbed knitting. Pick up 100 for the waist and do 8 rows plain, also
with scarlet, and 2 rows up the front. Finish off with large buttons
and elastic loops, and either knit 4 rows (like the neck) round the
sleeves, or put long ones. For the latter cast on 40 stitches and knit
7 inches. Rib 20 rows in scarlet for the wrist.
Child’s Gaiter, to cost 1s. 2d.
Three needles, No. 11 or 12, and 2 oz. brown Berlin wool.
Cast on 60 (for a very small child 54 will do), knit 3, pearl 3, for 24
rows. Now do plain knitting for 12 rows, then decrease at the
beginning and end of the row. Continue plain knitting, but you must
decrease every five rows. When six decreasings are done go on
knitting until 78 rows are done.
Seventy-ninth row—knit 21 stitches, take the next 18 for the instep
on a third pin and knit backwards and forwards, decreasing at the
beginning and end after the first 2 rows. When 18 rows are done
cast off.
Sew up the leg, and take up the rest of the stitches for the heel. Do
10 rows, decreasing at the beginning and end of every row. Cast off.
Round Sofa-cushion.
This can be worked in shades, or in any two colours which contrast
well.
Berlin and pins No. 14 may be used, casting on 54 stitches. This is
rather a small size; to make it larger use coarser wool, and pins to
correspond.
First row—plain knitting.
Second row—the rest of the cushion is done in Brioche stitch; leave
the last 3 stitches unknitted, turn and go on with the Brioche stitch.
Fourth row—leave the last 6 unknitted and turn.
Sixth row—leave the last 9 unknitted, and so on, 3 more every time
you get to the end of the row.
When you have worked off all the stitches, join the next shade and
knit the whole row. The next row like the 2nd.
This makes the knitting come into a gradual round. When large
enough sew it up and do a second in the same way. Make a round
pillow lined with feathers, and put between the knitting, which must
be sewn together like a silk pincushion. Draw in the centre, which
may be finished off with an ornamental button; the edges can be
crocheted or left plain, as preferred.
These cushions used to be very fashionable at one time, although
they are seldom seen now. However, fashions change so quickly that
they may, perhaps, be used again.
Knee-cap.
These are very comfortable for people who are at all rheumatic.
Cast on 29 stitches with Berlin wool, knit 14, bring the wool forward,
knit the rest. Continue this until 28 rows are done. Then knit 32
rows without increasing. Begin to decrease where you took the wool
forward at the 14th stitch by knitting 2 together. Do 28 rows to
correspond with the other side; have 29 stitches on your pin, the
same as you began with. Cast off, and sew the ends together.
The size of knee-caps must be regulated by the pins, and also by the
number of stitches used. Many people consider them better when
quite plain, without either increasing or decreasing.
Baby’s Hood.
Pins No. 9 and Berlin, or any other soft wool.
Cast on 60, and do 1 plain row. Then do Hood pattern, page 50, for
5 rows; then 6 rows of plain knitting. Do the rest in any fancy stitch.
When the knitting is 4 inches deep do 2 plain rows, 1 pearled, and
cast off.
Cast on 22 stitches for the crown, do 1 plain row.
Second row—knit 1, thread over, take 2 together. Repeat; 6 more
plain rows, and then any fancy stitch. When 3 inches are done knit a
row, decreasing at the beginning and end of the row. Pearl the next,
decreasing in the same way. Do 5 more rows in this manner, and
cast off. Now sew the crown and head parts together, gathering the
fulness at the top, and not at the sides.
Take up 74 stitches for the curtain, or neck part, and do 3 rows of
alternate knitting and pearling; let the plain part come on the right
side.
In the second row increase after every 2 stitches, so as to have
about 110 altogether. When you have done 3 rows do any open
pattern for 2 inches, and cast off. Edge the hood all round with a
simple crochet edging in pink wool, and then ribbon round the neck.
These hoods are very useful in grey wool for poor people.
PER LB.
s. d. s. d.
Charity Wheeling Yarn 3 6 to 4 0
Best Scotch Wheeling Yarn, 6½d. & 7½d. per
4 3 ” 4 9
skein
Best Scotch Fingering Yarn 6 0 ” 7 0
Scotch Berlin Fingering 7 6 ” 8 6
Merino Yarn, 4½d. per skein 10 6 ” 12 6
Petticoat Yarn, 9d. per skein 6 0 ” 7 0
Andalusian, 10d. per oz. 11 6 ” 15 0
Shetland Wool, 10d. per oz 11 6 ” 15 0
Pyrenees ditto — ” —
Berlin Wool, Single or Double, 7d. to 11d. per oz. 8 6 ” 12 6
Fleece Wool in 2, 3, 4, 6, 8, and 12 threads 6 6 ” 7 9
Lady Betty Wool, 2, 3, and 4 thread, 9d. and 10d.
11 6 ” 12 6
per oz.
Angola Yarn, 2½d. per skein 6 0 —
Scotch Knitting Cotton 3 9 —
Updated editions will replace the previous one—the old editions will
be renamed.
1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside the
United States, check the laws of your country in addition to the
terms of this agreement before downloading, copying, displaying,
performing, distributing or creating derivative works based on this
work or any other Project Gutenberg™ work. The Foundation makes
no representations concerning the copyright status of any work in
any country other than the United States.
1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if you
provide access to or distribute copies of a Project Gutenberg™ work
in a format other than “Plain Vanilla ASCII” or other format used in
the official version posted on the official Project Gutenberg™ website
(www.gutenberg.org), you must, at no additional cost, fee or
expense to the user, provide a copy, a means of exporting a copy, or
a means of obtaining a copy upon request, of the work in its original
“Plain Vanilla ASCII” or other form. Any alternate format must
include the full Project Gutenberg™ License as specified in
paragraph 1.E.1.
• You pay a royalty fee of 20% of the gross profits you derive
from the use of Project Gutenberg™ works calculated using the
method you already use to calculate your applicable taxes. The
fee is owed to the owner of the Project Gutenberg™ trademark,
but he has agreed to donate royalties under this paragraph to
the Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each date on
which you prepare (or are legally required to prepare) your
periodic tax returns. Royalty payments should be clearly marked
as such and sent to the Project Gutenberg Literary Archive
Foundation at the address specified in Section 4, “Information
about donations to the Project Gutenberg Literary Archive
Foundation.”
• You comply with all other terms of this agreement for free
distribution of Project Gutenberg™ works.
1.F.
1.F.4. Except for the limited right of replacement or refund set forth
in paragraph 1.F.3, this work is provided to you ‘AS-IS’, WITH NO
OTHER WARRANTIES OF ANY KIND, EXPRESS OR IMPLIED,
INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.
Please check the Project Gutenberg web pages for current donation
methods and addresses. Donations are accepted in a number of
other ways including checks, online payments and credit card
donations. To donate, please visit: www.gutenberg.org/donate.
Most people start at our website which has the main PG search
facility: www.gutenberg.org.
ebookgate.com