United States Military Academy
West Point, New York 10996
Simulation Roadmap for Program
Executive Office (PEO) Soldier
OPERATIONS RESEARCH CENTER OF EXCELLENCE
TECHNICAL REPORT NO: DSE-R-0421
DTIC #: ADA425648
Lead Analysts
Captain Eric S. ToUefson, M.S.
Analyst, Operations Research Center
Captain Greg L. Boylan, M.S.
Instructor, Department of Systems Engineering
Senior Investigators
Bobbie L. Foote, Ph.D.
Visiting Professor, Department of Systems Engineering
Paul D. West, Ph.D.
Assistant Professor, Department of Systems Engineering
Directed by
Lieutenant Colonel Michael J. Kwinn, Jr., Ph.D.
Director, Operations Research Center of Excellence
Approved by
Colonel WilUam K. Klimack, Ph.D.
Associate Professor and Acting Head, Department of Systems Engineering
July / 2004
The Operations Research Center of Excellence is supported by the
Assistant secretary of the Army (Financial Management & Comptroller)
Distribution A; Approved for public release; distribution is
unlimited.
Form Approved
0MB No. 0704-0188
REPORT DOCUMENTATION PAGE - SF298
Public reporting burden for this collection of Information is estlmatBd to average 1 hour per response, Including the time for reviewing Instructions, searching existing data sources, gathering and maintaining the
data needed, and completing and reviewing this collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information. Including suggestions for reducing
this burden to Department of Defense, Washington Headquarters Services, Directorate for infomiatlon Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 222024302. Respondents should be avrare that notwithstanding any other provision of law, no person shall be subject to any penally for falling to comply with a collection of Information If it does not display a curentiy
valid 0MB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS.
1. REPORT DATE (DD-MM-YYYY)
2. REPORT TYPE
31-07-2004
Technical Report
3. DATES COVERED (From - To)
01 Oct 03 - 31 May 04
4. TITLE AND SUBTITLE
5a. CONTRACT NUMBER
5b. GRANT NUMBER
Simulation Roadmap for Program Executive Office (PEO) Soldier
5c. PROGRAM ELEMENT NUMBER
6. AUTHOR(S)
Sd. PROJECT NUMBER
DSE-R-0421
CPT Eric S. Tollefson, CPT Gregory L. Boylan, LTC Michael J. Kwinn, Jr.,
5e. TASK NUMBER
Dr. Bobbie L. Foote, Dr. Paul D. West
5f. WORK UNIT NUMBER
7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)
8. PERFORMING ORGANIZATION REPORT
NUMBER
845-938-5661
Operations Research Center
Dep't of Systems Engineering
US Military Academy
West Point, NY 10996
DSE-R-0421
9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES)
10. SPONSOR/MONITOR'S ACRONYM(S)
PEO Soldier
5901 Putnam Road, Bldg 328
Fort Belvoir, VA 22060-5422
11. SPONSOR/MONITOR'S REPORT
NUMBER(S)
12. DISTRIBUTION / AVAILABILITY STATEMENT
Distribution A:
Approved for public release; distribution is unlimited .
13. SUPPLEMENTARY NOTES
14. ABSTRACT
Maintaining an edge during this time of unprecedented technological growth requires that
the Army field Infantry soldier systems quickly.
The risk of doing so without some
assessment of utility, however, is quite high, and the acquisition community must estimate
the operational impact of proposed systems with an increasing degree of accuracy. For this,
the Army has turned to combat simulations.
However, the representation of the individual
soldier in simulations has not kept pace with other representations. We applied the systems
engineering process to support the needs of the Infantry soldier system acquisition community
by identifying the best path forward to utilize and/or develop simulation capabilities that
meet program manager's needs. Our recommendation was that PEO Soldier pursue the enhancement
of and linkage between Combat'™^ the Infantry Warrior Simulation, and Objective One SemiAutomated Forces. In this report, we discuss the process that we applied and our results, as
well as our initial plan for implementation.
15. SUBJECT TERMS
Combat Simulations, Infantry Soldier Systems, Acquisition
16. SECURITY CLASSIFICATION OF:
a. REPORT
Unclassified
b. ABSTRACT
Unclassified
c. THIS PAGE
Unclassified
17. LIMITATION
OF ABSTRACT
18. NUMBER
OF PAGES
None
166
19a. NAME OF RESPONSIBLE PERSON
CPT Eric S. Tollefson
19b. TELEPHONE NUMBER (include area
code)
845-938-5663
Standard Form 298 (Rev. 8-98)
Prescribed by ANSI Std. Z39.18
(This page intentionally left blank)
Simulation Roadmap for Program Executive Office (PEO)
Soldier
Lead Analysts
Captain Eric S. ToUefson, M.S.
Analyst, Operations Research Center
Captain Greg L. Boylan, M.S.
Instructor, Department of Systems Engineering
Senior Investigator
Bobbie L. Foote, Ph.D.
Visiting Professor, Department of Systems Engineering
Paul D. West, Ph.D.
Assistant Professor, Department of Systems Engineering
OPERATIONS RESEARCH CENTER OF EXCELLENCE
TECHNICAL REPORT NO: DSE-R-0421
DTIC#: ADA425648
Directed by
Lieutenant Colonel Michael J. Kwinn, Jr., Ph.D.
Director, Operations Research Center of Excellence
Approved by
Colonel William K. Klimack, Ph.D.
Associate Professor and Acting Head, Department of Systems Engineering
July / 2004
The Operations Research Center of Excellence is supported by the Assistant Secretary of the Army
(Financial Management & Comptroller)
This Research was sponsored by: PEO Soldier
Distribution A; Approved for public release; distribution is unlimited.
Abstract
One of the primary challenges facing the United States Army acquisition community is
that of quickly fielding technologically-advanced equipment to the force. The high cost of doing
so brings with it significant risk, as demonstrated by the Crusader artillery and Comanche
helicopter programs, both multi-billion dollar programs cancelled within the last two years. As a
resuU, program managers must be able to reasonably guarantee the utility of their products early
in the design phase and to continue doing so throughout the product's lifecycle. To that end, the
Army has turned to simulation to evaluate the combat effectiveness of its proposed systems.
Unfortunately, the development of technological capabilities, especially with respect to
information sharing, is outpacing improvements in current combat simulation capabilities.
Moreover, until recently, the focus of the combat modehng community has been on large
battlefield platforms and unit-level analyses. As a resuh, the representation of the individual
soldier on the battlefield has not kept pace with other representations. These Infantry soldier
models require unprecedented fidelity in terms of the Infantry soldier entity and his environment.
The Program Executive Office Soldier (PEO Soldier), the Army program manager for the
acquisition of nearly all the items carried or worn by the Infantry soldier, requires such highfidelity models of the Infantry soldier in order to evaluate the effectiveness of its products. They
have realized that the existing simulation capability is not up to the task.
We used the Systems Engineering and Management Process (SEMP), taught at the
United States Military Academy as the standard problem solving methodology, to identify PEO
Soldier's requirements for an Infantry combat simulation and to recommend a path forward to
them. We recommended that they pursue the modification of and linkage between three
simulation software packages under development: Combat'^', the Infantry Warrior Simulation
and Objective OrieSAF.
Our application of standard systems engineering tools led to a unique characterization of
the requirements of a simulation capability specific to Infantry soldier system acquisition. Those
requirements differ in many ways firom Infantry simulations developed for other purposes and
often applied unsuccessfully to the project management domain. Using those requirements, we
were able to describe the ideal simulation and compare current simulations to that model.
Ill
In this report, we briefly discuss the problem background and methodology. We then
delineate our findings and results in detail, pointing out the unique insights obtained through the
application of the systems engineering process. Finally, we conclude with our recommendation
to PEO Soldier for addressing their unique project management needs and our plans for
implementation.
IV
About the Authors
CPT ERIC S. TOLLEFSON is an Instructor and Analyst in the Operations Research
Center at the United States MiUtary Academy (USMA), West Point. He is an Infantry officer
and has served in various positions, including platoon leader, company commander, and as a
staff officer. He holds a BS in Engineering Physics fi-om USMA and a MS in Operations
Research fi-om the Georgia Institute of Technology, May 2002.
CPT GREGORY L. BOYLAN is an Instructor in the Department of Systems
Engineering at the USMA, West Point. He is an Infantry officer and has served in various
positions including platoon leader, company executive officer, rifle company commander,
and staff officer. He holds a BS in Environmental Science from USMA and a MS in
Industrial Engineering from the Georgia Institute of Technology.
LTC MICHAEL J. KWINN, JR., Ph. D., is an Associate Professor in the Department
of Systems Engineering and Director of the Operations Research Center of Excellence at the
United States Military Academy, West Point. He has a B.S. Degree from USMA, M.S.
degree in Systems and Industrial Engineering from the University of Arizona and a Ph.D. in
Management Science and Information Systems from the University of Texas, Austin. His
research interests include operational assessment methodology, efficiency analysis, recruiting
analysis especially marketing effects and capability analysis and modeling. Lieutenant
Colonel Kwinn may be contacted at michael .kwinn@usma. edu.
BOBBIE L. FOOTE, Ph. D., is Senior Faculty in the Department of Systems
Engineering at USMA. Dr. Foote served 32 years as a faculty member at the University of
Oklahoma in Industrial Engineering. He was a two-time director of the department and was
also associate director of the Oklahoma Center for Integrated Design and Manufacturing
located at Oklahoma State University in Stillwater as well as Professor of Industrial
Engineering at OU. After retirement he worked with a team to re-engineer corporations such
as Compaq. He joined USMA in 2001 where he teaches and conducts research in statistics,
modeling and optimization as applied to Army problems.
PAUL D. WEST, Ph. D., is an Assistant Professor and the Director of Technology in
the Department of Systems Engineering at the United States Military Academy at West
Point. His research interests include network-centric systems, combat modeling, and
information visualization. He received a BS from the State University of New York at
Albany, an MBA from Long Island University, a Master of Technology Management from
Stevens Institute of Technology, and an interdisciplinary PhD in Systems Engineering and
Technology Management, also from Stevens.
VI
Acknowledgements
This study was fiinded by PEO Soldier based on the proposal entitled "Proposal to
Develop a Roadmap for Simulation Support of PEO Soldier Programs" and was completed in
May, 2004.
The information contained here has been obtained with the help of numerous people.
We would specifically like to thank Major Pedro Habic and Lieutenant Colonel Larry
Larimer from TRAC-WSMR; Dr. Dale Malabarba, Mr. Robert Auer, and Mr. Paul Short
from the Natick Soldier Center; COL(Ret) Pat Toffler, Director, Research and Studies
Partnership, United States Military Academy; and, Mr. Brad Bradley and Mr. Dean
Muscietta from AMSAA. Interviews with the above provided tremendous insights that were
critical in development of these requirements. We would also like to make special mention
of those that reviewed the drafts of various documents and who provided comments and
specific recommendations that have been integrated into the content of those documents.
They are Dr. Robin Buckelew, SES, Director, Applied Sensors, Guidance, and Electronics,
Directorate Aviation and Missile RDEC; Mr. John D'Errico, Operations Research Analyst at
TSM-Soldier; and, Mr. Charlie Tamez, Systems Integration, PEO Soldier.
Our mention of contributors does not imply their approval of our results. The
opinions contained herein are the opinions of the authors, and do not necessary reflect those
of PEO Soldier, the United States Military Academy, the United States Army, or the
Department of Defense.
vu
Table of Contents
Abstract
iii
About the Authors
v
Acknowledgements
vii
Table of Contents
viii
List of Figures
xii
List of Tables
xiv
Chapter 1:
15
Introduction
1.1
General
15
1.2
Background
15
1.2.1.
SMART
15
1.2.2.
PEG Soldier
16
1.2.3.
DoD Models and Simulations
17
1.2.4.
Current Efforts
17
General Methodology
18
1.3
Chapter 2:
2.1
Problem Definition
21
Needs Analysis
21
2.1.1.
Project Impetus
21
2.1.2.
Initial Problem Statement
21
2.1.3.
System Decomposition
21
2.1.4.
Review of Literature and Related Research
23
2.1.4.1
Modeling and Simulation Policies and Guidelines
23
2.1.4.2 Acquisition Policies and Guidelines
25
2.1.4.3
26
PEG Soldier Related Documents
2.1.4.4 Methodology Documents
27
2.1.4.5
27
Doctrine
VUl
2.1.4.6 General Documents
28
2.1 A J Conferences
29
2.1.5.
Stakeholder Analysis
2.1.5.1
30
PEO Soldier and Subordination Organizations
2.1.5.2 Other Army Agencies
31
32
2.1.6.
Functional Analysis of a Simulation Capability
33
2.1.7.
Requirements Analysis
33
2.1.7.1
33
2.1.7.2 Requirements Methodology - Advantages and Challenges
37
2.1.7.3
38
2.1.8.
2.2
Requirements Methodology
Requirements
Revised Problem Statement
Value System Design
50
50
2.2.1.
Functions and Objectives
50
2.2.2.
Evaluation Measures
51
2.2.2.1
Joint/Combined Arms Modeling Capability
52
2.2.2.2
STMS Modeling Capability
52
2.2.2.3 Modifiable Equipment Types
52
2.2.2.4 User Control over Conditions
53
2.2.2.5 Modify TTPs
53
2.2.2.6 PEO Soldier Control
53
2.2.2.7 Fielding Risk
54
2.2.2.8
54
Time until Available
2.2.3. Weighting
Chapter 3:
3.1
54
Design and Analysis
56
Alternatives Generation
3.1.1.
56
Generation
56
3.1.1.1
Existing Simulations
56
3.1.1.2
Simulations under Development
57
3.1.1.3 Modifying Simulations under Development
58
3.1.1.4 Combination of Existing, Developing, & Modified Simulations
58
3.1.1.5 New Simulation
58
IX
3.1.2.
Feasibility Screening
3.1.2.1
Constraints
59
3.1.2.2 Feasibility Screening Matrix
59
3.1.2.3
60
3.1.3.
3.2
59
Eliminated Alternatives
Alternative Optimization
61
Modeling and Analysis
61
3.2.1.1
Joint/Combined Arms Modeling Capability
62
3.2.1.2
STMS Modeling Capability
63
3.2.1.3
Modifiable Equipment Types
64
3.2.1.4 User Control over Conditions
65
3.2.1.5
65
Modify TTPs
3.2.1.6 PEO Soldier Control
3.2.1.7 Fielding Risk
3.2.1.8 Time until Available
Chapter 4:
Decision Making
66
;
67
67
69
4.1
Alternative Scoring
69
4.2
Decision
70
4.2.1.
Alternative Comparison
4.2.2.
Sensitivity Analysis
....70
72
4.2.2.1
Sensitivity of the Weights
72
4.2.2.2
Sensitivity of the Alternative Modeling
73
4.2.2.3
Sensitivity of the Value Curves
74
4.2.3.
Cost-benefit Analysis
75
4.2.4.
Recommendation
76
Implementation
77
Chapters:
5.1
Planning for Action
77
5.2
Execution
77
5.3
Assessment and Control
78
Chapter 6:
Conclusions
79
Bibliography
80
Appendix A: List of Abbreviations
87
Appendix B: Soldier Functional Hierarchy
92
Appendix C: STMS Simulation Functional Requirements
98
Appendix D: Engineering Problem Statement
134
Appendix E: Soldier MAWG Questionnaire
139
Appendix F: Value Scales
149
Appendix G: Sensitivity Graphs
157
Appendix H: Cost Estimation
161
Distribution List
164
REPORT DOCUMENTATION PAGE - SF298
166
List of Figures
Figure 1. PEO Soldier organizational structure
16
Figure 2. DoD combat model and simulation hierarchy
17
Figure 3. Systems Engineering and Management Process (McCarthy, et al., 2003)
19
Figure 4. Systems context diagram
22
Figure 5. Simulation capability functional hierarchy
33
Figure 6. Primary simulation software fimctions
34
Figure 7. Primary simulation modeling functions
34
Figure 8. Functional decomposition of primary STMS functions
35
Figure 9. Functional decomposition for requirements analysis
35
Figure 10. Value hierarchy
51
Figure 11. STMS modeling capability value curve
69
Figure 12. Stacked bar graph of the value scores within the simulate function
71
Figure 13. Stacked bar graph of the value scores within the support decision making
function
;
71
Figure 14. Graph of the sensitivity of the global weight of STMS modeling capability
72
Figure 15. Total value scores assuming linear value curves
74
Figure 16. Cost-benefit graph
75
Figure 17. Functional decomposition of assessing the situation
92
Figure 18. Functional decomposition of making sensing decisions
92
Figure 19. Functional decomposition of sensing
93
Figure 20. Fimctional decomposition of making engagement decisions
93
Figure 21. Functional decomposition of engaging
94
Figure 22. Functional decomposition of making movement decisions
94
Figure 23. Fimctional decomposition of moving
95
Figure 24. Functional decomposition of making communications decisions
95
Figure 25. Functional decomposition of communicating
96
Figure 26. Functional decomposition of making enabling decisions
96
Figure 27. Functional decomposition of enabling
97
Figure 28. Hatley and Pirbhai architecture template (1998)
102
Figure 29. CA/Joint modeling capability value scale
149
Figure 30. CA/Joint modeling capability value curve
149
Figure 31. STMS modeling capability value scale
150
Figure 32. STMS modeling capability value curve
150
Figure 33. Modifiable equipment types value scale
151
Figure 34. Modifiable equipment types value cvirve
151
Figure 35. User control over conditions value scale
152
Figure 36. User control over conditions value curve
152
Figure 37. TTP modifiability value scale
153
Figure 38. TTP modifiability value curve
153
Figure 39. PEO Soldier control value scale
154
Figure 40. PEO Soldier control value curve
154
Figure 41. Fielding risk value scale
155
Figure 42. Fielding risk value curve
155
Figvire 43. Time until available value scale
156
Figure 44. Time until available value curve
156
Figure 45. Graph of the sensitivity of the global weight of Joint/CA modeling capability. 157
Figure 46. Graph of the sensitivity of the global weight of STMS modeling capability
157
Figure 47. Graph of the sensitivity of the global weight of modifiable equipment types... 158
Figure 48. Graph of the sensitivity of the global weight of user control over conditions... 158
Figure 49. Graph of the sensitivity of the global weight of TTP modifiability.
Figure 50. Graph of the sensitivity of the global weight of PEO Soldier control
159
159
Figure 51. Graph of the sensitivity of the global weight of fielding risk
160
Figure 52. Graph of the sensitivity of the global weight of time until available
160
Xlll
List of Tables
Table 1. Local weights of functions and objectives
55
Table 2. Evaluation measures sorted by global weight
55
Table 3. Feasibility screening matrix
60
Table 4. Raw data matrix (RDM)
62
Table 5. Decision matrix (DM)
70
Table 6. Cost estimation for cost-benefit analysis
XIV
161
Chapter 1: Introduction.
1.1 General.
The Army acquisition community requires high-resolution simulations that represent the
dismounted Infantry soldier in enough detail to estimate the operational effectiveness of soldier
weapons and equipment. Such a simulation package must be robust enough to model advanced
concepts like tactics, command and control, shared situational awareness, soldier fatigue, and
information overload, in addition to weapons effects, soldier functions, and terrain
characteristics. It must be sensitive to technological advances in weapons, sensors, and
equipment. Traditionally, simulation innovations in these areas have been a result of a
capabilities-based approach in which specific capabilities have been added to existing software
to satisfy a certain niche. However, we believe that the Army requires a needs-based analysis
that will identify the precise simulation requirements necessary to support acquisition decision
making. From there, a simulation solution can be designed to meet the Army's Infantry soldier
system acquisition needs. In this report, we discuss our approach beginning with a brief
description of the problem background and the methodology we used. We then dehneate our
findings and results in detail, pointing out the unique insights obtained through the appUcation of
the systems engineering process. Finally, we conclude with our recommendation to PEO Soldier
for addressing their unique simulation needs and our plan for implementation.
1.2 Background.
1.2.1. SMART.
Over the last decade, the United States Army has emphasized the importance and utility
of using simulations throughout the acquisition process. This initiative was initially called
simulation-based acquisition (SBA), and is now part of the Simulation and Modeling for
Acquisition, Requirements, and Training (SMART) program. The SMART program "involves
rapid prototyping using M&S [modeling and simulation] media to facilitate systems engineering
so that materiel systems meet users' needs in an affordable and timely manner while minimizing
15
risk (Army Model and Simulation Office, 2002b)." With this initiative come the challenges of
identifying and developing the appropriate simulation packages for each step in the acquisition
process, from concept development to lifecycle management.
1.2.2. PEO Soldier.
The Program Executive Office (PEO) Soldier faces such a challenge. The mission of
PEO Soldier is "to arm and equip Soldiers to dominate the fiiU spectrum of peace and war, now
and in the future (PEO Soldier, 2003);" therefore, that organization is responsible for the
acquisition of most of the weapons and equipment carried and used by the Infantry soldier. PEO
Soldier has three subordinate Project Manager organizations, which are divided by the type of
systems they manage. Each Project Manager is fiirther broken down into Product Managers,
who manage the acquisition of more specific equipment. See Figure 1 below.
Program Executive Office Soldier
Project Manager
Soldier Warrior
1
Project Manager
Soldier Equipment
Project Manager
Soldier Weapons
1
Product
Manager
Product
Manager
Product
Manager
Air
Warrior
Land
Warrior
Future
Warrior
Product
Manager
Ciothing
and
Individual
Equipment
Product
Manager
Crew
Served
Weapons
Product
Manager
Individual
Weapons
Figure 1. PEO Soldier organizational structure.
A current focus of PEO Soldier, and the focus of this study, is to identify a simulation
package that will provide the means to quantify the platoon-level operational effectiveness of a
new system or component as part of the SMART concept. PEO Soldier's need, as a system
acquirer and integrator, is shared by materiel developers and item-level evaluators as well. Thus
the identification of an appropriate simulation will have far-reaching application to organizations
throughout the Army.
16
1.2.3. DoD Models and Simulations.
The Department of Defense (DoD) organizes its models and simulations into levels based
on their degree of aggregation. Figure 2 is a diagram of the four categories of DoD models and
simulations. PEO Soldier is interested in simulations that operate in the middle region mission^attle and engagement-level models. While the military has been involved in
developing simulations for many years, the emphasis was on large-scale, campaign-level models
designed to represent the potential engagements of a full-scale war with the Soviet Union. Even
most simulations that were able to represent the individual soldier focused on larger units and
battlefield systems, often modeling the individual soldier using the same algorithms as a tank,
but changing its speed, range, weapons, etc., to match soldier attributes. Now, the decentralized,
networked nature of current conflicts and the multiple roles of the US soldier dictate the need for
higher fidelity simulation models at the soldier level.
Resolution
Force or System
Level Represented
Simulations
General Size of
Unit Represented*
Joint/Combined Forces
Corps P-S Dh/isions / 20 - 45,000 Soldieis)
Division (3 Brigades/10-1S,000 Soldiei^
Brigade (3 Battalions / 3 - 5000 Soldiers)
Battalion (4-5 Companies / GOO Soldiers)
Company
Platoon
Squad
Fire Team
(3
(3
(2
(4
Platoons /130 Soldiers)
Squads/35 Soldiers)
Fire Teams / 9 Soldiers)
Soldier^
Individual Combatant
Weapon System
These are appixiximate
numbers and vary
greatly by type of unit
Trigger Assembly
Figure 2. DoD combat model and simulation hierarchy.
1.2.4. Current Efforts.
To date, various Army agencies and contractors have recognized the need and have made
considerable efforts to tackle the problem. From our observations, their efforts to identify
requirements fall into two categories. The first is an upgrade-based approach, wherein the
recognition of a unique requirement drives changes to an existing (legacy) simulation to meet
that need. This is an iterative approach that leads to a continual upgrade cycle, often resulting in
numerous concurrent versions of the same software, and is limited by the architecture and design
of the software. This method is used primarily by organizations that either do not have the
17
fimding or the time to create a new simulation, or are the primary proponents for the legacy
software. For software that has the architecture to support such changes, this method can be a
valid technique for meeting requirements. However, this process rarely resuhs in a
comprehensive set of requirements that fiiUy identifies an organization's needs, a valuable
product itself
The other category is a characteristics-based approach. In this approach, an organization
identifies its requirements based on the characteristics upon which the system will be evaluated.
While this approach does resuU in a comprehensive set of requirements, it too has some
drawbacks. One is that the characteristic itself may not be well-defined or translate well into
simulation requirements. For instance, a commonly-used term for a capability improvement of
modem soldier systems is their ability to enhance a soldier's situational awareness. Not only is
the definition of this term not widely agreed upon, it is hard to decompose into requirements. A
soldier's situational awareness directly affects, and is directly affected by, other high-level
characteristics like mobility, lethality, and survivability, which themselves overlap for many of
the soldier's fiinctions. That interdependence complicates the logical decomposition into
simulation requirements. Also, the terminology of the resulting product may not be easily
understandable by all simulation stakeholders.
1.3 Systems Engineering and Management Process (SEMP).
We used the Systems Engineering and Management Process (SEMP), taught in the
Department of Systems Engineering at the United States Military Academy (USMA) as the
standard systems engineering process - shown in Figure 3. We will briefly describe the process.
Overall, the SEMP is an iterative process, allowing the analyst to make refinements to any
products based upon new mformation or discoveries regardless of where in the process the
information is discovered.
The first phase is Problem Definition, consisting of Needs Analysis and Value System
Design. The Needs Analysis begins with an initial problem statement from the cUent. With that,
the analyst utiUzes various tools and techniques to convert that initial problem statement into a
revised problem statement that fiiUy encapsulates the client's true need. The four main
techniques used in this step are 1) a fiiU decomposition of the system under study into its
subsystems and components; 2) a review of literature and related research; 3) a stakeholder
18
analysis; and, 4) a functional decomposition of the system leading to requirements definition.
Other important techniques include analyses of system inputs and outputs, futures analyses, and
Pareto-type analyses. With an accurate revised problem statement, the study moves into the
second step of Problem Definition - Value System Design. Here, the engineer uses valuefocused thinking to transform the required functions of the system into objectives and measures
to evaluate those objectives. The resulting value system represents the values of the primary
stakeholders and provides a basis to evaluate future alternative solutions to the problem under
study.
Environment
1=:^'"'-'°^.
Descriptive
Scenario
Normative
Scenario
Assessment & Feedback
Figure 3. Systems Engineering and Management Process (McCarthy, et al., 2003).
After determining the requirements, the study moves into Phase II of the SEMP - Design
and Analysis. During the first step of this phase - Alternative Generation - we generate
potential alternative simulation solutions that will facilitate PEO Soldier's evaluation of selected
soldier systems. During Modeling and Analysis, we analyze each alternative to determine the
degree to which that alternative meets the requirements. We do that by evaluating the
alternatives against the measures developed during Phase I.
Following the above analysis, we move into Phase III of the SEMP - Decision Making and use the value hierarchy to score each of the alternatives, allowing us to compare them on a
common scale. With the results of the comparison, a thorough sensitivity analysis, and a costbenefit analysis, we are then able to recommend a simulation alternative to the client.
19
The final phase, pending acceptance of the recommendation, is Implementation (Phase
IV of the SEMP). Implementation consists of three steps - Planning for Action, Execution, and
Assessment and Control. Any proposed implementation plan will mclude, at a minimum, 1) a
phased timeline that identifies intermediate objectives, essential tasks, and the critical path, 2) an
estimation of the required implementation and life-cycle costs associated with the solution, and
3) other assessment and control mechanisms to help manage the plan.
The fi-amework of this report follows the structure of the SEMP.
20
Chapter 2: Problem Definition.
2.1 Needs Analysis.
2.1.1. Project Impetus.
In a meeting with LTC Michael Kwinn on 01 October 2003, BG James Moran, PEO
Soldier, stated that he needed a simulation for decision support in choosing between alternative
systems or components for the Infantry soldier, for expressing quantitatively the differences
between candidate systems, and for justifying resource allocation. BG Moran commissioned the
Operations Research Center of Excellence (ORCEN) at USMA to conduct a study to develop a
roadmap for the use, modification, or development of a simulation, or family of simulations, that
will support PEO Soldier decision making.
2.1.2. Initial Problem Statement.
Based upon discussions with our primary stakeholder during our initial project meeting
with the Deputy PEO Soldier, Mr. Charles Rash, on 14 November 2003, we obtained the
following initial problem statement:
PEO Soldier needs a simulation that allows the evaluation of platoon
effectiveness based upon changes in Soldier tactical mission system (STMS)
characteristics.
An STMS is any system, or system of systems, worn or carried by the Infantry soldier on
the battlefield and includes such equipment as weapons, load-bearing equipment,
communications devices, GPS devices, sensors, tools, etc.
2.1.3. System Decomposition.
We began by looking at a simulation system in general and decomposing it into its
primary functions, components, and structural hierarchy. This initial look at the system allowed
us to consider its key aspects and served as a starting point for the remainder of our analysis.
The primary functions of a simulation in this case are to support decision-making and to
simulate. The primary subfunctions of the simulation system itself are interfacing with the user.
21
processing inputs, controlling processes, transforming inputs to outputs, processing outputs, and
maintaining, self-testing, and managing redundancy. A more in-depth look at functions will be
discussed in the functional and requirements analyses sections.
Components are divided by type into structural, operating, and flow. Structural
components are static and do not change during the transformation process. For a simulation
system, the primary examples are computer hardware (CPU, monitor, 10 devices) and the facility
itself Operating components transform inputs into outputs and are dynamic. Primary examples
for simulation are the software, architecture, computer code, and algorithms that drive the
interactions. Additionally, users, analysts, programmers, maintainers, and subject matter experts
(SMEs) provide expertise and controls that affect the transformation. Flow components are
those inputs that are transformed into outputs. Examples in this case include actual simulation
inputs entered by the user or internal to the software - STMS characteristics, scenarios, entities,
units, equipment, etc. Outputs include data that may be converted into measures of effectiveness
(MoEs) or measures of merit (MoMs), and visual outputs, such as 2D and 3D playback of the
scenario. Other things that flow through the system that may be of particular interest to PEO
Soldier are time and money.
User/Analyst
Figure 4. Systems context diagram.
22
The structural hierarchy within which a simulation falls includes it super-systems (those
systems of which the simulation is a part), lateral systems (others systems that are similar in
objectives, goals, or structure to the simulation), and its sub-systems. A simulation that meets
PEO Soldier's needs is primarily part of the following super-systems: the Department of the
Army's (DA's) SMART program, DoD/DA acquisition systems, DA's Modeling and Simulation
(M&S) Master Plan, PEO Soldier, US Army Infantry Center (USAIC), simulation federations
such as MATREX (Modeling Architecture for Technology, Research and Experimentation), and
computer networks. Lateral systems include live and virtual simulations, other analytical
methods, prototypes, computer-aided design (CAD) software, training systems, data analysis
software, and STMS. Critical subsystems include data, empirical/engineering-level simulations,
input systems, output systems, control systems, transformation systems, user interfaces, and
maintenance systems. Figure 4, on the previous page, shows a systems context diagram that
captures the hierarchical structure and components of the system.
By conducting the above analysis, we identified key aspects of the system under study
that must be considered throughout the process. Additionally, by putting the entire system in
context, we were able to identify other systems and factors in the environment that directly affect
any proposed simulation solution.
2.1.4. Review of Literature and Related Research.
Upon project initiation, our team began to conduct an extensive search of literature. A
comprehensive list of references can be found at the end of this report in the Bibliography
section. Those references are grouped by the topic areas that we are using to organize the
following discussion.
2.1.4.1 Modeling and Simulation Policies and Guidelines.
We started by famiharizing ourselves with the relevant Army and DoD poHcies,
regulations, and guidelines governing modeling and simulation. Some of the most helpfiil were
the documents related to the Army's SMART program. These provided the critical tie-in
between policy and best-practices and between acquisition and M&S. The SMART Reference
Guide briefly describes many of the various simulations being used today (AMSO, 2001). The
Planning Guide directly applies M&S to system development and describes its intended use in
23
the acquisition community (AMSO, 2002b). Thus, the SMART documents served as a good
starting point for the simulations we would consider and the ideal application of those tools in
the acquisition process.
Also useful in the same respect as the SMART documents were the Army Modeling and
Simulation Master Plan (1997) and the Army Modeling and Simulation Roadmap (2003). Key
information we found in the Master Plan included the Army M&S vision and strategic guidance,
M&S management, the components of the M&S lifecycle, and Army-level M&S requirements
with their objectives and measures of performance. With the Master Plan currently under
revision, we referred to the Roadmap for updated guidance. That document, still in draft form,
updated the Army's purpose, vision and mission for M&S, its goals and objectives, and the
organizational structure for M&S management. Recently, a draft version of the new master plan
was published, now called United States Army Modeling and Simulation (M&S) Processes and
Procedures (2004); however, due to the timing of its release, the content of that document was
not a factor in our initial results.
Also critical to our understanding of the systems within which we were working were DA
regulations, pamphlets, and memoranda. The primary of these is AR 5-11, Management ofArmy
Models and Simulations (1997). That document lays out Army policy on M&S management,
and guidance on verification, validation, and accreditation (W&A), configuration management
(CM), and data management. DoDI 5000.61, DoD Modeling and Simulation Verification,
Validation, and Accreditation (W&A) (2003), and DA Pam 5-11, Verification, Validation, and
Accreditation ofArmy Models and Simulations (1999), discuss W&A requirements in more
detail and provided key information that would be crucial to any implementation we
recommended. DA Memo 5-15 (2002), concerning high level architecture (HLA) compliance,
delineates the requirement that all simulations be HLA-compUant, which later became one of our
key constraints. DA Pam 5-XX, Simulation Support Planning and Plans (2003), in draft form,
describes the requirement for major acquisition programs to have simulation support plans
(SSPs) and details the required information for SSPs. The details in this document are very
similar to that in Appendix C of the SMART Planning Guide. From both, we learned a great deal
about the recommended integration of M&S, specifically simulation, into the acquisition
Ufecycle of major products. AR 5-5, Army Studies and Analyses (1996), and its corollary
pamphlet, DA Pam 5-5 (1996), describe the general analyses and studies requirements and
24
include study source information and study report formats. Additionally, they provide detail
about study advisory groups (SAGs). Since our recommended solution would be used as part of
studies and analyses, those documents helped us understand how the simulation capability would
fit into the larger picture.
The final type of regulations we reviewed in this group were those pubUshed by the US
Army Training and Doctrine Command (TRADOC). Most of these documents serve as
clarifications of DA publications with specific application to TRADOC. Therefore, we will only
discuss one of those publications here - TRADOC Regulation 71-4, TRADOC Scenarios for
Combat Development (2001). This document describes the use, development, and approval of
scenarios to be used in TRADOC studies and analyses, and, thus, would directly impact any
simulation capability we might recommend.
2.1.4.2 Acquisition Policies and Guidelines.
We quickly reahzed that in order to understand the context of the required simulation
capability, we had to gain a sufficient understanding of the DoD and Army acquisition processes.
We relied on two different types of documents to do this. The first type directly addresses the
acquisition system. At the DoD level, they include Department of Defense Directives (DoDDs),
Instructions (DoDIs), and Guidebooks. DoDI 5000.2, Operation of the Defense Acquisition
System (2003), and the Interim Defense Acquisition Guidebook (2002), provide detailed
information about the DoD acquisition process including procedures, regulatory information,
information technology (IT) considerations, test and evaluation (T&E), resource estimation,
human systems integration, acquisition of services, and program management. For the most part,
we used these documents to answer specific questions instead of reading the entire publication.
At the Army level, the key acquisition system document that we reviewed was AR 70-1, Army
Acquisition Policy, which generally covers the same topic areas as the DoD Guidebook, but with
the addition of Army-specific pohcies. We also reviewed TRADOC Regulation 71-12,
TRADOC System Management, which discusses management of high-priority materiel systems,
to include the establishment of TRADOC Systems Managers (TSMs), TRADOC Program
Integration Offices (TPIOs), and TRADOC Project Offices (TPOs).
The other type of publications that we reviewed to increase our understanding of the
acquisition system governs capabilities and requirements development. At the DoD level, these
25
include the Chairman of the Joint Chiefs of Staff Manuals (CJCSM) and Instructions (CJCSI)
and DoD architecture documents. CJCSI 3170.01C, Joint Capabilities Integration and
Development System (2003), provided us a detailed explanation of the process of integrating and
developing capabihties and of the required products throughout the materiel hfecycle. The DoD
Architecture Framework, Version 1.0, Volumes I&II (2003), describe, in detail, how to define
the architecture of warfighting operations and business operations within the overall DoD
firamework. Thus, our simulation solution may, in time, become part of another system's
architecture and be used to tie into the overall framework.
At the Army level, we used TRADOC guides to get further detail, specific to the Army,
on the development of Initial Capabilities Documents (ICDs) and Capability Development
Documents (CDDs). Additionally, we used an AMSO shde to clarify the specific M&S review
criteria for those documents. TRADOC Pamphlet 71-9, Requirements Determination (1999),
describes the requirements determination process, organizational rules, integrated concept teams
(ICTs), capstones, future operational capabilities (FOC), science and technology programs
(S&T), Advanced Warfighting Experiments (AWE), Advanced Technology Demonstrations
(ATDs), and Advanced Concept Technology Demonstrations (ACTDs), studies and analyses,
warfighting materiel requirements, and, most importantly for us, M&S requirements integration
and approval. All of the above publications showed us not only how our simulation fit in the
capabilities development process, but also pointed us to the types of questions that our solution
may have to answer for the systems it will support.
2.1.4.3 PEO Soldier Related Documents.
With a basic understanding of the acquisition process and M&S policies, we needed to
develop a greater understanding of PEO Soldier - its organization and products. We obtained
much of this information through interviews; however, we did rely on some documents for
specific information. The PEO Soldier Portfolio 2003: America's Most Deployed Combat
System provided a general overview of the organization and descriptions of the materiel currently
being developed. This was a tremendous resource for developing simulation requirements since
it describes the capabilities of the various systems that would need to be modeled. The other
documents that we reviewed were all related to the Land Warrior (LW) program. Since Land
Warrior is an integrated system of systems, it is an ideal example of what must be modeled in
26
simulation. Also, since an analysis of alternatives (AoA) is currently being conducted by TRACWSMR for the Land Warrior Milestone (MS) C decision, that program drove, in part, our initial
implementation timeline. The LW documents consisted of previous studies conducted by
AMSAA and other organizations, and taskers for the AoA. Additionally, we used the slides
created by TRAC-WSMR for their AoA briefing (Larimer, 2004) to the study advisory group
(SAG), which delineated the questions to be answered and their plan for the conduct of the AoA.
This presentation proved to be a very valuable source of information. Finally, we referred to a
draft SSP, written by Booz-AUen & Hamilton (2000), that attempts to address the integration of
simulation into the Land Warrior program; however, that draft was never approved.
2.1.4.4 Methodology Documents.
Much of the systems engineering methodology expertise resided with the members of our
study team. However, we used Hatley and Pirbhai's book (1988) to faciUtate our fiinctional
decomposition and the Department of Commerce's IDEF standards (1993) to compare our
methodology with a similar process. Finally, we V&QA Logical Decision for Windows, Version
5.125 (2003) for the decision analysis portion of our methodology.
2.1.4.5 Doctrine.
We referred to doctrinal manuals when trying to define the simulation modeling
requirements, specifically in terms of scenarios, combined arms representation, and the modeling
of the fimctions of the individual soldier. For scenarios, we looked at potential Joint operations
using CJCSM 3170.01, Universal Joint Task List (UJTL) (2002), and Army operations using FM
7-15, The Army Universal Task List (2003). For the operational capabilities that fiilfilled the
Objective Force vision, we referred to TRADOC Pamphlet 525-66, Force Operating
Capabilities (2003). For individual soldier tasks and fimctions we looked at FM 7-8, Infantry
Rifle Platoon and Squad (2001), and ARTEP 7-5-MTP, Mission Training Plan for the Stryker
Brigade Combat Team Infantry Rifle Platoon and Squad (2003). We looked at the latter because
the current version of Land Warrior is to be Stryker-interoperable.
27
2.1.4.6 General Documents.
In this category of documents, we were seeking any other work relating to our study. We
discovered quickly that others have been researching various aspects of individual soldier
modeling in simulation. In fact, we found four studies that directly addressed aspects of our
study. The fnst was a study that served as one of the drivers for our own. That study, titled
Finding a Simulation that Models Both Present and Future Individual Soldier Systems (Aviles, et
al., 2000), conducted at the USMA, sought to identify the best simulation to evaluate capabiHties
of individual soldier systems. They concluded that no simulation met all the requirements. That
study was done on behalf of PM Soldier (PEO Soldier's predecessor organization). However,
the information contained therein needed to be updated and their requirements broken down in
greater detail.
The second study was conducted by the Soldier Modeling and Analysis Working Group
(MAWG) chaired at TRAC-WSMR. That group, consisting of members of numerous Army
agencies, identified key soldier modeling requirements and evaluated existing simulations
against those requirements. Their report, Soldier Modeling and Analysis Working Group
(MAWG) Evaluation Report (Larimer, et al., 2004), provides a roll-up of current simulation
capabilities by requirement. Because of the shortened nature of our timeline, we relied heavily
upon the simulation proponents' responses to the Soldier MAWG surveys (copies fiimished to us
by TRAC-WSMR), in order to evaluate current simulations. See Appendix E for a list of the
questions contained in the surveys.
The third study was conducted by a NATO working group. The report is titled Model(s)
for the Assessment of the Effectiveness ofDismounted Soldier System [sic] (2001). In this study,
the NATO working group evaluated key simulations from numerous member countries against
required characteristics. As with the first study we discussed, the NATO results are somewhat
dated, and their requirements were not in the detail we desired. However, they served as another
base upon which we could conduct our study.
The fourth study was conducted in the Department of Systems Engineering at the USMA
in 2002. In it. Dr. Paul West attempted to recreate in simulation the first major battle of the
Vietnam War, the battle in the la Drang Valley in 1965. The purpose of his study was to
"explore the feasibiHty of replicating a Vietnam-era battle in simulation, using 1960s and Land
Warrior technologies," and given that, to "recreate a Vietnam-era infantry battle to provide
28
vignettes for comparing Land Warrior and non-Land Warrior equipped forces (West, 2002)."
Dr. West used the Joint Conflict and Tactical Simulation (JCATS). Their conclusions were that
the lack of situational awareness and semi-autonomous behaviors in JCATS required interactor
support; therefore, simulating a Vietnam-era battle in a closed-loop simulation was not possible
at that time. His conclusions directly reflect the crux of PEO Soldier's problem.
In addition to the above studies, we found some other key documents relating to our
research. One of these was NATO Soldier Modernisation [sic]: Measurements for Analysis - a
Frameworkfor Modelling [sic] and Trials (1999). The objective of this NATO report was to
quantify the achievement goals of individual soldiers and groups of soldiers. The document set
measurement levels, described ideal missions and vignettes, discussed key command and control
(C2) areas, and listed special operations-other-than-war (OOTW) considerations. The paper by
Cosby and Severinghaus (2003) identifies shortcomings in individual and small group soldier
modeling due to the changing nature of modem warfare. The paper by Rodriguez (2003)
identifies current individual and small team simulation efforts in the virtual training domain.
The paper by Cioppa, et al. (2003), describes the current efforts of the Military Operations in
Urban Terrain (MOUT) Focus Area Collaborative Team (FACT), or the Urban Operations (UO)
FACT as it is now called, to address ciurent gaps in the modeling of urban terrain.
Other references we used included a paper covering analyses of alternatives written by
the Army Logistics Management College (ALMC, 2003), that gave a good overview of the
conduct of an AoA. We also looked at a paper by the National Security Directorate (2001) that
looks at the future requirements of Objective Force Warrior (OFW), now Future Force Warrior
(FFW). The technical report by Kwinn and Smith (2001) provided a framework to view the
soldier's potential role in network-centric warfare. Finally, the NATO presentation by Jacobs
and Brooks gave us a synopsis of human behavior representations (HBR).
2.1.4.7 Conferences.
In addition to reviewing literature and documents, we attended related conferences to
gain an understanding of the current state of research. We will discuss three of those
conferences here.
We attended the Interservice/Industry Training, Simulation, and Education Conference
(I/ITSEC) held in Orlando in December 2003. I/ITSEC was a very large conference bringing
29
together the latest in technology and techniques from all of the Armed Services and from
industry. The focus of the conference was on simulation for training and education purposes,
although analysis was also covered. We spent a tremendous amount of time viewing technology
through government and industry exhibits and learning about current research in tutorials and
paper presentations. The primary topics that we focused on during our attendance were
simulation interoperabihty, HBR, simulation requirements definition, environmental
representation requirements, and communications network modeUng. The information we took
from I/ITSEC was very valuable to our study.
The second conference that we attended, also in December 2003, was the Winter
Simulation Conference (WSC) in New Orleans. WSC was an academic and industry conference
covering a wide variety of simulation topics, most of them non military-related. Although there
were exhibits here as well, the tutorials and paper presentations were the most valuable source of
information. The topics that we focused on here were simulation mteroperabiUty, HBR,
requirements definition, distributed computing, verification, validation and accreditation
(W&A), and urban operations M&S research.
The third and last conference that we attended was the US - German High Resolution
Infantry Workshop held in Monterey, CA, in March 2004. The purpose of this conference was
to discuss Infantry soldier modeling issues being faced in common by US and German modelers.
During the first portion of the workshop, representatives from both countries presented current
research and progress on a variety of related topics. Durmg the second portion of the workshop,
participants worked in small groups on the following focus areas: weapons effects and target
engagement, human-centric modeling, and urban terrain and environment. Although the
workshop was relatively late in the progress of our study, we learned a great deal and were able
to refine some of our products as a result.
Our review of literature and related research was a continuous process, and a critical
foundation to our study.
2.1.5. Stakeholder Analysis.
In conjunction with our search of pertinent literature and research relating to ovir project,
we were also attempting to capture the needs and opinions of all critical stakeholders. A
stakeholder, for our purposes, was anyone who had a vested interest in our problem. To identify
30
stakeholders, we did so by category: simulation users, simulation developers and maintainers,
analysts, SMEs, acquisition organizations, and others. While we attempted to interview a broad
range of stakeholders, the compressed nature of our timeline prevented us from interviewing
everyone we would have liked and caused us to rely on research and information developed by
others.
Our interviews encompassed personnel from the following organizations: TRACWSMR's Land Warrior AoA team, TSM-Soldier at Fort Berming, the Modehng and Analysis
Team at the Natick Soldier Center (NSC), the Infantry Warrior Team at the Army Materiel
Systems Analysis Activity (AMSAA), PEO Soldier and its subordinate project and product
managers (specifically members of PM-Soldier Warrior (PM-SWAR), PM-Clothing and
Individual Equipment (PM-CIE), PM-Soldier Weapons (PM-SW), and PM-Land Warrior (PMLW)), Combat'^' developers at TRAC-WSMR, Objective OneSAF (OOS) developers, TRACMTRY, and SMEs internal to our organization.
We obtained a fremendous amount of input from our interviews and discussions and we
integrated much of what we learned into our products. However, we will discuss fUrther
information we received from two particular groups.
2.1.5.1 PEO Soldier and Subordination Organizations.
We held our initial project meeting on 14 November 2003 at USMA. At that meeting, we
briefed Mr. Charles Rash, Deputy, PEO Soldier, on our project plan. As a result, we received
additional guidance and clarification. We will only discuss the most pertinent of that
information here. Mr. Rash indicated that he beheved that the solution to their problem would be
a suite of simulations, and that a single simulation that could meet all of their needs will not be
possible for a long time. He stated that he understood that M&S has limitations and that no
M&S will do everything; therefore, he wanted to know what is achievable as input to making
informed decisions. He asked us to look at the technologies of other services, but not
simulations from other countries. Finally, he asked us to have a recommendation by 31 March
2004 (in 4.5 months).
We presented our first Interim Progress Report (IPR) to Mr. Charles Rash, at PEO
Soldier in Fort Belvoir, VA, on 19 December 2003. There, we reported the results of our review
of literature and stakeholder analysis to that point, and we received additional clarification.
31
When asked about vviiat specific types of questions they needed answering, Mr. Rash cited
comparative studies between new, proposed systems and current systems. He needed a
simulation capabiUty for value-added analysis of new equipment. He wanted to be able to get
collective (platoon-level), aggregate measures of effectiveness, but with individual soldier
fidelity. He pointed out the fact that often an engineering-level model will show great benefit at
the individual system level, but wiien aggregated into larger, force-on-force models the overall
benefit is smaller or even larger. He wanted to be able to quantify that benefit, especially the
quantification of the value of situational awareness (SA) / situational understanding (SU). He
was looking for a model that would produce results acceptable to the Operations Research /
Systems Analysis (ORSA) community. Some of the specific capabilities he brought up were:
•
the ability to switch out soldier systems in the simulation;
•
the ability to change the equipment mix;
• the representation of platoons, squads, and individuals using tactics, techniques, and
procedures (TTPs);
•
a human behavior representation that reflects interaction;
•
a representation of the handling of casualties;
•
a representation of w4iat happens to soldier performance if a system fails; and,
• the ability to replicate unmanned systems.
We conducted a second IPR on 22 March 2004 to Mr. Ross Guckert, Director of Systems
Integration, PEO Soldier. However, because of the timing of the IPR, we did not obtain new
information that impacted our study.
In addition to meetings with our primary client, we also spoke with M&S representatives
at the Project and Product Manager levels. From them, we learned that the PMs are frequently
asked to conduct short-suspense "what-ifs," but do not really have the simulation capability to
answer those sufficiently. They need simulations for conducting engineering trade-off analyses
as well. They also felt that PEO Soldier needed to have greater influence in model development
and use.
2.1.5.2 Other Army Agencies.
The primary comment from stakeholders in other M&S agencies was that the data did not
exist to populate simulation databases properly. Most recognized the shortcomings of current
32
simulation technology to meet fully the needs of the Infantry acquisition community. They also
pointed us in the direction of related research and projects, many of which were referenced
earlier in this report.
2.1.6. Functional Analysis.
We will discuss our functional analysis for this study in two parts. The first part,
discussed in this section, covers the general functions of a simulation capability; the second part,
in the next section, includes a more-detailed analysis of soldier functions in order to define
requirements.
We identified that, at the highest level, this simulation capability must support PEO
Soldier decision-making, as well as simulate or represent the soldier in his battlefield
environment. While these seem obvious, our original inclination had been to focus on the
simulation piece at the expense of decision support. By doing so, we had no way to account for
risk or time-related issues. Our resulting functional hierarchy is shown graphically in Figure 5.
F
simulation
Support for STMS
Acquisition
Decision-Making
1
1
Support Decisionmalting
Simulate
_
r
1
Acnnunt for Joints
Cambined Aims
EfTectivenessw/
System Changes
■:.:,::■,,:
Model SIMS
Effectiveness
i ._.___
Model STUS
■
■
Modify Equipment
_ ___
Replicsle
1
Modiiy
Conilitians
ModilyTTPs
_l
Figure 5. Simulation capability functional hierarchy.
2.1.7. Requirements Analysis.
2.1.7.1 Methodology.
We uniquely applied a common methodology to define our requirements. The first step
was conducting a decomposition of the required simulation software functions and arranging
them into a functional hierarchy. At the highest level we used Hatley and Pirbhai's generic
33
template for representing a system's physical architectxire, decomposing each function into six
subfunctions (Figure 6): 1) provide user interface, 2) format inputs, 3) transform inputs to
outputs, 4) control processing, 5) format outputs, and 6) enable maintenance, conduct self-test,
and manage redundancy processing (Hatley and Pirbhai, 1998).
Simulation
System
:|
Process Inputs
1
1
1
Interface with
User
Control
Processes
Maintain, Salftest, Manage
Redundancy
Prociiis Model
Process
Outputs
Figure 6. Primary simulation software functions.
Each of these six subfunctions can then be broken down further, as necessary. Five of the six
subfunctions above capture the architecture and desired characteristics of the simulation itself
One sub-function - transforming inputs to outputs (the process model) - addresses the modeling
aspects of the simulation. We decomposed this function as shown in Figure 7 below.
"1
Process Model
Model Other
Entitles
Model
Weapons /
Ammunition
Effects
It
Rludi-. ih«
Model Unit
Organizations
Model the
Environment
Figure 7. Primary simulation modeling functions.
From here, we focused on modeling the represented entity, and conducting a functional
decomposition on it. Since we wanted to have resolution down to the individual Infantryman,
the next step consisted of decomposing all of the functions that the soldier executes in the
performance of his mission. Because we focused on soldier functions, information was easily
obtained from subject matter experts (SMEs), i.e. soldiers. The challenge, then, was organizing
the functions into a hierarchy. We worked from both top-down and bottom-up to ensure
completeness. Figure 8 shows the highest level of our soldier decomposition. Each of those subfunctions were further broken down. See Appendix B for the entire soldier functional hierarchy.
34
Represent the
Infantiy
STMS
Decide
Assess
Current
Situation
Make
Uovement
Decisions
Make Sensing
Decisions
Make
Communications
Decisions
Make
Engagement
Decisions
Maka
Enabling
Decisions
Communicate
Engage
Figure 8. Functional decomposition of primary STMS functions.
The intent of the final set of requirements was to provide a simulation capable of
comparing alternate materiel solutions. For that reason, we needed only decompose down to
subfunctions that either affect the performance of the soldier system or component, or allow for
differentiation between alternative systems. However, in order to have a hierarchy robust
enough to consider future systems, we decomposed to the lowest sub-function by v^ich the
modeler can reasonably expect a future ^stem to be affected or differentiated. We will address
this further in a moment.
Simulation Functional Hierarchy
FT
|~]
•—'
Level of Fidelity
Not Required
Simulation Model Hierarchy
Figure 9. Functional decomposition for requirements analysis.
Once we decomposed the soldier entity's functions and arranged them into a hierarchy,
we then conducted input-output analyses. For each function, we identified all of the inputs
transformed by the fiinction and the outputs produced. During this process, we also captured the
35
attributes that affect the transformation, as shown in Figure 9. While this is seemingly simple in
theory, it was quite involved in practice.
As an example, we will consider the soldier function of choosing a target to engage.
Inputs into this decision include the soldier's own location, the target location, the threat
presented by the target, the soldier's perceived probability of hitting the target, other targets, tiie
terrain, the weather, his sector of responsibility, his location in the formation, etc. The primary
output is a target choice, which may be an input to the actual engagement function.
Attributes
of the soldier would include his training level, e?q)erience, doctrine, rules of engagement (ROE),
role in the unit, etc. All of these attributes affect how the soldier transforms his inputs to an
output and may be unique to that soldier. We considered these part of the processing instructions
that control the transformation.
The methodology we used had checks to ensure completeness and prevent unintentional
redundancy. First, the inputs of a parent node should include all, and omit none, of the inputs of
its children nodes. The same is true of the outputs. Failure to meet the above conditions could
indicate missing subfunctions, or an incomplete identification of inputs or outputs. Another
check was to ensure that any input was either an output of another function or an input into the
system itself from the environment. Such analysis assisted us in identifying other combat
systems, or entities, on the battlefield that must be represented and would require similar
functional decompositions as well.
Upon completion of the overall hierarchy, we then converted the functional hierarchy
into simulation requirements. The first step would have been determining the fidelity required of
the simulation model, based on the soldier system being evaluated. Ideally, we would determine,
for each branch in the model hierarchy, how far down was necessary for system evaluation. We
would draw "cuts" in the hierarchy, below which are subfunctions that do not require explicit
representation in the current study. From there, we would determine objectives for each lowestlevel sub-function (above the cut-line in the model hierarchy) based on the primary stakeholders'
input. We would then choose evaluation measures for each of the objectives. For simulation
modeling requirements, these measures will likely be binary - whether or not the simulation has
that modeling capability - or a constructed scale that is incremented by the model's ability to
achieve the objective. However, because we were developing requirements for all potential PEO
Soldier systems, not just one under study, we did not draw cut lines, but considered all functions.
36
We should note that our approach is similar to the Integration Method for Function
Modeling (IDEFO), developed by the Air Force in the 1970s and released by the National
Institute of Standards and Technology (MIST) as a standard for function modeling in 1993.
IDEFO is a formalized modeling language, with rules and techniques for creating a graphical
representation of a system (Department of Commerce, 1993). IDEFO and our approach share the
same underlying methodology centered on a logical extension of common requirements
definition tools - functional decomposition and input-output analysis. However, our approach
does not demand the same rigorous appUcation of rules and techniques as IDEFO. The sheer
complexity of the soldier system on a fluid battlefield makes IDEFO impractical for our
purposes. Therefore, we stopped short of applying a meticulous rule set, such as IDEFO, to our
problem.
2.1.7.2 Methodology Advantages and Challenges.
The primary advantage of our process does not concem its mechanics, but rather the fact
that we have a process. While many of the simulations currently in development have
operational requirements documents (ORDs) that identify simulation requirements, we have not
found a document that describes a methodology for developing them.
Another advantage of our process is its contribution to an understanding of the
requirements. The functional hierarchy creates a concise picture of the required functions of a candidate simulation. Such a tool is indispensable for presenting results to a client or a manager,
or for quickly conveying needs to potential simulation developers. Moreover, the clarity
achieved through well-defined functions makes the subsequent requirements more easily
understood by the client, modelers, program managers, software engineers, and the subject
matter experts (vAio will help to validate the model). This invariably reduces the potential for
discormect between the client and the software engineers.
An additional advantage of our methodology is that, by identifying the inputs, we
implicitly defined the requirements for the representation of the environment. For instance, if an
input for the function of firing the soldier's rifle is the level of ambient light, that characteristic
must be represented as part of the environment. Also, identifying inputs and outputs illuminated
linkages between functions, and between the system (in this case the soldier) and other systems
37
on the battlefield (e.g. an enemy soldier). Thus, we could identify other battlefield systems that
must be represented, and their required level of fidelity.
The entity fiinctional hierarchy with cut-lines has advantages as well. The extensive
decomposition of functions, even though the current stud^ may not require it, allows tiie
hierarchy to be used again for other studies. It should also be robust enough to consider fixture
soldier systems with minor modifications instead of having to start over again. Thus, a properlyconstructed hierarchy can expand and contract to meet the needs of study. This prevents
unnecessary detail that might slow the simulation or consume excessive memory. Those points
at which the hierarchy expands and contracts can also serve as 'hooks' onto which other higher
fidelity models can link, if necessary. Therefore, instead of having one large model, one can
have a lower-fidelity model that links to high-fidelity, engineering-level simulations when
needed.
The methodology results could be useful as well after a simulation, or federation of
simulations, is chosen for the study. By comparing the simulation to the value hierarchy, the
analyst can quickly determine areas for which "work-arounds" will have to be developed, or
assumptions made, to bridge capability gaps. Since the inputs and outputs used by the functions
are identified, the modeler can see precisely vsiiich missing inputs and outputs will have to be
generated by altemative means.
All of the above being said, our methodology is not without its challenges. For one, it
requires an in-depth analysis for each study, which alone can be quite time-consuming.
Additionally, our process requires that the entity functional analysis be done for any class of
entity that could be encountered on the battlefield - a step we have not yet done. The required
level of fidelity of the other systems may not necessarily be as high as that of the soldier, but
analysis must be done for each entity class to determine that level.
2.1.7.3 Requirements.
Early in our analysis, we discovered that the simulation model requirements flowed fi-om
two primary needs: the need for realism and the need for a tool to compare candidate soldier
tactical mission systems (STMS). The simulation model has to produce valid outcomes based
upon the inputs. This fact is certainly not unique to our study, but is the goal of all combat
simulations. But how much realism is required? Resource and technology constraints dictate
38
that we define an appropriate level of fidelity. The answer to that question depends primarily
upon the purpose of the simulation. The purpose of our simulation is to provide a decision aid
for comparing STMS configurations and distribution. Therefore, the simulation model must
represent those inputs and outputs affecting or affected by the system being considered, vMle
still producing a valid result. Otherwise, unique aspects of the systems being compared will not
factor into the simulation output, potentially resulting in an uninformed decision. This is
currently the case in existing simulations and tiie reason PEO Soldier commissioned this study.
Any comparison between systems must consider the system's performance with respect
to its desired characteristics. According to an engineering problem statement written by the
Department of Systems Engineering (2000) at USMA for PEO Soldier, the main STMS
characteristics are mission capability, survivability, and trustworthiness. Mission capability and
survivability further consist of lethality, mobility, protection, communications, and situational
awareness. Trustworthiness consists of reliability, availability, maintainability, sustainability,
and usability. The simulation model, then, must provide measures of the system's performance
in terms of those characteristics.
Measures used to evaluate the predicted outcomes for one or more characteristics are
called measures of effectiveness (MoEs). For instance, a common MoE used to evaluate
lethality is the total number of enemy kills. Thus, a higher total number of enemy kills
represents a higher degree of lethality. That MoE may depend upon a large number of measures
of performance (MoPs). MoPs are lower-level measures that quantify the performance of a
specific piece of equipment or human task. Using the lethality example, the total number of
enemy kills may be a function of the following MoPs: weapon rate of fire, accuracy, reliability,
human aiming error, target location error, etc. It is quite apparent that weapon reliability, a MoP
that directly affects the total enemy kills, also directly affects system trustworthiness. This is an
example of the interdependence that led us to decompose by function.
Interestingly, our combination of functional decomposition with input-output analyses
actually improved our understanding of the desired performance outputs (or MoEs). By
identifying the inputs and outputs of every function, we were also identifying MoPs. Since those
MoPs directly affect MoEs, we were able to identify unexpected sources of performance
contribution that we would have missed using other methods. Thus, for a comparative analysis,
our results give PEO Soldier a clearer picture of how their individual systems contribute to the
39
efifectiveness of the soldier system of systems. The following summarizes our requirements. For
the detailed functional requirements document we developed, see Appendix C.
Attributes
As we mentioned in the methodology section, a simulation must represent the soldier
entity using a complete set of attributes that affect the entity's performance of a fiinction (or
transformation of inputs to outputs). These attributes can act as inputs or controls. For example,
attributes acting as controls may be rule sets for making a decision, a general knowledge base
drawn upon by cognitive processes, or physical constiaints affecting performance. Additionally,
these attributes can be affected or changed by the process itself For instance, movement can
reduce the soldier's energy level, operations can increase his experience level, and equipment
damage can change its physical and performance characteristics. Attributes can be entered
directly or be fed by engineering level simulations.
We group the soldier attributes into three categories - mission, personal, and equipment.
Mission attributes reflect the soldier's knowledge about his mission and how he is expected to
accomplish that mission. They generally apply to all soldiers in the unit. Personal attributes
reflect characteristics of the soldier himself, and may, or may not, vary from soldier to soldier.
While the simulation may take such data from engineering level models of human performance,
these attributes should still factor into the mission level simulation by affecting soldier
performance. Personal attribute subcategories include physical, physiological, psychological,
mental, and readiness.
Equipment attributes reflect characteristics of the equipment, weapons, or clothing worn
by the soldier. They may differ by soldier, depending on the type of equipment he is carrying,
but are normally constant for a particular piece of equipment. The actual attributes greatly
depend on the specific type and model of equipment being represented. These include weapons
and ammunition, sensors, communications, clothing, and other equipment. It is these attributes
that PEO Soldier would be interested in altering to reflect different types of equipment.
Therefore, these attributes must be modeled exphcitly.
Assess the Situation
This first main fimction serves as a primary driver for practically all soldier decisionmaking and actions, and may well be the most difficult to model. Within the context of this
paper, the assessment function involves the basic aspects of what the military commonly refers
40
to as METT-TC analysis. Such an analysis centers on the soldier's assessment of his Mission,
Enemy situation, Terrain (referred to as environment in this paper). Troops available (such as
friendly situation or knowledge of supporting and adjacent units). Time available, and Civil
considerations. For the purposes of brevity, any further use of METT-TC directly refers to the
soldier's assessment of the situation.
Wherein PEO Soldier's requirements are concerned, this assess function is critical.
Not only does an effective assessment obviously enhance the soldier's situational awareness, it
has direct and indirect impacts in other areas, such as lethality and mobility (e.g., soldier
assessment of weapons/equipment needed to complete mission). In fact, many of the proposed
capability enhancements of PEO Soldier systems aim to provide the soldier with improved
means for collecting and analyzing information critical to his situational assessment.
Accordingly, any simulation that seeks to offer comparative analysis between PEO Soldier
systems must certainly model how those systems enhance or detract from the soldier's ability to
assess the situation. It is also important to recognize that the soldier's assessment phase never
truly ends. In fact, throughout the duration of a particular mission, the soldier is constantly
updating his assessment based on physical observations, encounters, and external data fed to him
through various conduits (analog or digital communications, voice/hand signals from a squadmate, etc.).
Sense and Make Sensing Decisions
On the asymmetric battlefield of today, soldiers at every level make decisions based on a
sensing of the battlefield. In fact, much, if not all, of wiiat Infantry soldiers do on the battlefield
involves varying degrees of sensing. This includes the specific functions of searching, acquiring
and tracking targets. Accordingly, the soldier's sensed perception of the battlefield plays a
critical role in his decision processes and resulting actions.
PEO Soldier systems directly address this fiinction by providing sensing capabilities that
affect inputs into the decision cycle. For example, PEO Soldier systems, such as thermal weapon
sights, night vision devices, GPS systems, and other video systems, enhance the soldier's ability
to detect, acquire, identify, and track potential targets. Similarly, such systems enable the soldier
to refine his METT-TC assessment more efficiently. Logically, by virtue of the enhanced
capabilities they provide, these system components will affect soldier decisions. Therefore, any
41
potential simulation must model those decisions and how they are affected by sensing
equipment.
PEO Soldier products also serve to enhance the soldier's physical ability to observe the
battlefield. Most are designed specifically for improving his ability to see throughout the EM
spectrum - visual, image intensification, infrared, and thermal; however, improvements to the
soldier's ability to use his other senses are probably not far into the future. In that vein, we must
consider both the soldier's natural and technologically-enhanced sensing capabilities.
Additionally, for the sake of realism, the simulation should model the soldier's ability to detect
other cues, e.g. hearing movement or weapon reports, or making observations based on
fortuitous glances that may shift his attention.
Ultimately, the simulation should reflect how a particular soldier system affects the
soldier's sensing capabilities. For example, a future system may propose a fully en-closed
helmet. Does this enhance or detract from natural sensing methods (e.g. does it impede
peripheral vision and thereby create a tunnel effect; does it dampen sound to such an extent that
soldiers are more susceptible to surprise, etc.)? Soldier systems could be differentiated based on
how they overcome such problems and to what extent they enhance natural sensing capabilities.
Engage and Make Engagement Decisions
Any simulation must explicitly model the soldier's ability to engage enemy forces, since
it is one of the soldier's primary functions. As with any function, however, the actual act of
engaging a target cannot occur without some form of decision process associated with it, no
matter how hasty. While the decisions rely heavily upon the weapons and equipment that the
soldier is carrying, the unit assets at his disposal, and his means of bringing those assets to bear,
they are also impacted by the quality of the soldier's METT-TC assessment and his sensing
decisions/actions that led to the target in the first place.
PEO Soldier has made great strides in weapon and sensor enhancements that improve
engagement decisions. As addressed under the sense ftinction, enhanced sensory capabilities
enable the soldier to acquire, identify, and track targets more efficiently. Advanced
communications and digital equipment, coupled with GPS and laser targeting devices, will allow
the soldier to call upon networked fires with shorter response times and greater accuracy.
Such capabilities, while affecting the soldier's engagement decisions, will also have the
obvious additional impact of affecting the actual act of engagement. Products such as aiming
42
lights, the Integrated Laser White Light Pointer, Multi-Function Laser System, and the Sniper
Night Sight should enable the soldier to engage targets with greater accuracy, thereby
influencing such measures as the probability of kill, probability of hit, etc. Accordingly, any
simulation seeking a comparative analysis of soldier systems must effectively address both the
engagement decision cycle and the resulting engagement process.
Move and Make Movement Decisions
Any soldier system will have some impact on his ability to move, including navigating,
changing posture, and changing location through various means of movement. Accordingly, any
simulation must address the effects, positive or negative, a system has on these fimctions. For
example, while weapons and sensor enhancements certainly increase a soldier's lethality, all of
these additional pieces of equipment must be carried. This may have a counter-balancing effect
on lethality in that more weight translates to a more fatigued soldier. Thus, the modeling
requirements must capture the impacts of the physical weight and arrangement of the systems on
soldier movement, navigational capabilities, and signature (his detectability).
Through many of its products, PEO Soldier seeks capabilities that 1) do not physically
impede movement and 2) enhance the soldier's ability to navigate. For example, navigational
decisions are aided by GPS equipment, the Helmet Mounted Display (HMD), or the
Commander's Digital Assistant (CDA). For comparative analysis, a simulation must represent
the benefits of those types of equipment in conjunction with comparative cases (e.g.,
navigational errors caused by the use of only a map and a compass). On a smaller scale, these
decisions encompass soldier movement towards cover and concealment.
As previously discussed, the soldier's continual METT-TC assessment triggers tiie
decide/act cycles associated with these functions. The information shared via the CD A, HMD,
and communication equipment can aid the soldier in making movement decisions, thereby
necessitating representation. Again, the simulation should model any mistakes made in the
absence of these devices, as well as the choice of movement methods to avoid detection (slower,
crouched, deliberate, etc.).
The actual physical functions of moving from one location to another are not directly
aided by current PEO Soldier programs, with the exception of some climbing aids for urban
operations (UO). However, future soldier equipment could feasibly include exoskeletons and
other muscidar aids that would serve to enhance basic human movements and strength.
43
Regardless, the weight of the equipment carried and worn by the soldier affects these functions.
Consequently, in order to differentiate between soldier tactical mission systems (STMS), the
simulation should model the impacts of equipment on soldier movement rates, degrees of motion
on his joints, limitations in his fine motor skills, etc.
Lastly, the soldier's selection of a particular posture (standing, kneeling, lying down) has
a great impact on the outcome of an engagement, both in terms of the accuracy of the firer and
the exposure of the target, as well as the exposure of the firer to returned fires. The obvious
impacts of such a choice, coupled with the fact that current PEO systems afford soldiers the
ability to fire from a reduced exposure position, require representation in any simulation.
Communicate and Make Communications Decisions
The ability to communicate serves as a vital battlefield multiplier for the soldier since it
enhances situational awareness, lethality, and protection. Accordingly, the simulation model
must consider communication capabilities above, below, and lateral to the soldier. It must
consider issues hke integration and interoperability with other battlefield digital systems, both
mounted and dismounted. It should model enhancements in range and non line-of-sight (NLOS)
capabilities. In the case of Land Warrior, communications include not only radios, but the HMD
and CD A.
Communication is critically important to current and future soldier systems. In addition
to the ability to transmit voice data, soldiers can transmit digital data as well, allowing for a
greater exchange of information. Much of the in-formation sent and received by personnel and
devices (e.g., information transmitted via satellite or robot) supports soldier decisions on the
battlefield and thereby influences his actions. Therefore, it is important that the simulation
capture these decisions and functions in order to represent the soldier accurately.
Transmitting includes all types of communication such as verbal, hand-signaling, writing,
typing information into a CD A, etc. While a plethora of communication modes creates
redundancy, the devices must compete for band-width. Bandwidth constraints or system
overload translate to lost information and degraded situational awareness. The simulation must
represent tiiese transmissions and the associated impacts on the soldier. The simulation must
also model communication with soldier-controlled systems like unmanned aerial vehicles
(UAVs) and robots, as well as the time required to communicate (e.g., time required to type and
send a written order or graphics).
44
As with transmissions, receiving communications from other soldiers or data devices is
critical to the success of PEO Soldier-equipped entities. It cannot be assumed that all
information available to soldier will be received. Therefore, the distinction between wiiat is
received by the soldier and what is not is critical for simulation.
Enable and Make Enabling Decisions
Enabling functions reflect the soldier's ability to operate in his surroundings and perform
common tasks critical to survival and the performance of the other four main functions. Thus,
these actions enable the soldier to engage, move, communicate, and sense, either directly or
indirectly, as well as operate in the basic human sense.
In the course of battlefield operations, the soldier will engage in various decide/act cycles
that support one or more other functions. Such cycles include the altering of terrain, load
manipulation, and basic human operation. In choosing to alter terrain, the soldier acts either
defensively to counter a threat or offensively to gain advantage. These alterations might include
digging a fighting position to enhance protection or clearing fields of fire to enhance his ability
to engage. If moving, he may decide to breach through or move an obstacle in his path by
cutting wire, removing mines, etc. These decisions are especially important in urban operations
(UOs), in which soldiers must open doors and windows, move furniture out of their way, etc.
Because any STMS might positively or negatively impact this ability, the simulation must model
it
Similarly, the soldier will make decisions concerning his load based on mission necessity.
For instance, upon making contact, the soldier will probably drop his rucksack until the
conclusion of the engagement in order to enhance his agility and mobility (thereby enhancing
lethality). Likewise, v^en a unit moves into a pre-assault position, they may leave their
rucksacks and unnecessary equipment behind, under guard, until the mission is complete. Other
load manipulations might include choosing to pick up an enemy weapon if ammunition is low, or
shifting equipment around to make some available for use (e.g., pulling equipment from the
rucksack or putting unnecessary equipment into it). Any system will affect the soldier's ability
to perform these necessary functions and will hkely come with various configuration options.
Therefore, at a minimum, the most basic and common of these decisions should be modeled in a
simulation.
45
Other enabling functions include conducting bodily functions (eating, drinking, and
sleeping), performing first aid to self and others, and performing equipment maintenance,
resupply, and repair. These functions affect how the soldier and his equipment operate and so
require representation, to some degree, in the simulation model.
Requirements for Other-Than-Soldier Representation
The previous section discussed our decomposition of the functions of the Infantry soldier,
since that was the focus of our effort. However, our requirements would be incomplete if we did
not mention other aspects of the simulation model that must be considered. In some cases, the
functional requirements imply a modeling capability that should be expounded upon due to its
importance. In other cases, we integrate requirements of numerous functions into a cohesive
topic.
Representation of the Environment
Representation of the environment is a tremendously complex issue. Our methodology
provides a means to deter-mine environmental requirements by identifying inputs and outputs of
each function. If some aspect of the environment is an input into a function, then it should be
represented to model that fiinction accurately. Additionally, if an output of a function is an effect
on the environment, the simulation should model that as well. The requirements and discussion
here only touch upon the more important aspects of this representation as they relate to the
requirements of PEO Soldier.
At the highest level, the simulation must be capable of representing any type of
environment in which the soldier might operate - urban, desert, jungle, swamp, forest, plains,
mountains, arctic, and littoral. Within the environment, the simulation must model various
aspects of the terrain. Terrain relief, combined with the weight of the soldier's equipment, will
affect his ability to move. Relief also impacts whether a one entity can see and, in the case of
direct fire weapons, engage other entities. The model should represent the effects of vegetation
on round, fragment, and shrapnel trajectories. Additionally, soldiers seek cover from enemy fire
behind trees, and concealment from detection behind various types of vegetation. Therefore,
those aspects should be modeled. However, the simulation need not necessarily represent each
individual plant explicitly. Instead, a random draw to determine vs^ether the soldier can find
such cover or concealment can occur based on the type of environment. This implies that the
model does not require 3-D representation.
46
One of the key aspects of terrain that must be modeled is urban terrain, as statistics point
to an increased likelihood of military operations in that type of environment. The accurate
modeling of structures is critical for assessing the effectiveness of any system in an urban
environment. The structure models should be able to represent interior and exterior
characteristics, with multiple rooms, multiple floors, construction material properties, windows,
doors, and furniture. It should also have attributes that allow the assessment of weapons effects
on the various components of the structure. It should affect the soldier's ability to communicate
within and between buildings. Additionally, the simulation must model other urban features:
vehicles, infrastructure (electric and phone cables; poles; gas, sewer, and water lines; etc), paved
areas, businesses, and general urban layout (roads, alleys, blocks, industrial parks, yards and
fences, etc).
Another major aspect of the environment is the climate, which can have a significant
impact on soldier and equipment performance. Weather conditions (e.g., temperature, humidity,
pressure, wind, and precipitation) affect equipment reliability and performance, as well as the
soldier's ability to perform tasks. Light conditions affect his ability to sense surroundings. Manmade conditions, such as battlefield obscurants like smoke and dust, chemical and biological
contamination, and illumination devices also have a tremendous impact on the soldier and so
require representation.
The simulation environment should be dynamic. This requirement reflects the ability of
the simulation to alter the terrain and climate during a single run and would ac-count for the
effects of the soldier and his weapons, such as blast craters, damage to structures, fire damage,
and changes to vegetation from deliberate soldier action. Dynamic climate allows for changes
as the day progresses (e.g., temperature, humidity, and barometric changes) and changes due to
the effects of soldier and his weapons.
Representation of Other Entities
We did not go into the same detail for other types of entities as that discussed for the
representation of the individual soldier. For the required simulation capability, we are concerned
primarily with the representation of the soldier. Therefore, models only need represent other
entities to the degree that the soldier will observe or interact with them.
For instance, there may be less need to represent an artillery piece explicitly, only the
fires request process, the incoming rounds, and their effects. However, for a tank, the simulation
47
may have to model its physical and vulnerability characteristics, its capabilities, and a realistic
portrayal of its behavior. The same is true of aircraft, personnel carriers, trucks, and other
systems the soldier may physically encounter on the battlefield.
The simulation must also represent higher headquarters and lateral units, but only to the
degree necessary for communications and directives purposes. For instance, if the platoon leader
is attempting to communicate with his company commander on the company net, then the traffic
from all company elements on that net should be simulated to ensure a realistic representation of
delay. Other representations of higher headquarters might include the ability to attach company
mortars or an extra squad, for example.
Representation of Network-Centric Warfare
Network-centric warfare is implied in the discussions of many of the soldier functions;
however, we will discuss it here as one integrated topic, focusing on its effects on the target
engagement process. Clearly, this characteristic of warfare must be represented in any
simulation that might be used to evaluate future soldier systems.
The target engagement process can be broken down into five distinct functions, called
battlefield information functions, v^ich are search/detect, identify, track/target, engage, and
assess. The responsibility for the performance of these functions is shifting away from the
individual soldier to a host of systems distributed throughout the battle-field. Thus, sensors may
search/detect, identify, and track/target potential enemy targets, engagement logic may trigger an
unmanned weapons platform to engage, and sensors may assess the effects of the engagement
(Kwinn, 2001).
The soldier may fill any of the battlefield information function roles as either a sensing or
engagement platform; however, he may no longer fill all of the roles. Thus, the discussion of
soldier functions does not alone capture the network-centric process. While the soldier's core
functions may capture his role in that process, the simulation must account for the digital transfer
of information between the soldier and other network platforms and how that information affects
the functions of the soldier and those platforms.
Representation of System Reliability and Power Requirements
Technological advances in equipment invariably create in-creased power requirements,
integration issues, and special maintenance and repair issues. Hence, it becomes necessary to
model equipment reliability and power systems.
48
The modeling of reliability should account for the failure rates of each of the components
and how the various potential component failures would affect the system as a whole. This does
not require explicit modeling but perhaps an association with probability functions for potential
sys-tem errors and probabilistic estimations of the repair times required for those failures.
Likewise, the system failures should affect the soldier's ability to perform certain functions and
require the soldier to switch to an alternate method of performing that function, if available.
The simulation should represent power requirements based on the mission, power load,
and power source capacity, as well as the ability to resupply. Furthermore, the modeled power
requirements coupled with reliability should account for the effects of the environment on system attributes.
Representation of Weapons and Ammunition Effects
While our functional discussion implies this type of representation, it merits special
mention. In short, the simulation must represent all types of weapons and ammunition that the
soldier may carry or encounter on the battlefield.
For direct fire weapons, the simulation should represent kinetic energy weapons, nonlethal weapons, electro-magnetic energy weapons, and other types of weapons de-livered via
soldier, vehicle, or aircraft-mounted platforms. The model should consider area and point firing,
as well as the various firing modes (single shot, burst, and fully automatic). Similarly, it must
include all types of direct fire ammunition (to include non-lethal types).
Indirect fire weapons necessitate similar representation. The model should represent
lethal and non-lethal weapons delivered via soldier, towed, vehicle, or aircraft-mounted
platforms and should accurately depict the characteristics of the rounds they fire. These
characteristics include the particular type of round (high explosive rounds (air burst, point
detonated, and delay), white phosphorous, illumination, smoke, smart munitions, etc.), as well as
the time to fire, time of flight, and round adjustment requirements. Likewise, the model should
account for chemical and biological weapons, their means of delivery, and their effects on the
environment and the soldier.
In conjunction with the actual direct and indirect fire systems, the simulation should also
model their key firing characteristics (either explicitly or implicitly). Important direct fire
parameters include maximiun effective range, rates of fire, bias (variable and fixed), random
error, and probabilities of hit, kill, and incapacitation for all possible weapon-munition-target
49
groupings. These must also ac-count for all weapon-sensor pairings, as they will affect the
aforementioned probabilities. Important indirect fire parameters include range, lethal radius,
ballistic error, dispersion, aim error, target location error, and probabilities of kill and
incapacitation due to fragments and blast effects.
The representation of weapons, ammunition, and ex-plosives must include their effects
on targets (humans with various levels/types of protection, structures, vehicles, vegetation,
terrain, other objects). Such representation should include effects based on the part of the target
struck and the level of protection in that area. Injuries should be affected by treatment, time, and
the environment. These effects include not only the effects of hitting the target, but also
suppressive effects on personnel nearby (varied suppression duration and level based on the
ammunition characteristics, the soldier's protection, and his state of mind).
Again, the latest complete version of our detailed functional requirements document can
be found in Appendix C.
2.1.8. Revised Problem Statement.
Based upon our needs analysis, the following is our revised problem statement, or
effective need:
Identify and/or develop tactical combat simulation capability for Light Infantry
missions at the level of Platoon and below with resolution down to the individual
Soldier. The simulation capability must accept, as input, scenarios and Soldier
STMS characteristics. It must model the functions of the Soldier in a tactical
environment, and provide, as output, the measures of effectiveness (MOEs) used
to evaluate STMS. The simulation(s) will provide the analytical capability to
support Program Executive Office (PEO) Soldier decision making.
Appendix D includes our engineering problem statement, vAiich lays out our revised
problem statement in more detail.
2.2 Value System Design.
2.2.1. Functions and Objectives.
With our initial Needs Analysis complete, we turned to value-focused thinking to provide
a system with which we could evaluate the altemative solutions we would develop later. We
started with the functional hierarchy we developed previously (Figure 5), keeping those lowest-
50
level functions (or sub-functions) that would allow us to differentiate between candidate
alternatives. For each of those lowest-level functions we determined objectives (defined for our
purposes as the desired direction of attainment of an evaluation consideration) based upon the
needs and desires of the stakeholders. In our process, we treat total cost as an independent
variable and therefore do not account for it in the value hierarchy. We will look at cost after all
alternatives have been scored. The functions and objectives can be seen in the completed value
hierarchy shown below in Figure 10.
SimulatiDn
Support for
STMS
Acquisition
Decision-Wakinn
Simulate
Support
Decision-making
LW = 0.5
LW-0.5
Account for Joint/
Combined Arms
Effectiveness
LW = 0.3
i Accurately Model ;
;
the Joint,
•
• Combined Arms i
i
Environment
;
5.
LW=1.0
!
Modal STMS
Effectiveness
Model STMS
Tointffll'
Modeling
Capabili^
[Scale: 0-10]
More is Bettor
GW = 0.15
Modl^
Conditions
LW-0.1
.....I..,
:
Maximize the
•
: user's potential ;
I for control over ■
•
conditions
>
LW-0.7
—7rr,—
Accurately Model
the Infantry
STMS
LW=1.0
.,,^.,^.^,,..
:,
' CTMS Modeling "I
Capability
|
[Scale: 0-10]
.
IMore is Better
GW = 0.245
I
>
:
•
:
Minimize Risk
•..„.,LW=.1.q
I
^ ^ ■■" *^ ^ ^ ^^
I
'
I
I
.
User Control
over Conditions
[Scale: 0-10]
More is Better
GW= 0.035
Modify
Equipment
LW = 0.1
TTR
Maximize the
types of STMS
equipment the
user can modify
■
|
.
:
:
I
:
LW = 0.6
::::.'jn::i
Minimize the Risk
Minimize the Risk
to Achieving
that the
Required
Capabilities are
not Fielded
Capabilities
.mrnmn
!" "PEO Soldfer ">
,1
•
Control
I
[Scale: 0-10]
11
I More is Better , i
. _ ffl« = 0J5_ J .
Fielding Risk
[Scale: D^]
Less is Better
:
■
t
;
*
I
|
. SV-115_ _•
S Minimize the Time *
■ Until Simulation is :
' Availabte for Use •
i
HodilyTTPs
•
j
■
:
I
:
■
........!ryy,°.j:?
1
i
Modifiable
' Equipment Types
I
[Scale: 0-10]
I More is Better
I. _ GW^SMS
■
|
•
........tvy.%];?
iW MM MM JL «> «■ W|
^1
Time Until
Available
[Months]
Less is Better
Maximize the
user's ability to
modify TTPs
_GW=oi _
r frp'Modifi'abiiity"!
■
I
[Scale: 0-10]
IMore is Better
|
■
L _^i.''i?5_ _•
Figure 10. Value hierarchy.
2.2.2. Evaluation Measures.
Below each objective, we selected an evaluation measure that we would use to measure
the degree of attainment of that objective. Each evaluation measure consists of a name, a
complete description, its units, and v^ether more or less is better. Because of the subjective
nature of our system requirements, most of our evaluation measures were defined in terms of
51
constructed scales, each level indicating an incremental change in the attainment of that
objective. The following sections describe our evaluation measures.
2.2.2.1 Joint/Combined Arms Modeling Capability.
This evaluation measure supports the objective accurately model the Joint, combined
arms environment under the sub-function accountfor Joint / combined arms effectiveness. It is a
constructed scale, with integer values between 0 and 10. We defined it as a measure of the
accurate modeling of the relevant combined arms and Joint assets, to the degree required for
STMS comparative analysis (e.g., autonomous calls for fire, communications interfaces with
naval gunfire (NGF), close air support (CAS), unmanned aerial vehicles (UAVs)). Therefore, a
level of 0 corresponds to no relevant Joint or combined arms assets being modeled and a level of
10 corresponds to all relevant Joint and combined arms assets being modeled. For more detail
about the scale between tiie endpoints, see Figure 29 in Appendix F.
2.2.2.2 STMS Modeling Capability.
This evaluation measure supports the objective accurately model the Infantry SIMS
under the sub-function model SIMS. It is a constructed scale, with integer values between 0 and
10. We defined it as a measure of the accurate modeling of STMS functions, including actions
and decisions, required for STMS comparative analysis. See the detailed requirements in
Appendix C for more information about the soldier functions (and their inputs and outputs) that
should be modeled. Therefore, a level of 0 corresponds to no STMS functions being modeled
and a level of 10 corresponds to all STMS functions being modeled. For more detail about the
scale between the endpoints, see Figure 31 in Appendix F.
2.2.2.3 Modifiable Equipment Types.
This evaluation measure supports the objective maximize the types ofSIMS equipment
the user can modify under the sub-fiinction modify equipment. It is a constructed scale, with
integer values between 0 and 10. We defined it as a measure of the amount of STMS equipment
that can be changed by the user for comparative analyses. Implicit in that definition is the
requirement that the STMS equipment be represented in the first place. See the PEO Soldier
Portfolio (2003) for more detail about the types of equipment that must be modeled and
52
modified. Therefore, a level of 0 corresponds to no STMS equipment being modifiable and a
level of 10 corresponds to all STMS equipment being modifiable. For more detail about the
scale between the endpoints, see Figure 33 in Appendix F.
2.2.2.4 User Control over Conditions.
This evaluation measure supports the objective maximize the user's potentialfor control
over conditions under the sub-function modify conditions. It is a constructed scale, with integer
values between 0 and 10. We defined it as a measure of the amount of control the user has over
conditions to the degree required for comparative analysis, to include dynamic weather and
terrain conditions. Therefore, a level of 0 corresponds to no control over conditions and a level
of 10 corresponds to complete control over conditions. For more detail about the scale between
the endpoints, see Figure 35 in Appendix F.
2.2.2.5 Modify TTPs.
This evaluation measure supports the objective maximize the user's ability to control
tactics, techniques, and procedures (TTPs) under the sub-function modify TTPs. It is a
constructed scale, with integer values between 0 and 3. We defined it as a measure of the user's
ability to modify the TTPs within the simulation. Therefore, a level of 0 corresponds to no
modification capability and a level of 3 corresponds to direct user modification (e.g., via decision
tables or behavior composition GUI). For more detail about the scale between the endpoints, see
Figure 37 in Appendix F.
2.2.2.6 PEO Soldier Control.
This evaluation measure supports the objective minimize the risk to achieving required
capabilities supporting the objective minimize risk under the function support decision making.
It is a constructed scale, with integer values between 0 and 10. We defined it as a measure of the
amount of control ("seat at the table") PEO Soldier will have for tiie given altemative with
regards to system development and future modifications. Therefore, a level of 0 corresponds to
no control and a level of 10 corresponds to complete control. For more detail about the scale
between the endpoints, see Figure 39 in Appendix F.
53
2.2.2.7 Fielding Risk.
This evaluation measure supports the objective minimize the risk that the capabilities are
notfielded supporting the objective minimize risk under the function support decision making. It
is a constructed scale, with integer values between 0 and 5. We defined it as a measure of the
risk that the simulation development will not be completed (e.g. due to budget cuts, project
failure, policy and regulation changes, etc). Therefore, a level of 0 corresponds to no risk and a
level of 5 corresponds to very high risk. For more detail about the scale between the endpoints,
see Figure 41 in Appendix F.
2.2.2.8 Time until Available.
This evaluation measure supports the objective minimize the time until the simulation is
available for use supporting the function support decision making. It is a natural scale, measured
in months. We defined it as the number of months from May, 2004, until a capability usable for
AoA purposes (to the degree that it will be developed) is available for use by PEO Soldier or its
designated representative. See Figure 43 in Appendix F.
2.2.3. Weighting.
Once the hierarchy was complete, we assigned local weights (LW) to the functions and
objectives based upon the needs and desires of our client, collected as part of our stakeholder
analysis. For this discussion, it is helpful to refer back to Figure 10, the completed value
hierarchy. Local weights indicate the value of that function or objective versus others on the
same branch and level in the hierarchy, and must sum to 1.0 (within each branch and level). For
example, the highest level functions are simulate and support decision-making. We assigned
them both a value of 0.5, indicating that they are equally important to the client. On the next
level, under simulate, we assigned accountfor Joint / combined arms effectiveness a value of 0.3
and model SIMS effectiveness a value of 0.7, indicating the much higher value to the chent of
modeling the Infantry soldier. See Table 1 for a summary of the local weights assigned to the
functions and objectives. It is normally not as accurate to assign weights from top-down.
However, due to the simplicity of our hierarchy (few levels and branches), we determined that
working top-down would still provide accurate weights. Our resulting global weights confirmed
that.
54
We derived the global weights (GW) for each of the evaluation measures by multiplying
the local weight of each objective and function along the branch of the hierarchy from the
evaluation measure to the top. Using Table 1 below, the equivalent calculation would be to
multiply the local weights from right to left for each row in the table. The resulting value for
each row would be the global weight of the evaluation measure supporting the right-most
objective. The global weights indicate the relative stakeholder value of each of the evaluation
measures with respect to all of the others and sum to 1.0. Table 2 contains a list of the evaluation
measures, sorted by global weight. Therefore, the measures are listed by relative importance to
the client.
Function
Simulate
Support
Decisionmaking
Local
Weight
0.5
Function/Objective
Local
Weight
Account for Joint/CA
Effectiveness
0.3
Model STMS
Effectiveness
Function/Objective
Local
Weight
Accurately model the
JoinfCA Environment
1.0
Model STMS
0.7
Modify Equipment
0.1
Modify Conditions
0.1
Modify nPs
0.1
0.7
Minimize Risk
0.6
0.5
Minimize the Time
until Simulation is
Available for Use
Minimize the Risk to
Achieving Required
Capabilities
Minimize the Risk That the
Capabilities Are Not
Fielded
Function/Objective
Local
Weight
Accurately Model the
Infantry STMS
Maximize the Types of
STMS Equipment the
User Can Modify
Maximize the User's
Potential for Control
over Conditions
Maximize the User's
Ability to Modify TlPs
1
1.0
1.0
1.0
1.0
0.5
0.5
0.4
Table 1. Local wei^ts of functions and objectives.
We can now use the completed value hierarchy, shown in Figure 10, to evaluate
alternatives using multiple, and often competing, objectives or attributes. Functions in that
figure are identified by a solid line, objectives by a dotted line, and evaluation measures by a
dashed line.
Evaluation Measure
Global Weight
STMS Modeling Capability
0.245
Time until Available
0.200
Joint/CA Modeling Capability
0.150
PEO Soldier Control
0.150
Fielding Risk
0.150
Modifiable Equipment Types
0.035
User Control over Conditions
0.035
TTP Modifiability
0.035
Table 2. Evaluation measures sorted by global weight.
55
1
1
Chapter 3:
Design and Analysis
3.1 Alternatives Generation.
3.1.1. Generation.
Our next step after initial completion of the Problem Definition phase - the most
important and resource-intensive phase - was to look at candidate solutions to the problem. We
generated a large number of alternatives from the following categories: 1) using existing
simulation capabilities; 2) using simulations under development; 3) modifying simulations under
development; 4) using a combination of the previous three categories; and, 5) creating a new
simulation capability. For existing simulations and simulations under development, we generally
considered the same simulations as those looked at by the Soldier MAWG, with some
exceptions. We considered only domestic simulations, since we received direct guidance from
our client not to consider models developed by foreign countries. The following are brief
descriptions of the alternatives we developed, by group.
3.1.1.1 Existing Simulations.
In this set of altematives, PEO Soldier would use a currently-existing simulation. They
may attempt to influence future developments by presenting their requirements; however, they
would not provide any resources to the developers, except as required to obtain, use, and
maintain the system. The following are the altematives we considered in this set.
Agent-based models (ABMs). We considered the following ABMs, all part of the
Marine Corps' Project Albert: MANA, Pythagoras, and Socrates. These models are closed-loop,
agent-based, time-sequenced, stochastic models. They resolve down to the individual
combatant. These models (often called distillations) represent first-order behaviors, but do not
use the complex, physics-based algorithms characteristic of other constiiictive simulations.
Combined Arms and Support Task Force Evaluation Model (CASTFOREM). The
proponent for this model is TRAC - White Sands Missile Range (TRAC-WSMR). It is a closed-
56
loop, event-sequenced, stochastic simulation. It resolves down to the individual combatant,
vehicle, or platform.
Janus. The proponent for this simulation model is TRAC-WSMR. It is a primarily
HITL (but with closed-loop execution possible), event-sequenced (with some time-sequenced
exceptions), stochastic model. It resolves down to the individual combatant, vehicle, or
platform.
Joint Conflict and Tactical Simulation (JCATS). The proponents for this simulation
are Lawrence Livermore National Laboratories (LLNL) and the Joint Warfighting Center
(JWFC). It is a primarily HITL (but with closed-loop execution possible), event-sequenced,
stochastic model. It resolves down to the individual combatant, vehicle, or platform.
One Semi-Automated Forces Testbed Baseline (0TB). The proponent for this
simulation is PEG Simulation, Training and Instrumentation (PEO-STRI). It is capable of both
HITL and closed-loop execution, event-sequenced, and stochastic. It resolves down to the
individual combatant, vehicle, or platform.
Squad Synthetic Environment (SSE). This virtual simulation is developed by the
Advanced Interactive Systems (AIS) Company and is used extensively at the Soldier Battle Lab
at Fort Benning, GA. It is a HITL, virtual simulation. It resolves down to the individual soldier.
Vector in Commander (VIC). The proponent for this model is TRAC - Fort
Leavenworth (TRAC-FLVN). VIC is a closed-loop, event-sequenced, deterministic model that
resolves down to the battalion level.
3.1.1.2 Simulations under Development.
In this set of alternatives, PEG Soldier would use the end product as it is developed.
They may attempt to influence the process along the way by presenting their requirements;
however, they would not provide any resources to the developers, except as required to obtain,
use, and maintain the system. The following are tiie alternatives we considered in this set.
Advanced Warfighting Simulation (AWARS). The proponent for this model is TRACFLVN. AWARS is a closed-loop, event-sequenced, deterministic model currently being
developed to replace VIC. The model will resolve down to the battalion level.
Combined Arms Analysis Tool for the XXP* Century (Combat'™^). The proponent
for this model is TRAC-WSMR. Combat^^ is a closed-loop, event-sequenced, stochastic model
57
currently being developed to replace CASTFOREM. It will resolve down to the individual
combatant, vehicle, or platform.
Infantry Warrior Simulation (IWARS). The proponents for IWARS are the Natick
Soldier Center and AMSAA. IWARS is a primarily closed-loop (but capable of HITL
operation), time-sequenced with event interruptions, stochastic model currently being developed
to replace both the Natick Soldier Center's Integrated Unit Simulation System (lUSS) and
AMSAA's Infantry MOUT Simulation (AIMS). It will resolve down to the individual soldier.
One Semi-Automated Forces Objective System (OOS). The proponent for this
simulation is PEG Simulation, Training and Instrumentation (PEO-STRI). It is capable of both
HITL and closed-loop execution, event-sequenced, and stochastic. It resolves down to the
individual combatant, vehicle, or platform. It is currently being developed to replace Janus,
JCATS MOUT, and OIB.
3.1.1.3 Modifying Simulations under Development.
In this set of alternatives, PEO Soldier would provide resources to a simulation currently
under development in order to have their requirements met. The amount of resources depends
upon, among other things, the projected capability of the simulation, the architecture of the
software, the programming, operating, and maintenance costs of the simulation developer, and
the developer's willingness to implement PEO Soldier requirements. For this set, we considered
the modification of all four simulations under development already considered: AWARS,
Combat^', IWARS, and OOS.
3.1.1.4 Combining and/or Modifying Existing and Developing Simulations.
This set of altematives could encompass any combination of the previous three
categories. At this point, we are only considering combinations of the eleven simulations already
discussed. We will refine, and further define, this altemative set later.
3.1.1.5 New Simulation.
This is a single altemative in which PEO Soldier would entirely resource a simulation
capability that will meet their requirements. They would assume all responsibilities associated
with the entire lifecycle of this product.
58
3.1.2. Feasibility Screening.
3.1.2.1 Constraints.
Not all of the alternatives that we identified above were feasible. Our previous analyses
(primarily our stakeholder analysis) revealed constraints that any candidate solution would have
to satisfy in order to be considered further. The primary constraints that we considered (with the
source of the constraint in parentheses) were:
•
the user must be able to alter STMS characteristics witiiin the simulation (client);
•
the user must be able to enter and modify a scenario into the simulation (client);
•
the simulation must be at the resolution of the individual Infantry soldier (client);
•
the simulation software must be high level architecture (HLA) compliant (regulatory
requirement); and,
•
the simulation must be capable of closed-loop (non-human-in-the-loop) execution
(our team).
We determined that the last constraint was appropriate because our solution would be used
primarily for analysis. Therefore, in order to reduce the error associated with human learning
and experience, as well as the challenges of finding subjects on short notice, we required that the
simulation solution be able to run non-HITL. We did, however, leave open the possibility that in
some cases, HITL execution may be appropriate.
3.1.2.2 Feasibility Screening Matrix.
After identifying the key constraints, we had to screen out those altematives that were
clearly infeasible. To do that, we constructed a matrix in which we compared each of our
original altematives to each constraint. See Table 3. If an altemative failed to satisfy a
constraint, it received an "NG," indicating no-go, in the table with the reason w^iy; otherwise, it
received a "G," indicating go. If an altemative failed to meet even one constraint, it was
eliminated from further consideration, indicated by an "NG" in the RECAP column. As a result
of this analysis, the following altematives were eliminated.
59
3.1.2.3 Eliminated Alternatives.
The elimination of an alternative does not imply that the simulation is bad. It only
indicates that, given the constraints discussed above, it cannot be a solution to our problem. In
fact, the use of many of the eliminated simulations for Infantry soldier modeling is quite
appropriate in many cases. For instance, currently only ABMs have the capability to answer
first-order behavior questions that may be critical for certain lypes of analyses. The SSE is a
premier simulator that can answer equipment usability questions that no other simulation can.
Therefore, elimination from our consideration here does not imply that it will not factor into the
final overall solution indirectly.
Alter STMS
Soldier Entity
Enter Scenario? HLA Compliant?
Characteristics?
Resolution?
(Yes)
(Yes)
(Yes)
(Yes)
ABMs
Aggregate to
Platoon?
RECAP
ores)
Closed Loop
Capable?
(Yes)
NG(No)
G
NG(No)
G
G
G
NG
CASTFOREM
G
G
NG(No)
G
G
G
NG
Tanus
G
G
G
G
G
G
G
JCATS
G
G
G
G
G
G
G
0TB
G
G
G
G
G
G
G
SSE
G
G
G
G
G
NG(No)
NG
VIC
G
G
G
NG(No)
G
G
NG
AWARS*
G
G
G
NG(No)
G
G
NG
Combat™*
G
G
G
G
G
G
G
[WARS*
G
G
G
G
G
G
G
OOS*
G
G
G
G
G
G
G
Combination
G
G
G
G
G
G
G
New Simulation
G
G
G
G
G
G
G
I* These simulations, for screening purposes, are considered together whether modified or used as is.
|
Table 3. Feasibility screening matrix.
ABMs. These models failed to meet two of the constraints. First, because they are
distillations, they are not intended to model the wide array of physics-based effects associated
with the combat environment. Therefore, they do not have the level of detail witii regard to
STMS characteristics to allow the creation of STMS components within the simulation.
Secondly, they are not HLA compliant since they are not yet intended to federate directly with
existing combat simulations.
CASTFOREM. This model is not HLA-complaint and was eliminated as a result.
However, Combat^^, wiiich is to replace CASTFOREM, was not eliminated and would be the
appropriate model to consider anyway.
SSE. The Squad Synthetic Environment is a virtual simulation and was eliminated
because it cannot execute without a human-in-the-loop.
60
VIC. This simulation was eliminated because it does not resolve down to the individual
soldier.
AWARS. This simulation was eliminated because it does not resolve down to the
individual soldier. Elimination of this alternative also implied the elimination of the
modification of this software as an alternative. Since the intent of this simulation is for higherlevel analyses, modifying it to meet PEO Soldier's requirements would not fit into the purpose of
the model.
3.1.3. Alternative Optimizatioii.
The only alternative (or alternative set) that required further optimization was the
combination set. We mentioned previously that this set could include any combination and/or
modification to the eleven simulations listed in both the existing and under-development
categories. However, we narrowed down that list to one optimized alternative: the modification
of and linkage between Combat'^^ IWARS, and OOS. In this altemative, PEO Soldier provides
resources to all three feasible simulations currently under development in order to have their
requirements met. The amount of resources depends upon, among other things, the parsing of
the requirements between the simulations, the projected capabilities of the simulations, the
architectures of the software, the programming, operating, and maintenance costs of the
simulation developers, and the developers' willingness to implement PEO Soldier requirements.
Our reasoning for considering only this altemative in the set is as follows. Five of the
original eleven simulations were eliminated in feasibility screening and were not necessary to
consider further. OOS is intended to replace Janus, JCATS MOUT, and 0TB. That left three
simulations: Combat^\ IWARS, and OOS. Since each of them have unique characteristics, we
saw no need to narrow them down any further. Fortunately, their status as simulations under
development gives PEO Soldier a better opportunity to integrate their requirements into the
initial development of the product.
3.2 Modeling and Analysis.
With our initial list of altematives narrowed down to only those which were feasible, we
modeled each of the altematives to determine data for each of the evaluation measures. As
mentioned previously, most of the evaluation measures were constructed scales. Therefore, the
61
process of determining values was subjective. Additionally, all but three of the eleven
alternatives were either under development or did not exist at all. As a result, we had to estimate
values based upon existing documentation, projected capability, and subject-matter e^qjertise.
Because of the subjectivity and incomplete information involved, we often had to assess values
for the alternatives by comparing them against each other, instead using the ideal method of
assessing each alternative by itself Therefore, the scores are more accurate wiien considered
relative to each other rather than independently. The following discussion describes our scoring
of the alternatives by evaluation measure. Recall the discussions in Section 2.2.2 and the figures
in Appendix F defining each of the measures. Table 4 below, tiie raw data matrix (RDM),
summarizes the results of the altemative modeling. Be aware that, when looking at the RDM, a
lower value may more desirable than a higher one in some cases.
loint/CA Modeling
Capability
SUMS Modeling
Capability
Modifiable
Equipment Types
User Control Over
Conditions
rTPModfflability
Janus
JCATS
OTB
CbtXXI
IWARS
oos
Mod
CbtXXI
Mod
IWARS
Mod
OOS
Combine
New Sim
5
6
6
6
4
7
7
5
8
9
10
1
3
3
4
7
5
5
S
6
9
10
5
6
6
7
8
7
8
9
8
9
10
6
6
6
7
7
7
8
8
8
9
10
1
1
2
3
3
3
3
3
3
3
3
PEO Soldier Control
0
0
0
1
2
1
4
6
4
6
10
Fielding Risk
0
0
0
3
3
2
2
2
2
2
4
Time Until Available
0
0
0
12
29
17
12
29
17
29
71
Table 4. Raw data matrix (RDM).
3.2.1.1 Joint/Combined Anns Modeling Capability.
This evaluation measure assesses the alternative's ability to represent the Joint and
combined arms battlefield to the degree required for STMS comparative analysis. To model
alternatives on this scale, we considered the following capabilities: representing the
Joint/combined arms assets themselves in terms of their actions and their behaviors, and
representing the soldier's interaction with them (calls for fire, communications, exchange of
digital information, etc). Existing simulations (Janus, JCATS, and OTB) as a wiiole represent
Joint and combined arms fairly well; however, they do not represent the interaction with the
soldier as well. In fact, those three simulations are primarily designed for human-in-the-loop
(HITL) interaction. Therefore, we assigned them all a baseline level of 5. JCATS scored
slightly higher (a value of 6) because of its use of limited behaviors for robotics control and its
use for special operations training (thereby integrating special operations assets that interact with
62
the individual special operations soldier). We scored 0TB a level of 6 because of its limited
behaviors for certain combined arms assets (e.g., standard battle drills).
For Cbt^^^, we assessed a level of 6 because that software will model many of the
interactions with Joint and combined arms assets through decision tables; however, its focus is
not at the individual soldier level and thus might not capture interactions with the soldier in
detail. On the other hand, IWARS is designed to represent the individual soldier with a high
level of fidelity, but its requirements for Joint and combined arms asset representation are much
less. We scored it at a 4. OOS intends to have a large Joint and combined arms capability and
behavior composition tools. We scored it at a 7.
For the modified altematives, we made the assumption that PEO Soldier infusion of
resources into any of the previous three simulations could raise their level by a point, giving Mod
Cbt^^ a 7, Mod IWARS a 5, and Mod OOS an 8. We scored the Combine alternative at a 9
because using a combination allows PEO Soldier to capitalize on the combined arms and Joint
strengths of Cbt'^^ and OOS vMle representing interactions with the high-fidelity soldiers in
IWARS. Therefore, it should at least have the level of Mod OOS (8) and gain another point
through linkage. Finally, for the New Sim alternative, we assumed that PEO Soldier could have
programmed exactly what it needed into the simulation, giving it a level of 10.
3.2.1.2 STMS Modeling Capability.
This evaluation measure assesses the alternative's ability to represent the functions of the
soldier in the detail required for STMS comparative analysis. When first looking at the levels we
assigned to the various models, the reader may be surprised to see particularly low values. Keep
in mind that we want a model that can represent such items as the Commander's Digital
Assistant (CDA) and Helmet-Mounted Display (HMD) and their effects on soldier performance.
Our decomposition of the soldier revealed eleven functions of the soldier that require modeling.
Six of those fall under the decide function. Therefore, any simulation that does not model soldier
decisions, or mental processes and their effects on the soldier, started with a level of 4.5 (five
elevenths of the total ten points), and that assumes complete representation of soldier action
functions. Appendix C (our requirements document) indicates the high level of fidelity we are
seeking; therefore, scores start very low. This does not indicate a poor simulation, only that it
does not score well against the highly-detailed requirements we developed.
63
We scored Janus at a 1 because representation of the soldier is only at the most basic
level. It does not have behavior models and, at the soldier level, does not represent actions at a
very high fidelity. JCATS scored a 3 because of its somevs^iat more detailed individual soldier
representation, especially in terms of the urban operations (UO) environment and special
operations forces. We scored 0TB a level of 3 because of its limited behaviors (e.g., standard
battle drills), its absorption of Infantry soldier modeling originally programmed into DISAF
(Dismounted Semi-Automated Forces), and its integration with the Squad Synthetic
Environment used for Infantry training and analysis at Fort Benning.
For Cbt
, we assessed a score of 4 because, wdiile it can model behaviors through use of
decision tables, it does not model high-fidelity soldiers well, especially in a MOUT environment.
On the other hand, IWARS is designed to represent the individual soldier with a high level of
fidelity and with behaviors, and so we scored it at a 7. OOS will model the Infantry soldier,
integrating all of what OTB already has, and adding extensive behaviors. However, its projected
soldier entity capability does not appear to be at the fidelity that IWARS will. We gave it a level
of 5.
For the modified alternatives, we again made the assumption that PEO Soldier infiision of
resources into any of the previous three simulations could raise their level by one, giving Mod
Cbt^' a 5, Mod IWARS an 8, and Mod OOS a 6. We scored the Combine alternative at a 9
because using a combination allows PEO Soldier to leverage the capabilities of the three
simulations. Thus, it should at least score the level of Mod IWARS (8) and gain another level
through linkage. Finally, for the New Sim alternative, we assumed that PEO Soldier could have
programmed exactly what it needed into the simulation, giving it a level of 10.
3.2.1.3 Modifiable Equipment Types.
This measure represents the user's ability to modify the equipment that the individual
soldier is carrying (or components of the STMS). The three existing simulations, whether or not
they model soldier functions in enough detail, have a fairly extensive capability to alter
equipment, giving them at least a level of 5, our baseline. We scored Janus at the baseline level
of 5 because its representation of the soldier is only at the most basic level. JCATS and OTB
scored slightly better than Janus (level of 6) because of extended capabilities programmed
through association with the Soldier Battle Lab (SBL) at Fort Benning.
64
For Cbt
, we assessed a level of 7 because it is a tool being designed for analysis, and
therefore, will have a robust capability to alter equipment for analysis purposes. OOS will
include an extensive capability to alter equipment as well, and therefore was scored a 7.
IWARS, because it is being designed for Infantry STMS analysis, should have an even greater
ability to alter equipment, especially STMS equipment, and thus scored an 8.
For the modified altematives, we again made the assumption that PEO Soldier infusion of
resources into any of the previous three simulations coixld raise their level by one, giving Mod
Cbt^^ an 8, Mod IWARS a 9, and Mod OOS an 8. We scored the Combine alternative at a 9
because using a combination should at least score the level of Mod IWARS (9), but would not
quite reach the ideal level of 10. Finally, for the New Sim alternative, we again assumed that
PEO Soldier could have programmed exactly v^at it needed into tiie simulation, giving it a level
of 10.
3.2.1.4 User Control over Conditions.
This measure represents the user's ability to modify conditions, to include the scenario
and the environment. The three existing simulations allow the user to control the initial
conditions by entering the mission/scenario and altering the environmental conditions. However,
the detail of the conditions is not at the required resolution for Infantry soldier analysis. For
instance, the user cannot control environmental conditions that change rapidly in time and space
during a single run of the simulation. Therefore, we scored Janus, JCATS, and 0TB at a level of
6. For the simulations under development - Cbt^^, IWARS, and OOS - we assumed an
improvement in the modifiable resolution of the environment, giving them a level of 7. For the
modified altematives, we again made the assumption that PEO Soldier infiision of resources into
any of the previous three simulations could raise their score by a point, giving Mod Cbf^^, Mod
IWARS, and Mod OOS a level of 8. We scored the Combine altemative at a 9 and the New Sim
altemative a value of 10.
3.2.1.5 Modify TTPs.
For this measure, we used only four levels (0-3) to evaluate the user's ability to model
tactics, techniques, and procedures (TTPs). At the lowest level (0), no modification of TTPs is
possible. No altematives fell into this category. The next level (1) allows modification through
65
scripting only. Simulations in this category have no behavior models. The only way to affect
TTPs in these is to script movement and decisions or to use HITL. Janus and JCATS fell into
this category. The next level (2) allows modification through changes in the code only. 0TB,
with its limited behaviors, fell into this capability. The highest level (3) indicates a capability
that allows users to modify TTPs through the use of a behavior modification tool and/or decision
tables. The remaining simulations will all be in this category.
3.2.1.6 PEO Soldier Control.
This is a measure of the amount of control that PEO Soldier will have over the
development of the simulation capability and future modifications. We assumed that, for
simulations that alreacfy exist, PEO Soldier would have no influence over their development or
modification (since this alternative does not include any type of funding by PEO Soldier).
Therefore, Janus, JCATS, and 0TB all receive a value of 0. For the simulations under
development, we assumed that PEO Soldier (again, not funding any of these efforts) may have
some input, but very little control, and we therefore scored Cbt^^ and OOS a level of 1. We
scored IWARS a level of 2, because that simulation is being designed specifically with support
of PEO Soldier's acquisition programs in mind and therefore would give PEO Soldier a larger
voice.
Alternatives that include funding will naturally give PEO Soldier more influence over the
development of the simulation capability. Thus, PEO Soldier is likely to have some control
(corresponding to a level of 4) in alternatives Mod Cbt^^ and Mod OOS, still limited by the fact
that these simulation programs encompass a tremendous amount of requirements, of w^ich PEO
Soldier's part is small. Mod IWARS, on the other hand, should allow PEO Soldier significant
control (a level of 6) since support of organizations like PEO Soldier is the primary purpose of
the simulation. In the Combine altemative, PEO Soldier's control over development would still
reflect significant control, but not more, thus giving the altemative a level of 6. One might be
inclined to reduce the value since it includes two simulations over vAiose development PEO
Soldier would have less control, but the measure reflects control over the development of
requirements, not control over a specific program. Finally, if PEO Soldier developed a new
simulation, they would have complete control (a level of 10).
66
3.2.1.7 Fielding Risk.
Fielding risk is a measure of the risk of project failure and ranges in levels from none (0)
to very high risk (5). Since existing alternatives are already fielded, there is no fielding risk.
Thus, Janus, JCATS, and 0TB have levels of 0. We assigned OOS a level of 2 (low risk)
because of the amount of resources and effort behind its development. On the other hand, we
assigned medium risk (level of 3) to Cbt^' and IWARS because they receive significantly less
levels of funding and have not yet released working models (whereas, at the time of this
evaluation, OOS had already released Block B).
With the infusion of resources into any existing developmental effort, we assumed that
the risk is reduced to low (a level of 2). Thus Mod Cbt^^, Mod IWARS, Mod OOS, and
Combine all received levels of 2. We did not feel that the amount of resources PEO Soldier
might infuse into OOS (which would be a very small percentage of that program's overall
budget) would affect its risk rating in this regard. We also rated the Combine alternative at low
risk. We scored the development of a new simulation, however, at a level of 4 (high risk)
because of the inherent risks of starting a new, long-term venture. PEO Soldier would be
competing for the same programming expertise with other, already-established programs.
Additionally, while all three simulations under development have received some higher-level
support, whether implicit or explicit, any new simulation proposed by PEO Soldier would have
to gain such support in an M&S leadership environment leaning towards less, not more,
simulations.
3.2.1.8 Time until Available.
This measure has the only natural scale of our eight evaluation measures. As such, this
was the most objective (but not without some subjectivity). Since existing simulations are usable
now, they were given a value of 0 months. At the time of this evaluation, Cbt^^ was planning a
usable release for the Land Warrior (LW) AoA beginning in June, 2005, and thus was 12 months
from release. We assumed that IWARS' second release. Version 2.0, with its completed
information-centric capability would be the first usable version. Its release was scheduled for the
end of FY06, giving it a value of 29 months. However, we have since learned that they have
altered their timeline and priority of requirements to have a usable version by June, 2005, for the
LW AoA. Nonetheless, we did not leam that information until our evaluation was complete.
67
OOS' Block D release (v^at we considered the first usable version for our purposes) was
scheduled for the end of FY05, giving it a value of 17 months.
For the modified alternatives - Mod Cbt^\ Mod IWARS, Mod OOS - we assumed that
the timelines would not change with the infusion of PEO Soldier resources, but the requirements
might. For the Combine altemative, we used the longest timeline of the three component
simulations, or 29 months. Finally, for the New Sim altemative, we used developmental
information from CASTFOREM, Combat^\ lUSS, and IWARS, to estimate the developmental
time of a new simulation. We assimied that development would not begin until the new fiscal
year (in 5 months) and take approximately 5.5 years to develop - the average of our four
estimates - for a total of 71 months.
68
Chapter 4:
Decision Making
4.1 Alternative Scoring.
During this step, we developed value curves that would be used to convert the raw scores
determined earlier during modeling for each evaluation measure into value scores (related to, but
not the same as, utility) on a common scale. That scale ranged from 0 to 100, with 100 being the
most desirable. The value curves were based upon the information obtained during our analyses,
primarily our stakeholder analysis. For the seven constructed-scale evaluation measures, we
chose as the endpoints for our curves the lowest and highest possible levels for that scale.
Therefore, the least desirable level on that scale corresponded to a value of 0 and the most
desirable level on that scale corresponded to a value of 100. For the one natural measure, time
until available, we chose 0 months as our 100 value endpoint (since less is better), and 120
months as our 0 value endpoint (since we chose 10 years as our planning horizon).
SIMS Modeling Capability
p'Noo-]
j^.-^
■'■■'^-
^.^^"""^
:■.■.,■■.
^••
70 -
y
60-
J
50-
^
40^
X
30^-
f;;.. m^ '".-iio^•""^
,
'i
.
^
c
■■
/
/
^
" '■■ ■ •■; 10
;,
^
(
3
,,■■■■'
2
rj—
,;
4'
"■,'■'
6 ••
'■■■TT—r—'
;■ ■
8
Level
,
M0
■:■••;;
Figure 11. STMS modeling capability value curve.
After determining the endpoints, we used various methods to determine the shape of the
ciirve between the points. The resulting curves took the following shapes: linear, piecewise
linear, s-curve, and exponentially increasing. An example is shown in Figure 11 for the
evaluation measure STMS modeling capability. Based on the slight s-curve shape, we see that
69
changes in level at the lower and higher ends of the scale are not as valuable as changes in the
middle portion of the scale. Similar reasoning, based upon the slope of the curve, can be made
for the other curve shapes. Appendix F shows the value curves for all evaluation measures.
After we constructed value scales, we used them to convert the raw data into value
scores. We then multiplied the value scores by the global weight of the evaluation measure and
summed the result for all evaluation measures to obtain a total value score for each alternative.
The decision matrix (DM) in Table 5 summarizes the conversion to value scores and the
resulting total value scores for each alternative. The number listed for each alternative and
evaluation measure corresponds to a value score. The numbers in the bottom row correspond to
the total value scores for the alternatives (value scores multiplied by the global weights and
summed for that colimin).
Global
Weight
Janus
JCATS
OTB
CbtXXI
IWARS
OOS
Mod
CbtXXI
Mod
IWAKS
0.150
50
60
60
60
40
70
70
50
80
90
100
0.245
6.9
22.9
22.9
35.6
77.1
50
50
86.7
64.4
93.1
100
0.035
25
36
36
49
64
49
64
81
64
81
100
0.035
69
69.1
69
84
84.1
84.1
93.3
93.3
93.3
97.7
100
0.035
20
20
50
100
100
100
100
100
100
100
100
PEO Soldier Control
0.150
0
0
0
17.5
35
17.5
70
80
70
80
100
Fielding Risk
0.150
100
100
100
50
50
70
70
70
70
70
10
Time Until Available
0.200
100
100
100
95.7
81.6
92.3
95.7
81.6
92.3
81.6
43.9
48.2
54.0
55.1
55.1
62.6
62.5
71.9
77.1
76.3
84.9
75.3
Joint/CA Modeling
CapabiUty
SIMS Modeling
Capability
Modifiable
Equipment Types
User Control Over
Conditions
TTPModifiability
Total Value
Score
Mod OOS Combine
New Sim
Tables. Decision matrix (DM).
4.2 Decision,
4.2.1. Alternative Comparison.
By looking at the decision matrix in Table 5, the reader can see that the alternative with
the highest total value score is Combine - the modification of and linkage between Combat'™,
IWARS, and OOS. The next four scores (Modified IWARS, Modified OOS, New Simulation,
and Modified Combat^') are grouped fairly close together, but significantly behind that of
Combine. Although we mentioned it earlier in the paper during modeling, it is important to point
out again that these scores meaning nothing in isolation. They are a means to compare the value
of the altematives in terms of the stakeholders' objectives, on a relative basis. Thus, Combine,
relative to the other altematives, best meets the objectives of the stakeholders. Figure 12 and
70
Figure 13 below show stacked bar graphs of the value scores within each of the two primary
subfunctions: simulate and support decision-making functions respectively.
^..^„,.^.,...^.....^^....,_^ Resute Optmiu Wndow H*
Rsuikiiig foi Simulate Fimctioii'Otjective
Alternative
New Siimilation
Combinalion
Modified IVVARS
Modified OOS
IWARS
Modified Cbt XSI
OOS
Cbt XXI
OTB
rCATS
: Jams V
Value
100.000
92,118
76.673
73.390
67,128
63.50
61.81?
51.742
40,102
38,002
26,383
ai STMS Modeling CspatiUfy
H Usa Coofrol over Con^on«
03 loWCA ModA>g Capability
B TIP Modiflability
H Modifiable E<jtiipmenH>p€s
Figure 12. Stacked bar graph of the value scores within the simulate function.
P' Rfe elK vm tssess ftevlm R»ults OfXHK wniciw Help
RaiiMiig for Support Decitsioix Making Function/Objective
Alternative
Mocfifiedcbtxxi
Mo<ttfied OOS
MotMed IWARS
Combination
Jami.?
ICATS
OTB
OOS
CMXXI
IWARS
New Stoiulation
Vahie
80.286
7S.925
77.623
77.623
70.000
70.000
70.000
63.175
58.536
,. ,58.123
50.566
Time Until Available
H
^¥m?mm 11
mmmmmm
mmmmmm
liiP^
wmm
^mmmn
HH
QH] PEO Sotdia- Coiitiol
^ KdAig Risk
Figure 13. Staclied bar graph of the value scores within flie support decision maldng function.
71
As one would expect, New Sim scores the highest within the simulate function; however, it
scores the lowest within the support decision-making function. The graphs above also show the
contribution of each of the evaluation measures to its parent function.
4.2.2. Sensitivity Analysis.
Much of the process that led to the total value scores (weighting, modeling, and value
conversions) was subjective. Therefore, we conducted sensitivity analysis on our results to
determine if our best alternative would be different given small changes to the subjective ratings.
4.2.2.1 Sensitivity of the Weights.
First, we looked at the weighting assigned to each of the evaluation measures. Figure 14
below is a screen shot from Logical Decision for Windows showing the sensitivity of the total
value scores to the global weight of the SIMS modeling capability evaluation measure.
yse B» View Assess RaSew Restte Optkss Window H*
MslS <^ MalsJB Lifyrj
: Best
■. ■
"*'>;_3^^^"1—r
i
^"*S5^
;,:;::,: Value
^""■--v.
""'"''•v^
^■^
*"■"
"^-s.
"■•1
^""^^^
wota
0: :,
i1
11
1
1
11
1
1
11
!
J
"'"^--i
1
CombiiKrtiou
Modified IWARS
Modified OO.S
New Shunlation
Modified CbtXKI
IVt'ARS
OOS
CbtXXI
OTB
JCATS
Taims
: !■"».>.
1 ""j""'^
100
Paceot of Wdajit on STMS Modeli^ Capabity Ival Measm e
Figure 14. Graph of the sensitivity of the global weight of STMS modeling capability.
The y-axis represents total value score and the x-axis represents the global weight of the
evaluation measure. Each line in the graph represents an alternative. The vertical line indicates
the original global weight of the evaluation measure (in this case, 0.245). At the original global
weight, the reader can see that the best alternative (highest line) is the Combine altemative. By
moving along the line representing the best altemative, one can find the point (point of
72
indifference) at which another alternative scores a higher total value score. In this case, at a
global weight of about 0.68, the New Sim alternative becomes the highest scoring. Thus, the
client would have to shift his global weight of this measure to 0.68 for our recommended
alternative to change. The rule of thumb we used for such analysis is that a point of indifference
within 0.1 of the original global weight indicates sensitivity of our altemative to the global
weight of that evaluation measure. We constructed such graphs for all eight evaluation measures
(refer to Appendix G) and found that our altemative was not sensitive to the global weighting of
any of the evaluation measures.
4.2.2.2 Sensitivity of the Altemative Modeling.
We also looked at the sensitivity of the scores we assigned to the altematives for each of
the evaluation measures during our modeling step. For this, we first eliminated six of the
altematives from consideration (Janus, JCATS, 0TB, Cbt^\ IWARS, and OOS) using the
following analysis. We assumed that all of the scores we assigned to these six altematives were
underestimates and we therefore raised or lowered them (depending on which level was more
preferable) one level for all relevant evaluation measures. Even with this, the highest scoring of
the six altematives fell 10 points below the Combine altemative. Since their total value scores
were still significantly lower than our recommended altemative's total value, we determined that
no changes, within realistic bounds, to the scores we assigned them would cause any of them to
be the highest scoring altemative.
That left five altematives to analyze. We had already assigned the highest possible levels
to the New Sim altemative for all but two of the measures -fielding risk and time until available.
Even wiien we reduced ^Q fielding risk to medium and the time until available by a year. New
Sim still fell slightly behind the Combine altemative, althou^ by only just over a point.
However, we were fairly confident in the values we assigned originally for the two measures.
The remaining three altematives that scored below the Combine altemative were Mod IWARS,
Mod OOS, and Mod Cbt^\ Since the Combine altemative includes the modification of those
three simulations already, we determined that, at worst, it would score the same as those three
altematives for the five measures under the simulate function. We applied similar reasoning for
the three measures under the support decision-making function. Only the time until available
measure seemed to have the potential for a lower score (more time) than the other three. On the
73
other hand, by using all three simulations in the Combine alternative, we could assume that PEO
Soldier would be increasing their control over the solution (by shifting requirements between the
simulations) and reducing thdr fielding risk (by not putting all of their resources into a single
simulation).
As a result of the above analysis, we determined tiiat the recommended solution was not
sensitive to the scores we assigned during modeling (assuming realistic bounds on those scores).
4.2.2.3 Sensitivity of the Value Curves.
Our final sensitivity analysis involved the value curves. To test the sensitivity of the
value curves, we decided to see if the recommended altemative would change if all of the value
curves were linear (versus piecewise linear, s-curve, or increasing exponential). Figure 15 shows
the resulting total value scores with linear value curves. The difference in total value scores
between Combine and the three modification altematives actually increases slightly. On the
other hand, the difference between Combine and New Sim decreases to less than 2 points.
However, the recommended altemative still does not change. Therefore, we determined that the
recommendation is not sensitive to the value curves.
MillMyBl^lf
Ranking for Siimilatioii Siipport for STMS Acqiusition Decision Making Faiictioii/Objective
Alternative
\'alue
Combuiation
New Sauulahoii
M6<BfiedTO'.^S
ModtfiedOOS
Mofflfied Cbt XXI
OOS
OTB
JCATS.
IWARS
. :
Cbt XXI ;
Jmnis
78 '•V
~6 16"
<»'17
g-.W
64.850
58.817
5T,S83
56.717
S6M7
52.700
A&.967
t
■'",,;
L
.
w
a
^.v
^^
i
I
[
1
L_
L
f
L
L_-
i,. ,i
" ' v; s '
..:.: ^::.: ::::::¥:::: "■n
i
, , ,\U. i...
~
—
■■,'■
.;■*"
1
-"1
:":-v:::::
_
*
.'"■;
'!
i
~"T"1
K
"i
"f
■■■'<■■■
■4" ..
n '■
■
:y\
.,::.;, :i
Figure IS. Total value scores assuming linear value curves.
74
4.2.3. Cost-benefit Analysis.
We did not account for total cost in our value hierarchy because we treat cost as an
independent variable. Therefore, after conducting sensitivity analyses, we conducted a costbenefit analysis. Similar to what we faced with our other modeling, the determination of costs
for alternatives that are still under development or not in existence was imprecise, at best. We
did not have the detailed information necessary to determine the cost of modifications, since
those costs would depend greatly on wiiich requirements needed to be added to the simulation(s)
and how hard the integration of those requirements would be. Therefore, we used the
information we had available to be within order-of-magnitude accuracy. Our detailed cost
estimations can be found in Appendix G. Figure 16 shows the resulting cost-benefit graph.
Since creating a new simulation was both orders-of-magnitude more costly and provided less
benefit, it was dominated by the highest-scoring alternative.
Cost Benefit Analysis
01
l_
o
u
Ui
3
i
CQ
■!-•
O
$0
S10
320
$30;,
♦ Janus
« JCATS
0TB
Cbt XXI
-I WARS
♦ OOS
^ Modified Cbt XXI
- Modified I WARS
■ Modified OOS
♦ Combination
■ New Sinnulation
Costs (millions)
Figure 16. Cost-benefit graph.
Our analysis can also be viewed by cost groupings. In that view, the altematives
generally fell into three groups: low-cost (existing simulations and simulations under
development), mid-cost (modifying and/or combining existing simulations and simulations under
development), and high-cost (developing a new simulation). The groups are labeled in Figure
75
16. Again, creating a new simulation was both more costly than any member of the mid-cost
group and still provided less benefit than the highest-scoring alternative. Also, since the
Combine alternative fell within the same cost category of the closest-scoring alternatives, it
could not be ruled out due to costs. Thus, the Combine alternative remained our recommended
solution.
4.2.4. Recommendation.
We presented our results and recommendation to our client, PEO Soldier, on 14 May
2004. They agreed with our recommendation and have begun to process of generating
community buy-in and implementing the solution.
It is important to note that we are not recommending that PEO Soldier and those
conducting analyses on their products cease to use other simulations. In the short term, we must
continue to use tools that are currently in existence, many of w^ich have tremendous strengths
and capabilities. In fact, even after this recommendation comes to fruition, tiiere may very well
be appropriate times to use other simulations to conduct analyses within certain niches wiiere
their strengths lie. However, as far as the investment of PEO Soldier resources to solve this
particular problem in the long-term, we make the above recommendation.
76
Chapter 5:
Implementation
5.1 Planningfor Action.
We are now in the process of conducting joint presentations with PEO Soldier to selected
major players in the DoD analysis and Infantry soldier system acquisition communities. These
presentations will explain our methodology, results, and recommended course of action.
Such
meetings will ensure that key members of the community understand PEO Soldier's needs and
how those needs are met by modifying and combining Combat'™, IWARS, and OOS.
Additionally, we can consider and integrate suggestions from others who may have important
insights into the problem and the solution.
Once we have consensus in the community, we will move forward with implementation.
The first primary task that we must accomplish is to establish points of contact (POCs) and/or
liaisons with each of the simulation proponents in order to open lines of communication and to
facilitate the flow of information. From there, we can begin to negotiate agreements between
PEO Soldier and the simulation proponents. While such negotiations can begin in generalities,
we will need to move rapidly into specifics in order to move forward with the implementation.
Therefore, we must begin the process of converting our functional requirements into simulation
specifications that will allow simulation managers and programmers to implement PEO Soldier
requirements into their software. Integral to that process is determining how to divide the
requirements among the simulations either through direct modification or through a linkage.
Factors that might affect this parsing of requirements, especially in cases where more than one
option exists, include the planned and existing capabilities of the simulations, simulation
architectures, cost of implementing the requirement, usefulness to the simulation proponent,
basis for the requirement, implementation time, and synergy with otiier requirements. Thus, we
must attempt to optimize the benefit by minimizing the costs, both financial and other.
5.2 Execution.
With the initial agreements complete, a set of specifications, a comprehensive plan to
implement those specifications, and initial timelines, costs, and resources determined, we will
begin execution of our recommendation. During this step, simulation proponents will integrate
77
PEO Soldier requirements into their software and develop architectures and interfaces that will
facilitate the linkages required for successful implementation of the solution.
5.3 Assessment and Control
Our role during implementation (after the planning step is complete) will primarily shift
to that of monitoring progress, renegotiating agreements due to unexpected changes, quality
assessment, and supervising W&A. Our responsibility will be to ensure that not only are the
requirements met, but that those requirements remain vaHd and updated to reflect PEO Soldier's
evolving needs.
78
Chapter 6:
Conclusions
PEO Soldier approached us with a challenging problem and a short suspense.
Nonetheless, our application of a systematic systems engineering approach provided us the tools
necessary to conduct a thorough analysis in a relatively short period of time. As a result, we
were able to present a robust recommendation that PEO Soldier has accepted and begun to
implement. The modification of and linkage between Combat^ the Infantry Warrior
Simulation (IWARS), and Objective One Semi-Automated Forces (OOS) offers the greatest
benefit in terms of simulation capability and decision support.
Each simulation has unique strengths that, when combined, should have great synergistic
effects on the resulting federation. IWARS will have tremendous strengths in terms of
individual soldier modeling, whereas OOS and Combat^' will have robust combined arms
representations. Combat^^^^will provide great analytical capability as a closed-loop simulation,
whereas OOS' greatest strength will be in HITL. IWARS, designed for both types of human
interaction, will be able to link to both by operating in either mode. OOS' environmental
runtime component, being used by all three simulations, will facilitate a detailed representation
of the environment. By being able to link to OOS, Combat^'^" and IWARS will benefit from the
numerous concurrent efforts being conducted to support OOS, such as the work of the UO
FACT. Thus, where one simulation may have weaknesses, another simulation has strengths.
Finally, PEO Soldier support for efforts already underway conserves scarce resources and
leverages valuable work already invested. Their support benefits the simulation proponents, as
well, by adding value to their efforts and expanding their application areas. Clearly, this
recommendation will benefit all involved.
79
Bibliography
Modeling and Simulation Policy and Guidelines
Army Model and Simulation Office, DAMO-ZS, "Army Model and Simulation Office
(AMSO) Position on the Simulation Support Plan," Memorandum, dtd 18
September 2002a.
Army Model and Simulation Office, Planning Guidelines for Simulation and Modeling
for Acquisition, Requirements, and Training, Change 1, 20 September 2002b.
Army Model and Simulation Office, Simulation and Modelingfor Acquisition,
Requirements, and Training: Execution Plan, 06 November 2000.
Army Model and Simulation Office, Simulation and Modelingfor Acquisition,
Requirements and Training (SMART) Reference Guide, April 2001.
Assistant Secretary of the Army (Research, Development and Acquisition), SARD-DO,
"Modeling and Simulation (M&S) Support of the Army Acquisition Process,"
Memorandum, dtd 20 September 1996.
Department of Defense, DoD Instruction 5000.61: DoD Modeling and Simulation (M&S)
Verification, Validation, and Accreditation (WSA), 13 May 2003.
Department of the Army, Army Regulation 5-11: Management of Army Models and
Simulations, 10 July 1997.
Department of the Army, Army Regulation 5-5: Army Studies and Analyses, 30 June
1996.
Department of the Army, DA Memo 5-15: Management of Army Models and Simulations:
Army High Level Architecture (HLA) Implementation Procedures, 01 November
2002.
Department of the Army, Department of the Army Pamphlet 5-11: Verification,
Validation, and Accreditation of Army Models and Simulations, 30 September
1999.
80
Department of the Army, Department of the Army Pamphlet 5-5: Guidance for Army
Study Sponsors, Sponsor's Study Directors, Study Advisory Groups, and
Contracting Officer Representatives, 01 November 1996.
Department of the Army, Department of the Army Pamphlet 5-XX: Simulation Support
Planning and Plans, 11 July 2003.
Deputy Under Secretary of the Army (Operations Research), "Army Vision and Goals for
Simulation and Modeling for Acquisition, Requirements and Training (SMART),"
Memorandum, dtd 03 November 1999.
Office of the Deputy Chief of Staff for Operations and Plans and the Office of the Deputy
Under Secretary of the Army (Operations Research), The Army Model and
Simulation Master Plan, Army Model and Simulation Office, October 1997.
Office of the Deputy Chief of Staff, Army G-3, and the Office of the Deputy Under
Secretary of the Army (Operations Research), United States Army Model and
Simulation (M&S) Processes and Procedures - DRAFT, Army Model and
Simulation Office, May 2004.
Office of the Deputy Chief of Staff, G-3, and the Office of the Deputy Under Secretary of
the Army (Operations Research), Army Modeling and Simulation Roadmap—
DRAFT, February 2003.
US Army Training and Doctrine Command, ATCD-EM, "Simulation Support Plan,"
Memorandum, dtd 20 January 2003.
US Army Training and Doctrine Command, TRADOC Regulation 11-8: TRADOC
Studies and Analyses, 31 July 1991.
US Army Training and Doctrine Command, TRADOC Regulation 5-11: U.S. Army
Training and Doctrine Command (TRADOC) Models and Simulations (M&S) and
Data Management, 16 November 1998.
US Army Training and Doctrine Command, TRADOC Regulation 5-3: The U.S Army
Training and Doctrine Command (TRADOC) Study Program, 04 March 1991.
US Army Training and Doctrine Command, TRADOC Regulation 71-4: TRADOC
Scenarios for Combat Development, 02 January 2001.
Acquisition Policy and Guidelines
Army Model and Simulation Office, PowerPoint slide, "M&S Review Criteria for
ORD/CDD/CPD."
81
Assistant Deputy Chief of Staff for Developments, ATCD-ZC, "Development and
Approval of Army Warfighting Requirements," Memorandum, dtd 31 May 2002.
Assistant Secretary of the Army (Acquisition, Logistics and Technology) and US Army
Materiel Command, SAAL-ZR, "Policy for Standardization of Collaborative
Environments for Weapons System Acquisition Programs," Memorandum, dtd 26
August 2003.
Chairman of the Joint Chiefs of Staff, CJCSI 3170.01C: Joint Capabilities Integration
and Development System, 24 June 2003.
Chairman of the Joint Chiefs of Staff, CJCSM 3170.01: Operation of the Joint
Capabilities Integration and Development System, 24 June 2003.
Department of Defense, DoD Architecture Framework, Version 1.0, Volume I: Definitions
and Guidelines, 15 August 2003.
I>&l)2ix\m&aXo^I>Qi&QS,Q, DoD Architecture Framework, Version 1.0, Volume 11: Product
Descriptions, 15 August 2003.
Department of Defense, DoD Directive 5000.1: The Defense Acquisition System, 12 May
2003.
Department of Defense, DoD Instruction 5000.2.- Operation of the Defense Acquisition
System, 12 May 2003.
Department of Defense, Interim Defense Acquisition Guidebook, 30 October 2002.
Department of the Army, Army Regulation 10-\: Army Acquisition Policy, 31 December
2003.
US Army Training and Doctrine Command, Guide for Development of Army Initial
Capabilities Documents (ICDs), 22 October 2003.
US Army Training and Doctrine Command, Guide for Development of Army Capability
Development Documents (CDD)—DRAFT, 20 January 2004.
US Army Training and Doctrine Command, TRADOC Pamphlet 71-9: Requirements
Determination, 05 November 1999.
US Army Training and Doctrine Command, TRADOC Regulation 71-12: TRADOC
System Management, 01 March 2002.
PEG Soldier Related Documents
82
Booz-AUen & Hamilton, Inc., PMSoldier System: Land Warrior Simulation Support
Plan, Draft Version .7.6, November 2000.
Department of the Army, Force Development, "Analyses Supporting Land Warrior (LW)
Milestone (MS) C Decision for Blocks I & n," Memorandum, dtd 16 January
2003.
Department of the Army, Force Development, "Update to Requirements for Analyses
Supporting Milestone C Decision for Land Warrior Stryker Interoperable and
Special Operations Forces," Memorandum, Draft.
Larimer, Larry (TRAC-WSMR), "Land Warrior Block H Analysis Study Plan," Draft
Briefing to the Land Warrior Study Advisory Group (SAG), 10 March 2004.
Operational Requirements Document for Land Warrior, ACAT I, Prepared for MS HI
Decision, 31 October 2001.
PEO Soldier, PEO Soldier Portfolio 2003: America's Most Deployed Combat System,
2003.
TRADOC System Manager - Soldier, Land Warrior Program Studies and Analyses
Summary.
US Army Materiel Systems Analysis Activity, "A Review of the Land Warrior
Performance & Effectiveness Analyses Tasks," White Paper, 13 August 2001.
US Army Materiel Systems Analysis Activity, Land Warrior Performance Analysis,
Technical Report No. TR-639, September 1996.
Methodology
Department of Commerce, National Institute of Standards and Technology, Draft Federal
Information Processing Standards Publication 183, Announcing the Standardft)r
Integration Definition for Function Modeling (IDEFO), 1993.
Hatley, Derek J. and Pirbhai, Imtiaz A., Strategies for Real-Time System Specification,
Dorset House, New York, 1988.
Logical Decisions, Logical Decisions for Windows: Version 5.125,2003.
McCarthy, D. J., McFadden, W. J. and McGinnis, M. L., "Put Me in Coach; I'm Ready to
Play! A Discussion of an Evolving Curriculum in Systems Engineering,"
Proceedings of the 13th Annual International Symposium oflNCOSE,
Washington, DC, 493-501, 2003.
83
Doctrine
Chainnan of the Joint Chiefs of Staff, CJCSM 3170.01: Universal Joint TaskList (UJTL),
01 July 2002.
Department of the Army, ARTEP 7-5-MTP: Mission Training Plan for the Stryker
Brigade Combat Team Infantry Rifle Platoon and Squad, May 2003.
Department of the Army, FM 7-15: The Army Universal TaskList, August 2003.
Department of the Army, FM 7-8: Infantry Rifle Platoon and Squad, Change 1, 01 March
2001.
US Army Training and Doctrine Command, TRADOC Pamphlet 525-66: Force
Operating Capabilities, 30 January 2003.
General Documents
Army Logistics Management College, ^nafy^w of Alternatives, PM-2009-ISE, 29
September 2003.
Aviles, D., Bowers, R., Garden, L., Howard, J., MacCarter, N., Morales, B., Larimer, L.,
Finding a Simulation that Models Both Present and Future Individual Soldier
Systems, United States Military Academy, West Point, NY, 2000.
Cioppa, T. M., Willis, J. B., Goerger, N. D., and Brown, L. P., "Research Plan
Development for Modeling and Simulation of Military Operations in Urban
Terrain," In Proc of the Winter Simulation Conference 2003, New Orleans, LA.
Clark, A., Garret, D., Muscietta, D., and Zaky, A., "Infantry Simulation Behaviors
Development Update," PowerPoint Presentation, 17 October 2003.
Cosby, Neale and Severinghaus, Rick, "The M&S Void: Simulations for Individual and
Small Teams," MSIAC's Journal Online, 5(1), 2003.
Department of Systems Engineering, USMA, "Engineering Problem Statement: Soldier
Tactical Mission System (SIMS)," 2000.
Jacobs, R and Brooks, P., "Human Behavior Representation: Relevant Technologies and
Their Development," PowerPoint Presentation as Part of NATO Research &
Technology Organization Studies, Analysis and Simulation Panel Lecture Series
222.
84
Kwinn, Michael J., Jr. A Frameworkfor the Analysis of the Future Combat System
Conceptual Design Alternatives, ORCEN Technical Report, 2001.
Larimer, L., Habic, P., Bosse, T., Hardin, D., Mackey, D., and Tulloh, D., Soldier
Modeling and Analysis Working Group (MAWG) Evaluation Report, TRADOC
Analysis Center Technical Report, TRAC-WSMR-TR-04-009,2004.
Mission Needs Statementfor the One Semi-Automated Forces (OneSAF).
Muscietta, D., "AMSAA/Natick Collaboration on Dismounted Infantry Modeling and
Simulation," PowerPoint Presentation, August 2003.
National Security Directorate, Objective Force Warrior: "The Art of the Possible" ...A
Vision, Oak Ridge National Laboratory, December 2001.
North Atlantic Treaty Organization, "TTCP: Joint Systems and Analysis Group,
Technical Panel 4 - Dismounted Combatant Operations (JSA-TP-5); Terms of
Reference."
North Atlantic Treaty Organization, Model(s)for the Assessment of the Effectiveness of
Dismounted Soldier System, Document AC/225(LG/3), Ed. Th.L.A. Verghagen,
01 September 2001.
North Atlantic Treaty Organization, NATO Soldier Modernisation: Measurements for
Analysis - a Frameworkfor Modelling and Trials, Document AC/225(LG/3)D/25,
AC/225(LG/3-WG/3)D/9, AC/225(LG/3-WG/4)D/6, 04 October 1999.
One Semi-Automated Forces (OneSAF) Operational Requirements Document (ORD),
Version 1.1, ACAT El, Prepared for Milestone C Decision.
Rodriguez, Wilfred, "Virtual Simulations & Infantry Txskimg" Infantry Magazine, 92(1),
21-25, 2003.
ToUefson, E. S., Boylan, G. L., Kwinn, M. J., Foote, B. L., West, P., and Martin, P. G.,
"Using Systems Engineering to Define Simulation Requirements for the
Acquisition of Infantry Soldier Systems," to be published in Proc of the ICSE and
INCOSE 2004 Conference, Las Vegas, NV, 2004.
ToUefson, E. S., Boylan, G.L., Kwinn, M.J., Jr., and Foote, B.L., "Simulation Modeling
Requirements For Determining Soldier Tactical Mission System Effectiveness," to
be published in the Proc of the Winter Simulation Conference 2004, Washington,
DC, 2004.
85
ToUefson, E. S., Kwinn, M. I, Boylan, G. L., Foote, B. L., and West, P., "A Needs-based
Analysis of Analytical, High-Resolution Infantry Simulations," In Proc of the
2004 HE Annual Conference, Houston, TX, 2004.
US Army Materiel Systems Analysis Activity and US Army Soldier Systems Center,
Infantry Warrior Simulation, IWARS, Required Capabilities, Draft, April 2004,
US Army Soldier Systems Center, "Analytical Support to Soldier System Decision
Makers," Draft.
West, Paul, "Land Warrior at LZ X-Ray: A Historical Analysis of the 21'' Century
Soldier," PowerPoint Presentation, 08 May 2002.
86
Appendix A: List of Abbreviations
2D
3D
A
ABM
ACR
ACTD
AI
AIMS
AMSAA
AMSO
AoA
API
AR
ARL
AID
AUTL
AWARS
AWE
B
BCT
BG
BIF
BLOS
BOI
BOS
C
C2
C4ISR
CA
CAD
CAS
CASTFOREM
CDA
CDD
CJCSI
CJCSM
CM
COL
Combat"^'
COP
CPD
2-Dimensional
3-Dimensional
Agent-Based Model
Advanced Concepts and Requirements
Advanced Concept Technology Demonstration
Artificial Intelligence
AMSAA Infantry MOUT Simulation
U.S. Army Materiel Systems Analysis Activity
Army Model and Simulation Office
Analysis of Alternatives
Application Programming Interface
Army Regulation
Army Research Lab
Advanced Technology Demonstration
Army Universal Task List
Advanced Warfare Simulation
Advanced Warfighting Experiment
Brigade Combat Team
Brigadier General
Battlefield Information System
Beyond Line of Sight
Basis of Issue
Battlefield Operating System
Command and Control
Command, Control, Computers, Communication, Intelligence,
Surveillance, and Reconnaissance
Combined Arms
Computer-Aided Design
Close Air Support
Combined Arms and Support Task Force Evaluation Model
Commander's Digital Assistant
Capability Development Document
Chairman of the Joint Chiefs of Staff Instruction
Chairman of the Joint Chiefs of Staff Manual
Configuration Management
Colonel
Combined Arms Analysis Tool for the XXIst Century
Common Operating Picture
Capability Production Document
87
CPT
CPU
CRD
D
DA
DIS
DISAF
DM
DMSO
DoD
DoDD
DoDI
DOTLMPF
DUG
DUSA(OR)
E
EPS
F
FAA
FACT
FFW
FM
FNA
FOC
FSA
FY
G
GUI
GW
H
HBR
HITL
HLA
HMD
I
ICD
ICT
IDEF
10 Device
IPR
IR
IT
I/ITSEC
lUSS
Captain
Central Processing Unit
Capstone Requirements Document
Department of the Army
Distributed Interactive Simulation
Dismounted Infantry Semi-Automated Forces
Decision Matrix
Defense Model and Simulation Office
Department of Defense
Department of Defense Directive
Department of Defense Instruction
Doctrine, Organization, Training, Leadership, Materiel, Personnel,
and Facilities
Defense Technical Information Center
Deputy Under Secretary of the Army for Operations Research
Engineering Problem Statement
Functional Area Analysis
Focus Area Collaborative Team
Future Force Warrior
Field Manual
Functional Needs Analysis
Force Operating Capabilities
Functional Solution Analysis
Fiscal Year
Graphical User Interface
Global Weight
Human Behavior Representation
Human in the Loop
High Level Architecture
Helmet Mounted Display
Initial Capabilities Document
Integrated Concept Teams
Integration Definition for Function Modeling
Input-Output Device
Interim Progress Report
Infrared
Information Technology
Interservice/Industry Training, Simulation and Education Conference
Integrated Unit Simulation System
IWARS
J
JCATS
JCIDS
JUTL
K
L
LLNL
LOS
LTC
LW
LW
M
M&S
MAJ
MAIREX
MAWG
METT-TC
MNS
MOA
MoE
MoM
MoP
MOU
MOUT
MS
N
NATO
NGF
NLOS
NSC
NSC
0
OFW
OOS
OOTW
ORCEN
ORD
ORSA
0TB
P
PEO
PEOSTRI
Infantry Warrior Simulation
Joint Conflict and Tactical Simulation
Joint Capabilities Integration and Development System
Joint Universal Task List
Lawrence Livermore National Laboratories
Line of Sight
Lieutenant Colonel
Land Warrior
Local Weight
Modeling and Simulation
Major
Modeling Architecture for Technology, Research and
Experimentation
Modeling and Analysis Working Group
Mission, Enemy situation. Terrain, Troops available. Time available.
Civil considerations
Mission Need Statement
Memorandum of Agreement
Measure of Effectiveness
Measure of Merit
Measure of Performance
Memorandum of Understanding
Military Operations in Urban Terrain (now Urban Operations)
Milestone
North Atlantic Treaty Organization
Naval Gimfire
Non-line of Sight
Natick Soldier Center
National Simulation Center
Objective Force Warrior (nowFuture Force Warrior)
Objective OneSAF
Operation Other Than War
Operations Research Center of Excellence
Operational Requirements Document
Operations Research / Systems Analysis
OneSAF Testbed Baseline
Program Executive Office
PEO for Simulation, Training, and Instrumentation
89
PM-cm
PM-LW
PM-OneSAF
PM-SW
PM-SWAR
POC
Product Manager - Clothing and Individual Equipment
Product Manager - Land Warrior
Product Manager - OneSAF
Project Manager - Soldier Weapons
Project Manager - Solder Warrior
Point of Contact
0
R
RDA
RDEC
RDM
ROE
S
SA
SAF
SAG
SASO
SBA
SBL
SE
SEMP
SES
SMART
SME
SSE
SSP
S&T
STMS
STO
SU
T
T&E
TEMO
TEMP
TPIO
TPO
TRAC
TRAC-FLVN
TRAC-MTRY
TEIAC-WSMR
TRADOC
TSM
TTP
U
Research, Development, and Acquisition
Research and Development Engineering Center
Raw Data Matrix
Rules of Engagement
Situational Awareness
Semi-Automated Forces
Study Advisory Group
Stability and Support Operations
Simulation Based Acquisition
Soldier Battle Lab
Systems Engineering
Systems Engineering and Management Process
Senior Executive Service
Simulation and Modeling for Acquisition, Requirements, and
Training
Subj ect Matter Expert
Squad Synthetic Environment
Simulation Support Plan
Science and Technology
Soldier Tactical Mission System
Science and Technology Objective
Situational Understanding
Test and Evaluation
Training, Exercises, and Military Operations
Test and Evaluation Master Plan
TRADOC Program Integration Office
TRADOC Project Office
TflADOC Analysis Center
TRAC at Fort Leavenworth, KS
TRAC at Monterey, CA
TRAC at White Sands Missile Range, NM
Training and Doctrine Command
TRADOC System Manager
Tactics, Techniques, and Procedures
90
UA
Unit of Action (now Brigade Combat Team)
UAV
Unmanned Aerial Vehicle
UE
Unit of Employment
UGS
Unattended Ground Sensor
UGV
Unattended Ground Vehicle
UO
Urban Operations
USAIC
US Army Infantry Center
USMA
United States Military Academy
V
VIC
Vector in Commander
W&A
Verification, Validation, and Accreditation
W
WSC
Winter Simulation Conference
XYZ
liis table is sorted alphabetically
91
Appendix B: Soldier Functional Hierarchy
Current
Situation
Assess the
Enemy
Situation
Assess the
Mission
Assess the
Friendly
Situation
Assess the
Environment
Assess the
Time
Avaiiable
Assess the
Neutral/Ovillan
Situation
Figure 17. Functional decomposition of assessing tlie situation.
Make Sensing
Decisions
■1
,
1
. .
Make
Tracking /
Designation
Decisions
Make Search
Decisions
Choose to
Search
Choose
Search
Location
Choose to
Track/
Designate
Targets
—
Choose
Acquisition
Method
Choose
Search
Method
Choose
Search
Timing
Choose
Targets to
Track/
Designate
—
Choose \
Detection to :
Focus On ;
Choose to
Correct
Sensing
Equipment
Malfunction
Choose Track
/ Designate
Duration
_!
Figure 18. Functional decomposition of maldng sensing decisions.
92
:.
■■,:::,_i.,._
Make
Acquisition
Decisions
Assess
Threat
Assess target
Characteristics
Sense
Tracic /
Designate
Search
Manipulate
Sensing
Equipment
Acquire
Orient
Change
Sensing
Equipment
Status
-
Orient
Sensing
Equipment
Correct
Sensing
Equipment
Malfunction
-
Orient Self
Emplace
Sensing
Equipment
-
Locate
-*
See
Hear
Smell
Feel
'—
Recognize
Taste
Figure 19. Functional decomposition of sensing.
Make
Engagement
Decisions
..;■■■ 1
„::
choose
Engagement
Method
•t :,.,..:;.
Choose
Target
1
1
Assess
Engagement
Effects
i
Select
Engagement
Timing
i
1
Decide to
Disengage
Decide When
to Correct
Device
Malfunction
Figure 20. Functional decomposition of making engagement decisions.
93
Conduct
Qose
Quarters
r
Employ a
Device
_
''"■y""'
X
Manipulate
Engagement
Device
Fight with
Equipment
Fight
Detonate
Explosive
Fire Weapon
Throw Device
Aim Device
Figure 21. Functional decomposition of engaging.
Make
Movement
Decisions
Make
Navigational
Decisions
Determine
Location
Choose
Route
i
Make General
Movement
Decisions
Choose
Posture
Choose
Destination
Choose
Movement
Technique
Choose
Movement
Method
Choose
Choose
Formation
Timing
Figure 22. Functional decomposition of making movement decisions.
94
Change
Physical
Location
Change
Pasture
Run
aimb
Stand
Crouch
Walk
Swim
Kneel
Sit
Crawl
Jump
n_ii
Mount/
Dismount
Figure 23. Functional decomposition of moving.
Make
Communication
Decisions
Make
Transmission
Decisions
Make
Reception
Decisions
Process
Communication
Information
X
Decide When
to Transmit
Decide How
to Transmit
Decide What
to Transmit
Decide Whom
to Transmit
To
Decide to
Recewe
Decide How
to Receive
I
Sort / Prioritize
Communication
Data
Figure 24. Functional decomposition of making communications decisions.
95
Interpret
Communication
Data
Communicate
Hflanipulate
Communications
Equipment
Receive
Transmit
Correct
Change
Communications
Equipment Status
Communications
Equipment
Maifunction
Signal
Type
Figure 25. Functional decomposition of communicating.
Mal<e
Enabling
Decisions
iUlalce
Decisions to
Aitor
Surroundings
iUlalce
Decisions
to Alter
Terrain
Decide to
Dig
Decide to
Build
Decide to
Alter
Vegetation
Make
Decisions
to Alter
Objects
Decide to
Move
Objects
Decide to
Change
Objects
X
Maice
Decisions to
ManipuialB
Load
Decide to
Tailor
Load
Halte
Operation
Decisions
JC
Decide to
Conduct
Decide to
Cariy
Object
Bodily
Functions
Decide to
Adtiilnlster
First Aid
:: vMake : ■
Equipment
ppeiating
Decisians
Decide to
Add to
Load
Monitor
Needs
Monitor
Health
Monitor
Equipment
Decide to
Alter Load
Decide to
Consume
Decide to
Administer
Self Aid
Decide to
Maintain
Decide to
Reduce
Load
Decide to
Expel
Decide to
Administer
Buddy Aid
Decide to
Sustain /
Resupply/
Redistribute
Decide to
Rest
Figure 26. Functional decomposition of maldng enabling decisions.
96
Decide to
Repair
Enable
Altar
Surfoundlngs
Alter
Terrain
Dig
Build
Alter
Vegetation
—
ManipulalB
Load
Alter
Objects
Tailor
Load
Operate
Carry
Object
_C
i.
Conduct
Bodily
Functions
Keep
Equipment
Operating
Administer
First Aid
Move
Objects
Add to
Load
Consume
Administer
Self Aid
Maintain
Change
Objects
Alter Load
Expel
Waste
Administer
Buddy Aid
Sustain /
Resupply/
Redistribute
Reduce
Load
Rest
-»
Figure 27. Functional decomposition of enabling.
97
Repair
Appendix C: STMS Simulation Functional Requirements
Functional Requirements
for a
Soldier Tactical Mission System (STMS)
Simulation Capability
Version 0.3
Compiled by:
CPT Eric ToUefson
CPT Gregory Boylan
LTC Michael J. Kwinn, Jr.
Dr. Bobbie Foote
Dr. Paul West
Operations Research Center of Excellence
Department of Systems Engineering
United States Military Academy
West Point, New York
14 May 2004
98
I. INTRODUCTION.
This document outlines the key functional requirements for a simulation capability that a Soldier
tactical mission system (STMS) acquirer, particularly the Program Executive Office Soldier
(PEO Soldier), can use for analytical purposes. Specifically, these requirements are meant to
satisfy the following problem statement:
Identify or develop a tactical combat simulation capability for Light Infantry
missions at the level of Platoon and below with resolution down to the individual
Soldier. The simulation must accept, as input, scenarios and Soldier tactical
mission system (STMS) characteristics. It must model the functions of tiie
Soldier in a tactical environment, and provide, as output, the measures of
effectiveness (MOEs) used to evaluate STMS mission effectiveness. It must be
adaptable to changes in technology and operational environments. The
simulation(s) will provide the analytical capability to support Program Executive
Office (PEO) Soldier decision making.
This document is not intended to be a "requirements document" in the sense of an operational
requirements document (ORD) or a set of specifications for contracting. Instead, it delineates the
key characteristics of the simulation software and modeling aspects that would make a
simulation (or family of simulations) ideal for PEO Soldier decision-making.
We have based the information contained here upon interviews, visits to organizations
throughout the Army, and literature reviews. While we consider our research representative of
the domain, it is, by no means, completely exhaustive. Therefore, this document is, and should
remain, a working document that PEO Soldier, or that organization's designated representative,
updates wiien new information becomes available.
II. METHODOLOGY.
Our decomposition of simulation requirements is unique in that we decomposed by soldier
function. The normal approach for identifying simulation requirements for analytical purposes is
to categorize them by the characteristics used to evaluate the system of interest. For instance, the
effectiveness of an STMS is measured by its lethality, survivability, mobility, situational
awareness, etc. Although we initially chose to use this method, we switched to a functional
decomposition for a number of reasons. One is that the characteristic itself may not be welldefined or translate well into simulation requirements - for example, situational awareness. Not
only is the definition of this term not widely agreed upon, its broad implications make it hard to
decompose into requirements. A soldier's situational awareness directly affects, and is directly
affected by, other characteristics like mobility, lethality, and survivability, which themselves
overlap for many of the soldier's functions. That interdependence complicates the logical
decomposition into simulation requirements and is the primary reason we chose another method.
Even more, the diverse group of simulation stakeholders may not easily understand the
terminology of the resulting product. Therefore, we looked at the functions (and indirectly at
tasks) that an Infantry soldier must perform on the battlefield to accomplish his mission
successfully.
99
We identified the primary soldier functions and decomposed each of those. We decomposed
down to subfunctions that either affect the performance of the soldier system or component, or
allow for differentiation between altemative systems, making sure we considered current and
future systems. This technique allowed us to focus on representations that are important for
acquisition decision-making, versus training or mission rehearsal. For each of the lowest level
functions, we have identified the inputs and outputs critical for each of the subfunctions. The
simulation, then, must model those inputs and outputs to assess accurately the performance of an
STMS.
III. OUTPUTS.
As a result of the approach discussed above, this document is organized first by simulation
function, and then by soldier function. Some areas warrant additional discussion and are
included as separate sections as well. For the general simulation functions, we identify the key
characteristics required for an ideal simulation. For each of the lowest-level Soldier functions,
those inputs transformed by, and outputs produced by, those functions most critical for
evaluating STMS are listed. As previously mentioned, the inputs and outputs are not exhaustive
and should be updated as needed.
IV. ADDITIONAL NOTES.
A. We use the term "simulation" to mean either a single simulation or a linkage of two or
more. The linkage may be direct, through architecture (HLA, MATREX), or through data
exchange. We are identifying requirements for a simulation capability without attempting to
define how that might be accomplished.
B. This document identifies the required capabilities (both current and projected for use with
future materiel acquisition) of the ideal simulation. We include information regardless of
whether or not the current technology, state of knowledge, and/or data exist to support its
development.
C. These requirements contain many lists of examples. Before conversion to simulation
specifications, these must be converted to definitive lists based upon the intent of the
sponsoring organization(s).
D. To the extent possible, we have tried to list functions in a logical order for flow purposes.
However, in some cases, outputs of functions listed may be inputs to functions already
discussed. We have tried to ensure that all functional outputs are inputs into other functions,
and vice versa. The exceptions, of course, are those inputs from, and outputs into, the
environment.
E. Even if valid algorithms and techniques can be created for all required capabilities, there
are significant gaps in the required data. Such gaps will prevent the attainment of a good
model, if not addressed. This document can be used to identify data shortcomings and to
focus future data collection efforts.
V. ACKNOWLEDGEMENTS. The information contained here has been obtained with the
help of numerous people. We would specifically like to thank Major Pedro Habic and
Lieutenant Colonel Larry Larimer from TRAC-WSMR; Dr. Dale Malabarba, Mr. Robert Auer,
and Mr. Paul Short from the Natick Soldier Center; COL(Ret) Pat Toffler, Director, Research
100
and Studies Partnership, United States Military Academy; and, Mr. Brad Bradley and Mr. Dean
Muscietta from AMSAA. Interviews with the above provided tremendous insights that were
critical in development of these requirements. We would also like to make special mention of
those that reviewed the drafts of this document and who provided comments and specific
recommendations that have been integrated into the content. They are Dr. Robin Buckelew,
SES, Director, Applied Sensors, Guidance, and Electronics, Directorate Aviation and Missile
RDEC; Mr. John D'Errico, Operations Research Analyst at TSM-Soldier; and, Mr. CharUe
Tamez, Systems Integration, PEO Soldier. Our mention of the above people does not imply their
approval of this document. The opinions contained herein are the opinions of the authors, and do
not necessary reflect those of PEO Soldier, the United States Military Academy, the United
States Army, or the Department of Defense.
101
I. GENERAL SIMULATION FUNCTIONS
The model we used for the initial decomposition of the required simulation functions is from
Hatley and Pirbhai's architectural template in Figure 28. Their architecture is a generic template
for decomposing the functions of a system for purposes of system specification (Hatley and
Pirbhai, 1988). This model provides us with a logical functional grouping to organize the upper
level simulation requirements.
User Interface Praceeting
Proceis Model
Input
Proceitlng
Control Model
Output
Procesiing
Maintenance, Self-Tett, and
Redundancy Management
Procitiing
Figure 28. Hatley and Pirbhai architecture template (1998).
A. Interface with User. This function has to do with obtaining information from the user,
giving the user feedback on the inputs received, presenting the outputs of the simulation to
the user, requesting information from the user, and providing a means for the user to request
information from the simulation.
The following are simulation requirements that relate to this function:
Intuitive (Windows-like; menu driven)
Understandable to subject matter experts, users, and analysts
Allows the input of new equipment and weapons systems through changes to
parameters; parameters must be robust enough to encompass the largest possible
array of potential weapons and equipment
Allows the user or analyst to alter behaviors/TTPs
Intuitive scenario development (use of Army operational terms and symbols and other
recognizable icons)
Input configuration reflects military decision making
Allows for multiple input methods
Allows the reuse of previous scenarios and annotated terrain as a starting point for
developing new scenarios
Allows the extraction of stored data during any point of the scenario execution
Visual output and playback capability to allow for face validation of models and
scenarios, as well as a qualitative handle on the results
B. Process Inputs. This function concems the conversion of inputs from external interfaces
and other components of the simulation into data usable by the simulation for processing.
The following are simulation requirements that relate to this function:
• Can interface with other simulation systems through HLA or other federation
simulation architectures (e.g., MATREX)
102
•
•
•
Identification of internal data errors relayed to the user through the user interface;
easy-to-follow audit trail
Managed internal data conversion process
Able to input data from other sources (external database, spreadsheet, engineering
simulation output, etc)
C. Control Processes. This function has to do with the control of the simulation processing
itself and would manage such processes as timing and linkages to higher fidelity models.
The following are simulation requirements that relate to this function:
• HLA compliance
• Flexible; plug-and-play backbone that allows integration of additional capabilities
• Object-oriented coding
• Configuration managed by a plan
• Constructive simulation - non human-in-the-loop (HITL), but allows for HITL
execution, if desired
• Stochastic repeatability (same inputs with the same random seed produces the same
outputs)
• Fidelity control (ability to use higher fidelity models if necessary or to use aggregated
results in lieu of those models)
• Preprocessing capability to reduce runtime
• Minimize runtime
• Operate with unclassified and classified data
D. Process Model. This function has to do with the actual simidation model itself and how it
processes the user inputs into outputs. The discussion of the simulation model will occur
later in the sections describing Non-Soldier Representation, Soldier Attributes, and Soldier
Representation.
The following are general simulation requirements that relate to this function:
• Has both high-fidelity models and aggregated results of those models to be drawn
upon as needed
• Verified, validated, and accredited for its intended uses
E. Process Outputs. This function concems tiie conversion of simulation outputs into data
usable by extemal interfaces and other components of the simulation.
The following are simulation requirements that relate to this fimction:
• Exportable outputs to commercial statistical and graphical software
• Identification of intemal data errors relayed to the user through the user interface
• Managed intemal data conversion process
• Continuous or event-driven output capability
F. Maintain, Self-Test, and Manage Redundancy. These functions have to do with
diagnostic testing, intemal simulation monitoring, error detection, and the ability to draw on
redundant resources.
The following are simulation requirements that relate to this function:
• Capable of debugging changes to code
103
•
•
•
Clear identification and presentation of errors
Traceability - audit trail to identify sources of error
Data backup capability to avoid catastrophic loss of data
G. Other.
• Includes, at a minimum, tiie following types of documentation guides: user,
methodology, and analyst. They can be rolled up into one document, but should
address the topics unique to each type.
• Includes a complete verification, validation, and accreditation (W&A) plan and
documentation.
II. SIMULATION SCENARIOS
We believe that the level of fidelity required by the simulation, as we delineate it in this
document, is sufficient to ensure that any potential scenario can be modeled. Therefore, we will
only briefly discuss the points below.
To understand best vsiiat future operations the Army may encounter, we can refer to FM 7-15,
The Army Universal Task List (AUTL). The four main types of Army operations are offensive,
defensive, stability, and support operations. Refer to Annex A for a complete breakdown of
these four operations. Refer to Annex B for the joint tactical tasks as found in CJCSM
3500.04C, UniversalJoint Task List (UJTL). Both lists should serve as a basis for developing
scenarios or vignettes for simulation modeling and should be fully represented in order to
determine the effectiveness of the soldier systems in all potential operations.
The referenced task lists imply that we need to model our own and allied forces, enemy forces
and their allies, and neutral forces. Furthermore, the simulation should represent regular military
and guerilla forces, terrorists, and civilians, both armed (including police) and unarmed (to
include NGOs).
III. NON-SOLDIER REPRESENTATION
A. Represent the Environment. Representation of the environment is a complex issue.
The requirements and discussion below only touch upon the more important aspects of this
representation as they relate to the requirements of PEG Soldier.
1. General Environment Types. The following are the general types of the
environments in which the future soldier can be expected to operate. This is not
necessarily an all-inclusive list, but is representative of the spectrum of environments that
the soldier will experience. If the simulation will also be used for special operations
forces (SOF), it may have to include detailed models of special environments like the
ocean or the atmosphere.
The following are the primary environment types:
• Urban, or built up, areas
• Desert
• Jungle
• Swamp / Marshland
104
• Forest / Woodland
• Plains / Rural
• Mountains
• Arctic
• Littoral
2. Terrain. The following are terrain characteristics that must be modeled in the
simulation. Each of these characteristics may be different based upon the above
environment types. Even within a specific environment lype, these characteristics may
vary.
a) Relief. The simulation should have a realistic mapping of the surface of the earth,
at a high enough resolution to effect the movement and line of sight (LOS) of an
individual soldier. This category also consists of wiiat is sometimes called "microterrain" or the finely-detailed, very high resolution terrain, like rocks, gullies,
mounds, and vegetation. As with vegetation discussed below, it is important to model
the effects of this type of terrain, especially as it relates to the soldier's ability to seek
cover and concealment, as well as its impacts on sensors that may be part of the
soldier system under evaluation. However, this need not be represented explicitly,
but can be modeled probabilistically as discussed with vegetation below.
b) Vegetation. The vegetation should be represented in a way that will allow the
model to account for its effects on round, fragment, and shrapnel trajectories,
visibility, sound, and mobility. It is not necessary for the simulation to specifically
model a single bush, plant, or tree. Such vegetation could be accounted for
probabilistically based upon the type of environment. The simulation must model
how vegetation affects the soldier's ability to detect targets, as well as the soldier's
vulnerability to detection by other entities.
c) Soil Composition. This characteristic has a great impact on mobility and therefore
should be modeled. Soil composition also affects the soldier's ability to dig and to
get dirt or sand for sandbags. This characteristic also includes man-made surfaces
such as roads and hardstands.
d) Water. For PEO Soldier's needs, it is may not be necessary to model the ocean or
large lakes (except for littoral operations - a Marine-oriented application). However,
the simulation must model rivers, streams, and swamps.
e) Subterranean Features. This includes man-made and natural tunnels and caves.
These features should affect the soldier's ability to move, communicate, and sense.
f) Built-up Areas. The accurate modeling of structures is critical for assessing the
effectiveness of any system in an urban environment. The structure models should be
able to represent structure interior and exterior, with multiple rooms, multiple floors,
construction material properties, windows, doors, "cat holes," and fumiture. It should
also have attributes that will permit the assessment of weapons effects on the various
components of the structure. It should affect the soldier's ability to communicate
within the building and between buildings. Additionally, other urban features must
be modeled: vehicles, infrastructure (electric and phone cables; poles; gas, sewer, and
water lines; etc), paved areas, businesses, gas stations, and general urban layout
(roads, alleys, blocks, industrial parks, yards and fences, etc).
g) Other Man-made Features. This includes all types of man-made obstacles, to
include minefields (type, dimensions, density), improvised explosive devices (lEDs),
105
concertina and barbed wire, tank ditches, road and airstrip cratering, 55-gallon drums,
and any other man-made feature (except as discussed above in built-up areas).
h) Dynamic Terrain. This requirement reflects the ability of the simulation to alter
the environment during a run. This type of terrain would account for the effects of
the soldier and his weapons upon the teriain, such as blast craters, damage to
structures, fire damage, changes to vegetation from deliberate soldier action, etc.
3. Climate. The following are aspects of the climate that must be modeled in the
simulation. Each of these characteristics may be different based upon the environment
type. Even within a specific environment type, these characteristics may vary botii
spatially and temporally.
a) Weather Conditions. This includes characteristics such as humidity, barometric
pressure, ambient and radiant temperature, visibility, cloud cover, fog, precipitation,
ceiling, wind speed, wind direction, inversion factors, and attenuation and extinction
coefficients for various portions of the EM spectrum.
b) Light Conditions. This includes characteristics such as sunrise, sunset, BMNT,
EENT, angle of the sun, sky-to-ground contrast ratio, moonlight, atmospheric
obscurants (e.g., haze) and starlight.
c) Dynamic Climate. This allows for changes to the climate as the day progresses
(e.g., temperature, humidity, and barometric changes) and changes due to the soldier
and his weapons (e.g., smoke and heat from fires).
d) Man-made Conditions. These conditions include battlefield obscurants, such as
smoke and dust; biological and chemical agent contamination and nuclear effects; and
illumination and signaling pyrotechnics, such as flares, star clusters, and mortar and
artillery illumination rounds. The simulation should represent the different types of
smoke and its properties, such as buildup and dissipation characteristics and its
effects on different types of sensors. When modeling flares, the simulation should
consider dynamic lighting conditions as a result of the wind, tiajectory, location with
respect to the ground, and bum rate of the flare and how it casts shadows.
Additionally, muzzle flashes from all types of weapons, and their effects on
observers, should be considered.
4. Other. This includes the effects of insects, wildlife, bacteria, and disease on soldier
performance and survivability.
B. Represent Other Entities. This requirements document will not go into the detail for
other types of entities as that discussed for the representation of the individual soldier. For
this simulation, we are concemed primarily with the representation of the soldier. Therefore,
models only need represent other entities to the degree that the soldier will observe or
interact witii them.
For instance, there may not be need to represent an artillery piece explicitiy, only the fires
request process, the incoming rounds, and their effects. For a tank, on the other hand, the
simulation may be required to model its physical and vulnerability characteristics, its
capabilities, and a realistic portrayal of its behavior, depending upon how it will interact with
the soldier dimng the course of the scenario. The same is true of aviation aircraft, personnel
carriers, trucks, and other systems the soldier may physically encounter on the battlefield.
106
C. Represent Higher and Lateral Headquarters. The simulation needs to represent higher
headquarters and lateral units (i.e. company and above elements and sister platoons) only to
the degree as necessary for communications and directives purposes. For instance, if the
platoon leader is attempting to communicate with his company commander on the company
net, then the traffic from all company elements on that net should be simulated to ensure a
realistic representation of delay. Other representations of higher headquarters might include
the ability to attach company mortars or an extra squad, for example.
D. Represent Weapons and Ammunition Effects. While this type of representation is
implied in the discussion of the functions below, it deserves special mention. The simulation
must represent all types of weapons and ammunition that the soldier may carry or encounter
on the battlefield.
For direct fire weapons, the simulation should be able to represent kinetic energy weapons,
non-lethal weapons, electromagnetic (EM) energy weapons, and other types of weapons (e.g.
flame thrower) delivered via soldier, vehicle, or aircraft-mounted platforms. The model
should consider area and point firing, as well as the various firing modes (safe, single shot,
burst, fully automatic). It must include direct fire rocket and missile systems. All types of
direct fire ammunition should be modeled, to include armor-piercing, HE, sabot, HEAT,
saboted light armor penetiation (SLAP), canister, anti-structure, etc. The non-lethal direct
fire ammunition types that should be represented are the 12 gauge point and crowd control
ammunition, 40mm sponge cartridge, 40mm crowd dispersal cartridge, etc.
Similarly, for indirect fire weapons, the simulation should be able to represent lethal and nonlethal weapons delivered via soldier, towed, vehicle, or aircraft-mounted platforms. It should
model the fully trajectory of the rounds. Representation of indirect ammunition should
include HE rounds (air burst, point detonated, and delay), white phosphorous (WP),
illumination, smoke, dual-purpose improved conventional munitions (DPICM), family of
scatterable mines (FASCAM), smart munitions, as well as grenades (fragmentary,
incendiary, WP, thermite, CS, stun [non-lethal], etc.). The model should accoimt for the
entire call for fire process (request, time to fire, time of flight, round adjustment, etc). Also
represented in this category should be chemical and biological weapons, their means of
delivery, and their effects on the environment and the soldier.
The simulation should also model key firing characteristics (either explicitly or implicitly).
Important direct fire parameters include max effective range, rates of fire, bias (variable and
fixed), random error, and probabilities of hit, kill, and incapacitation for all possible weaponmunition-target pairings; these must also account for weapon-sensor pairings, such as M4
with ceo or M203 with enhanced laser sights, etc. as these will affect the aforementioned
probabilities. Important indirect fire parameters include range, lethal radius, ballistic error,
dispersion, aim error, target location error, and probabilities of kill and incapacitation due to
fragments and blast effects.
The representation of weapons, ammunition, and explosives must include their effects on
targets (humans with various levels and types of protection, structures, vehicles, vegetation,
terrain, other objects). Also modeled should be effects based on the part of the target struck
107
and what level of protection is in that area. Injuries should be affected by treatment, time,
and the environment. These effects include not only the effects of hitting the target, but also
suppressive effects on personnel nearby (varied suppression duration and level based on the
ammunition characteristics, the soldier's protection, and his state of mind).
E. Represent System Reliability and Power Requirements. As in the discussion for
weapons and ammunition effects above, system reliability and power requirements are
implied by the soldier functions described below. However, it is important to mention briefly
here. The advanced technological equipment being fielded for the soldier brings with it
power requirements, integration issues, and special maintenance and repair issues.
Therefore, it is important, in some way, to model the equipment's reliability and power
systems.
The modeling of reliability should account for the failure rates of each of the components and
how the various potential component failures would affect the system as a wiiole. This does
not need to be modeled explicitly, but could be associated with probability functions for each
potential type of system error and a probabilistic estimation of the repair time for each type
of failure. The system failures should affect the soldier's ability to perform certain functions
and require the soldier to switch to an altemate method of performing that function, if
available.
The power requirements should be modeled based on the mission requirements, power load,
and power source capacity, as well as the ability to resupply the power source.
Both reliability and power requirements modeling should consider the effects of the
environment on system attributes.
F. Represent Network-Centric Warfare. This is implied in the discussions of many of the
soldier functions below; however, it is beneficial to discuss network-centric warfare as one
integrated topic, especially its effects on the target engagement process. Clearly, this
characteristic of warfare must be represented in any simulation that might be used to evaluate
future soldier systems.
The target engagement process can be broken down into five distinct functions, called
battlefield information functions (BIF). These are search/detect, identify, track/target,
engage, and assess. The responsibility for the performance of these functions is shifting
away from the individual soldier to a whole host of systems distributed throughout the
battlefield. Thus, sensors may search/detect, identify, and track/target potential enemy
targets, engagement logic may trigger an unmanned weapons platform to engage, and sensors
may assess the effects of the engagement (Kwinn and Smith, 2001).
As part of this, the soldier may fill any of these roles as either a sensing or engagement
platform; however, he may no longer fill all of the roles. Thus, the discussion of soldier
functions below, on its own, does not capture the network-centric process. The soldier's role
in that process can be captured by his core functions, but the simulation must account for the
108
digital transfer of information between the soldier and the other network platforms and how
that information affects the functions of the soldier and other platforms.
IV. SOLDIER ATTRIBUTES
The soldier entity must be represented by a complete set of attributes that affect the entity's
performance of a function (or transformation of inputs to outputs). These attributes can act as
inputs or controls. For example, attributes acting as controls may be rule sets for making a
decision, a general knowledge base drawn upon by cognitive processes, physical constraints
affecting performance, etc. Additionally, these attributes can be affected or changed by the
process itself For instance, movement can reduce the soldier's energy level, operations can
increase his experience level, and equipment damage can change its physical and performance
characteristics. Many of these attributes are measures of performance (MoPs) for the respective
sensor, weapon, or piece of equipment. Attributes can be entered directly or be fed by
engineering level simulations. The simulation should consist of a baseline entity, probably the
current Soldier, with a standard set of attributes that can serve as a baseline for comparisons.
A. Mission Attributes. These attributes reflect the soldier's knowledge about his mission
and how he is expected to accomplish that mission and generally apply to all soldiers in the
unit. The ability to model such attributes are critical if this simulation is to be used as part of
aDOTMLPF (Doctrine, Organization, Training, Materiel, Leadership, Personnel, and
Facilities) analysis, as required by the acquisition process. Such attributes include:
• Doctrine
• Tactics, techniques, and procedures (TTPs)
• Rules of engagement (ROE)
• Standing operating procedures (SOP)
• Soldier's role in the unit / duty position
• Current mission and intent (objectives, control measures, routes, checkpoints,
timeline, etc)
• Unit training, experience, and cohesion
B. Personal Attributes. These attributes reflect characteristics of the soldier himself, and
may, or may not, vary from soldier to soldier. The attributes must be dynamic and allow for
degraded status or changes based on the course of the scenario. Aside from the obvious
reasons for inclusion (reaHsm), these attributes can also be adjusted to evaluate various
system impacts on performance, relative to soldier size, strength, etc (e.g., a smaller soldier
may be incapable of performing certain tasks with a specific combination of system
components because it is too bulky, heavy, etc). While the simulation may take such data
from engineering level models of human performance, these attributes should still factor into
the mission level simulation.
1. Physical Attributes.
• Height, weight, dimensions, shape
• Color, reflectance
• Posture, location
2. Physiological Attributes.
109
•
Fitness, strength, energy, alertness, visual acuity, endurance, fatigue, circadian
rhythm
• Sensory motor skills, dexterity
• Health, injury/wound status
• Physical susceptibility to combat (i.e., effects on physiological attributes as time
spent in combat increases)
3. Psychological Attributes.
• Courage, initiative, aggressiveness, discipline, self-confidence, fear, trust
• Stress, excitement, motivation (will to fight)
• Obedience (likelihood for following orders, instructions, etc.)
• Effects of casualties (on state of mind, etc.)
4. Mental Attributes
• Memory (short term and long term), cognitive bandwidth, attentional resources
• Reflexive processing, reaction/decision capabilities
• Susceptibility to pressure (either to act or not act, based on actions of other
entities, such as peers or superiors)
5. Readiness Attributes. These, when considered with the type of equipment the soldier
is using, can be used to determine the impact of the equipment's "user-friendliness" on
soldier performance. For instance, a system would be considered user-friendly if a
soldier can operate it with minimal training and experience.
• Training
• Experience
C. Equipment Attributes. These attributes reflect characteristics of the equipment,
weapons, or clothing worn by the soldier. They may differ by soldier, depending on the type
of equipment he is carrying, but are normally constant for a particular piece of equipment.
The attributes must allow for degraded status or changes based on the situation. The actual
attributes greatly depend on the specific type and model of equipment being represented.
Below are some examples by category.
1. Weapons and ammunition attributes
• Maximum effective range, burst/lethal radius, rates of fire, rounds per trigger pull,
number of rounds in magazine, fuse time, firing modes (safe, single roiaid, burst,
fully automatic)
• Probability of hit (Ph) and probability of a kill or incapacitation given a hit (Pk)
for various situations, environments, and postures (for lethal and non-lethal
weapons and ammunition)
• Errors
• Suppressive effects
• Reliability, failure rates, failure types, failure modes, durability
• Service (repair) times, maintenance times, ease of diagnosis
• Availability, start up time, standard configuration
• Power requirements, source type, load
• Optical, thermal, electionic and audio signature
• Physical dimensions, weight
2. Sensor attributes
110
Cycles per milliradian (CPM) tables, contrast data, spectral band, extinction
coefficient
Range, field of view (narrow and wide), narrow-to-wide factor, image
intensification, magnification
Physical dimensions, weight
Optical, thermal, electronic and audio signature
Reliability, availability, repair time
IFF capability
Communications attributes
Frequency, range, type (analog, digital)
Security (e.g. frequency-hopping [FH])
Electronic and audio signatures
Physical dimensions, weight
Optical, thermal, electronic and audio signature
Reliability, availability
Clothing attributes
Optical, thermal, and audio detectability
Ballistic protection (Ph, Pk based on equipment and clothing)
EM wave protection
Chemical protection
Environmental protection (knee/elbow protection, cold weather gear, wet weather
gear)
Other equipment attributes
Capacity, restrictiveness (effects on mobility, dexterity)
ROM / RAM, processor speed
Overall system state as a function of individual component failure/incapacitation
V. SOLDIER REPRESENTATION (BY FUNCTION). We decomposed the soldier into two
primary fijnctions: decide and act; we continued to decompose those functions to the lowest level
we believed necessary for evaluating altemative soldier systems. For the following discussion of
requirements by soldier function, we grouped together the decisions and actions that were
directly related for ease of understanding and input/output ttacking. Therefore, to grasp the
actual functional hierarchy, it is best to refer to the hierarchy diagram itself, rather than try to
infer the structure from the following discussion.
It is important to note, when looking at the inputs and outputs, the distinction between actual and
perceived information. Generally, the soldier makes his decisions based upon his perception of
the situation, but his actions are affected by the actual situation. Therefore, the simulation must
make this distinction as well, especially since much of the soldier systems being fielded by PEO
Soldier enhance the soldier's ability to make decisions by providing more timely and accurate
(presumably) information.
A. Assess the Current Situation. This function is a primary driver for most soldier decision
making and may be the hardest to model. This assessment is the same as that commonly
referred to as METT-TC (Mission, Enemy, Terrain [replaced here by "environment"]. Troops
111
available [replaced here by "friendly situation"], Time available, and Civil considerations).
In fact, for the remainder of this document, the acronym METT-TC will refer to this
assessment of the current situation. In the case of PEO Soldier requirements, much of the
capability enhancements of Land Warrior provide the soldier with information critical to his
assessment of the current situation. Therefore, a simulation must model, in some way, how
that information affects the soldier's METT-TC assessment, and how that assessment affects
his decision-making. Key to this assessment also is the simulation's requirement to track
both the information perceived by the soldier, and actual data. Soldiers will make decisions
based upon perceived data, but his actions will be affected based upon actual data. Much of
the technology being fielded today is designed to improve the soldier's perception, or
assessment, of the situation. Therefore, this difference between perceived and actual data
must be modeled and tracked in the simulation.
1. Assess the Mission. To meet PEO Soldier's needs, a simulation does not need to
explicitly model the soldier's assessment of the mission. However, it must reflect the
soldier's knowledge of the mission in terms of the commander's intent, objective(s),
control measures, ROE, assigned tasks, etc. Much of this information can be entered
beforehand as entity attributes.
• Modeled Inputs: None
• Modeled Outputs: Perception of mission, to include intent, tasks/purposes,
endstates, role relative to other friendly elements (higher, adjacent, subordinate)
and vice versa, systems/units in support, commander's intent, objective(s), control
measures
2. Assess the Enemy Situation. This is key for Land Warrior. The Helmet-Mounted
Display (HMD) and Commander's Digital Assistant (CDA), as well as the increased
communication capability, in Land Warrior allow the sharing of knowledge about the
enemy situation. The requirement that any soldier system be conversant with Stryker
systems (FBCB2) and other digital systems also enhances this soldier function.
Therefore, this capability must be represented.
• Modeled Inputs: Perceived enemy information from HMD, CDA,
communications, other sensing equipment, soldier observation, and soldier
training and experience
• Modeled Outputs: Perception of enemy composition, disposition, strength,
capabilities, weaknesses, doctrine, and tactics
3. Assess the Friendly Situation. For the same reasons as above, this must be
represented in simulation.
• Modeled Inputs: Friendly information perceived from HMD, CDA,
communications, other sensing equipment, soldier observation, and soldier
training and experience
• Modeled Outputs: Perception of friendly composition, disposition, strength, and
capabilities
4. Assess the Environment. Current PEO Soldier programs do not yet directly aim to
increase the soldier's capability to assess the environment around him. Indirectly, shared
information between soldiers can improve the soldier's information. In order to make the
simulation realistic, however, it should represent the soldier's assessment of the
environment, which will affect subsequent decisions, especially those related to
movement, engagements, and communication. The commonly used acronym for this
112
soldier assessment is OAKOC (Obstacles, Avenues of approach, Key terrain.
Observation and fields of fire. Cover and concealment).
• Modeled Inputs: Sensed information, communications
• Modeled Outputs: Perception of OAKOC, including terrain, vegetation, weather,
slope, soil type, and light conditions
5. Assess the Time Available. Many current PEO Soldier systems aim to reduce the
time to make a decision by providing more timely and accurate information (situational
awareness). Therefore, the simulation should model the soldier's assessment of time to
represent its effects on the soldier's action and decision cycles. Effects on his action
cycle might include speeding up a movement rate to meet a hit time when delayed by an
incidental enemy contact or moving off the plarmed route. An example involving the
soldier's decision cycle is shortening his engagement decision process based upon an
unexpected threat. He may have to fire reflexively or plan a hasty attack based on his
assessment of the time available. This is especially important in urban operations in
v^ich targets appear at much shorter ranges and in more unexpected locations.
• Modeled Inputs: Sensed time, assessment of other METT-TC elements, sensed
information
/
• Modeled Outputs: Perception of the time remaining to accompUsh tasks
6. Assess the Neutral/Civilian Situation. As with the soldier's assessment of friendly
and enemy situation, information about non-combat entities on the battlefield can be
shared via the HMD, CD A, or communication equipment. Therefore, this capability
must be explicitly modeled, especially given the wide array of missions the present-day
soldier must be prepared to accomplish.
• Modeled Inputs: Non-combat entity information perceived from HMD, CD A,
communications, other sensing equipment, and soldier observation
• Modeled Outputs: Perception of non-combat entity disposition (population
centers, persotmel locations), composition (cultural, religious, ethnic entity mix,
sensitivities, traditions), capabilities (armament, equipment), and loyalties.
B. Sense and Making Sensing Decisions
1. Make Search Decisions. A soldier makes decisions to search based upon a
tremendous number of factors. PEO Soldier programs directly provide capabilities that
affect many of the inputs into search decisions. The capabilities of the sensing equipment
that the soldier carries, such as the Thermal Weapon Sights (TWS), Night Vision Devices
(NVDs), and Lightweight Video Reconnaissance System (LVRS), will affect these
decisions. Additionally, the soldier's assessment of METT-TC, discussed above, will be
an input into search decisions. The soldier's searching methods while moving should be
consistent with TTPs and not based, for instance, strictly upon direction of movement.
When stationary in hasty or deliberate fighting positions, sectors of responsibility will be
assigned the soldier based on leader guidance or SOP. The simulation must represent this
in some way.
Specific search decisions include:
• Choose to Search
• Choose Search Method
• Choose Search Location
• Choose Search Timing
113
• Choose to Correct Sensing Equipment Malfunction
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC (especially assessment of enemy
threat), knowledge of the soldier's equipment capabilities, sensory cues (primarily
audio and visual), battlefield history (previous engagement in a certain area),
assigned sector, SOP
• Modeled Outputs: Search decisions listed above
2. Search
a) Manipulate Sensing Equipment. A soldier will likely perform certain actions
with his equipment before, during, and after searching. These actions take time and
draw on the soldier's attentional resources. Usability issues are better studied using
prototypes; however, the time required to perform certain actions and the availability
of the equipment should be modeled in the simulation. All of the soldier's sensing
equipment is not available all of the time. Therefore, to be realistic and to weigh the
effects of equipment configuration on unit effectiveness, these characteristics should
be modeled to some degree.
Specific sensing equipment manipulation functions include:
• Change Sensing Equipment Status
• Correct Sensing Equipment Malfunction
• Emplace Sensing Equipment
Inputs and outputs of these functions include:
• Modeled Inputs: Search decisions listed above, METT-TC assessment, terrain
conditions, weather conditions, equipment configuration
• Modeled Outputs: Change in equipment status, reduced attentional resources,
elapsed time, audio signature
b) Orient. Once the sensing equipment has been readied, if necessary, the soldier
must orient himself and/or his equipment in the required direction. This orientation
may take time, especially for more complicated types of sensors, and should be
modeled. This function includes continuous changes to orientation, as in scanning.
Specific orient functions include:
• Orient Sensing Equipment
• Orient Self
Inputs and outputs of these functions include:
• Modeled Inputs: Search decisions listed above, terrain conditions, weather
conditions, equipment characteristics, field of view, scan time and technique
• Modeled Outputs: Change in equipment orientation, change in soldier
orientation, elapsed time
c) Observe. Many PEO Soldier products are designed to improve the soldier's
capability to observe. Most, if not all, of these are designed specifically for
improving his ability to see (observe throughout the EM spectrum - visual, image
intensification, IR, thermal); however, improvements to the soldier's ability to use his
other senses are probably not far into the future, and should therefore be modeled.
We must consider both the soldier's natural and technologically-enhanced
capabilities. Special sensors (e.g., electronic, magnetic, seismic, radioactivity,
chemical) would be included under the soldier sense that the signal he receives from
the sensor falls under (e.g., a Geiger counter represents detections with an audio
114
signal, which would fall under hear, and a visual meter, which would fall under see).
Additionally, for the sake of realism, the simulation should model the soldier's ability
to detect other cues, e.g. hearing movement, shifting equipment, or weapon reports, or
even observations based on fortuitous glances. Finally, discussion should be made of
line of sight (LOS). The simulation must consider not only symmetric LOS, but also
asymmetric LOS. For instance, a Soldier behind certain types of vegetation, or
behind a wall, can see a large portion of the battlefield through a relatively small hole.
Conversely, an observer looking in the direction of that Soldier has a very low
probability of seeing him. Another key example occurs inside of a building. A
Soldier who backs away from a window can still see outside relatively well; however,
he significantly reduces his own probability of being seen. A simulation that assumes
that if one entity can see another then the reverse is true, is unacceptable wiien
looking at the fidelity required of a Soldier model.
Specific observe fimctions include:
• See
• Hear
• Feel
• Smell
• Taste
Inputs and outputs of these functions include:
• Modeled Inputs: Weather, terrain, light, and wind conditions; symmetric and
asymmetric line of sight (LOS); battlefield obscurant properties; EM waves,
audio waves, vibrations, heat, and odors vicinity the soldier's sensory devices;
material properties, device capability, physiological capability, optical
contrast, target detectability (e.g., camouflage)
• Modeled Outputs: Visual, auditory, olfactory, tactile, and taste sensed
information; reduced attentional resources; reduced power (if electronic
equipment is being used); recognized need for another observation point;
recognized need for more information
3. Make Acquisition Decisions. PEO Soldier products directly provide capabilities that
affect many of the inputs into acquisition decisions. Again, the capabilities of the sensing
equipment that the soldier carries will affect his decisions, especially the method he
chooses to acquire a target, and his ability to assess target threat and characteristics.
Additionally, the soldier's assessment of METT-TC will be a critical input into
acquisition decisions, especially his choice of target on wiiich to focus, given multiple
targets.
Specific acquisition decisions include:
• Choose Acquisition Method
• Choose Detection to Focus On
• Assess Threat
• Assess Target Characteristics
Inputs and outputs of these decisions include:
• Modeled Inputs: Observed visual, auditory, olfactory, tactile, and taste
information, assessment of METT-TC
• Modeled Outputs: Acquisition decisions hsted above
115
4. Acquire. Many PEO Soldier products are designed to improve the soldier's capability
to acquire targets. Laser rangefinders and thermal imagers help the soldier locate targets,
and higher resolution sights give soldiers an improved capability to identify and classify
targets. Therefore, the simulation should be able to model these capabilities under all
potential environmental and mission conditions. In fact, the challenge is not so much
representing the soldier with technologically-advanced equipment, it is representing the
base case with a greater incidence of target location error, misidentification, and
misclassification, leading, in some cases, to fratricide.
Specific acquire functions include:
• Locate
• Recognize
• Identify
• Classify
Inputs and outputs of these functions include:
• Modeled Inputs: Acquisition decisions, combat ID / IFF devices, environmental
conditions (especially weather, light [day/night/transition, ambient levels,
artificial], and terrain conditions), actual target characteristics/signature,
assessment of target characteristics, target location, target activity
• Modeled Outputs: Perceived target location, perceived target identification
(friend, neutral, or foe), perceived target threat, reduced attentional resources,
reduced equipment power, combined effects of multiple sensors being used
5. Make Tracking/Designation Decisions. Soldier decisions in this area are based
primarily on his assessment of METT-TC and the results of his searching for and
acquiring of targets, objects, or locations. Because of this, PEO Soldier requires a
simulation that models these decisions. Tracking is the soldier function of keeping
observation on a target for a period of time. Designating is the function of using a device
to mark a target for another observer or weapon system.
Specific tracking/designation decisions include:
• Choose to Track/Designate Targets
• Choose Track/Designate Duration
• Choose Targets to Track/Designate
Inputs and outputs of these functions include:
• Modeled Inputs: Assessment of METT-TC; perceptions of target location; target
identification and threat; target choice (if choice is a moving target); choice to
observe a moving target
• Modeled Outputs: Tracking/designation decisions listed above
6. Track/Designate. PEO Soldier products include sensors and designators, like the
Multi-Function Laser System and the Lightweight Laser Designator Rangefinder
(LLDR), that directly affect the soldier's capability to track and/or designate a target, and
should therefore be modeled.
• Modeled Inputs: Tracking/designation decisions listed above, environmental
conditions (especially terrain, weather, and light conditions), target
characteristics, target location, target activity
• Modeled Outputs: Target being tracked, designated target, reduced attentional
resources, reduced equipment power
116
C. Engage and Making Engagement Decisions
1. Make Engagement Decisions. These decisions rely heavily upon the we^ons and
equipment that the soldier is carrying, and upon the unit assets at his disposal, as well as
his means of bringing those assets to bear. PEO Soldier products, therefore, greatly
impact these decisions. Advanced communications and digital equipment should allow
the soldier to call upon networked fires with shorter response times. Advanced weapon
capabilities and the soldier's ability to fire with reduced exposure will affect his choice of
methods and timing. The simulation, then, should represent these engagement decisions.
Specific engagement decisions include:
• Choose Engagement Method
• Choose Target
• Assess Engagement Effects
• Select Engagement Timing
• Decide to Disengage
• Decide When to Correct Device Malfunction
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, perceived target location, perceived
target identification (friend, neutral, or foe), perceived target threat, perceived
target range, perceived target attributes; perceived proximity of target to
structures, other entities, and objects; desired effects on target; visual, auditory,
olfactory, tactile, and taste sensed information; weapons available; mission
requirements; ROE
• Modeled Outputs: Engagement decisions Usted above
2. Engage
a) Conduct Close Quarters Combat. Under certain circumstances, the soldier may
choose to conduct close quarters combat (CQB), instead of engaging the enemy with
a weapon. While PEO Soldier products may not be specifically designed for this
purpose, they may affect the performance of these functions. For exanple, shorter
weapons may change the way in vviiich soldiers perform rifle bayonet engagements.
Equipment configurations may reduce the soldier's range of motion thereby hindering
his ability to fight.
Specific close quarters combat functions include:
• Fight
• Fight with Equipment
Inputs and outputs of these functions include:
• Modeled Inputs: Engagement decisions listed above, actual target location
and range, target capabilities, equipment configuration
• Modeled Outputs: Effects on target and overall target status, reduced energy
level, entity reaction, changed equipment status, physiological effects,
psychological effects
b) Employ a Device
(1) Manipulate Engagement Device
(a) Prepare Device. A soldier will perform certain actions with his weapons
and equipment before, during, and after engaging. These actions take time
and draw on the soldier's attentional resources. As mentioned before,
usability issues are better studied using prototypes; however, the time reqxxired
117
to perform certain actions and the availability of the weapons and equipment
must be modeled in the simulation. To be realistic and to differentiate
between optimal weapons for particular system configurations, these
characteristics should be modeled in some way.
Specific manipulate engagement device functions include:
• Arm Engagement Device
• Load Engagement Device
• Emplace Engagement Device
• Correct Engagement Device Malfunction
Inputs and outputs of these functions include:
• Modeled Inputs: Engagement decisions listed above, METT-TC
assessment, device configuration, device status
• Modeled Outputs: Change in device status, change in device
characteristics, reduced attentional resources, elapsed time, audio
signature
(b) Aim Device. Products such as aiming lights, the Integrated Laser White
Light Pointer, Multi-Function Laser System, and the Sniper Night Sights
directly impact this function and require that the simulation represent tiie
aiming process for the various types of equipment.
• Modeled Inputs: Engagement decisions listed above, perceived target
location (range and direction), and target exposure (posture)
• Modeled Outputs: Change in device orientation, reduced attentional
resources, elapsed time, change in equipment status, reduce equipment
power, EM wave (in the case of laser and white light sights)
(2) Fire Weapon. All weapons and ammunition developed by PEO Soldier
require an accurate representation of this function. This should include all
weapons and ammunition discussed in Section O.D.
• Modeled Inputs: Engagement decisions listed above, actual target location
and range, weather conditions, wind conditions, terrain (including
vegetation), weapon status, equipment configuration, device orientation
(deflection, elevation), firer posture, target posture, laser designation,
attentional resources, probability of hit, probability of kill
• Modeled Outputs: EM wave (e.g., a visual signature or a laser or high
powered microwave (HPM) weapon), thermal effects, smoke, audio
signature, olfactory signature, ballistic trajectory, EM trajectory, terminal
effects on target (human, equipment, structure, environment - including
collateral damage), suppressive effects, entity reaction, effects on the
environment and its reaction, change in weapon status, reduction of
available ammunition, elapsed time, physiological effects on firer,
accuracy (fixed and variable biases and random errors), round dispersion,
adjusted Ph given a sensed missed, adjusted Ph give a miss not sensed,
targeting adjustments. These outputs must also generate some battle
damage assessment (BDA) for the soldier that provides him with either
confirmed or perceived effects of his engagement (feedback loop). The
level of confirmation or perception, relative to the desired effect of the
118
engagement, will serve as inputs into some other decide/act process (e.g.,
either a re-engagement, search for new targets, or continued movement).
(3) Detonate Explosive. This function covers explosives that are emplaced and
then detonated. Although PEO Soldier programs do not directly deal with the use
of explosives, the individual soldier on tiie battlefield may still use them,
especially in an urban operations (UO) environment. Therefore, a simulation
should represent this capability and the impacts (positive or negative) that any
PEO Soldier system has on the soldier's ability to employ them. Additionally,
some weapons and ammunition under development in PEO Soldier programs may
replace the need for explosives in certain situations. Therefore, the simulation
must model the use of explosives as the base case against vsiiich to compare
advanced PEO Soldier capabilities.
• Modeled Inputs: Engagement decisions listed above; explosive properties,
preparation, and location; weather conditions, wind conditions, terrain
(including vegetation); structural properties
• Modeled Outputs: EM wave, thermal effects, smoke, audio signature,
olfactory signature, shrapnel and projectiles trajectories, terminal effects
on targets, objects, and structures (including collateral damage),
suppressive effects, entity reaction, effects on the environment and its
reaction, change in explosive status, reduction of available explosives,
elapsed time
(4) Throw Device. This function primarily deals with grenades, but applies also
to any object that can be thrown to affect a target, including explosives that are
designed to detonate on impact, other thrown weapons (e.g., Molotov cocktails),
and even rocks. Grenades are important to model in the simulation since they are
a PEO Soldier product. Other objects could be critical to model when
representing peacekeeping scenarios. Additionally, a soldier tactical mission
system (STMS) may impact the soldier's ability to throw the device.
• Modeled Inputs: Engagement decisions listed above; chemical and
explosive properties; weather conditions, wind conditions, terrain
(including vegetation); structural properties; equipment configuration (e.g.
restricted range of motion)
• Modeled Outputs: These vary tremendously depending on the type of
projectile thrown: EM wave, thermal effects, smoke and smoke
propagation, audio signature, olfactory signature, shrapnel and projectile
trajectories, terminal effects on targets, objects, and structures (including
collateral damage), suppressive effects, entity reaction, effects on the
environment and its reaction, change in projectile status, reduction of
available projectiles, elapsed time
D. Move and Making Movement Decisions
1. Make Navigational Decisions. This is another key capability gap that PEO Soldier
products aim to address. Each of the primary navigation decisions are aided by GPS
equipment, the HMD, and the CDA. Not only must a simulation that can be used for
analysis of alternatives represent the benefits of those types of equipment, but it must also
represent comparative cases, e.g., navigation errors caused by use of only a 1:50000 map
119
and a lensatic compass. Additionally, navigation representation must account for the
soldier's reaction to natural and man-made obstacles encountered during movement. On
a smaller scale, these decisions encompass soldier movements towards cover and
concealment. Special mention should also be made of the simulations requirement to
track both perceived and actual data. In a navigational context, the comparison of actual
and perceived data and their effects on soldier decisions are critical for assessing the
effectiveness of GPS equipment.
Specific navigational decisions include:
• Determine Location
• Choose Destination
• Choose Route
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment ofMETT-TC, with particular focus on assessments
of the mission, enemy, and terrain (OAKOC); information from navigational aids
(e.g., maps, photos, compasses, digital compass, GPS, feeds from sensory
equipment, communicated information and directives); perception of terrain;
perceived location (output of one decision, input into another), recognized need
for another observation or firing point
• Modeled Outputs: Perceived location, perceived distance, perceived time,
navigational decisions listed above
2. Making General Movement Decisions. Movement techniques include traveling,
traveling overwatch, and bounding overwatch. Movement methods include walking,
miming, crawling, jumping, etc. The primary drivers for these decisions are the soldier's
perception of the enemy situation, the terrain, and the time remaining to complete the
mission. Again, the information shared via the CDA, HMD, and communication
equipment can aid the soldier in making these movement decisions and should be
represented in the simulation. As above, the mistakes made in the absence of these
devices should be modeled by the simulation as well. The simulation should also model
the choice of movement methods to avoid detection (slower, crouched, deliberate)
General movement decisions include:
• Choose Movement Technique
• Choose Movement Formation
• Choose Movement Method
• Choose Movement Timing
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, with particular focus on assessments
of the enemy, terrain (OAKOC), and time remaining to complete the mission;
information from navigational aids (e.g., maps, photos, compasses, digital
compass, GPS, feeds from sensory equipment, communicated information and
directives); perception of terrain, perceived location, perceived time and distance
• Modeled Outputs: General movement decisions listed above
3. Change Physical Location. The actual physical functions of moving from one
location to another are not directly aided by current PEO Soldier programs, with the
exception of some climbing aids for urban operations (UO). However, future soldier
equipment could include exoskeletons and other muscular aids that would strive to
enhance basic human movements, strength, etc. Regardless, the weight of the equipment
120
carried and worn by the soldier affects these functions. In order to determine the
effectiveness of one soldier tactical mission system (STMS) over another, tiie simulation
should model the effects of soldier equipment on his movement rate, degrees of motion
on his joints, limitations in his fine motor skills, etc. It is probably not necessary to
explicitly model each type of movement in detail; however, the differences in movement
rates, levels of exposure, and energy expenditure, as well as the situations and terrain
conditions in which these types of movement may be used should be modeled. The
simulation must also represent the dynamic changes in soldier attributes during
movement (e.g., high crawl, to a rush, to a high crawl, to a combat roll, to moving behind
cover, etc).
Specific functions of changing physical location include:
Run
Walk
Crawl
Roll
Climb
Swim
Jump
Mount/dismount
Inputs and outputs of these functions include:
• Modeled Inputs: General movement and navigational decisions listed above;
actual terrain (including vegetation, soil type, and grade), weather, and light
conditions; soldier physiological attributes, especially energy level and physical
fitness; equipment attributes (especially weight, bulkiness, and other restricting
attributes); use of equipment that can assist in these functions (e.g., life vest,
ladder, grappling hook, etc)
• Modeled Outputs: Change in soldier location, change in soldier posture,
exposure, audio waves, visual cues, physiological effects (especially reduced
energy level), change in equipment status, change in equipment configuration,
reduced attentional resources
4. Choose Posture. Choice of posture has a great impact on the outcome of an
engagement, both in terms of the accuracy of the firer and the ejqjosure of the target to
detection and engagement, as well as the exposure of the firer to returned fires.
Additionally, the choice to fire from a reduced exposure firing position is a key decision
provided by Land Warrior. Therefore, the posture decision should be represented in
simulation.
Inputs and outputs of this decision include:
• Modeled Inputs: Assessment of METT-TC, with particular focus on assessments
of the enemy and the terrain (OAKOC), communicated information and
directives, SOP, availability of cover and concealment
• Modeled Outputs: Choice of posture, choice of posture change timing
5. Change Posture. While the actual tiansition fi-om one posture to another probably
does not need to be modeled explicitly, certain characteristics of each posture and the
tiansitions between those postures should be. For instance, the inability to tiansition
between certain postures due to equipment configuration is important to capture, as is the
target area presented to the enemy and the effects on firer accuracy from each position.
121
Specific change posture functions include:
• Stand
• Crouch
• Kneel
• Sit
• Lie Down (including the prone firing position)
Inputs and outputs include:
• Modeled Inputs: Decision to change posture, choice of posture change timing,
equipment configuration
• Modeled Outputs: Change in posture, reduced exposure, physiological effects,
firing and throwing accuracy (e.g., changes in the probabilities of a kill, hit, etc),
sound produced, visual cues, change in equipment state
E. Communicate and Making Communication Decisions. In general, the simulation
model must consider communication capabilities above, below, and lateral to the soldier. It
must consider issues like integration and interoperability with digital systems (FBCB2,
artillery systems), both mounted (especially in the case of Stryker) and dismounted. It should
model enhancements in range and non line-of-sight (NLOS) capabilities. In the case of Land
Warrior, communications includes not only radios, but the HMD and CDA.
1. Make Transmission Decisions. Communication is critically important to the soldier
systems being fielded currently and in the future. In addition to the ability to transmit
voice data, soldiers can transmit digital data as well, allowing for a greater exchange of
information. Much of the information sent and received by personnel and devices (e.g.,
information transmitted via satellite or UAV) supports the decisions made by the soldier
on the battlefield. Therefore, it is important that tiiese decisions and functions be
represented by the simulation. Finally, the simulation should model soldier's choice to
transmit based upon the level of combat activity. For instance, soldiers are unlikely to
type in data during a firefight or other activity that requires significant attentional
resources.
Specific transmission decisions include:
• Decide When to Transmit
• Decide What to Transmit
• Decide How to Transmit
• Decide Whom to Transmit to
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC; perception of the terrain; perception
of the enemy situation; visual, auditory, olfactory, tactile, and taste sensed
information; recent engagements; entity attentional resources due to current
activity; SOP; communication received
• Modeled Outputs: Transmission decisions listed above
2. Transmit. This function includes all types of communication including verbal, handsignaling, writing, typing information into a CDA, etc. As more devices on the
battlefield transmit information, issues like bandwidth availability become important.
Therefore, the simulation must represent these transmissions and the effects of
information loss due to bandwidth issues. The simulation must include communication
with soldier-controlled systems like UAVs, robots, etc. The environment has a
122
significant effect on transmissions and must also be represented. Also, the time it takes
to type data into a transmission device is important to capture vAien comparing
alternative systems. Finally, as discussed above, the simulation should model soldier's
ability to transmit based upon the level of combat activity.
Specific transmission functions include:
• Talk
• Type
• Write
• Signal
Inputs and outputs of these functions include:
• Modeled Inputs: Transmission decisions listed above, entity attentional resources
due to current activity, equipment configuration, equipment status, bandwidth
load, bandwidth capacity, light conditions, weather conditions, terrain conditions
(especially line-of-sight [LOS]), soldier clothing and equipment effects (e.g.,
effects of MOPP gear), length of the message, background noise and interference
• Modeled Outputs: Verbal, typed, written, and signaled communications; change
in equipment status; reduced attentional resources; elapsed time; audio signature;
visual signature; additional load to bandwidth; reduced equipment power;
pyrotechnic effects on the environment; transmission success or failure; accuracy
of transmitted information; transmission quality
3. Make Reception Decisions. The decision to receive a communication is not just
restricted to the traditional "listening to the radio." It includes other related acts like
watching for signals and listening for verbal communication, as well as viewing the
HMD, CD A, or GPS to collect information from other sources. As discussed many times
previously, PEO Soldier products involve the transfer of this additional information (up,
down, and laterally). However, the soldier does not automatically get this information.
He must know wiien and vviiere to seek the information he needs. Therefore, these
decisions should be modeled.
Specific reception decisions include:
• Decide to Receive
• Decide How to Receive
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC; perception of the terrain; perception
of the enemy situation; visual, auditory, olfactory, tactile, and taste sensed
information; recent engagements; SOP; communication received; choice to
determine location; choice to locate the enemy
• Modeled Outputs: Reception decisions listed above
4. Receive. For the reasons discussed above, receiving communications from other
soldiers or information from data devices is critical to the success of PEO Soldier
equipped entities. It cannot be assumed that all information available to soldier will be
received. Therefore, the distinction between what is received by the soldier and what is
not is critical for simulation. Here, we must account for the strengths and limitations of
the STMS (e.g. one configuration communicates with FBCB2, the other does not).
• Modeled Inputs: Reception decisions listed above, equipment configuration,
equipment status, bandwidth load, light conditions, weather conditions, terrain
conditions
123
• Modeled Outputs: Change in equipment status; reduced attentional resources;
elapsed time; reduced equipment power; visual, auditoiy, and tactile sensed
information; perceived information
5. Manipulate Communications Equipment. A soldier will likely manipulate his
communications equipment before, during, and after searching. These actions take time
and draw on the soldier's attentional resources. As mentioned with other types of
equipment, usability issues are better studied using prototypes; however, the time
required to perform certain actions and the availability of the equipment should be
modeled in the simulation. To be realistic and to weigh the effects of equipment
configuration on unit effectiveness, these characteristics should be modeled in some way.
Specific communication equipment manipulation functions include:
• Change Communications Equipment Status
• Correct Communications Equipment Malfunction
Inputs and outputs of these functions include:
• Modeled Inputs: Transmission and reception decisions listed above, METT-TC
assessment, terrain conditions, weather conditions, equipment configuration,
equipment characteristics
• Modeled Outputs: Change in equipment status, reduced attentional resources,
elapsed time, audio signature
6. Process Communication Information. Here we are talking about the process of
taking the words, signals, sounds, or data received and converting those cues into
information that can be used elsevsiiere. The specifics of how this is done are most likely
beyond the scope of explicit modeling since the processes of the human brain are not
completely understood. However, vs^at should be captured in some way in the
simulation are the effects of information overload on the soldier. Expectant results would
include information this is missed, confused, or lost. Here also we should mention the
requirement to account for the effects of conflicting or contradictory information (e.g.
soldier perception vs. information received from others) as an output of this function that
would be an input into other functions and decisions. For example, as information comes
in, the soldier processes it and then acts upon it. The subsequent action depends upon the
information; if it conflicts with the soldier's current perception, then he may attempt to
clarify that information by entering into another (1) communication decide-act cycle, or
(2) sense decide-act cycle.
Specific communications processing functions include:
• Sort / Prioritize Communication Data
• Interpret Communication Data
Inputs and outputs of these functions include:
• Modeled Inputs: Visual, auditory, olfactory, tactile, and taste sensed cues;
environmental conditions; physiological and psychological conditions; sorted
communications data (output of one function; input into the other)
• Modeled Outputs: Information, changes cognitive bandwidth (stimulated or
reduced), physiological conditions
F. Enable and Making Enabling Decisions. The enable function serves to c^ture those
functions that a soldier performs that can apply to two or more of the major categories, or
functions that the soldier performs to assist in the execution of other functions. Thus, these
124
actions 'enable' the soldier to engage, move, communicate, and sense, either directly or
indirectly, as well as "operate" in the basic human sense.
1. Alter Surroundings and Making Decisions to Alter Surroundings
a) Make Decisions to Alter Terrain. In order to realistically represent a soldier in
combat, these decisions should be represented. A soldier in a position for any length
of time will begin to prepare a position, whether it is scratching out a hasty fighting
position and clearing his fields of fire, or preparing deliberate fighting positions.
Specific decisions to alter terrain include:
• Decide to Dig
• Decide to Build
• Decide to Alter Vegetation
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, perception of the terrain and
vegetation, equipment available, SOP, communications from leaders
• Modeled Outputs: Decisions hsted above
b) Alter Terrain. The actual functions of altering terrain may not have to be
modeled explicitly in the simulation. However, given a certain soil type and other
terrain characteristics, as well as the equipment available to the soldier, the model
should represent changes to the soldier's exposure over time, his ability to sense in
his sector (changing LOS), and his energy level.
Specific terrain altering functions include:
• Dig
• Build
• Alter Vegetation
Inputs and outputs of these functions include:
• Modeled Inputs: Decisions to alter terrain listed above, equipment available,
terrain characteristics (e.g., soil type, vegetation, water table), soldier energy
level
• Modeled Outputs: Change in soldier exposure, audio signature, reduced
energy level, other physiological effects, elapsed time, reduced attentional
resources, changes in terrain and vegetation, changes in soldier cover and
concealment
c) Make Decisions to Alter Objects. These encompass decisions to clear a path for
movement (e.g., move or breach obstacles, open doors), to clear a field of fire (except
for altering terrain captured above, such as opening a window or "punching" a hole in
a wall), or to alter objects for other reasons. These decisions are especially important
in urban operation (UO) environments, in v^ich soldiers must open doors and
windows, move furniture out of their way, etc. In other environments, these functions
would most importantly involve the soldier's decision to breach or remove obstacles
to his movement.
Specific decisions to alter objects include:
• Decide to Move Objects
• Decide to Change Objects
Inputs and outputs of these decisions include:
125
• Modeled Inputs: Assessment of METT-TC, equipment available, perceived
environmental characteristics, perception of the terrain, energy level,
physiological attributes, SOP
• Modeled Outputs: Decisions to alter objects listed above
d) Alter Objects. PEO Soldier programs develop equipment that can perform these
fimctions. For instance, shotguns used to breach locked doors, 40mm rounds that can
be used to breach thin walls, etc. In order to capture the effects of these weapons,
especially in the UO environment, their capabilities should be modeled.
Specific alter object functions include:
• Move Objects
• Change Objects
Inputs and outputs of these functions include:
• Modeled Inputs: Decisions to alter objects listed above, object attributes,
equipment available, ammunition available, energy level, physiological
attributes, terrain conditions, weather conditions, light conditions
• Modeled Outputs: Change in object attributes, location, orientation, or
configuration; elapsed time; reduced energy level; audio signature;
physiological effects; reduced attentional resources; soldier exposure as a
target; audio signature
2. Manipulate Load and Making Decisions to Manipulate Load
a) Decide to Tailor Load. During the course of a mission, the soldier may make
decisions to change his load. For instance, upon making contact, the soldier will
probably drop his rucksack until the conclusion of the engagement. When a unit
moves into an objective rally point (ORP), they may leave their rucksacks and
uimecessary equipment behind, under guard, until the mission is complete. A soldier
may choose to pick up an enemy weapon if his ammunition level is low. The soldier
may decide to shift equipment around to make some available for use (e.g., pulling
equipment from the rucksack or putting unnecessary equipment into it). Therefore, at
a minimum, the most basic and common of these decisions should be modeled in a
simulation.
Specific decisions to tailor load include:
• Decide to Add to Load
• Decide to Alter Load
• Decide to Reduce Load
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, soldier equipment availability,
ammunition level, SOP, energy level, physiological attributes
• Modeled Outputs: Decisions to tailor load listed above
b) Tailor Load. The modeling of these functions is important in that they alter the
soldier's capabilities based on the change in availability of his equipment. The actual
location of the equipment on his body is not critical for modeling; however, his
change in capability and availability is. Also modeled should be the ease of tailoring
the load.
Specific tailor load functions include:
• Add to Load
• Alter Load
126
• Reduce Load
Inputs and outputs of these functions include:
• Modeled Inputs: Decisions to tailor load listed above, equipment availability,
equipment characteristics and configuration, non-soldier equipment available
to him, energy level, physiological attributes
• Modeled Outputs: Changes to equipment availability, changes to equipment
configuration, change in equipment attributes, elapsed time, audio signature,
reduced attentional resources
c) Decide to Carry Object. This decision is similar to the decision to tailor the load;
however, the intent in this decision is to move an object from one point to another,
instead of adding to a load with the intent to use it. The modeling of this decision and
the subsequent function is not critical for comparing STMS, but will help make the
simulation more realistic for certain situations, especially that of casualty evacuation.
• Modeled Inputs: Assessment of METT-TC, soldier load, perceived
environmental characteristics, perception of the terrain, energy level,
physiological attributes, SOP
• Modeled Outputs: Decision to carry an object
d) Carry Object. The function of carrying an object (or person) is critical for
realism in some situations, most notably, as mentioned above, casualty evacuation.
This function could be an individual soldier or a team of soldiers carrying an object.
• Modeled Inputs: Decision to carry an object, soldier load, environmental
conditions, energy level, physiological attributes, object attributes
• Modeled Outputs: Change in object location, change in load, change in
equipment attributes, audio signature, reduced energy level, physiological
effects, reduced attentional resources
Operate and Making Operation Decisions
a) Decide to Conduct Bodily Functions. These decisions do not necessarily need to
modeled explicitly by the simulation, unless the analyst in interested in comparing the
advantages of various types of water storage equipment. Their corresponding
functions, on the other hand, or rather their effects, should be represented in some
way if scenarios are of extended duration.
Specific bodily function decisions include:
• Monitor Needs
• Decide to Consume
• Decide to Expel Waste
• Decide to Rest
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, energy level, perceived needs
(hunger, thirst, exhaustion, etc), food availability, drink availability, SOP
• Modeled Outputs: Bodily function decisions listed above
b) Conduct Bodily Functions. As noted above, the fiinctions themselves do not
need to be explicitly modeled; however, the effects of performing, or not performing
those functions should be if the scenario is of extended duration. This would ensure
that such considerations as the amount and type of food carried, the water carrying
capacity, etc, are modeled for evaluation purposes. Also considered should be the
ease with wiiich these functions are performed given a specific STMS configuration.
127
This is important, since it will be the first test it will imdergo with soldiers. The last
piece may be the role of an engineering level simulation, but the results of that
analysis should feed the mission level simulation.
Specific bodily functions include:
• Consume
• Expel Waste
• Rest
Inputs and outputs of these functions include:
• Modeled Inputs: Decisions to conduct bodily functions listed above,
physiological attributes, water bearing equipment capacity, food
characteristics, liquid characteristics
• Modeled Outputs: Changes to physiological attributes, reduction in the
amount of food and water available, reduced attentional resources (rest)
c) Decide to Administer First Aid. The vv^ole piece of casualty treatment and
evacuation must be modeled. To start with, the decisions made by soldiers to treat
themselves or others have a tremendous impact on any engagement. Likewise, the
decision of a soldier or team of soldiers to evacuate a unit member results in a
significant reduction in the available fighting force. Thus, the decision to administer
first aid must be modeled to make the simulation more realistic.
Specific decisions to administer first aid include:
• Monitor Health
• Decide to Administer Self Aid
• Decide to Administer Buddy Aid
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, sensed information (either self
monitored or sensed from another soldier), communications received,
physiological attributes, equipment available, environmental effects on
wounds, elapsed time since wounding
• Modeled Outputs: Decisions to administer first aid
d) Administer First Aid. The simulation must model the effects of performing first
aid, since it is a function performed by soldiers on the battlefield; however, the
representation need not be explicit, as long as the probabilistic effects are modeled.
Equipment such as clothing with built-in toumiquets or other medically-based
uniform construction will need a simulation that can model, in some way, the
application of first aid in order to evaluate their effectiveness.
Specific first aid functions include:
• Administer Self Aid
• Administer Buddy Aid
Inputs and outputs of these functions include:
• Modeled Inputs: Decisions to administer first aid, physiological attributes
(especially type and location of injury), equipment characteristics, equipment
available, skill and training of the soldier
• Modeled Outputs: Changes to physiological attributes, reduction in available
equipment, reduced attentional resources, reduced energy level, elapsed time
e) Make Equipment Operating Decisions. These decisions encompass the full
array of logistical decisions. With each new STMS or piece of equipment issued to
128
the soldier, the systems become more complex. This dependence upon technology
has many important implications for logistical support and maintenance. Therefore,
the decisions to perform these functions must be modeled, especially since it is
unlikely that a soldier will be able to repair equipment immediately after it fails.
Specific equipment operating decisions include:
• Monitor Equipment
• Decide to Maintain
• Decide to Sustain / Resupply / Redistribute
• Decide to Repair
Inputs and outputs of these decisions include:
• Modeled Inputs: Assessment of METT-TC, equipment characteristics,
equipment status, repair or maintenance equipment available, level of
maintenance/repair required, training of the soldier, SOP, availability of all
classes of supplies
• Modeled Outputs: Equipment operating decisions listed above
f) Keep Equipment Operating. Included in these soldier functions, are the
representations of reliability, failure rates, types of failure, equipment integration
issues, durability, equipment damage, power source capacity, and availability of
supplies from extemal sources, repair (service) times, maintenance times, etc.
Specific equipment operating functions include:
• Maintain
• Sustain / Resupply / Redistribute
• Repair
Inputs and outputs of these decisions include:
• Modeled Inputs: Equipment operating decisions listed above, equipment
characteristics, equipment status, repair and maintenance equipment
available, training of the soldier, availability of all classes of supplies
• Modeled Outputs: Change in equipment status, change in level of supplies,
reduced attentional resources, physiological effects, elapsed time, change in
equipment characteristics
129
Annex A. US Army Tactical Mission Tasks and Operations from FM 7-15
ART 8.0: CONDUCT TACTICAL MISSION TASKS AM) OPERATIONS
ART 8.1: Conduct Offensive Operations
ART 8.1.1 Conduct a Movement to Contact
ART 8.1.2 Conduct an Attack
ART 8.1.3 Conduct an Exploitation
ART 8.1.4 Conduct a Pursuit
ART 8.1.5 Conduct One of the Five Forms of Maneuver
ART 8.2: Conduct Defensive Operations
ART 8.2.1 Conduct an Area Defense
ART 8.2.2 Conduct a Mobile Defense
ART 8.2.3 Conduct a Retrograde
ART 8.3: Conduct Stability Operations
ART 8.3.1 Conduct Peace Operations
ART 8.3.2 Conduct Foreign Internal Defense Operations
ART 8.3.3 Conduct Security Assistance
ART 8.3.4 Conduct Humanitarian and Civic Assistance
ART 8.3.5 Provide Support to Insurgencies
ART 8.3.6 Support Counterdrug Operations
ART 8.3.7 Combat Terrorism
ART 8.3.8 Perform Noncombatant Evacuation Operations
ART 8.3.9 Conduct Arms Control Operations
ART 8.3.10 Conduct a Show of Force
ART 8.4: Conduct Support Operations
ART 8.4.1 Conduct Domestic Support Operations
ART 8.4.2 Conduct Foreign Humanitarian Assistance
ART 8.4.3 Conduct Forms of Support Operations
ART 8.5: Conduct Tactical Mission Tasks
ART 8.5.1 Attack by Fire an Enemy Force/Position
ART 8.5.2 Block an Enemy Force
ART 8.5.3 Breach Enemy Defensive Positions
ART 8.5.4 Bypass Enemy Obstacles/Forces/Positions
ART 8.5.5 Canalize Enemy Movement
ART 8.5.6 Clear Enemy Forces
ART 8.5.7 Conduct Counterreconnaissance
ART 8.5.8 Contain an Enemy Force
ART 8.5.9 Control an Area
ART 8.5.10 Defeat an Enemy Force
ART 8.5.11 Destroy a Designated Enemy Force/Position
ART 8.5.12 Disengage from a Designated Enemy Force
ART 8.5.13 Disrupt a Designated Enemy Force's Formation/Tempo/Timetable
ART 8.5.14 Conduct an Exfiltration
ART 8.5.15 Fix an Enemy Force
ART 8.5.16 Follow and Assume the Missions of a Friendly Force
130
ART 8.5.17 Follow and Support the Actions of a Friendly Force
ART 8.5.18 Interdict an Area/Route to Prevent/Disrupt/ Delay its Use by an Enemy Force
ART 8.5.19 Isolate an Enemy Force
ART 8.5.20 Neutralize an Enemy Force
ART 8.5.21 Occupy an Area
ART 8.5.22 Reduce an Encircled/Bypassed Enemy Force
ART 8.5.23 Retain a Terrain Feature
ART 8.5.24 Secure a Unit/Facility/Location
ART 8.5.25 Seize an Area
ART 8.5.26 Support By Fire the Maneuver of Another Friendly Force
ART 8.5.27 Suppress a ForceAVeapon System
ART 8.5.28 Turn an Enemy Force
ART 8.5.29 Conduct Combat Search and Rescue
ART 8.5.30 Conduct Consolidation and Reorganization Activities
ART 8.5.31 Reconstitute Tactical Forces
131
Annex B. Joint Mission Tasks from C JCSM 3500.04C
Note: Many tasks have been deleted from earlier versions of CJCSM 3500.04C.
TA 1 Deploy/Conduct Maneuver
TA 1.1 Deleted
TA 1.1.1 Conduct Tactical Airlift
TA 1.1.2 Conduct Shipboard Deck Helicopter Landing Qualifications
TA 1.1.3 Deleted
TA 1.1.4 Conduct Sea and Air Deployment Operations
TA 1.2 ConductPassage of Lines
TA 1.2.1 Conduct Air Assault Operations and Air Assault
TA 1.2.2 Conduct Airbome Operations
TA 1.2.3 Conduct Amphibious Assault and Raid Operations
TA 1.2.4 Conduct Counterdrug Operations
TA 1.3 Conduct Countermine Operations
TA 1.4 Conduct Mine Operations
TA 1.5 Deleted
TA 1.5.1 Deleted
TA 1.5.2 Deleted
TA 2 Develop Intelligence
TA 2.1 Deleted
TA 2.2 Deleted
TA 2.3 Deleted
TA 2.4 Disseminate Tactical Waming Information and Attack Assessment
TA 2.5 Deleted
TA 3 Employ Firepower
TA 3.1 Deleted
TA 3.1.1 Deleted
TA 3.2 Deleted
TA 3.2.1 Conduct Fire Support
TA 3.2.2 Conduct Close Air Support
TA 3.2.3 Conduct Interdiction Operations
TA 3.2.4 Conduct Joint Suppression of Enemy Air Defenses (JSEAD)
TA 3.2.5 Deleted
TA 3.2.6 Conduct Attacks Using Nonlethal Means
TA 3.2.7 Conduct Air and Missile Defense Operations
TA 3.2.8 Conduct Air To Air Operations
TA 3.3 Coordinate Battlespace Maneuver and Integrate With Firepower
TA 3.4 Deleted
TA 4 Perform Logistics and Combat Service Support
TA 4.1 Deleted
TA 4.2 Distribute Supplies and Provide Transport Services
TA 4.2.1 Deleted
TA 4.2.2 Deleted
TA 4.2.3 Conduct Air Refueling
132
TA 4.2.4 Deleted
TA 4.3 Deleted
TA 4.4 Conduct Joint Logistics Over-the-Shore Operations (JLOTS)
TA 4.5 Deleted
TA 4.6 Deleted
TA 5 Exercise Command and Control
TA 5.1 Deleted
TA 5.2 Deleted
TA 5.2.1 Establish, Operate and Maintain Baseline Information Exchange
TA 5.2.2 Deleted
TA 5.3 Deleted
TA 5.4 Deleted
TA 5.5 Deleted
TA 5.5.1 Conduct Force Link-Up
TA 5.6 Employ Tactical Information Operations
TA 6 Protect the Force
TA 6.1 Deleted
TA 6.2 Conduct Joint Personnel Recovery
TA 6.2.1 Deleted
TA 6.3 Conduct Rear Area Security
TA 6.4 Conduct Noncombatant Evacuation
TA 6.4.1 Deleted
TA 6.5 Provide For Combat Identification
TA 6.6 Deleted
TA 6.7 Deleted
TA 7 Operate In a CBRNE Environment
TA 7.1 Conduct Mission Operations in a CBRNE Environment
133
Appendix D: Engineering Problem Statement
Soldier Tactical Mission System (STMS) Simulation
Requirement
(Effective Need)
Identify and/or develop tactical combat simulation capability for Light Infantry missions
at the level of Platoon and below with resolution down to the individual Soldier. The simulation
capability must accept, as input, scenarios and Soldier STMS characteristics. It must model the
functions of the Soldier in a tactical environment, and provide, as output, the measures of
effectiveness (MOEs) used to evaluate STMS. The simulation(s) will provide the analytical
capability to support Program Executive Office (PEO) Soldier decision making.
Simulation Model
The Soldier will remain the key to victory on any Joint mission in vMch he is involved.
Soldiers will accomplish their assigned tasks within units operating as part of the combined arms
team in both Joint and Coalition task organizations. Missions will span the spectrum of Joint and
Army operations. Because of this, the simulation model must capture the full array of these
potential scenarios and evaluate the effectiveness of the STMS to Joint and Army missions. No
longer can we be Army-centric in our consideration of materiel solutions.
Soldier attributes and capabilities are integral to operational success in all circumstances.
The Soldier must be fully capable of accomplishing the mission and surviving to carry out future
missions in situations that will be complex, uncertain, high risk, and r^idly changing.
Environment types (urban, jungle, desert, plains/rural, forest/woodland, arctic, mountains,
littoral), dynamic terrain (relief, vegetation, soil composition, water, subterranean, build-up
areas, man-made features), and dynamic climate (weather, light conditions, man-made
conditions) will include all possibilities.
The simulation capability then must represent the Soldier in any combination of the
above situations and conditions. It must represent the combined arms team with which the
Soldier interacts during the conduct of his mission, including higher and lateral headquarters
134
elements. It must model the weapon systems and ammunition that the Soldier carries, brings to
bear, or faces on the battlefield, to include effects on all types of targets and the environment. It
must represent system reliability and power requirements. The simulation capability must
consider the network-centric characteristics of future warfare.
The simulation(s) must model the functions of the Soldier performed during the conduct
of a mission. On a macro level, the Soldier either acts or decides (a word used to describe all
mental processes). Soldier act functions include sense, engage, move, communicate, and enable.
The enable function includes those functions the Soldier performs to assist in the execution of
other functions. Such enabling functions include altering the Soldier's surroundings,
manipulating his load, and operating (conducting bodily functions, administering first aid, and
keeping equipment operating). Soldier decide functions include assessing the current situation
and making sensing, engagement, movement, communication, and enabling decisions.
In addition to the Soldier functions, the simulation(s) must represent Soldier attributes,
and how those attributes affect the functions he performs, i.e., how he transforms inputs to
outputs. Attribute types include mission (e.g., doctrine, ROE, SOP, and TTPs), personal (e.g.,
physical, physiological, psychological, and mental), and equipment (e.g., weapons and
ammunition, sensor, communications, and clothing).
STMS Characteristics
Given the simulation modeling capability, the simulation(s) must enable quantification
and analysis of the effects of differences in mission capability, survivability, and trustworthiness
of STMS as a function of variation in its attributes and characteristics.
The mission capability and survivability of the STMS is a function of lethality, mobility,
protection, communications, and situational awareness. A cardinal characteristic of the STMS is
the degree to w^ich it is trustworthy as manifest by reliability, availability, maintainability,
sustainability, and usability.
A more detailed discussion of STMS characteristics can be found later in this appendix.
135
Simulation Characteristics
The simiilatioii(s) must interface with the user, process inputs, control processes,
transform inputs to outputs, process outputs, and maintain, self-test, and manage redundancy.
Interfacing with the user includes being intuitive (Windows®-like, menu-driven) and
understandable. It must allow the user to input scenarios and STMS characteristics, as well as
alter entity behaviors and TTPs. It must allow for the extraction of stored data at any time during
scenario execution and have a playback capability to view the scenario.
Processing inputs includes managing the simulation's intemal data conversion (from
inputs) processes, interfacing with other simulations, and accepting data inputs from various
sources.
Controlling processes includes HLA compliance, flexibility that allows the integration of
additional capabilities, non human-in-the-loop (HITL) simulation execution, stochastic
repeatability, fidelity control, preprocessing capability, and optimized run-time.
Transforming inputs to outputs includes the simulation model characteristics discussed
above.
Processing outputs includes management of tiie simulation's intemal data conversion (to
outputs) processes, continuous or event-driven output capability, interface with extemal
simulations, and exportation of data to extemal sources.
Maintaining, self-testing, and managing redundancy includes the clear identification and
presentation of errors, traceability of errors, and data backup capability to avoid catastrophic data
loss.
Discussion of STMS Characteristics
Mission Capable and Survivable
Lethality denotes the ability to acquire potential targets, to assess their characteristics, to
engage them at ranges from close-in to extended (by direct or indirect fires), and to ensure they
are fully suppressed or defeated.
Mobility demands deployability at all levels (strategic, operational, tactical, & individual)
and by all means (air, land, & sea). Strategic mobility denotes deployment inter-theater aboard
136
USAF aircraft and USN ships. Operational mobility requires efficient transportation of STMS
within theater by air-transport (fixed or rotary wing), truck, and rail. Tactical mobility demands
quick, agile movement from company-level down to the individual Soldier (by air and on the
ground). The STMS must be amenable to tactical air assault; airborne operations; movement on
personnel carriers, fighting vehicles, trucks; and foot-movement (approach and assault). Soldiers
must remain agile (quickly find cover and concealment) and must be able to rapidly tailor their
load for the assault.
Protection must defeat: small caliber ballistics, light shrapnel, lasers, chemical
munitions, biologic attacks, fratricide (IFF), and weather related operational challenges.
The STMS must communicate intra-squad and between immediate and adjacent units. It
must be able to transmit to and receive digitized information within the chain-of-command.
Situational awareness for Soldiers denotes knowledge, at the squad level, of ^proximate
friendly and enemy dispositions, environmental factors, and viable options.
Trustworthy
Reliable: The STMS will function without hardware or software failure for the duration
of the mission.
Available: The components of the STMS required for a mission will be ready for use at
the time they are needed.
Maintainable: Components of the STMS requiring servicing or repair can be retumed to
operational status with a specified time.
Sustainable: The STMS when deployed on a mission can be resupplied so as to continue
a mission.
Usable: The STMS can be employed by Soldiers after a specified period of training and
experience.
137
Technologically Advanced
In order to meet the requirement for mission capability, the STMS must be interoperable
with present and anticipated related tactical systems. In must be competitive with other known
and anticipated systems available to adversaries, enabling our Soldiers to meet or exceed threat
capabilities.
Cost-Effective
The contribution of STMS to mission accomplishment and survivability must justify lifecycle costs.
138
Appendix E: Soldier MAWG Questionnaire
NOTE: The Soldier MAWG surveys were given to us courtesy of the Soldier MAWG team at
TRAC-WSMR. These are provided here to give an indication of the questions asked and
the type of information obtained. However, the results of and answers to these surveys
will not be published due to the sensitive nature of the content. Results in the aggregate
can be found in the Soldier MAWG Evaluation Report (Larimer, et al, 2004).
SURVEY QUESTIONS:
Model Characteristics Questions: These questions provide general information on the model.
1. Has the simulation been Verified, Validated, and Accredited for any U.S. Army, USMC, or
other DoD studies?
If so, who is the model's accrediting agency and what studies have used the simulation?
2. What level of unit resolution can be represented? What is this model's optimal level of
resolution?
3. Has the model been certified for any study related to soldier or small unit issues?
4. Who are your past and present users?
5. How lengthy is scenario development?
6. What scenarios has this simulation used?
7. How lengthy is scenario (replication) run time?
Given operating system/platform and size/scope of scenario?
Given mode of execution (single replicate, batch run, operator/analyst-in-the-loop)?
What is the range of synchronization rates/ratios used (1:1,1000:1...)?
8. What are the federations or other simulations with which this model has participated /
partnered?
9. When is your next release? What improvements are planned?
10. Who has proprietary rights to this model?
11. How flexible is this model to user requirements (adjust source code)?
12. What is the source of terrain data? Where and how is it edited? How easy is it to import
NIMA products into the model?
13. How easily can equipment and their effects be generated? That is, can equipment that does
not currently exist in the model be easily created?
14. How easily can outputs be catered to the user's needs?
15. Is the model constructive (stand-alone) or require man-in-the-loop?
16. Monte-carlo or deterministic?
17. If monte-carlo, how many replications are typically run?
18. Are statistics produced for the output measures (e.g., confidence intervals)?
19. Can a matrix of cases be set up ahead of time and run without intervention (i.e., batch mode)?
20. Is the model event-sequenced or time-sequenced?
21. What language is it written in?
22. On what computer platforms does it run?
23. What operating system?
24. What are the software and hardware requirements for running the model?
139
Model Evaluation Questionnaire for the Soldier System
The following pages contain a series of statements about key phenomenon necessary to address
individual and small unit analysis issues. For each statement, we ask the model
proponent to answer the following questions:
1. Do you think we have adequate knowledge of tiie real world phenomena to represent it in
M&S?
2. How does the model represent the phenomena? (algorithms) Do you think the model
adequately represent the phenomena? (If you do not directly represent the phenomena,
w^iat is your reasoning/justification for implementing a model shortcut?)
3. Who developed the model's representation of the phenomena? (Organization and POC if
possible)
4. How would you like to represent the phenomena?
5. Does data exist that supports the model's representation of the phenomena? Where are data
gaps? What are the most important gaps? What are the sources of data?
Modeling Situational Awareness/ C4I
Situational Awareness
1. Information known by the soldier in terms of the following factors:
Knowledge of own location
Knowledge of friendly unit/personnel locations
Knowledge of enemy unit/persoimel locations (ground truth versus effects of deception
operations/propaganda/PSYOPS, achievement of surprise).
Knowledge of terrain features (obstacles, roads, bodies of water, contours, etc.)
2. Quality of the soldier's knowledge (i.e. timeliness, error, confidence, etc).
3. The cognitive process whereby the soldier transforms information into awareness and
understanding (fusion, association, projection, etc.).
4. Soldier decision-making and behavior, and how they are influenced by situational
understanding (causal relationships between information and force effectiveness
outcomes). Can the soldier make autonomously make decisions based on his SA?
5. Effects of conflicting or contradictory information (e.g., perceived by soldier entity versus
reported by others/portrayed in COP/CROP).
Communications
1. Soldier level communication means.
Voice, radio
Voice, face-to-face
Visual, pyrotechnics
Data, radio (wireless network)
Visual, hand and arm signals
2. Threat communications.
140
3. Commxinication degradation between soldiers/units affected by the following factors:
Weather (rain, fog, snow, etc.)
Terrain (mountains, buildings, etc.)
Soldiers MOPP level (i.e. communication affected by wearing protective mask).
Radio types
Jamming (EW)
Weapons effects
4. Partial message reception or message misunderstanding.
5. The processes by which the soldier gains and disseminates information (e.g. pushing and
pulling information to/from the COP or other soldiers).
6. The process by v^ich non-human sensors (UAVs, robots, etc.) provide information to the
soldier.
Command and Control
1. OPORD and FRAGO planning and preparation activities.
2. Issuing of orders at all levels (team, squad, platoon, etc).
3. C2 of robots and semi-autonomous robots.
4. Soldier's and unit's ability to recognize the need to commit reserves, alter plans to respond to
emerging threats.
Intelligence
1. Information requirements and prioritization (CCIR).
2. Execution of a reconnaissance and surveillance plan in support of tactical operations.
3. Redirecting (dynamically re-tasking) sensors based on suspected threat situation (confirm or
deny incomplete information provided by another sensor).
Modeling Lethality
Acquisition: (please consider humans, robots, and UAVs)
1. Changes in the probability of acquisition of enemy targets given the following factors:
Level of ambient hght (i.e. day/night/transition)
Natural & man-made obscurants (rain, snow, fog, smoke, dust, etc.)
Terrain (sandy, mountainous, MOUT, etc.)
Artificial light (indoor engagements)
2. Availability and use of different acquisition systems based on the situation (night vision
devices, thermal sights, iron sights, video sight, etc.).
3. Combined effects of multiple sensors being used by a single soldier.
4. Change of acquisition ability based on the soldier's wear of MOPP equipment (i.e. protective
mask).
141
5. Target acquisition adjustment factors based on the acquisition system.
Different degrees of acquisition (i.e. detect, recognize, classify, identify)
Size of target (length, height, width)
Thermal signature of the target
Optical contrast of the target against the target's background.
Changing optical contrasts
Acquisition system field of view
Soldier's ability to scan a sector of fire with Umited field of view acquisition system
Changing target profiles
Artificial illumination
Foliage density and type
Viewer posture
Direction soldier is facing; likelihood soldier scans/looks to sides and rear versus
maintains attention on assigned sector
Range from soldier entity to sensor system being used to observe the target (if using
robotics, UAVs)
Target activity
6. Soldier entities passing target information to other soldier entities (i.e. can soldier use
information provided by another entity to orient his search/sector of fire).
7. Probability of soldier detection according to the following factors:
Type of personal camouflage worn
Electronic signature produced by the soldier's equipment
Thermal signature produced by the soldier and his equipment
8. The sound (audible) signature produced by the soldier and his equipment
(unique to the
system so that one soldier can be "noisier" than anotiier) (e.g., vsiiile digging hasty
defensive positions)
The olfactory signature (smell/odor/vapors) produced by the soldier and his equipment
(detection by domestic animals/guard dogs)
Presence of biologicals (bio-luminesence flagging combat swimmers, craft; animals of
all types potentially reacting to entity's presence)
9. Seismic, acoustic, and magnetic sensors.
10. Soldier scanning
Firing Decision and Engagement Process:
1. Acquisitions and responses based on knowledge of the battlefield, not just on seeing an
enemy entity. Examples: soldiers firing into an area where they think the enemy is based
on such things as muzzle flashes or last know location rather than just at a specific, seen
target.
2. Acquire, decide, and engage in close quarters combat.
With or without employment of robotics (air, surface (urban/aboard large vessels),
subsurface/caves).
Quick reaction drill.
Are robots (if used) solely sensor platforms or are they armed (lethal or non-lethal)?
3. Acquire, decide, and engage with non-combatants.
4. ROE and its effect on the acquire, decide, and engage process.
142
5. Soldier selection of engagement type given the target and weapons and other assets available
(e.g. soldier decision to engage with direct fire, indirect fire, or precision munitions).
6. Target evaluation and selection (i.e. if soldier acquires three targets at the same time, he has to
engage one before the other two).
7. Pre-planned fires (direct and indirect) in support of friendly movement.
8. Sector of fire discipline.
9. Soldier's engagement procedure as input to networked fires.
Weapon Effects:
1. Direct fire weapon system effects based on the following factors:
Probability of a hit given engagement
SSPH
PH, given a sensed miss
PH, given miss not sensed
Probability of a kill given a hit
Probability of incapacitation given a hit
Range of weapon
Rate of fire
2. Indirect fire weapons system factors:
Range of the weapon
Lethal radius
Ballistic error
Aim error
Target location error
Probability of a kill given a hit
Probability of incapacitation given a hit
Probability of incapacitation given blast effects (hearing loss, organ failure)
Ability of selected soldier entities to adjust fire and reduce target location error
Suppression effects of indirect fire systems
3. Non-lethal weapons and their effects.
4. Different mechanisms for direct fire effects and how terrain influences them.
Fragmentation
Heat and flame
Pressure
Laser
Bursting munitions
5. Non-lethal wounds and timeliness of medical care. Environmental affects on wounds.
6. Chemical weapons and their effects.
7. Playing of non-traditional weapon/target pairings (e.g. RPG versus personnel or Hellfire
missile into a room).
8. Target designators and their effects on PH.
Tracer rounds
Laser pointers
Precision guided munitions and characteristics of lasing platform
9. Reduced exposure firing and its effect on acquisition and PH.
143
10. Secondaiy effects of weapons and collateral damage.
External and internal damage to buildings
APS
Weapon backblasts
Maingun round spalling
Structural damage (both intemal and external)
Incidental weapons effects on dismounts (e.g. debris flying off of a building)
11. Laser range finders.
12. Battle Damage Assessment (i.e. how does the shooter know the effects of his rounds?).
13. Flyby effects (i.e. If a round misses its target, is there an effect on other entities/structures
located near the target?)?).
14. Weapons being destroyed or damaged by weapons effects.
15. How PH is affected by firer's posture.
16. Ability to pick up / transfer discarded equipment.
17. The following weapon/sensor pairings:
ceo with M4 (day and ni^t)
M203 day/night sight
M4withTWS (day and night)
Threat with iron sights at night
Hand grenade against all targets
Suppression
1. The events that cause a soldier or unit to be suppressed.
2. The soldier's response (behavior) to being suppressed.
Modeling Mobility
1. Soldier's mobility and how they are affected by the following factors:
Weight of the soldier's load (tires quicker, moves slower with heavier load)
Restrictive equipment (bulk - limited range of motion for fine motor skills)
Environment (up-hill movement vs. down-hill, soil type)
Weather (icy, foggy, rainy, etc.)
Day/night differences
Altitude
Temperature
Obstacles
2. Soldier fatigue and its effects on movement and task performance.
For long duration exertions
For short duration exertions
3. Individual movement techniques.
Soldier's ability to recognize danger areas and unit's employment of proper danger area
crossing techniques to minimize exposure to observation/fires (including posting of
security) and use of rally points.
Soldier's and units ability to recognize the need for and employ altemate routes (e.g., bypass obstacles, danger areas, identify best covered and concealed approaches)
Soldier's and unit's ability to recognize need to move to altemate positions.
144
4. Water crossings / beach landings. Do they account for:
Water depth
Current
Water temperature
Embankment / beach characteristics
5. Robot mobihty.
6. Special movements.
Climbing / descending a ladder
Climbing / descending a rope
Crawling through a tuimel
Mounting / dismounting a vehicle, aircraft or other platform
Soldier carrying another soldier
Skiing, ski-joring, snow-shoeing (including while pulling ahkios)
Combat swimming (surface and subsurface)
Paddling rubber inflatable boats (CRRCs)
Parachuting (free-fall, static-line; steerable or not), including use of weapon/equipment
containers and door bundles
7. Dynamic soldier's loads.
8. Non-enhanced land navigation (getting lost/miss-oriented/causing breaks in contact).
9. GPS usage in support of land navigation and j amming of GPS.
10. Unit formations
Modeling Individual Soldier Survivability (Protection)
1. The probability a soldier is killed/injured given he is hit varies with respect to the following
factors:
The type of round that strikes the soldier
The location of the round striking the soldier (head, torso, etc.)
The type & body coverage of body armor the soldier is wearing
The angle at vdiich the round strikes the body armor or helmet
The distance from the weapon to the target soldier (i.e. accounts for the loss in bullet
velocity over distance)
The likelihood the round first passed through foliage/plaster/doors/windows/water
(causing projectile tumble, deflection, velocity reduction, etc).
2. Assessment of appropriate types of injuries, incapacitation, or death based on the factors in
phenomenon #1 above.
3. Fratricide.
4. Fratricide prevention systems (IFF).
5. Change of MOPP status during the operation.
6. Casualty assessment on individuals riding inside of destroyed vehicles (for both crew and
mounted passengers).
7. Casualty treatment and evacuation, (including disposition/redistribution of critical items)
8. Body armor. Surviving impact but still being incapacitated or degraded.
145
Human Factors (for both red and blue forces)
1. Training (level of performance based on skill achieved by training prior to model execution model input).
2. Leaming (changes in behavior based on events that occur during the course of a model run or
simulated operation).
3. Motivation (will to fight).
Soldier obedience (independent action, initiative, response to leader orders when other
soldier entities elect to follow orders or run, hide, surrender, disobey orders [e.g.,
willingness to accept decisive engagement versus withdraw before being decisively
engaged])
Soldier acceptance of pressure to act or not act, based upon perceptions/actions of fellow
soldier entities.
4. Experience.
5. Unit cohesion.
6. Fatigue.
Do you represent energy level use (fatigue) or build-up (nutrition intake)?
Do you model hydration levels?
Do you represent the effects of these levels on activity/task performance?
7. Effects of casualties on the surviving unit members.
Soldier Equipment Trustworthiness
Modeling Maintainability:
1. Repair/replacement of damaged equipment.
2. Different repair/replacement times for different soldier equipment components .
Modeling Sustainability
1. Logistical concerns:
Battery failure over time
Logistical re-supply operations
(explicit portrayal by specific classes of supply
(e.g., munitions—Class V); impacts on continuity/potential compromise of small
unit/special operations)
Effects of the environment on re-supply requirements (batteries fail sooner in cold
weather, etc.)
Modeling Reliability:
1. Failures of different components of the soldier's personal equipment such as weapons,
protective gear, acquisition components, etc.
2. Mean Time to Failure (MTTF) used to generate component failures.
3. Reduced capability given an equipment failure (i.e. Thermal sight fails, can the soldier use
iron sights until the thermal sight is repaired/replaced)
146
Modeling the Physical Environment
1. The following types of terrain and the availability/effects of cover and concealment and
movement at the individual soldier level:
Urban (MOUT)
Jungle
Mountain
Desert
Plains
Woodland
2. Environmental characteristics and their effects:
Temperature (extreme cold/hot)
Winds (and the wind's effect on obscurants/chemical & biological agents)
Diurnal effects (changing shadows)
Precipitation (rain, snow, hail, sleet)
3. Battlefield obscurants:
Smoke (generated and natural)
Dust as a result of battlefield conditions
Atmospheric (fog, haze, light conditions)
4. Changes to the environment (dynamic terrain):
Rubble
Blowing open a door
Breaching a wall
5. Battlefield obstacles and their effects on terrain:
Minefields (anti-personnel and anti-tank)
Wire obstacles
Tank ditches
Abatis
Road crater
Bodies of water (rivers, streams, lakes, etc.)
6. Terrain resolution. What are the effects of higher terrain resolution on model operations?
7. Subterranean avenues of movement.
8. Shielding (cover and concealment) and its effects on acquisition and PK.
Modeling the Mission Environment
1. Other combat systems:
Fixed wing aircraft and munitions (Air Force/NavyAJSMC attack and bomber aircraft)
Rotary wing aircraft
Tracked vehicles
Wheeled vehicles
Watercraft
Robots
147
2. Other we^on systems and their effects on the individual soldier:
Direct fire missile systems
Indirect fire systems
Laser weapons
Nuclear weapons
Biological weapons
Fixed wing delivered munitions (both conventional and precision)
Anti-personnel mines
Anti-tank mines
Thermobaric
3. Non-combatants and the enemies' tactical use of them.
4. Potential combatants.
5. Multiple sides.
6. Cultural environment and civilian affairs (CA) operations.
7. Accommodating and non-accommodating behavior of non-combatants.
8. The contemporary operating environment.
9. Joint sensors.
10. Non-standard threat weapons (booby traps, car bombs).
11. Enemy Prisoners of War.
12. Psychological Operations (PSYOPS - friendly and enemy).
148
Appendix F: Value Scales
Constructed scale: 0-10
Global weight: 0.15
Accurate modeling of the relevant combined arms and joint assets, to
the degree required for STMS comparative analysis (e.g.,
autonomous calls for fire, communications interfaces with NGF, CAS,
;:UAV)' ^
Level
»
t
;I
3
:
No CA/Joint
Assets
FewCA/JoInt
Assets
*1
.1.--
5
■-:4i«Mlil^
Some CA/Joint
V*'"**
Man; CA/Joint
Assets
Most CA/Joint
Assets
Figure 29. CA/Joint modeling capability value scale.
Joint/CA Modeling Capability
Figure 30. CA/Joint modeling capability value curve.
149
AD CA/Joint
Assets
■ Constructed scale: 0-10
■ Global weight: 0.245
■ Accurate modeling of STMS functions, including actions and
decisions, required for STMS comparative analysis
Level
0
No Fniictlons
1
2
J
ForFiuictlDiis
4
Some Functions
5
6
7
ManyFuncdons
8
MostFonctlons
Figure 31. STMS modeling capability value scale.
STMS Modeling Capability
Figure 32. STMS modeling capability value curve.
150
9
10
AnFuncUons
Constructed scale: 0-10
Global weight: 0.035
Measure of the amount of STMS equipment that can be changed by the
user for comparative analyses
level
No STMS
Equipment
LitUcSTMS
Equipmeiit
SomcSTMS
Equipment
ManySTMS
Equipment
MostSTMS
Equipment
Figure 33. Modifiable equipment types value scale.
Modifiable Equipment Types
Figure 34. Modifiable equipment types value curve.
151
AUSTMS
Equipment
Constructed scale: 0-10
Global weight: 0.035
Measure of the amount of control the user has over conditions to the
degree required for comparative analysis, specifically dynamic weather
and terrain conditions
Le*el
0
No Control
LUQc Control
Some Control
Moderate Control
Mncb Control
Figure 35. User control over conditions value scale.
User Control Over Conditions
Figure 36. User control over conditions value curve.
152
Coinplete Control
Constructed scale: 0-3
Global weight: 0.035
Measure of the user's ability to modify the tactics, techniques, and
procedures of the simulation entities
Level
0
NoMotUQcatton
Capabilily
Modracaflon Throii^
Scriptiiig Old;
ModlflGaflon fluron^
Code Adjnstments
Direct User Modiflcatioii
(ie, via decisnn tables,
behavior composition GUI
Figure 37. TTP modifiability value scale.
Figure 38. TTP modifiability value curve.
153
Constructed scale: 0-10
Global weight: 0.15
Measure of the amount of control ("seat at the table") PEO Soldier
will have for the given alternative with regards to system
development and future modifications
Level
\>
I
2
1
3
1
.
■
No Control
4
1
5
1
'■■i^W
Some Control
LIttte Control
High Control
S^jilHcant Control
Figure 39. PEO Soldier control value scale.
PEO Soldier Control
Level
Figure 40. PEO Soldier control value curve.
154
Complete Control
Constructed scale: 0-5
Global weight: 0.15
A measure of the risk that the simulation development will not be
completed (e.g., due to budget cuts, project failure, policy and
regulation changes)
2
. .
1
I
5
mghRidi
Very High Risk
..
'WiM
No Risk
Very Low Side
Low Risk
MedinmRidc
Figure 41. Fielding risk value scale.
Fielding Rlsl(
3
>
Figure 42. Fielding risk value curve.
155
Natural scale, measured in months
Global weight: 0.20
Defined as the number of months from May, 2004, until a capability
usable for AoA purposes (to the degree that it will be developed) is
available for use by PEO Soldier or its designated representative
S4
60
66
72
Figure 43. Time until available value scale.
Time Until Available
m
..^^^^^^^^^^
-■■
^^\~~'~ ^
m
\
"
= HaIfi«ayThru
LWAoA
"■"--
?
S'
\
■,
\
} m
i
m
X
■>
40
%
1 •■
30
\
20
-"1
\
:10
\ 1
0
20
m
60
Montlis
80
Figure 44. Time until available value curve.
156
100
—^
12D
Appendix G: Sensitivity Graphs
■^ftt- E* M« '^sfe''tevikt -fUaite ^t
Window Hdp
B^
1—T—
——--—-
Cojribinafiou
MoMedlVVARS
McwMeddOS
/
New SiuMation
Modified CblXS
— WAkB
— oos
Value
■ CMXKI
OTB
JCATS
Tarnis
m^orst
Percaiit of Weigfat on JoM/CA Modelig Capabllty Ev^ Measiue
Figure 45. Graph of the sensitivity of the global weight of Joint/CA modeling capability.
^Ne
idt
Wew
Assess
Review
fieia*s
Opttes
Wndow
Hete
mmmmN^mt.mwi
,1
B^
^^Xl^-or^*^^
i
Vahie
Combinsrtiou
Modified IWARS
MoMedOOS
New SiiunlaHon
Mo<BfiedCbtXXI
IWARS
,
—- OOS
cwxxr
:
^^^\^"'-*--^^
^""^•-^s,^
;■;;•;: moM
0
1
1
t
1
. i)
,T
1
1
1
T
1,;, ',',(
.1
1
"■'
'"'^
1 ■
1
OTB
: _.._ JCATS
Jwus
t"^
1 : ^
100
Percent of WdsW: on STKffi Mod^^ Capabltj'Eval Measiue
Figure 46. Graph of the sensitivity of the global weight of STMS modeling capability.
157
^rit m Ikm Assess tevim ftsstjte Optlors Windsw H«^
Be^
Cm»bin<*on
— - Mo«fiedIWARS
...... Modified OOS
New SiuHiiatiott
-- Modified CbtXSI
—— IWARS
— - oos
Valxie
— - CbtXXI
-•- &m '■'::'::,.-■■■■
—— ■ Jaraig
Won?t
Paceiit of Wa^rf on Modifiable Eqiiqjmraif Types E\'al Meastire
Figure 47. Graph of the sensitivity of the global weight of modifiable equipment types.
1^ Ffe E* We» Ajssss sasview ResiAs C^jBons WMow ««%)
.sliJSI <?>! €r-M-*t|B Ll^^ri
■
,j
B^t
^„——''*'*'''''^
■;■ r
Value 1__--'-'''''''^--''.'' .
... 'i.
■
'
*"-::i^^^^^^^'^°''''''*'^
|:
Wor«t
. '■'(^
1
:■.■,■
^
---''"', ^■.sf-'^J^^^S^^"'''''''^
—^ Combination
— - ModrfiedWARS
---■ Modified OOS
New Stoilation
-- Modified CbtXXI
IW.ARS
OOS
CbtXXI
-— OTB
JCATS
Janus
s
i
It
1
t
i
1
1
1
«
1
1
1
t
1
1
100
Percent of Wei|jit ott User Control over Condition,? Evd Rleamre
Figure 48. Graph of the sensitivity of the global weight of user control over conditions.
158
^F3e E* View Asses* Renew R«dte Of^ats Wrdow H«t
Couibinatioii
Mo«fiedIWARS
...... Modified OOS
-■- NewSiuMiIation
-— Modified Gbt XXI
Be.^
— - OOS
—- CbtXXI ::,;,
-■- OTB
ICATS
— Jamis
Value
Worst
100
Percent of Wa^it on TTP Modifiability E\'al Measure
Figure 49. Graph of the sensitivify of the global weight of TTP modifiability.
I^We Elk View tesess Revtew Stexks OpMcra Window t*^i
^ »i
: Best
■
'
,■„.,...»,
IT
■
Value
Worst
:
j
1
11
1
1
t1
11
1
; 1:
[t
11
Couibiiiatioii
Modified IWARS
""" Modified OOS
New SJHKilatiioii
Modified CbtXXI
IVVARS
OOS
CbtXXI
OTB
JCATS
T»nis
1^^
i ,
■■ ■ ..100
„,:'^-'Perceiit oi■ We^it on PEO Solte Coutr<4 R-al Measi ire
Figure 50. Graph of the sensitivity of the global weight of PEO Soldier control
159
^I*s E* %w
Review Rea*s OpHons WnAm Help
ASMS
^^Sl*^iN"--lBU%|r|
Best
P^^^
—"~~"^-*—-~-~^
><5^^^^'^'''^^
^-r~.^
^"'^-"^-=-™'=-^
.»..i^JSS'"'^^~->--~..___
i
.-•h... ..
„jj,^^.
::;;:=r;a~^tj5B„
N,
1
Vahie
'.'it
j
—• ---—-
Ji^^^^^^^^^^
--,- 0TB
—- JCATS
—™" Janus
V
N
Worst
0
1
CojiAiiiiitioil
Modified IWARS
Modified OOS
NewSiiuulaHon
Modified Cbt XXI
11
fi
11
11
!J
i1
t11t ^"
100
Bereeut of Wa^t on Ftdifag Risk Eval Measure
Figure 51. Graph of the sensitivity of the global weight of fielding risk.
I^nte Eda: View ftisess Revfetii Rejuks (*Bont WWow H^
Best
Combtnatiott
Modified IWARS
Modified OOS
NewSinniMon
Modified Cbt XXI
IVl'ARS
OOS
Value
Cbt XXI
OTB
JCATS
Jamis
Worist
Percent of Wddit on liine UnfH Available E\Nit Meastrae
Figure 52. Graph of the sensitivity of the global weight of time until available.
160
Appendix H: Cost Estimation
The costs listed in the following table are very rough estimates based upon limited information,
as several of the models are currently under development. Therefore, these numbers were used
only for comparative purposes, not for planning. For most altematives, we believe that our
estimates are on the high side. Our reasoning is described below the table.
Column #-^
1
2
3
4
s
6
7
8
9
10
11
Simulation -^
Janus
JCATS
OTB
CbtXXl
IWARS
OOS
Mod
CbtXXI
Mod
IWARS
Mod
OOS
Combine
New Sim
Modification
$0
$0
$0
$0
$0
$0
$2
$3.5
$3.5
$4.5
$0
Full Development
$0
$0
$0
$0
$0
$0
$0
$0
$0
$0
$15
4nnual
IVIaintenance
$0
$0
$0
$0
$0
$0
$0.5
$.33
$0.5
$1
$1
Total (assume 10
^r horizon)
$0
$0
$0
$0
$0
$0
$7
$6.8
$8.5
$14.5
$25
Total Value Score
48.2
54.0
55.1
55.1
62.6
62.5
71.9
77.1
76.3
84.9
75.3
Table 6. Cost estimation for cost-benefit analysis.
Costs were broken down into three components: 1) the cost required to modify the simulation,
2) the cost to develop a simulation from scratch, and 3) the annual maintenance cost. The total
cost was calculated by adding the modification and development costs to 10 times the annual
maintenance cost (based on a ten-year planning horizon).
For Existing Altematives (Columns 1-6): The assumption is that PEO Soldier would not have
to provide funding. This may not necessarily be the case if the proponent charges for use
(usually to off-set annual maintenance fees); however, these amounts were negligible compared
to the other altematives and would be difficult to estimate because the fees would be based upon
usage.
For Modified Combat'^^(Column 7): The proponent estimated a yearly labor cost of
approximately $1 million. Based upon development cost information for other models, we
estimated another $1 million per year for other development costs, for a total of $2 million per
year. The model is currently scheduled to be available in a year; however, we estimate that it
161
would take at least another year to add in PEO Soldier requirements, for a total of 2 years. Thus,
the total expenditure both years is $4 million, half of vv^ich would be paid by PEO Soldier. The
proponent estimates that annual maintenance costs are $1 million per year, half of vv^iich
($500,000) might be paid annually by PEO Soldier.
For Modified IWARS (Column 8): Based upon the IWARS STO request, the total
e5q)enditures Ihrough FY09 total $10.7 million. We assumed that PEO Soldier would assume 1/3
of the cost (with Natick and AMSAA splitting the remaining 2/3), thus totaling about $3.5
million. The proponent estimates that annual maintenance costs are $1 million per year, 1/3 of
which ($333,000) might be paid annually by PEO Soldier.
For Modified OOS (Column 9): The current amount expended under the OneSAF contract is
much higher than the other simulations considered, partly because the requirements for that
simulation are much broader. Therefore, we used the same costs as that of the next highest
software (IWARS) to estimate the costs to PEO Soldier for funding modifications to a small
subset of OOS requirements. We used the same annual maintenance costs estimated by the other
proponents ($1 million per year), with PEO Soldier responsible for half ($500,000).
For the Combination alternative (Column 10): We summed the amount to modify all three
simulations (for a total of $9 million) and divided that number by half based upon the assumption
that, by capitalizing on the synergy of the three simulations, PEO Soldier would only have to
fund half for each simulation that it would have had to fund if it depended upon only one
simulation to meet all of its requirements. We assumed that the maintenance fees of the three
simulations would total $1 million for PEO Soldier based upon similar reasoning to the above.
For the New Simulation Alternative (Column 11): We collected available information for the
development costs of CASTFOREM, Combat™, IWARS, and OOS. Specifically, the total
costs estimated by Combat^'/CASTFOREM developers were $6.5 million (labor only); the
costs estimated by IWARS/IUSS developers totaled $13.5 million; and OOS costs so greatly
exceeded those of the other two that we did not directly factor them into our estimated
development costs. We chose $15 million as a rough estimate for three reasons: 1)
162
Combat^VCASTFOREM included only labor costs, 2) OOS costs reflect a much larger set of
modeling requirements than that needed by PEO Soldier, and 3) the new simulation should cost
slightly more than lUSS/IWARS, since those models align more closely with PEO Soldier's
needs, but require some modification.
163
Distribution List
NAME/AGENCY
Authors
COPIES
ADDRESS
CPTEricS. Tollefson
CPT Gregory L. Boylan
Dr. Bobbie L. Foote
Dr. Paul D. West
Department of Systems Engineering
Mahan Hall
West Point, NY 10996
Client
EG James Moran
Mr. Charles Rash
Mr. Ross Guckert
Mr. Charlie Tamez
PEG Soldier
5901 Putnam Road, Bldg 328
Fort Belvoir, VA 22060-5422
Dean, USMA
Office of the Dean
Building 600
West Point NY 10996
Defense Technical
ATTN: DTIC-O
Defense Technical Information Center
8725 John J. Kingman Rd, Suite 0944
Fort Belvoir, VA 22060-6218
Information Center
(DTIC)
Department Head-DSE
COL Michael McQnnis
Department of Systems Engineering
M^an Hall
West Point, NY 10996
ORCEN
Department of Systems Engineering
Mahan Hall
WestPoint NY 10996
ORCEN Director
LTC Michael Kwirm, Jr.
Department of Systems Engineering
Mdian Hall
West Point NY 10996
164
J, NAME/AGENCY
USMA Library
ADDRESS
COPIES
USMA Library
Bldg 757
West Point, NY 10996
1
165
View publication stats