measuring virtual simulation ’s value in training ... studies/15114_p… · training value is...

12
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015 2015 Paper No. 15114 Page 1 of 12 Measuring Virtual Simulation’s Value in Training Exercises - USMC Use Case Nathan Jones Greg Seavers Christin Capriglione MCSC PM TRASYS MCSC PM TRASYS NAWCTSD Orlando, FL Orlando, FL Orlando, FL [email protected] [email protected] [email protected] ABSTRACT In 2013, Lieutenant General (LTGen.) John A. Toolan, former Commanding General (CG) of First Marine Expeditionary Force (I MEF), requested incorporating previously non-interoperable and ‘stove-piped’ virtual and constructive Training Aids, Devices, Simulators and Simulations (TADSS) at I MEF’s First Marine Expeditionary Brigade’s (1st MEB’s) Large Scale Exercise 2014 (LSE-14) to demonstrate that Live, Virtual, Constructive (LVC) TADSS could collectively stimulate a Marine Air Ground Task Force (MAGTF) Commander’s Common Operational Picture (COP). The expected outcome would be an operationally-effective MEF with capabilities to conduct full-spectrum military operations with a COP stimulated with data feeds from LVC entities; while providing training to both the primary (battlestaff) and secondary (supporting unit) training audiences. The objective of this assessment was to measure the training value gained. A measurable training value of utilizing virtual TADSS in a live exercise could have the potential to impact the historical and traditionally biased paradigm to train everything live whenever possible within the Marine Corps. This paper presents the results of the training value assessment of augmenting the live training event with virtual TADSS. It provides impacts of virtual integration on training efficacy achieved for primary and secondary training audiences. Included is the training value construct, defined assessment approach, limitations, results (both immediate and post event impacts), and efficiencies in terms of cost plus cost avoidance. Recommendations and discussions focus on: (1) identified needs for improvements in exercise planning and tools to facilitate more efficient satisfaction of training objectives for primary and secondary training audiences, (2) develop training-related human performance measures in TADSS to measure performance against training objectives, and (3) define an encompassing methodology for assessing training value of training solutions to inform requirements and acquisition decision makers. ABOUT THE AUTHORS Nathan Jones is the Functional Lead for Instructional Systems at MCSC PM TRASYS. Mr. Jones has 15 years of experience in human performance/human-systems assessments and acquisition program support. He is responsible for overseeing PM TRASYS’ training front end analyses, Verification, Validation & Accreditations (VV&As), training effectiveness evaluations and training domain expertise for PM TRASYS. Mr. Jones was a featured panelist at the Current Trends in ISD and IPA within LVC Events special events at 2014 I/ITSEC. Greg Seavers is Senior Operations Research Analyst at MCSC PM TRASYS. Mr. Seavers has been a DoD cost analyst for twenty-one years with Industry and five years in Federal Service with both the U.S. Army and U.S. Marine Corps. Mr. Seavers’ cost experience with the Army and Marine Corps includes Program Life Cycle Cost Estimates, Independent Government Cost Estimates, Proposal Evaluations, Analysis of Alternatives, Earned Value Management, and Business Case Analyses for multiple ACAT I through IV Programs. Christin Capriglione is a lead Instructional Systems Specialist at Naval Air Warfare Center Training Systems Division. Ms. Capriglione has nearly 10 years of experience in human performance analysis, training development, and training device assessment. She is responsible for providing oversight of instructional analysis, design, and evaluation efforts related to the acquisition and assessment of various Navy and Marine Corps aviation and ground training systems.

Upload: others

Post on 27-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 1 of 12

Measuring Virtual Simulation’s Value in Training Exercises - USMC Use Case

Nathan Jones Greg Seavers

Christin Capriglione

MCSC PM TRASYS MCSC PM TRASYS

NAWCTSD Orlando, FL Orlando, FL

Orlando, FL

[email protected] [email protected] [email protected]

ABSTRACT In 2013, Lieutenant General (LTGen.) John A. Toolan, former Commanding General (CG) of First Marine Expeditionary Force (I MEF), requested incorporating previously non-interoperable and ‘stove-piped’ virtual and constructive Training Aids, Devices, Simulators and Simulations (TADSS) at I MEF’s First Marine Expeditionary Brigade’s (1st MEB’s) Large Scale Exercise 2014 (LSE-14) to demonstrate that Live, Virtual, Constructive (LVC) TADSS could collectively stimulate a Marine Air Ground Task Force (MAGTF) Commander’s Common Operational Picture (COP). The expected outcome would be an operationally-effective MEF with capabilities to conduct full-spectrum military operations with a COP stimulated with data feeds from LVC entities; while providing training to both the primary (battlestaff) and secondary (supporting unit) training audiences. The objective of this assessment was to measure the training value gained. A measurable training value of utilizing virtual TADSS in a live exercise could have the potential to impact the historical and traditionally biased paradigm to train everything live whenever possible within the Marine Corps. This paper presents the results of the training value assessment of augmenting the live training event with virtual TADSS. It provides impacts of virtual integration on training efficacy achieved for primary and secondary training audiences. Included is the training value construct, defined assessment approach, limitations, results (both immediate and post event impacts), and efficiencies in terms of cost plus cost avoidance. Recommendations and discussions focus on: (1) identified needs for improvements in exercise planning and tools to facilitate more efficient satisfaction of training objectives for primary and secondary training audiences, (2) develop training-related human performance measures in TADSS to measure performance against training objectives, and (3) define an encompassing methodology for assessing training value of training solutions to inform requirements and acquisition decision makers.

ABOUT THE AUTHORS Nathan Jones is the Functional Lead for Instructional Systems at MCSC PM TRASYS. Mr. Jones has 15 years of experience in human performance/human-systems assessments and acquisition program support. He is responsible for overseeing PM TRASYS’ training front end analyses, Verification, Validation & Accreditations (VV&As), training effectiveness evaluations and training domain expertise for PM TRASYS. Mr. Jones was a featured panelist at the Current Trends in ISD and IPA within LVC Events special events at 2014 I/ITSEC. Greg Seavers is Senior Operations Research Analyst at MCSC PM TRASYS. Mr. Seavers has been a DoD cost analyst for twenty-one years with Industry and five years in Federal Service with both the U.S. Army and U.S. Marine Corps. Mr. Seavers’ cost experience with the Army and Marine Corps includes Program Life Cycle Cost Estimates, Independent Government Cost Estimates, Proposal Evaluations, Analysis of Alternatives, Earned Value Management, and Business Case Analyses for multiple ACAT I through IV Programs. Christin Capriglione is a lead Instructional Systems Specialist at Naval Air Warfare Center Training Systems Division. Ms. Capriglione has nearly 10 years of experience in human performance analysis, training development, and training device assessment. She is responsible for providing oversight of instructional analysis, design, and evaluation efforts related to the acquisition and assessment of various Navy and Marine Corps aviation and ground training systems.

Page 2: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 2 of 12

Measuring Virtual Simulation’s Value in Training Exercises - USMC Use Case

Nathan Jones Greg Seavers

Christin Capriglione

MCSC PM TRASYS MCSC PM TRASYS

NAWCTSD Orlando, FL Orlando, FL

Orlando, FL

[email protected] [email protected] [email protected] DEMAND SIGNAL In 2014, Program Manager Training Systems (PM TRASYS) was directed by Training and Education Command (TECOM) to conduct a training value assessment of virtual integration in the First Marine Expeditionary Force (I MEF) Large Scale Exercise (LSE-14). LtGen. John A. Toolan, former Commanding General (CG) of First Marine Expeditionary Force (I MEF), tasked I MEF G7 with federating previously non-interoperable and ‘stove-piped’ virtual and constructive Training Aids, Devices, Simulators and Simulations (TADSS) in support of the I MEF/ First Marine Expeditionary Brigade (1st MEB) Large Scale Exercise 2014 (LSE-14).LtGen. Toolan wanted Live Virtual Constructive (LVC) TADSS to collectively stimulate a Marine-Air Ground Task Force (MAGTF) Commander’s Common Operational Picture (COP) in a notional Command Post Exercise (CPX). The expected outcome would be an operationally-effective MEF with capabilities to conduct a full-spectrum military operation. The purpose of integrating virtual simulations proof of concept into this training exercise was to demonstrate the value of virtual devices for providing training to both the primary (battlestaff) and secondary (supporting units) training audiences in addition to the live augmented with constructive simulation (primarily MAGTF Tactical Warfare Simulation (MTWS)) training model traditionally used in this type of exercise. To determine the impact of virtual integration on training efficacy (ability to complete tasks), PM TRASYS was also tasked to conduct an assessment of the training value during the LSE-14 Warm Start and Final Exercise (FINEX). This paper describes the outcome of the data collection, analysis, and recommendations for the virtual integration path forward from a training design perspective. The discussion presents doctrinal updates, measures, and standardization needs for Marine Corps (USMC) LVC training. MEASURING TRAINING VALUE The goal of the assessment was to provide I MEF, TECOM, and Headquarters Marine Corps (HQMC) with an evaluation of the training value achieved through the addition of the virtual component of the LVC simulation-based training construct to LSE-14. USMC large-scale exercise execution has historically been comprised of live training components enhanced with constructive components, primarily via MTWS. This effort’s primary objective intended to compare and contrast the relative training merits of integrated virtual simulation in LSE training versus a non-integrated (not connected with larger LSE event) training environment for primary and secondary training audiences. The secondary objective intended to assess training value added for trainees operating from each of the respective virtual devices (e.g. Supporting Arms Virtual Trainer (SAVT), Virtual Battlespace 2 (VBS2), AH-1W Aircraft Procedures Trainer (APT), and the Combat Convoy Simulator (CCS)). As Cermak & McGurk (2010) state, organizations typically measure training’s value by conducting surveys of trainees or counting how many complete training rather than by assessing whether those trainees learned anything (learning outcomes) and impacts improving the organization. Training value is italicized in this paper to emphasize the distinguishing of training value versus training effectiveness evaluations (TEEs) and common return-on-investment (ROI) evaluations. While TEEs, as defined by the Kirkpatrick model, measure learning in a scientific way (Kirkpatrick, 1994), TEEs do not apply any balance or tradeoff measurement of TEE measurements against other organizational variables such as costs.

Page 3: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 3 of 12

The authors espouse that training value is a tradeoff or balance of elements of training effectiveness measurements against other factors. The cumulative effect is greater than the sum of its components. USMC does not have a defined or standardized construct for defining and measuring training value. Therefore, for the purpose of this assessment, a training value construct (Figure 1) was defined according to four distinct value elements (described below) – each possessing categorically different qualities and properties. As such, this assessment requires the application of unique sets of metrics (quantitative and qualitative), data gathering techniques (method) and analytical procedures to assess each element’s respective character (data analysis). The four training value elements defined for the purpose of this assessment were: (1) training task and performance capability, (2) training realism capability, (3) affective reaction level, and (4) training efficiencies. The first three elements combined create a measure of TEE and are measured against the fourth element—training efficiencies as a tradeoff (Figure 1). Approach The team collected training value elements data via worksheets. The assessment scope included observations and data collection across 29 virtual training events. Events included 18 unclassified events and 11 classified events from air combat element (ACE), ground combat element (GCE), and logistics combat element (LCE) participants in nine training devices at four locations. The assessment team collected 270 task performance data sheets (measurement for task and performance capability element), 264 training environment attributes data sheets (measurement for training realism capability element), and 252 affective reaction data sheets (measurement for affective reaction level element). Discrepancies in data sheet collection totals resulted from fluid field assessment conditions and are not described in detail in this paper. The team also conducted interviews with the Officers-in-Charge (OICs) and Staff Non-commissioned Officers (SNCOs) of the participating units and Combat Operations Centers (COCs) for whom structured data collection was constrained, specifically Direct Air Support Center (DASC), Combat Logistics Regiment-1 (CLR-1), and Marine Corps Logistics Operations Group (MCLOG). To ascertain post event organizational impacts, interviews were conducted nine months after the training event with representatives from MCLOG and I MEF G7. The analysis compared integrated versus non-integrated training events for each value element. An integrated training event was defined as part of the larger exercise with communications with higher COCs. A non-integrated training event was defined as simulation specific (i.e. not networked with any other training situation). Constraints & Limitations The virtual integration into LSE-14 and corresponding training value assessment was conducted as an adjunct proof of concept activity where the instructors’ first priority was training, and the trainees’ priority was learning. As such, data collection plans and procedures remained somewhat fluid in order to accommodate given situational training priorities as they occur – and reduce demands imposed on trainees. Analysts were restricted in their ability to exercise various controls that normally accompany traditional study designs – i.e. experimental control: definitive structuring and isolation of the independent variables and dependent variables, engagement of meaningful sample sizes, faithfulness to pre-established task scenarios, etc. Despite these circumstances, the assessment focused on gathering data directly related to the assessment goals. Additional limitations included:

Page 4: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 4 of 12

• Data collection was limited to non-interfering observation, trainee surveys/questionnaires, and limited unit leader, instructor, and simulator operator after-action interviews.

• A specific task list was not developed due to the “free play” nature of the training exercise but was mitigated to some limited degree by conducting interviews with OICs and SNCO within the COCs to identify task performance limitations in regards to the virtual integration

• Some data collection depended on third party participation and coordination that suffered from competing priorities and responsibilities. Semi-structured interviews were used to capture the missing data points.

• Marine instructors did not collect performance evaluation data during the scenario execution against Training and Readiness (T&R) training objectives and analysts were not able to compare the tasks performed directly to a T&R measure of performance. This was due to the training exercise not being designed to fulfill specific T&R events for secondary audiences. Therefore, it limited ability to draw conclusions about effectiveness for achieving specific T&R objectives.

• Virtual simulation systems did not have the capability to collect performance measures that reflect task and/or T&R performance.

Training Value Elements Training Task and Performance Capability This element is a measure of the range of tasks (number of tasks) and skill execution that could be authentically performed with and without virtual simulation integration for both primary (battlestaff) and secondary (individuals in training devices) target groups. Training Task and Performance Capability may be considered two separate elements, however, within the scope of this assessment they were considered together as the data collected was closely related. Data analysis for this element first attempted to examine whether virtual simulation integration increased the number and type of training tasks that were accomplished. Data analysis for this element then examined the impact of virtual simulation integration on task performance levels. Overall, the results were positive with respect to both the number of tasks that trainees were able to perform and the trainees’ assessment of how well the devices supported their performance. Figure 2 provides an overarching depiction of the task performance across all virtual devices and scenarios. In the chart below, trainees indicated whether or not they were able to perform identified tasks. They responded affirmative to 70% of identified tasks during integrated training events and only 60% affirmation in non-integrated training events. A total of 1087 of 1547 identified integrated tasks and 871 of 1440 identified non-integrated tasks were successfully performed. Based on the task completion element results in Figure 2, it appears that there is a higher capability to complete tasks in the integrated configuration than the non-integrated.

Figure 2: Task Completion Capability Integrated vs Non-Integrated

The results for individual virtual simulators were similar lending further support to the overall indication that virtual integration allowed the trainees to successfully perform tasks required to fulfill their role in the mission scenario and achieve mission objectives. Figure 3 shows the difference in the convoy scenario operated in VBS2. The left chart shows 68% task completion while integrated; whereas, the right chart only shows 36% tasks completion in non-integrated/stand-alone configuration.

Page 5: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 5 of 12

Figure 3: Task Completion VBS2 Convoy Scenario - Integrated vs Non-Integrated

With respect to ratings of the level of performance to which the trainees were able to perform the tasks in Figure 3, the integrated and non-integrated environments appear to provide similar task performance capability (Figure 4). Therefore, the integrated training environment offered the ability to do more tasks at an equal level of performance as the tasks performed in the non-integrated environment

Figure 4: Task Performance VBS2 Convoy Scenario - Integrated vs Non-Integrated

An exception to the positive trend in task performance support capability occurred within some of the aviation devices. Due to technical limitations which prevented full fidelity simulators to be integrated, the part task procedural trainers could not clearly show adequate task training value. Training Realism Capability This element is a measure of the range of conditions and levels of combat realism effectively accommodated within the context of each training environment (integrated vs. non-integrated) and for each training venue (COC and LVC devices). Data analysis for this element examined whether training realism would increase through introduction of virtual simulation. Training level of realism importance and realism ratings were synthesized to produce a Level 3, Level 2, or Level 1 training support level with level 3 being the highest level of training support. Definitions of the three training support levels are provided in Table 1 with percentage of trainee responses supporting each level of realism.

Page 6: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 6 of 12

Table 1: Training Realism Levels

Realism Levels – Virtual Integrated Training Events

Level 3

System is capable of supporting complete operator performance to the expected training standard.

39%

Level 2

System provides attributes at a fidelity sufficient for beneficial training but not for full performance.

42%

Level

1

System is incapable of supporting task performance sufficiently for training. 16%

Level 2 ratings indicate some device and/or integration network improvements are likely required to enable trainees’ to meet the full performance requirements. Level 1 ratings may require additional analysis to determine the root causes of the issues for a particular device or scenario in the integrated environment and improve its capability. They will also require significant improvements to enable effective training performance. Bottom line, eighty-one percent of participants’ ratings indicated a level of fidelity sufficient to provide training value. Overall, the integrated training environment successfully provided the training realism in terms of the sensory cues and environmental attributes trainees required to perform the mission tasks successfully as well as achieve some training benefit. Improving training realism by addressing attribute-specific capability deficiencies will create improvements in trainee performance across multiple devices and obtain training value from the virtual integration. Affective Reaction Level This element is a measure of the types and levels of trainees’ emotive, intuitive and/or visceral responses within each training environment (integrated vs. non-integrated) and each training venue (COC and each LVC device) as they may serve to enable or inhibit authentic task execution. Data analysis of this element examined whether inclusion of virtual simulation affects performance anxiety, confidence, attention, and workload. For each event, trainees were asked to rate two sets of statements—trainee reaction statements and task load index statements—on a 5-point Likert scale with 5 being highest. Ratings for the affective reaction levels varied among raters, scenarios, and devices resulting in average ratings that were similar for each indicator and hovered around the middle of each rating scale. Figure 5 and Figure 6 display the average ratings across all scenarios as well as each individual scenario for which this data was collected.

Page 7: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 7 of 12

Figure 5: Affective Reaction Levels by Scenario - Trainee Reaction Statements

Figure 6: Affective Reaction Levels by Scenario - Task Load Index

The primary limitation on the interpretation of this data is lack of a standalone trainer/training baseline that specifies the level of affective characteristics that is required to achieve effective training. Coupled with observations and data from the other training value elements, the assessment team concluded that the affective reaction levels across the events were not dramatic enough to negatively impact training or the units’ ability to perform their mission requirements. The affective reaction data was collected so that it may be able to serve as a comparative baseline for future analyses. Training Efficiencies This element is a measurement of the training efficiencies that may be realized by augmenting training with virtual simulators. Data analysis of this element investigated whether inclusion of virtual simulation increases training efficiencies while quantifying the comprehensive costs associated with the virtual systems integration, inclusion of costs incurring prior to and after the final exercise training event, plus quantifying the cost avoided through the use of virtual simulation. Cost avoidance is in terms of utilizing the virtual simulators vice the use of live range

Page 8: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 8 of 12

operations (for example, blue tip rounds and platform hourly operations costs) to provide kinetic stimulation for the primary and secondary training audiences. The cost analysis analyzed the total cost of performing the Virtual integration via a Cost Element Structure (CES) that summarized the costs (inclusive of the labor, materials, and Other Direct Costs (ODCs) related to the planning, engineering and development, and execution) against the cost avoidance of the Virtual events. The cost of this one-time integration effort was calculated to be $2.4 Million. Cost avoidance calculation utilizing quantitative cost avoidance methodology for simulation is used as a measure to quantify the costs of performing simulated exercises as if they were done live (Gordon & Cooley 2013). The cost avoidance of the simulations was calculated to be:

Category Costs Avoided Vehicle $8,882.32 Aircraft $283,717.50 Ammunition $256,698.00 Munitions $8,555,360.31 Total $9,104,658.13

Based on the cost required to perform the training events with Virtual simulation and the cost avoidance, the net cost avoidance was: Virtual Sim Cost Avoided ($9.1M) – Virtual Integration Effort Cost ($2.4M) = $6.7M Net Cost Avoidance

DISCUSSION Training Value Summary Overall, LSE-14 demonstrated that a Commander’s battlestaff (primary audience) could be collectively stimulated through federating virtual TADSS into an existing live-constructive training exercise with data feeds from live, virtual, and constructive training participants. Additionally, LSE-14 stakeholders demonstrated that additional training audiences beyond the battlestaff (secondary audiences) could provide impactful training that improves their mission readiness and Warfighting capabilities. The assessment results provide support for the thesis that both the primary and secondary training audiences are able to realize training value through the integration of virtual devices. The greatest training value achieved was: • There was great training value in Marines talking to Marines (vice instructor or simulator operator role players

[often referred to as white cell]) as they performed the collective tasks they are trained to do (e.g., DASC to pilots, Joint Tactical Air Control (JTAC) to pilots, pilots to DASC, convoy Marines to CLR-1 COC, convoy to JTAC).

• The opportunity to train with other communities is not readily available in normal training evolutions. Virtual integration events allowed units to refine and practice their SOPs with Marines from other communities that they would not get the opportunity to do so with traditional white cell role playing.

• Virtual training events made up 20% of CLR-1 training events conducted during the LSE. This 20% was an increase in training opportunities that would not have been possible otherwise during LSE-14 without virtual TADSS.

• Virtual integration event participation increased training audiences' awareness of simulator capabilities available at home stations to provide standalone training opportunities and further potential for distributed and integrated training if integrated infrastructure can be established.

• Assessment discussions with participants from multiple units helped identify exercise coordination and information flow considerations that will aid the refinement and stability of virtual integration and further increase exercises with virtual integration training value.

Page 9: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 9 of 12

• While LSE-14 was a MEF level exercise, it was evident that maximizing the value of virtual integration may be best leveraged for Battalion / Company and below training, compounded by the frequency by which smaller exercises are conducted.

Training Design Recommendations With respect to efforts to integrate virtual devices into training exercises, the following training design recommendations will enable increasing training value and assessment success along with resolution of technical issues: • Identify required training tasks and objectives for both the primary and secondary training audiences upfront to

ensure that the scenarios and virtual training systems can be developed to meet training requirements effectively.

• Conduct a task development workshop with the scenario development team to ensure that a training assessment team has a complete, accurate task list to develop the data collection tools prior to the training event commencement.

• Establish a stable set of scripted scenarios in advance that can be used for the virtual integration in LVC events. The assessment team would benefit from having a narrative description of the scenario events to enable them to effectively determine what units to assess during LVC events.

• Identify SME observers for each organization in advance of the exercise to allow the assessment team opportunity to explain the data collection effort, train them on data collection strategies as necessary, and develop a plan to coordinate the data collection during the exercise.

• Identify the primary T&R events for each unit participating in the event to ensure the assessment team is able to accurately assess T&R events completed. Supplement assessment teams with SMEs from the respective communities capable of assessing T&R performance.

• Provide supporting materials such as maps and other artifacts needed for training to secondary training audiences and supporting units well in advance of each training event to allow them time for review, planning, and briefing in accordance with their normal execution processes and procedures.

• Ensure simulator operators understand the training scenario and sequence of events as well as being well-trained to facilitate the event to achieve training performance objectives. Contractor operator simulator support contracts will need to incorporate these exercise execution requirements.

• Develop and implement human performance measurement and data capture mechanisms that may be implemented by the simulation devices and/or instructor operator stations (IOSs) to provide insight into performance proficiency and/or effectiveness.

For more persistent and sustained LVC integration, these considerations will be vital to ensure effective training in LVC: technical simulation interoperability requirements, human interaction fidelity requirements, and human performance measuring capabilities that support task performance assessments for T&R standards. Post Event Training Impacts In addition to the data collected during LSE-14, the realized training value of virtual integration has continued to grow and have a lasting impact. The following impacts were identified in post event follow-up interviews nine months after the exercise. This data is not reflected in the event evaluation above, nor included in training efficiencies calculations but adds to the case of training value of Virtual integration. In addition, this data shows that a standardized method for assessing training value should incorporate post event follow-ups. LSE-14, specifically virtual integration, was the catalyst that allowed MCLOG to see LVC capabilities and take what was learned and quickly apply in a scaled down solution. MCLOG has capitalized on the training capabilities realized during LSE-14. via utilizing the same connected virtual simulators and training scenarios in three additional exercises improving training effectiveness for 450 Marines as of June 2015. The realized value of the LVC integrated training capability has resulted in it being a required training event prior to Integrated Training Exercise (ITX) for the Marine Wing Support Squadron (MWSS). One of the continued limitations is the setup time, need for more correlated maps, and a robust set of models between systems. (Jones, N. 2015, June 3. [Interview with Aaron Villaverde].)

Page 10: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 10 of 12

At I MEF, staff and Marine training officers have become more aware of the potential LVC training offers to their units resulting in increased communication and coordination between ground, air, and Command and Control (C2) communities. The Marines are asking more detailed questions on simulation systems’ capabilities and exploring ways to integrate systems to achieve more effective training. Some training capabilities realized during LSE-14 have been utilized such as the use of virtual unmanned aircraft system (UAS) in live training exercises in 2015. Representatives from the I MEF G7 have voiced need for greater integration of virtual UAS, integration of live training systems (e.g. Instrumented Tactical Engagement Simulation (ITESS) II), connecting live forces with virtual forces (e.g. VBS3 and ITESS II), and ability to utilize the capabilities in blended classified and unclassified environments. (Jones, N. 2015, June 4. [Interview with Eugene Apicella].) LVC Acquisition Impacts These requirements need to be addressed in the near term to enable effective virtual integration to achieve the greatest training value by achieving desired increases in training effectiveness while minimizing cost and creating new cost efficiencies for training exercises. The demand signal is not far out in the future. The upcoming LSE-16 is anticipated to have a larger virtual TADSS footprint based on the training value in LSE-14. Future Vision Commandant of the Marine Corps stated in his 2015 36th Commandant’s Planning Guidance (CPG), “Our investment in training systems will reflect the priority we place on preparing for combat and be fully integrated with training and readiness standards. I expect all elements of the MAGTF to make extensive use of simulators where appropriate (Dunford 2015)”. This vision will entail an enterprise-level effort to improve operational readiness by interoperating and federating Live, Virtual, and Constructive (LVC) domains to enhance individual, collective, and battlestaff training. The challenge ahead lies in OIC and SNCO acceptance and perception of training value in utilization of virtual simulation in live training exercises. Doctrine, education, and quantitative evidence are key to addressing this challenge. Processes and Tools As the integration of virtual devices with live training exercises becomes a more regular operation, establishing the standardized processes and tools to accomplish successful communications will allow future integrations to proceed more smoothly—from a procedural perspective, not just a technical one. Exercise design processes need to enable training objectives for multiple levels of audiences. Historically, training objectives have been identified for primary audiences only for the purpose of planning the training events. With the realized achievable benefit in training value for secondary audiences, their training objectives need to be accounted for in the planning process, which may be facilitated through USMC-wide standardized training exercise planning tools and processes. Performance Measures To assess Marine performance when utilizing virtual systems, there is a need for measures of performance integrated into training systems. The measures need to inform training objectives. To aid in this, tools should be developed that allow for identifying training objectives as part of the scenario planning process and facilitate the ability to assess the performance levels of Marines during the training. The process of identifying measures and the collection tools need to be standardized to facilitate collection by multiple audiences. Assessments of performance will require the identification of information cues—a discrete source of information, that must be monitored and/or processed—and a tool capable of collecting performance measures from the cues. LVC Capabilities Validation The training support communities that put together training exercises need to be informed of the LVC training capabilities. The planners need to know what sets of LVC configurations are capable of providing effective training to meet objectives that have been requested by fleet. USMC has been validating and accrediting training systems for the training capabilities but in a stand-alone configuration. USMC now needs to look at what training can be enabled by LVC and validate those configurations to properly inform the training support communities and update the appropriate doctrine to reflect the capabilities. In addition, contracts for simulator operators will need to include supporting LVC with cross-training on capabilities of multiple simulators as well as coordination duties during exercises.

Page 11: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 11 of 12

Standardize Training Value Measurements As USMC increases its emphasis on LVC training, it needs to establish a standardized training value definition and method of analyzing factors to include cost, training effectiveness, and efficacy to enable “training as we fight.” TEEs and cost ROI analyses do not adequately address the cumulative value of training solutions. More research is needed to develop an adequate and holistic approach to measuring training value. As these value elements (and others yet to be identified, e.g. exercise design time and manpower) are clearly defined and articulated, appropriate assessment strategies (means) including metrics, data collection tactics and tools, and analysis methodologies – may be properly tailored to suit each element. Then USMC will be able provide necessary information to training acquisition decision makers to understand their training value of training solutions when doing tradeoffs. CONCLUSION The LSE-14 training value assessment results clearly demonstrate that augmenting live training events with virtual TADSS can provide the same training value for reduced cost. These results apply to both large-scale exercises as well as home-station training where devices may be integrated together to meet lower-level training objectives. More detailed analyses and technical device assessments are required to identify specific capability gaps present in the virtual-integrated training environment; however, the results of this assessment provides quantitative and qualitative support to virtual integration as a value-added training tool in USMC live training exercises. Future training events will offer even higher cost avoidance and ROI as virtual simulations become more interoperable with each other and scenarios become more seamlessly integrated between the LVC platforms. As interconnectivity issues become less impeding to scenario planners, and the connections between systems become more enduring and easily repeatable, we can expect cost avoidance numbers to continue to climb. There is a great potential for cost savings as virtual simulations play a larger role in the USMC training exercises. Augmenting live training with virtual training in regularly scheduled training exercises represents a trade space in which exercise planners can decide how to make the most effective use of training budgets and training assets to achieve more effective training solutions. Training exercise design and LVC development should be focused on training objectives and fidelity required in LVC capabilities. To achieve highly effective training with LVC, the recommendations contained in this paper should be built upon in order to improve training planning, execution, and evaluations. “Train as you will fight” is one of the fundamental principles for USMC training (USMC 1996) and the integration of live Marines into training environments to replace white cell role players increases real world problems like the stress, inoculations, and multi-organizational factors, (e.g. “fog of war”). Communications is often the center of “fog”, therefore, integration of C2 systems to TADSS is just as important as the TADSS integration. The training value identified in this paper as result of virtual integration and the ability to assess the training value to enhance live training are both enduring requirements, “We need it now; we need it right.” (Tomko, Col. T.S., oral presentation, June 2, 2015). ACKNOWLEDGEMENTS The authors wish to acknowledge the vital cooperation of all the organizations involved in the LSE-14 virtual integration efforts. Without the vision, leadership, and tireless hours exerted by LtCol Stephen H. Mount and MSgt Robert A. Sousa, I MEF G7, the event would not have been successful and beneficial for Marines. Thanks to Maj Barron Mills who was key to coordinating MARCORSYSCOM’s support of I MEF training needs. John Larson, NAWCTSD, for assistance with developing training value elements and Mike Guest, NAWCTSD, for assistance with data collection. Credit and thanks to John Roth, PM TRASYS, who was the primary data analyst for the cost and cost avoidance and Capt Jonathan Richardson, PM TRASYS, who was the primary author for the After-Action Documentation and Analysis Report. REFERENCES Cermak J. & McGurk M. (2010, July). Putting a Value On Training. McKinsey Quarterly. Retrieved June 10, 2015

from http://www.mckinsey.com/insights/organization/putting_a_value_on_training

Page 12: Measuring Virtual Simulation ’s Value in Training ... Studies/15114_P… · Training value is italicized in this paper to emphasize the distinguishing of versus training effectiveness

Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015

2015 Paper No. 15114 Page 12 of 12

Dunford, Gen. J.F. (2015). U.S. Marine Corps Commandant’s Planning Guidance. Retrieved May 1, 2015, from http://www.hqmc.marines.mil/Portals/142/Docs/2015CPG_Color.pdf

Gordon, S. & Cooley, T. (2013) Phase One Final Report: Cost Avoidance Study of USMC Simulation Training Systems.

Kirkpatrick, D.L., & Kirkpatrick, J.D. (1994). Evaluating Training Programs, Berrett-Koehler Publishers. Marine Corps Systems Command Program Manager Training Systems. (2014). Large Scale Exercise 2014 (LSE-

14) After-Action Documentation and Analysis. Marine Corps Systems Command Program Manager Training Systems. (2014). Large Scale Exercise 2014 (LSE-

14) Virtual Integration Assessment Report. United States Marine Corps. (1996). Marine Corps Reference Publication (MCRP) 3-0A. Unit Training

Management Guide. Washington, DC: Department of the Navy.