critical success factors for crowdsourcing · vation, crowdsourcing, mbse, and virtual environments...
TRANSCRIPT
Image designed by Diane Fleischer
CRITICAL SUCCESS FACTORS for Crowdsourcing with Virtual Environments TO UNLOCK INNOVATION
Glenn E. Romanczuk, Christopher Willy, and John E. Bischoff Senior defense acquisition leadership is increasingly advocating new approaches that can enhance defense acquisition. Their constant refrain is increased innovation, collaboration, and experimentation. The then Under Secretary of Defense for Acquisition, Technology, and Logistics Frank Kendall, in his 2014 Better Buying Power 3.0 White Paper, called to “Incentivize inno-vation … Increase the use of prototyping and experimentation.” This article explores a confluence of technologies holding the key to faster development time linked to real warfighter evaluations. Innovations in Model Based Systems Engineering (MBSE), crowdsourcing, and virtual environments can enhance collaboration. This study focused on finding critical success factors, using the Delphi method, allowing virtual environments and MBSE to produce needed feedback and enhance the process. The Department of Defense can use the emerging findings to ensure that systems developed reflect stakeholders’ requirements. Innovative use of virtual environments and crowdsourcing can decrease cycle time required to produce advanced innovative systems tailored to meet warfighter needs.
DOI: https://doi.org/10.22594/dau.16-758.24.02 (Online only) Keywords: Delphi method; collaboration; innovation; expert judgment
336 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
A host of technologies and concepts holds the key for reducing devel-opment time linked to real warfighter evaluation and need. Innovations in MBSE, networking, and virtual environment technology can enable collabo-ration among the designers, developers, and end users, and can increasingly be utilized for warfighter crowdsourcing (Smith & Vogt, 2014). The inno-vative process can link ideas generated by warfighters, using game-based virtual environments in combination with the ideas, ranking, and filtering of the greater engineering staff. The DoD, following industry’s lead in crowd-sourcing, can utilize the critical success factors and methods developed in this research to reduce the time needed to develop and field critical defense systems. Innovative use of virtual environments and crowdsourcing can increase the usefulness of weapon systems to meet the real needs of the true stakeholders—the warfighters.
The DoD, as a whole, has begun looking for efficiency by employing inno-vation, crowdsourcing, MBSE, and virtual environments (Zimmerman, 2015). Industry has led the way with innovative use of crowdsourcing for design and idea generation. Many of these methods utilize the public at large. However, this study will focus on crowdsourcing that uses warfight-ers and the larger DoD engineering staff, along with MBSE methodologies. This study focuses on finding the critical success factors, or key elements, and developing a process (framework) to allow virtual environments and MBSE to continually produce feedback from key stakeholders throughout the design cycle, not just at the beginning and end of the process. The pro-posed process has been developed based on feedback from a panel of experts using the Delphi method. The Delphi method, created by RAND in the 1950s, allows for exploration of solutions based on expert opinion (Dalkey, 1967). This study utilized a panel of 20 experts in modeling and simulation (M&S). The panel was a cross section of Senior Executive Service, senior Army, Navy, and DoD engineering staff, and academics with experience across the range of virtual environments, M&S, MBSE, and human systems integra-tion (HSI). The panel developed critical success factors in each of the five areas explored: MBSE, HSI, virtual environments, crowdsourcing, and the overall process. HSI is an important part of the study because virtual envi-ronments can enable earlier detailed evaluation of warfighter integration in the system design.
Many researchers have conducted studies that looked for methods to make military systems design and acquisition more fruitful. A multitude of stud-ies conducted by the U.S. Government Accountability Office (GAO) has also investigated the failures of the DoD to move defense systems from the early stages of conceptualization to finished designs useful to warfighters. The
337Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
GAO offered this observation: “Systems engineering expertise is essential throughout the acquisition cycle, but especially early when the feasibility of requirements are [sic] being determined” (GAO, 2015, p. 8). The DoD process is linked to the systems engineering process through the mandated use of the DoD 5000-series documents (Ferrara, 1996). However, for many reasons, major defense systems design and development cycles continue to fail, major programs are canceled, systems take too long to finish, or costs are significantly expanded (Gould, 2015). The list of DoD acquisition projects either canceled or requiring significantly more money or time to complete is long. Numerous attempts to redefine the process have fallen short. The DoD has, however, learned valuable lessons as a result of past failures such as the Future Combat System, Comanche, Next Generation Cruiser CG(X), and the Crusader (Rodriguez, 2014). A partial list of those lessons includes: the need for enhanced requirements generation, detailed collaboration with stakeholders, and better systems engineering utilizing enhanced tradespace tools.
The DoD is now looking to follow the innovation process emerging in indus-try to kick-start the innovation cycle and utilize emerging technologies to minimize the time from initial concept to fielded system (Hagel, 2014). This is a challenging goal that may require significant review and restructuring of many aspects of the current process. In his article “Digital Pentagon,” Modigliani (2013) recommended a variety of changes, including changes to enhance collaboration and innovation. Process changes and initiatives have been a constant in DoD acquisition for the last 25 years. As weapons have become more complex, software-intensive, and interconnected, DoD has struggled to find the correct mix of process and innovation. The DoD acquisition policy encourages and mandates the utilization of systems engineering methods to design and develop complex defense systems. It is hoped that the emergence of MBSE concepts may provide a solid foundation and useful techniques that can be applied to harness and focus the fruits of the rapidly expanding innovation pipeline.
Innovative use of virtual environments and crowdsourcing can increase the usefulness of weapon systems to meet the real needs of the true stakeholders—the warfighters.
338 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
The goal and desire to include more M&S into defense system design and development has continually increased as computer power and software tools have become more powerful. Over the past 25 years, many new efforts have been launched to focus the utilization of advanced M&S. The advances in M&S have led to success in small pockets and in selected design efforts, but have not diffused fully across the entire enterprise. Several different process initiatives have been attempted over the last 30 years. The acquisi-tion enterprise is responsible for the process, which takes ideas for defense systems; initiates programs to design, develop, and test a system; and then manages the program until the defense system is in the warfighters’ hands. A few examples of noteworthy process initiatives are Simulation Based Acquisition (SBA); Simulation and Modeling for Acquisition, Requirements, and Training (SMART); Integrated Product and Process Development (IPPD); and now, Model Based Systems Engineering (MBSE) and Digital Engineering Design (DED) (Bianca, 2000; Murray, 2014; Sanders, 1997; Zimmerman, 2015). These process initiatives (SBA, SMART, and IPPD) helped create some great successes in DoD weapon systems; however, the record of defense acquisition and the amount of time required to develop more advanced and increasingly complex interoperable weapon systems has been mixed at best. The emerging MBSE and DED efforts are too new to fully evaluate their contribution.
The Army’s development of the Javelin (AAWS-M) missile system is an interesting case study of a successful program that demonstrated the abil-ity to overcome significant cost, technical, and schedule risks. Building on design and trade studies conducted by the Defense Advanced Research Projects Agency (DARPA) during the 1970s and utilizing a competitive prototype approach, the Army selected an emerging (imaging infrared seeker) technology from the three technology choices proposed. The inno-vative Integrated Flight Simulation, originally developed by the Raytheon/Lockheed joint venture, also played a key role in Javelin’s success. The final selection was heavily weighted toward “fire-and-forget” technology that, although costly and immature at the time, provided a significant benefit
As weapons have become more complex, software-intensive, and interconnected, DoD has struggled to find the correct mix of process and innovation
339Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
to the warfighter (David, 1995; Lyons, Long, & Chait, 2006). This is a rare example of warfighter input and unique M&S efforts leading to a successful program. In contrast to Javelin’s successful use of innovative modeling and simulation is the Army’s development of Military Operations on Urbanized Terrain (MOUT) weapons. In design for 20 years, and still under develop-ment, is a new urban shoulder-launched munition for MOUT application now called the Individual Assault Munition (IAM). The MOUT weapon acquisition failure was in part due to challenging requirements. However, the complex competing technical system requirements might benefit from the use of detailed virtual prototypes and innovative game-based war-
fighter and engineer collaboration. IAM follows development of the Army’s Multipurpose Individual Munition (MPIM), a program started by the Army around 1994 and canceled in 2001. Army Colonel Richard Hornstein indicates that currently, after many program changes and requirements updates, system development of IAM will now begin again in the 2018 timeframe. However, continuous science and technology efforts at both U.S. Army Armament Research, Development, and Engineering Center (ARDEC) and U.S. Army Aviation and Missile Research, Development, and Engineering Center (AMRDEC) have been maintained for this type of system. Many of our allies and other countries in the world are actively developing MOUT weapons (Gourley, 2015; Jane’s, 2014). It is hoped that by using the framework and success factors described in this article, DoD will accelerate bringing needed capabilities to the warfighter, using innovative ideas and constant soldier, sailor, and airman input. With the changing threat environment in the world, the U.S. military can no longer allow capability gaps to be unfilled for 20 years or just wait to purchase similar systems from our allies. The development of MOUT weapons is an applica-tion area that is ripe for the methods discussed in this article. This study and enhanced utilization of virtual environments cannot correct all of the problems in defense acquisition. However, it is hoped that enhanced utili-zation of virtual environments and crowdsourcing, as a part of the larger
The record of defense acquisition and the amount of time required to develop more advanced and increasingly complex interoperable weapon systems has been mixed at best.
340 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
effort into Engineered Resilient Systems (ERS) and expanded tradespace tools, can provide acquisition professionals innovative ways to accelerate successful systems development.
BACKGROUNDLiterature Review
This article builds upon detailed research by Murray (2014); Smith and Vogt (2014); London (2012); Korfiatis, Cloutier, and Zigh (2015); Corns and Kande (2011); and Madni (2015) that covered elements of crowdsourcing, virtual environments, gaming, early systems engineering, and MBSE. The research study described in this article was intended to expand the work discussed in this section and determine the critical success factors for using MBSE and virtual environments to harvest crowdsourcing data from war-fighters and stakeholders, and then provide that data to the overall Digital System Model (DSM). The works reviewed in this section address virtual environments and prototyping, MBSE, and crowdsourcing. The majority of these are focused on the conceptualization phase of product design. However, these tools can be used for early product design and integrated into the detailed development phase up to Milestone C, the production and deployment decision.
Many commercial firms and some government agencies have studied the use of virtual environments and gaming to create “serious games” that have a purpose beyond entertainment (National Research Council [NRC], 2010). Commercial firms and DARPA have produced studies and programs to utilize an open innovation paradigm. General Electric, for one, is com-mitted to “crowdsourcing innovation—both internally and externally … [b]y sourcing and supporting innovative ideas, wherever they might come from…” (General Electric, 2017, p. 1).
Researchers from many academic institutions are also working with open innovation concepts and leveraging input from large groups for concept creation and research into specific topics. Dr. Stephen Mitroff of The George Washington University created a popular game while at Duke University that was artfully crafted not only to be entertaining, but also to provide researchers access to a large pool of research subjects. Figure 1 shows a sample game screen. The game allows players to detect dangerous items from images created to look like a modern airport X-ray scan. The research utilized the game results to test hypotheses related to how the human brain detects multiple items after finding similar items. In addition, the game allowed testing on how humans detect very rare and dangerous items. The
341Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
game platform allowed for a large cross section of the population to interact and assist in the research, all while having fun. One of the keys to the use-fulness of this game as a research platform is the ability to “phone home” or telemeter the details of the player-game interactions (Drucker, 2014; Sheridan, 2015). This research showed the promise of generating design and evaluation data from a diverse crowd of participants using game-based methods.
FIGURE 1. AIRPORT SCANNER SCREENSHOT
Note. (Drucker, 2014). Used by permission Kedlin Company.
342 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
ProcessSeveral examples of process-related research that illustrates begin-
ning inquiry into the use of virtual environments and MBSE to enhance systems development are reviewed in this section. Marine Corps Major Kate Murray (2014) explored the data that can be gained by the use of a conceptual Early Synthetic Prototype (ESP) environment. The envisioned environment used game-based tools to explore requirements early in the design process. The focus of her study was “What feedback can be gleaned, and is it useful to decision makers?” (Murray, 2014, p. 4). This innovative thesis ties together major concepts needed to create an exploration of design within a game-based framework. The study concludes that ESP should be utilized for Pre-Milestone A efforts. The Pre-Milestone A efforts are domi-nated by concept development and materiel solutions analysis. Murray also discussed many of the barriers to fully enabling the conceptual vision that she described. Such an ambitious project would require the warfighters to be able to craft their own scenarios and add novel capabilities. An interesting viewpoint discussed in this research is that the environment must be able to interest the warfighters enough to have them volunteer their game-playing time to assist in the design efforts. The practical translation of this is that the environment created must look and feel like similar games played by the warfighters, both in graphic detail and in terms of game challenges to “keep … players engaged” (Murray, 2014, p. 25).
Corns and Kande (2011) describe a virtual engineering tool from the University of Iowa, VE-Suite. This tool utilizes a novel architecture, includ-ing a virtual environment. Three main engines interact: an Xplorer, a Conductor, and a Computational engine. In this effort, Systems Modeling Language (SysML) and Unified Modeling Language (UML) diagrams are integrated into the overall process. A sample environment is depicted sim-ulating a fermentor and displaying a virtual prototype of the fermentation process controlled by a user interface (Corns & Kande, 2011). The extent and timing of the creation of detailed MBSE artifacts, and the amount of integration achievable or even desirable among specific types of modeling languages—e.g., SysML and UML—are important areas of study.
In his 2012 thesis, Brian London described an approach to concept creation and evaluation. The framework described utilizes MBSE principles to assist in concept creation and review. The benefits of the approach are explored through examples of a notional Unmanned Aerial Vehicle design project. Various SysML diagrams are developed and discussed. This approach advo-cates utilization of use-case diagrams to support the Concept of Operations (CONOPS) review (London, 2012).
343Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
Carlini (2010), in the Director, Defense Research and Engineering Rapid Toolbox Study, called for accelerated concept engineering with an expanded use of both virtual and physical prototypes, and support for more innovative interdisciplinary red teams. In this article, the terms “virtual environment” and “virtual prototype” can be used interchangeably. Korfiatis, Cloutier, and Zigh (2015) authored a series of articles between 2011 and 2015 related to CONOPS development and early systems engineering design methods. The Integrated Concept Engineering Framework evolved out of numerous research projects and articles looking at the combination of gaming and MBSE methods related to the task of CONOPS creation. This innovative work shows promise for the early system design and ideation stages of the acquisition cycle. There is recognition in this work that the warfighter will need an easy and intuitive way to add content to the game and modify the parameters that control objects in the game environment (Korfiatis et al., 2015).
Madni (2015) explored the use of storytelling and a nontechnical narrative along with MBSE elements to enable more stakeholder interaction in the design process. He studied the conjunction of stakeholder inputs, nontradi-tional methods, and the innovative interaction between the game engine, the virtual world, and the creation of systems engineering artifacts. The virtual worlds created in this research also allowed the players to share common views of their evolving environment (Madni, 2015; Madni, Nance, Richey, Hubbard, & Hanneman, 2014). This section has shown that researchers are exploring virtual environments with game-based elements, sometimes mixed with MBSE to enhance the defense acquisition process.
CrowdsourcingWired magazine editors Jeff Howe and Mark Robinson coined the
term “crowdsourcing” in 2005. In his Wired article titled “The Rise of Crowdsourcing,” Howe (2006) described several types of crowdsourcing. The working definition for this effort is "… the practice of obtaining needed services, ideas, design, or content by soliciting contributions from a large group of people and especially from the system stakeholders and users rather than only from traditional employees, designers, or management" (Crowdsourcing, n.d.).
The best fit for crowdsourcing, conceptually, for this current research proj-ect is the description of research and development (R&D) firms utilizing the InnoCentive Website to gain insights from beyond their in-house R&D team. A vital feature in all of the approaches is the use of the Internet and modern computational environments to find needed solutions or content, using the
344 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
diversity and capability of “the crowd” at significant cost or time savings. The DoD, following this lead, is attempting to explore the capabilities and solutions provided by the utilization of crowdsourcing concepts. The DoD has numerous restrictions that can hinder a full utilization, but an artfully crafted application and a focus on components or larger strategic concepts can help to overcome these barriers (Howe, 2006).
In a Harvard Business Review article, “Using the Crowd as an Innovation Partner,” Boudreau and Lahkani (2013) discussed the approaches to crowd-sourcing that have been utilized in very diverse areas. They wrote: “Over the past decade, we’ve studied dozens of company interactions with crowds on innovation projects, in areas as diverse as genomics, engineering, operations
research, predictive analytics, enterprise software development, video games, mobile apps, and marketing” (Boudreau & Lahkani, 2013, p. 60).
Boudreau and Lahkani discussed four types of crowdsourcing: contests, collaborative communities, complementors, and crowd labor. A key enabler of the collaborative communities’ concept is the utilization of intrinsic motivational factors such as the desire to contribute, learn, or achieve. As evidenced in their article, many organizations are clearly taking note of, and are beginning to leverage the power of, diverse, geographically separated ad hoc groups to provide innovative concepts, engineering support, and a variety of inputs that traditional employees normally would have provided (Boudreau & Lahkani, 2013).
In 2015, the U.S. Navy launched “Hatch.” The Navy calls this portal a “crowdsourced, ideation platform” (Department of the Navy, 2015). Hatch is part of a broader concept called the Navy Innovation Network (Forrester, 2015; Roberts, 2015). With this effort, the Navy hopes to build a continuous
345Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
process of innovation and minimize the barriers for information flow to help overcome future challenges. Novel wargaming and innovation pathways are to become the norm, not the exception. The final tools that will fall under this portal are still being developed. However, it appears that the Navy has taken a significant step foward to establish structural changes that will simplify the ideation and innovation pipeline, and ensure that the Navy uses all of the strengths of the total workforce. “Crowdsourcing, in all of its forms, is emerging as a powerful tool…. Organizational leaders should take every opportunity to examine and use the various methods for crowdsourcing at every phase of their thinking” (Secretary of the Navy, 2015, p. 7).
The U.S. Air Force has also been exploring various crowdsourcing concepts. They have introduced the Air Force Collaboratory Website and held a num-ber of challenges and projects centered around three different technology areas. Recently, the U.S. Air Force opened a challenge prize on its new Website, http://www.airforceprize.com, with the goal of crowdsourcing a design concept for novel turbine engines that meet established design requirements and can pass the validation tests designed by the Air Force (U.S. Air Force, n.d.; U.S. Air Force, 2015).
Model Based Systems Engineering MBSE tools have emerged and are supported by many commercial firms.
The path outlined by the International Council on Systems Engineering (INCOSE), in their Systems Engineering Vision 2020 document (INCOSE, 2007) shows that INCOSE expects the MBSE environment to evolve into a robust, interconnected development environment that can serve all sys-tems engineering design and development functions. It remains to be seen if MBSE can transcend the past transformation initiatives of SMART, SBA, and others on the DoD side. The intent of the MBSE section of questions is to identify the key or critical success factors needed for MBSE to integrate into, or encompass within, a crowdsourcing process in order to provide the benefits that proponents of MBSE promise (Bianca, 2000; Sanders, 1997).
The Air Force Institute of Technology discussed MBSE and platform-based engineering as it discussed collaborative design in relation to rapid/expe-dited systems engineering (Freeman, 2011). The process outlined is very similar to the INCOSE view of the future, with MBSE included in the design process. Freeman covered the creation of a virtual collaborative environ-ment that utilizes “tools, methods, processes, and environments that allow engineers, warfighters, and other stakeholders to share and discuss choices. This spans human-system interaction, collaboration technology, visual-ization, virtual environments, and decision support” (Freeman, 2011, p. 8).
346 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
As the DoD looks to use MBSE concepts, new versions of the DoD Instruction 5000.02 and new definitions have emerged. These concepts and definitions can assist in developing and providing the policy language to fully utilize an MBSE-based process. The Office of the Deputy Secretary of Defense, Systems Engineering is working to advance several new approaches related to MBSE. New definitions have been proposed for Digital Threads and DED, using a DSM. The challenges of training the workforce and finding the cor-rect proof-of-principle programs are being addressed (Zimmerman, 2015). These emerging concepts can help enable evolutionary change in the way DoD systems are developed and designed.
The director of the AMRDEC is looking to MBSE as the “ultimate cool way” to capture the excitement and interest of emerging researchers and scientists to collaborate and think holistically to capture “a single evolving computer model” (Haduch, 2015, p. 28). This approach is seen as a unique method to capture the passion of a new generation of government engineers (Haduch, 2015).
Other agencies of the federal government are also working on pro-grams based on MBSE. David Miller, National Aeronautics and Space Administration (NASA) chief technologist, indicates that NASA is trying to use the techniques to modernize and focus future engineering efforts across the system life cycle, and to enable young engineers to value MBSE as a primary method to accomplish system design (Miller, 2015).
The level of interaction required and utilization of MBSE artifacts, methods, and tools to create, control, and interact with future virtual environments and simulations is a fundamental challenge.
SELECTED VIRTUALENVIRONMENT ACTIVITIES
ArmyWithin the Army, several efforts are underway to work on various
aspects of virtual environments/synthetic environments that are import-ant to the Army and to this research. Currently, efforts are being funded by the DoD at Army Capability Integration Center (ARCIC), Institute for Creative Technologies (ICT) at University of Southern California, Naval Postgraduate School (NPS), and at the AMRDEC. The ESP efforts managed by Army Lieutenant Colonel Vogt continue to look at building a persistent, game-based virtual environment that can involve warfighters voluntarily in design and ideation (Tadjdeh, 2014). Several prototype efforts are underway
347Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
at ICT and NPS to help evolve a system that can provide feedback from the warfighters, playing game-based virtual environments that answer real design and strategy questions. Key questions being looked at include: what metrics to utilize, how to distribute the games, and whether the needed data can be saved and transmitted to the design team. Initial prototype environments have been built and tested. The ongoing work also looks at technologies that could enable more insight into the HSI issues by attempt-ing to gather warfighter intent from sensors or camera data relayed to the ICT team (Spicer et al., 2015).
The “Always ON-ON Demand” efforts being managed by Dr. Nancy Bucher (AMRDEC) and Dr. Christina Bouwens are a larger effort looking to tie together multiple simulations and produce an “ON-Demand” enterprise repository. The persistent nature of the testbed and the utilization of vir-tual environment tools, including the Navy-developed Simulation Display System (SIMDIS™) tool, which utilizes the OpenSceneGraph capability, offers exploration of many needed elements required to utilize virtual envi-ronments in the acquisition process (Bucher & Bouwens, 2013; U.S. Naval Research Laboratory, n.d.).
NavyMassive Multiplayer Online War Game Leveraging the Internet
(MMOWGLI) is an online strategy and innovation game employed by the U.S. Navy to tap the power of the “crowd.” It was jointly developed by the NPS and the Institute for the Future. Navy researchers developed the mes-sage-based game in 2011 to explore issues critical to the U.S. Navy of the future. The game is played based on specific topics and scenarios. Some of the games are open to the public and some are more restrictive. The way to score points and “win” the game is to offer ideas that other players comment upon, build new ideas upon, or modify. Part of the premise of the approach is based on this statement: “The combined intelligence of our people is an unharnessed pool of potential, waiting to be tapped” (Moore, 2014, p. 3). Utilizing nontraditional sources of information and leveraging the rapidly expanding network and visualization environment are key elements that can transform the current traditional pace of design and acquisition. In the future, it might be possible to tie this tool to more highly detailed vir-tual environments and models that could expand the impact of the overall scenarios explored and the ideas generated.
348 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
RESEARCH QUESTIONSThe literature review demonstrates that active research is ongoing into
crowdsourcing, MBSE, and virtual environments. However, there is not a fully developed process model and an understanding of the key elements that will provide the DoD a method to fully apply these innovations to successful system design and development. The primary research questions that this study examined to meet this need are:
• What are the critical success factors that enable game-based virtual environments to crowdsource design and requirements information from warfighters (stakeholders)?
• What process and process elements should be created to inject war fighter-developed ideas, metrics, and feedback from game-based virtual environment data and use cases?
• What is the role of MBSE in this process?
METHODOLOGY AND DATA COLLECTIONThe Delphi technique was selected for this study to identify the critical
success factors for the utilization of virtual environments to enable crowd-sourced information in the system design and acquisition process. Delphi is an appropriate research technique to elicit expert judgment where com-plexity, uncertainty, and only limited information available on a topic area prevail (Gallop, 2015; Skutsch & Hall, 1973). A panel of M&S experts was selected based on a snowball sampling technique. Finding experts across DoD and academia was an important step in this research. Expertise in M&S, as well as virtual environment use in design or acquisition, was the primary expertise sought. Panel members that met the primary requirement areas, but also had expertise in MBSE, crowdsourcing, or HSI were asked to participate. The sampling started with experts identified from the literature search as well as Army experts with appropriate experience known by the researcher. Table 1 shows a simplified description of the panel members as well as their years of experience and degree attainment. Numerous, addi-tional academic, Air Force, and Navy experts were contacted; however, the acceptance rate was very low.
349Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
TABLE 1. EXPERT PANEL EXPERTISE
DESCRIPTION EDUCATION EXPERIENCE
Academic Researcher—Alabama PhD 20-30 years
Navy—Academic Researcher—California PhD 20-30 years
Army Officer—Requirements/Game Based Enviroments—Virginia
Masters 15-20 years
Army SES—M&S—Retired—Maryland PhD 30 + years
Navy M&S Expert—Virgina Masters 10-15 years
M&S Expert—Army SES—Retired Masters 30 + years
M&S Expert—Army—Virtual Environments Masters 10-15 years
M&S Expert—Army—V&V PhD 20-30 years
M&S Expert—Army—Virtual Environments PhD 15-20 years
M&S Expert—Army—Simulation Masters 20-30 years
M&S Expert—Virtual Environments/Gaming BS 15-20 years
M&S Expert—Army—Serious Games—Colorado
PhD 10-15 years
Academic Researcher—VirtualEnvironments—Conops—New Jersey
PhD <10 years
M&S Expert—Army—Visualization Masters 20-30 years
M&S Expert—Army/MDA—System of Systems Simulation (SoS)
BS 20-30 years
Academic Researcher—Florida PhD 20-30 years
M&S Expert—Army Virtual Environments— Michigan
PhD 15-20 years
M&S Expert—Army—Simulation PhD 10-15 years
Army M&S—Simulation/SoS Masters 20-30 years
Army—Simulation—SES—Maryland PhD 30 + years
Note. CONOPS = Concept of Operations; M&S = Modeling and Simulation; MDA = Missile Defense Agency; SES = Senior Executive Services; SoS = System of Systems; V&V = Verification and Validation
An exploratory “interview-style” survey was conducted using SurveyMonkey to collect demographic data and answers to a set of 38 questions. This sur-vey took the place of the more traditional semistructured interview due to numerous scheduling conflicts. In addition, each member of the expert panel was asked to provide three possible critical success factors in the primary research areas. Follow-up phone conversations were utilized to
350 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
seek additional input from members of the panel. A large number of possi-ble critical success factors emerged for each focus area. Figure 2 shows the demographics of the expert panel (n=20). More than half (55 percent) of the panel have Doctoral degrees, and an additional 35 percent hold Master’s degrees. Figure 2 also shows the self-ranked expertise of the panel. All have interacted with the defense acquisition community. The panel has the most experience in M&S, followed by expertise in virtual environments, MBSE, HSI, and crowdsourcing. Figure 3 depicts a word cloud; this figure was created from the content provided by the experts in the interview survey. The large text items show the factors that were mentioned most often in the interview survey. The initial list of 181 possible critical success factors was collected from the survey, with redundant content grouped or restated for each major topic area when developing the Delphi Round 1 survey. The expert panel was asked to rank the factors using a 5-element Likert scale from Strongly Oppose to Strongly Agree. The experts were also asked to rank their or their groups’ status in that research area, ranging from “inno-vators” to “laggards” for later statistical analysis.
FIGURE 2. EXPERT PANEL DEMOGRAPHICS AND EXPERTISE
Degrees M & S VE
HSI Crowdsource MBSE
Bachelors10%
Medium 5%
Low10%
Low60%
Low50%
High20%
High20%
Medium35%
Medium30%
Medium40%
Masters35%
PhD55% High
95%High75%
Medium25%
351Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
FIGURE 3. WORDCLOUD FROM INTERVIEW SURVEY
Fifteen experts participated in the Round 1 Delphi study. The data generated were coded and statistical data were also computed. Figure 4 shows the top 10 factors in each of four areas developed in Round 1—virtual environments, crowdsourcing, MBSE, and HSI. The mean, Interquartile Range (IQR), and percent agreement are shown for 10 factors developed in Round 1.
The Round 2 survey included bar graphs with the statistics summarizing Round 1. The Round 2 survey contained the top 10 critical success factors in the five areas—with the exception of the overall process model, which contained a few additional possible critical success factors due to survey software error. The Round 2 survey shows an expanded Likert scale with seven levels, ranging from Strongly Disagree to Strongly Agree. The addi-tional choices were intended to minimize ties and to help show where the experts strongly ranked the factors.
Fifteen experts responded to the Round 2 survey, rating the critical success factors determined from Round 1. The Round 2 survey critical success factors continued to receive a large percentage of experts choosing survey values ranging from “Somewhat Agree” to “Strongly Agree,” which con-firmed the Round 1 top selections. But Round 2 data also suffered from an increase in “Neither Agree nor Disagree” responses for success factors past the middle of the survey.
352 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
FIGURE 4. CRITICAL SUCESS FACTOR RESULTS ROUND 1
VIRTUAL ENVIRONMENTSCRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Real Time Operation 4.67 1 93%
Utility to Stakeholders 4.47 1 93%
Fidelity of Modeling/Accuracy of Representation 4.40 1 87%
Usability/Ease of Use 4.40 1 93%
Data Recording 4.27 1 87%
Verification, Validation and Accreditation 4.20 1 87%
Realistic Physics 4.20 1 80%
Virtual Environment Link to Problem Space 4.20 1 80%
Flexibility/Customization/Modularity 4.07 1 80%
Return On Investment/Cost Savings 4.07 1 87%
CROWDSOURCINGCRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Accessibility/Availability 4.53 1 93%
Leadership Support/Commitment 4.53 1 80%
Ability to Measure Design Improvement 4.47 1 93%
Results Analysis by Class of Stakeholder 4.33 1 93%
Data Pedigree 4.20 1 87%
Timely Feedback 4.20 1 93%
Configuration Control 4.13 1 87%
Engaging 4.13 1 80%
Mission Space Characterization 4.13 1 87%
Portal/Web site/Collaboration Area 4.07 1 87%
MBSECRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Conceptual Model of the Systems 4.60 1 87%
Tied to Mission Tasks 4.43 1 93%
Leadership Commitment 4.40 1 80%
Reliability/Repeatability 4.33 1 93%
Senior Engineer Commitment 4.33 1 80%
Fidelity/Representation of True Systems 4.27 1 93%
Tied To Measures of Performance 4.27 1 87%
Validation 4.27 1 93%
Well Defined Metrics 4.27 1 80%
Adequate Funding of Tools 4.20 2 73%
353Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
FIGURE 4. CRITICAL SUCESS FACTOR RESULTS ROUND 1—CONTINUED
HSICRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Ability to Capture Human Performance Behavior 4.64 1 100%
Adequate Funding 4.57 1 100%
Ability to Measure Design Improvement 4.43 1 93%
Ability to Analyze Mental Tasks 4.36 1 100%
Integration with Systems Engineering Process 4.33 1 87%
Leadership Support/Commitment 4.29 1.25 79%
Intuitive Interfaces 4.29 1.25 79%
Consistency with Operational Requirements 4.27 1 93%
Data Capture into Metrics 4.21 1 86%
Fidelity 4.14 1 86%
Note. IQR = Interquartile Range.
The Round 3 survey included the summary statistics from Round 2 and charts showing the experts’ agreement from Round 2. The Round 3 ques-tions presented the top 10 critical success factors in each area and asked the experts to rank these factors. The objective of the Round 3 survey was to determine if the experts had achieved a level of consensus regarding the ranking of the top 10 factors from the previous round.
PROCESS AND EMERGING CRITICAL SUCCESS FACTOR THEMES
In the early concept phase of the acquisition process, more game-like elements can be utilized, and the choices of technologies can be very wide. The graphical details can be minimized in favor of the overall application area. However, as this process is applied later in the design cycle, more detailed virtual prototypes can be utilized, and there can be a greater focus on detailed and subtle design differences that are of concern to the war-fighter. The next sections present the overall process model and the critical success factors developed.
Process (Framework)“For any crowdsourcing endeavor to be successful, there has to be a
good feedback loop,” said Maura Sullivan, chief of Strategy and Innovation, U.S. Navy (Versprille, 2015, p. 12). Figure 5 illustrates a top-level view of the framework generated by this research. Comments and discussion
354 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
from the interview phase have been combined with the literature review data and information to create this process. Key elements from the Delphi study and the critical success factors have been utilized to shape this pro-cess. The fidelity of the models utilized would need to be controlled by the visualization/modeling/prototyping centers. These centers would provide key services to the warfighters and engineers to artfully create new game elements representing future systems and concepts, and to pull information from the enterprise repositories to add customizable game elements.
FIGURE 5. CROWDSOURCE INNOVATION FRAMEWORK
Note. MBSE = Model Based Systems Engineering; S&T = Science and Technology; SysML/UML = Systems Modeling Language/Unified Modeling Language.
The expert panel was asked: “Is Model Based Systems Engineering neces-sary in this approach?” The breakdown of responses revealed that 63 percent responded “Strongly Agree,” another 18.5 percent selected “Somewhat Agree,” and the remaining 18.5 percent answered “Neutral.” These results show strong agreement with using MBSE methodologies and concepts as an essential backbone, using MBSE as the “glue” to manage the use cases, and subsequently providing the feedback loop to the DSM.
In the virtual environment results from Round 1, real time operation and realistic physics were agreed upon by the panel as critical success factors. The appropriate selection of simulation tools would be required to sup-port these factors. Scenegraphs and open-source game engines have been evolving and maturing over the past 10 years. Many of these tools were commercial products that had proprietary architectures or were expensive.
MBSES&T Projects
& IdeasWarfighter
Ideation
Use Case inSysML/UML
Graphical ScenarioDevelopment
Visualization/Modeling/Prototype Centers
Enterprise Repository/Digital System Models
CollaborativeCrowdsourceInnovation
Environment
Vote/Rank/CommentFeedback
Voting/Ranking/FilterFeedback MBSE
Artifacts
Deploy/Capture &Telemeter Metrics
MBSE UML/SysMLArtifacts
MBSE ArtifactsAutogenerated
Develop GameModels & Physics
InnovationPortal
Game Engines
Ranking/PollingEngines
Engage ModelingTeam to Add
Game Features
PlayGame/Compete
Engineers & Scientists Warfighters
Environments
Models
Phys
ics
Decision Engines
MBSE Artifacts
Lethality
355Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
However, as the trend toward more open-source tools continues, game engines have followed the trend. Past research conducted by Romanczuk (2012) linked scenegraph tools such as Prospect, Panda3D, and Delta3D to high-fidelity human injury modeling and lethality application programming interfaces. Currently, the DoD has tools like VBS2 and VBS3 available, but newer commercial-level engines are also becoming free for use by DoD and the public at large. Premier game engines such as Source, Unity, and Unreal are now open-source engines (Heggen, 2015). The trend continues as WebGL and other novel architectures allow rapid development of high-end complex games and simulations.
In the MBSE results from Round 1, the panel indicated that both ties to mission tasks and to measures of performance were critical. The selection of metrics and the mechanisms to tie these factors into the process are very important. Game-based metrics are appropriate, but these should be tied to elemental capabilities. Army researchers have explored an area called Degraded States for use in armor lethality (Comstock, 1991). The early work in this area has not found wide application in the Army. However, the ele-mental capability methodology, which is used for personnel analysis, should be explored for this application. Data can be presented to the warfighter that aid gameplay by using basic physics. In later life-cycle stages, by capturing and recording detailed data points, engineering-level simulations can be run after the fact, rather than in real time, with more detailed high-fidelity simulations by the engineering staff. This allows a detailed design based on feedback telemetered from the warfighter. The combination of telemetry from the gameplay and follow-up ranking by warfighters and engineering staff can allow in-depth, high-fidelity information flow into the emerging systems model. Figure 6 shows the authors’ views of the interactions and fidelity changes over the system life cycle.
FIGURE 6. LIFE CYCLE
Note. EMD = Engineering and Manufacturing Development; Eng/Sci = Engineers/Scientists; S&T = Science and Technology.
Open InnovationCollaborationStrategicTrade StudyAnalysis of AlternativesLow Fidelity
CompetitiveMedium FidelityEvolving Representations
High FidelityDesign Features
Broad
Early Concept
Warfighters
Eng/Sci
Eng/SciEng/Sci
WarfightersWarfighters
Prototype EvaluationEMD
Comparative
Focused
IDEA
TION
S&T
356 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
Collaboration and FilteringA discussion on collaboration and filtering arose during the interviews.
The feedback process from a crowd using virtual environments needs voting and filtering. The voting techniques used in social media or on Reddit are reasonable and well-studied. Utilizing techniques familiar to the young warfighters will help simplify the overall process. The ranking and filtering needs to be done by both engineers and warfighters so the decisions can take both viewpoints into consideration. Table 2 shows the top 10 critical success factors from Round 2 for the overall process. The Table includes the mean, IQR, and the percent agreement for each of the top 10 factors. A collaboration area, ranking and filtering by scientists and engineers, and collaboration between the warfighters and the engineering staff are critical success factors—with a large amount of agreement from the expert panel.
TABLE 2. TOP 10 CRITICAL SUCCESS FACTORSOVERALL PROCESS—ROUND 2
CRITICAL SUCCESS FACTOR MEAN IQR % AGREE
Filtering by Scientists/Engineers 5.56 1 81%
Portal/Website/Collaboration Area 5.56 1 81%
Leadership Support 6 2.5 75%
Feedback of Game Data into Process 5.56 2.75 75%
Timely Feedback 5.75 2.75 75%
Recognition 5.13 1.75 75%
Data Security 5.5 2.75 75%
Collaboration between Eng/Scientistand Warfighters
6.06 2.5 75%
Engagement (Warfighters) 5.94 3 69%
Engagement (Scientists & Engineers) 5.75 3 69%
FidelityFidelity was ranked high in virtual environments, MBSE, and HSI.
Fidelity and accuracy of the modeling and representations to the true system are critical success factors. For the virtual environment, early work would be done with low facet count models featuring texture maps for realism. However, as the system moves through the life cycle, higher fidelity models and models that feed into detailed design simulations will be required. There must also be verification, validation, and accreditation of these models as they enter the modeling repository or the DSM.
357Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
Leadership CommitmentLeadership commitment was ranked near the top in the MBSE, crowd-
sourcing, and HSI areas. Clearly, in these emerging areas the enterprise needs strong leadership and training to enable MBSE and crowdsourcing initiatives. The newness of MBSE and crowdsourcing may be related to the experts’ high ranking of the need for leadership and senior engineer commit-ment. Leadership support is also a critical success factor in Table 2—with 75 percent agreement from the panel. Leadership commitment and support, although somewhat obvious as a success factor, may have been lacking in previous initiatives. Leadership commitment needs to be reflected in both policy and funding commitments from both DoD and Service leadership to encourage and spur these innovative approaches.
Critical Success FactorsFigure 7 details the critical success factors generated from the Delphi
study, which visualizes the top 10 factors in each by using a mind-map-ping diagram. The main areas of study in this article are shown as major branches, with the critical success factors generated appearing on the limbs of the diagram. The previous sections have discussed some of the emerging themes and how some of the recurring critical success factors in each area can be utilized in the framework developed. The Round 3 ranking of the critical success factors was analyzed by computing the Kendall’s W, coef-ficient of concordance. Kendall’s W is a nonparametric statistics tool that measures the agreement of a group of raters. The experts’ rankings of the success factors showed moderate, but statistically significant agreement or consensus.
358 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
FIGURE 7. CRITICAL SUCCESS FACTOR IN FIVE KEY AREAS
Filte
ring b
y Scie
ntist
s/Eng
ineer
s
Porta
l/Web
site C
ollab
orat
ion A
rea
Lead
ersh
ip Su
ppor
t
Feed
back
of G
ame D
ata I
nto P
roce
ss
Timely
Feed
back
Reco
gnitio
n
Data
Secu
rity
Colla
borat
ion Be
twee
n Eng
/Scie
ntist
& W
arfig
hter
s
Enga
gem
ent (
War
fight
ers)
Enga
gem
ent (
Scien
tists
& En
ginee
rs)
Acce
ssibil
ty/Av
ailab
ility
Lead
ersh
ip Su
ppor
t/Com
mitm
ent
Abilit
y to M
easu
re D
esign
Impr
ovem
ent
Resu
lts A
nalys
is by
Clas
s of S
take
holde
r
Data
Pedig
ree
Timely
Feed
back
Confi
gura
tion C
ontro
l
Enga
ging
Missi
on Sp
ace C
hara
cteriz
ation
Porta
l/Web
site/
Colla
borat
ion A
rea
Abilit
y to C
aptu
re H
uman
Perfo
rman
ce Be
havio
r
Adeq
uate
Fund
ing
Abilit
y to M
easu
re D
esign
Impr
ovem
ent
Abilit
y to A
nalyz
e Men
tal T
asks
Integ
ratio
n with
Syste
ms E
ngine
ering
Proc
ess
Lead
ersh
ip Su
ppor
t/Com
mitm
ent
Intuit
ive In
terfa
ces
Cons
isten
cy w
ith O
perat
ional
Requ
irem
ents
Data
Capt
ure I
nto M
etric
s
Fideli
ty
Conc
eptu
al Mo
del o
f the
Syste
ms
Tied t
o Miss
ion Ta
sks
Lead
ersh
ip Co
mm
itmen
t
Relia
bility
/Rep
eata
bility
Senio
r Eng
ineer
Com
mitm
ent
Tied t
o Mea
sure
of Pe
rform
ance
Fideli
ty/Re
pres
enta
tion o
f Tru
e Sys
tem
s
Valid
ation
Well
Defi
ned M
etric
s
Adeq
uate
Fund
ing of
Tools
Utilit
y to S
take
holde
rs
Real
Time O
pera
tion
Fideli
ty of
Mod
eling
/Accu
racy
of Re
pres
enta
tion
Usab
ility/
Ease
of U
se
Data
Reco
rding
Verifi
catio
n, Va
lidat
ion an
d Accr
edita
tion
Reali
stic P
hysic
s
Virtu
al En
viron
men
t Link
to Pr
oblem
Spac
e
Flexib
ility/
Custo
miza
tion/
Modu
larity
Retu
rn on
Inve
stmen
t/Cos
t Sav
ings
Criti
cal S
ucce
ss Fa
ctors
V
irtua
l Env
ironm
ent
MB
SE
HSI
Ove
rall P
rocess
Cro
wdsourc
ing
359Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
LIMITATIONS TO THE RESEARCHThe ideas presented here and the critical success factors have been
developed by a team of experts who have, on average, 20 to 30 years of expe-rience in the primary area of inquiry and advanced degrees. However, the panel was more heavily weighted by Army experts than individuals from the rest of the DoD. Neither time nor resources allowed for study of other important groups of experts, including warfighters, industry experts, and program managers. The Delphi method was selected for this study to gen-erate the critical success factors based on the perceived ease of use of the method and the controlled feedback gathered. The critical success factors developed are ranked judgment, but based on years of expertise. This study considered five important areas and identified critical success factors in those areas. This research study is based on the viewpoint of experts in M&S. Nonetheless, other types of expert viewpoints might possibly gen-erate additional factors. Several factor areas could not be covered by M&S experts, including security and information technology.
The surveys were constructed with 5- and 7- element Likert scales that allowed the experts to choose “Neutral” or “Neither Agree nor Disagree.” Not utilizing a forced-choice scale or a nonordinal data type in later Delphi rounds can limit data aggregation and statistical analysis approaches.
RECOMMENDATIONS AND CONCLUSIONS
In conclusion, innovation tied to virtual environments and linked to MBSE artifacts can help the DoD meet the significant challenges it faces in creating new complex interconnected designs much faster than in the past decade. This study has explored key questions and has developed critical success factors in five areas. A general framework has also been developed. The DoD must look for equally innovative ways to meet numerous informa-tion technology (IT), security, and workforce challenges to enable the DoD to implement the process successfully in the acquisition enterprise. The DoD should also explore interdisciplinary teams by hiring and funding teams of programmers and content creators to be co-located with systems engineers and subject matter experts.Artfully crafted game-based scenarios that help explore design and usability issues can be crafted and provided to warfighters as a part of the process and help focus on needed system information. The challenge remains for the methods to harvest, filter, and convert the information gathered to
360 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
MBSE artifacts that result from this process. An overall process can be enacted that takes ideas, design alternatives, and data harvested—and then provides a path to feed back this data at many stages in the acquisition cycle. The extent to which MBSE tools such as SysML, UML, and emerging new standards are adopted or utilized in the process may depend upon the emerging training of acquisition professionals in MBSE and the leadership commitment to this approach.
This article has answered the three research questions posed in earlier discussion. Utilizing the expert panel, critical success factors have been developed using the Delphi method. An emerging process model has been described. Finally, the experts in this Delphi study have affirmed an essen-tial role of MBSE in this process.
FUTURE RESEARCHThe DoD is actively conducting research into the remaining challenges
to bring many of the concepts discussed in this article into the acquisition process. The critical success factors developed here can be utilized to focus some of the efforts.
Key challenges in DoD remain, as the current IT environment attempts to study larger virtual environments and prototypes. The question of how to utilize the Secret Defense Engineering Research Network, High Performance Supercomputing, and Secret Internet Protocol Router Network, while simultaneously making the process continually available to warfighters, will need to be answered. The ability of deployed warfighters to engage in future system design efforts is also a risk item that needs to be investigated. Research is essential to identify the limitations and inertia associated with the DoD IT environment in relation to virtual environments and crowdsourcing. An expanded future research study that uses additional
Artfully crafted game-based scenarios that help explore design and usability issues can be crafted and provided to warfighters as a part of the process and help focus on needed system information.
361Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
inputs, including a warfighter expert panel and an industry expert panel, would provide useful data to compare and contrast with the results of this study.
An exploration of how to combine the process described in this research with tradespace methodologies and ERS approaches could be explored. MBSE methods to link and provide feedback should also be studied.
The DoD should support studies that select systems in the early stages of development in each Service to apply the proposed framework and process. The studies should use real gaps and requirements, and real warfighters. In support of ARCIC, several studies are proposed at the ICT and the NPS that explore various aspects of the challenges involved in testing tools needed to advance key concepts discussed in this article. The Navy, Air Force, and Army have active programs under various names to determine how M&S can support future systems development as systems and designs become more complex, distributed, and interconnected (Spicer et al., 2015).
When fully developed, MBSE and DSM methods can leverage the emerging connected DoD enterprise and bring about a continuous-feedback design environment. Applying the concepts developed in this article to assessments conducted by developing concepts, Analysis of Alternatives, and trade studies conducted during early development through Milestone C can lead to more robust, resilient systems continuously reviewed and evaluated by the stakeholders who truly matter: the warfighters.
The extent to which MBSE tools such as SysML, UML, and emerging new standards are adopted or utilized in the process may depend upon the emerging training of acquisition professionals in MBSE and the leadership commitment to this approach.
362 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
ReferencesBianca, D. P. (2000). Simulation and modeling for acquisition, requirements, and
training (SMART) (Report No. ADA376362). Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA376362
Boudreau, K. J., & Lakhani, K. R. (2013). Using the crowd as an innovation partner. Harvard Business Review, 91(4), 60–69.
Bucher, N., & Bouwens, C. (2013). Always on–on demand: Supporting the development, test, and training of operational networks & net-centric systems. Presentation to National Defense Industrial Association 16th Annual Systems Engineering Conference, October 28-31, Crystal City, VA. Retrieved from http://www.dtic.mil/ndia/2013system/W16126_Bucher.pdf
Carlini, J. (2010). Rapid capability fielding toolbox study (Report No. ADA528118). Retrieved from http://www.dtic.mil/dtic/tr/fulltext/u2/a528118.pdf
Comstock, G. R. (1991). The degraded states weapons research simulation: An investigation of the degraded states vulnerability methodology in a combat simulation (Report No. AMSAA-TR-495). Aberdeen Proving Ground, MD: U.S. Army Materiel Systems Analysis Activity.
Corns, S., & Kande, A. (2011). Applying virtual engineering to model-based systems engineering. Systems Research Forum, 5(2), 163–180.
Crowdsourcing (n.d.). In Merriam-Webster’s online dictionary. Retrieved from http://www.merriam-webster.com/dictionary/crowdsourcing
Dalkey, N. C. (1967). Delphi (Report No. P-3704). Santa Monica, CA: The RAND Corporation.
David, J. W. (1995). A comparative analysis of the acquisition strategies of Army Tactical Missile System (ATACMS) and Javelin Medium Anti-armor Weapon System (Master’s thesis), Naval Postgraduate School, Monterey, CA.
Department of the Navy. (2015, May 20). The Department of the Navy launches the “Hatch.” Navy News Service. Retrieved from http://www.navy.mil/submit/display.asp?story_id=87209
Drucker, C. (2014). Why airport scanners catch the water bottle but miss the dynamite [Duke Research Blog]. Retrieved from https://sites.duke.edu/dukeresearch/2014/11/24/why-airport-scanners-catch-the-water-bottle-but-miss-the-dynamite/
Ferrara, J. (1996). DoD's 5000 documents: Evolution and change in defense acquisition policy (Report No. ADA487769). Retrieved from http://oai.dtic.mil/oai/oai?verb=getRecord&metadataPrefix=html&identifier=ADA487769
Forrester, A. (2015). Ray Mabus: Navy’s ‘Hatch’ platform opens collaboration on innovation. Retrieved from http://www.executivegov.com/2015/05/ray-mabus-navys-hatch-platform-opens-collaboration-on-innovation/
Freeman, G. R. (2011). Rapid/expedited systems engineering (Report No. ADA589017). Wright-Patterson AFB. OH: Air Force Institute of Technology, Center for Systems Engineering.
Gallop, D. (2015). Delphi, dice, and dominos. Defense AT&L, 44(6), 32–35. Retrieved from http://dau.dodlive.mil/files/2015/10/Gallop.pdf
GAO. (2015). Defense acquisitions: Joint action needed by DOD and Congress to improve outcomes (Report No. GAO-16-187T). Retrieved from http://www.gao.gov/assets/680/673358.pdf
363Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
General Electric. (2017). GE open innovation. Retrieved from http://www.ge.com/about-us/openinnovation
Gould, J. (2015, March 19). McHugh: Army acquisitions 'tale of failure'. DefenseNews. Retrieved from http://www.defensenews.com/story/defense/land/army/2015/03/19/mchugh-army-acquisitions-failure-underperforming-canceled-/25036605/
Gourley, S. (2015). US Army looks to "full spectrum" shoulder-fired weapon. Retrieved from https://www.military1.com/army-training/article/572557-us-army-looks-to-full-spectrum-shoulder-fired-weapon/
Haduch, T. (2015). Model based systems engineering: The use of modeling enhances our analytical capabilities. Retrieved from http://www.army.mil/e2/c/downloads/401529.pdf
Hagel, C. (2014). Defense innovation days. Keynote presentation to Southeastern New England Defense Industry Alliance. Retrieved from http://www.defense.gov/News/Speeches/Speech-View/Article/605602
Heggen, E. S. (2015). In the age of free AAA game engines, are we still relevant? Retrieved from http://jmonkeyengine.org/301602/in-the-age-of-free-aaa-game-engines-are-we-still-relevant/
Howe, J. (2006). The rise of crowdsourcing. Wired, 14(6), 1–4. Retrieved from http://www.wired.com/2006/06/crowds/
INCOSE. (2007). Systems engineering vision 2020 (Report No. INCOSE-TP-2004-004-02). Retrieved from http://www.incose.org/ProductsPubs/pdf/SEVision2020_20071003_v2_03.pdf
Jane’s International Defence Review (2015). Lighten up: Shoulder-launched weapons come of age. Retrieved from http://www.janes360.com/images/assets /442/49442/ shoulder-launched weapon _systems_come_of_age.pdf
Kendall, F. (2014). Better buying power 3.0 [White Paper]. Retrieved from Office of the Under Secretary of Defense (Acquisition, Technology & Logistics) Website: http://www.defenseinnovationmarketplace.mil/resources/BetterBuyingPower3(19September2014).pdf
Korfiatis, P., Cloutier, R., & Zigh, T. (2015). Model-based concept of operations development using gaming simulation: Preliminary findings. Simulation & Gaming. Thousand Oaks, CA: Sage Publications. https://doi.org/1046878115571290
London, B. (2012). A model-based systems engineering framework for concept development (Master’s thesis), Massachusetts Institute of Technology, Cambridge, MA. Retrieved from http://hdl.handle.net/1721.1/70822
Lyons, J. W., Long, D., & Chait, R. (2006). Critical technology events in the development of the Stinger and Javelin Missile Systems: Project hindsight revisited. Washington, DC: Center for Technology and National Security Policy.
Madni, A. M. (2015). Expanding stakeholder participation in upfront system engineering through storytelling in virtual worlds. Systems Engineering, 18(1), 16–27. https://doi.org/10.1002/sys.21284
Madni, A. M., Nance, M., Richey, M., Hubbard, W., & Hanneman, L. (2014). Toward an experiential design language: Augmenting model-based systems engineering with technical storytelling in virtual worlds. Procedia Computer Science, 28(2014), 848–856.
Miller, D. (2015). Update on OCT activities. Presentation to NASA Advisory Council, Technology, Innovation, and Engineering Committee. Retrieved from https://www.nasa.gov/sites/default/files/atoms/files/dmiller_oct.pdf
364 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
Modigliani, P. (2013, November–December). Digital Pentagon. Defense AT&L, 42(6), 40–43. Retrieved from http://dau.dodlive.mil/files/2013/11/Modigliani.pdf
Moore, D. (2014). NAWCAD 2030 strategic MMOWGLI data summary. Presentation to Naval Air Systems Command. Retrieved from https://portal.mmowgli.nps.edu/documents/10156/108601/COMMS+1_nscMMOWGLIOverview_post.pdf/4a937c44-68b8-4581-afd2-8965c02705cc
Murray, K. L. (2014). Early synthetic prototyping: Exploring designs and concepts within games (Master’s thesis), Naval Postgraduate School, Monterey, CA. Retrieved from http://calhoun.nps.edu/handle/10945/44627
NRC. (2010). The rise of games and high-performance computing for modeling and simulation. Committee on Modeling, Simulation, and Games. Washington, DC: National Academies Press. https://doi.org/10.17226/12816
Roberts, J. (2015). Building the Naval Innovation Network. Retrieved from http://www.secnav.navy.mil/innovation/Pages/2015/08/NIN.aspx
Rodriguez S. (2014). Top 10 failed defense programs of the RMA era. War on the Rocks. Retrieved from http://warontherocks.com/2014/12/top-10-failed-defense-programs-of-the-rma-era/
Romanczuk, G. E. (2012). Visualization and analysis of arena data, wound ballistics data, and vulnerability/lethality (V/L) data (Report No. TR-RDMR-SS-11-35). Redstone Arsenal, AL: U.S. Army Armament Research, Development and Engineering Center.
Sanders, P. (1997). Simulation-based acquisition. Program Manager, 26(140), 72–76.Secretary of the Navy (2015). Characteristics of an innovative Department of the Navy.
Retrieved from http://www.secnav.navy.mil/innovation/Documents/2015/07/Module_4.pdf
Sheridan, V. (2015). From former NASA researchers to LGBT activists – meet some faces new to GW. The GW Hatchet. Retrieved from http://www.gwhatchet.com/2015/08/31/from-former-nasa-researchers-to-lgbt-activists-meet-some-faces-new-to-gw/
Skutsch, M., & Hall, D. (1973). Delphi: Potential uses in educational panning. Project Simu-School: Chicago Component. Retrieved from https://eric.ed.gov/?id=ED084659
Smith, R. E., & Vogt, B. D. (2014, July). A proposed 2025 ground systems “Systems Engineering” process. Defense Acquisition Research Journal, 21(3), 752–774. Retrieved from http://www.dau.mil/publications/DefenseARJ/ARJ/ARJ70/ARJ-70_Smith.pdf
Spicer, R., Evangelista, E., Yahata, R., New, R., Campbell, J., Richmond, T., Vogt, B., & McGroarty, C. (2015). Innovation and rapid evolutionary design by virtual doing: Understanding early synthetic prototyping (ESP). Retrieved from http://ict.usc.edu/pubs/Innovation%20and%20Rapid%20Evolutionary%20Design%20by%20Virtual%20Doing-Understanding%20Early%20Synthetic.pdf
Tadjdeh, Y. (2014). New video game could speed up acquisition timelines. National Defense. Retrieved from http://www.nationaldefensemagazine.org/blog/lists/posts/post.aspx?ID=1687
U.S. Air Force. (n.d.). The Air Force collaboratory. Retrieved from https://collaboratory.airforce.com/
U.S. Air Force. (2015). Air Force prize. Retrieved from https://airforceprize.com/aboutU.S. Naval Research Laboratory. (n.d.). SIMDIS™ presentation. Retrieved from https://
simdis.nrl.navy.mil/SimdisPresentation.aspx
365Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
Versprille, A. (2015). Crowdsourcing to solve tough Navy problems. National Defense. Retrieved from http://www.nationaldefensemagazine.org/archive/2015/June/Pages/CrowdsourcingtoSolveToughNavyProblems.aspx
Zimmerman, P. (2015). MBSE in the Department of Defense. Seminar presentation to Goddard Space Flight Center. Retrieved from https://ses.gsfc.nasa.gov/ses_data_2015/150512_Zimmerman.pdf
366 Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
Crowdsourcing with Virtual Environments http://www.dau.mil
Author Biographies
Mr. Glenn E. Romanczuk is a PhD candi-date at The George Washington University. He is a member of the Defense Acquisition Corps matrixed to the Operational Test Agency (OTA), evaluating the Ballistic Missile Defense System. He holds a BA in Political Science from DePauw University, a BSE from the University of Alabama in Huntsville (UAH), and an MSE from UAH in Engineering Management. His research includes systems engineering, lethality, visualization, and virtual environments.
(E-mail address: [email protected])
Dr. Christopher Willy is currently a senior systems engineer and program manager with J. F. Taylor, Inc. Prior to joining J. F. Taylor in 1999, he completed a career in the U.S. Navy. Since 2009, he has taught courses as a professo-rial lecturer for the Engineering Management and Systems Engineering Department at The George Washington University (GWU). Dr. Willy holds a DSc degree in Systems Engineering from GWU. His research interests are in stochastic processes and systems engineering.
(E-mail address: [email protected])
367Defense ARJ, April 2017, Vol. 24 No. 2 : 334–367
April 2017
Dr. John E Bischoff is a professorial lecturer of Engineering Management at The George Washington University (GWU). He has held exec-utive positions in several firms including AOL/Time Warner and IBM Watson Research Labs. Dr. Bischoff holds a BBA from Pace University, an MBA in Finance from Long Island University, an MS in Telecommunications Management from the Polytechnic University, and a Doctor of Science in Engineering Management from GWU.
(E-mail address: [email protected])