nafems world congress 2019 – paper submission get the data … · 2019-07-12 · nafems world...

12
NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary SPDM – Making the Case for a Tool-Independent Unified Data Model Malcolm Panthaki VP of Analysis Solutions, Aras Corp., USA Marc Lind SVP Strategy, Aras Corp., USA Abstract Increasing complexities in product architectures is placing a new level of emphasis on some traditional SPDM challenges. The complexity of effectively conducting 3-D simulations for various physics scenarios has been compounded by the necessity to create and manage mixed-fidelity and multidisciplinary models, and to rapidly conduct large numbers of simulations in the overall systems engineering process, including on-board software which often dynamically controls system behavior. Today, much of the data required to drive multi-physics, multi-fidelity simulations are specified in the disparate data formats of each of the underlying multi-vendor tools. These siloed data must be “integrated” manually by the engineers, with a severe impact on accuracy and efficiency, limiting the number of simulations that can be performed. Also, when multiple disciplines such as mechanical, electronics, and software must be considered to simulate system behavior, the data and processes are exponentially more complex. The authors contend that a tool-agnostic, unified, requirements-driven, systems-centric data model is required to best capture these data for simulation and SPDM, and that this approach has many advantages over a federated approach to integrate data. The advantages are further emphasized when you consider the need for intelligent simulation automation that works across significant design changes and large numbers of product variants in a product family, and the need to manage a fully-associative Digital Thread across the entire product lifecycle with support for versioning and configuration management. The authors will present two previously published case studies to illustrate the benefits of this approach for representing and managing the simulation data and automating complex multi-fidelity, multidisciplinary simulations that include dynamic controls software.

Upload: others

Post on 14-Mar-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

NAFEMS World Congress 2019 – Paper Submission

Get the Data Right for Effective Multidisciplinary

SPDM – Making the Case for a Tool-Independent

Unified Data Model

Malcolm Panthaki

VP of Analysis Solutions, Aras Corp., USA

Marc Lind

SVP Strategy, Aras Corp., USA

Abstract

Increasing complexities in product architectures is placing a new level of

emphasis on some traditional SPDM challenges. The complexity of effectively

conducting 3-D simulations for various physics scenarios has been

compounded by the necessity to create and manage mixed-fidelity and

multidisciplinary models, and to rapidly conduct large numbers of simulations

in the overall systems engineering process, including on-board software which

often dynamically controls system behavior.

Today, much of the data required to drive multi-physics, multi-fidelity

simulations are specified in the disparate data formats of each of the underlying

multi-vendor tools. These siloed data must be “integrated” manually by the

engineers, with a severe impact on accuracy and efficiency, limiting the

number of simulations that can be performed. Also, when multiple disciplines

such as mechanical, electronics, and software must be considered to simulate

system behavior, the data and processes are exponentially more complex.

The authors contend that a tool-agnostic, unified, requirements-driven,

systems-centric data model is required to best capture these data for simulation

and SPDM, and that this approach has many advantages over a federated

approach to integrate data. The advantages are further emphasized when you

consider the need for intelligent simulation automation that works across

significant design changes and large numbers of product variants in a product

family, and the need to manage a fully-associative Digital Thread across the

entire product lifecycle with support for versioning and configuration

management.

The authors will present two previously published case studies to illustrate the

benefits of this approach for representing and managing the simulation data

and automating complex multi-fidelity, multidisciplinary simulations that

include dynamic controls software.

Page 2: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

Case Study 1: Analyzing the Structural Thermal Optical Performance (STOP)

of an optical system is a complex, multidisciplinary, manual, and inefficient

process that is prone to human error. In this session, we will review how

optical system design teams at NASA and The Aerospace Corporation have

used the Aras Comet SPDM Workspace to rapidly perform STOP analyses,

including parametric, “what-if” trade studies of their system designs and active

thermal controls software. The case study will compare the efficiency and

robustness of this automated process to the prior manual process. [1], [2]

Case Study 2: Directed-energy laser weapons are complex precision optical

systems being developed by the US Air Force. In this case study, the Air Force

Research Labs wanted to simulate an early design that was being tested in the

lab and was showing aberrant behavior. The simulation is a mixed-fidelity,

multi-physics simulation. For computational efficiency, it was necessary to

combine lumped-parameter systems models for most of the laser system with

3-D models of certain subsystems that required higher-fidelity, using co-

simulation techniques. The simulation process was automated, significantly

increasing the efficiency of the transient co-simulation trade studies. [3]

1. The status quo cannot meet the exponentially-increasing need for

simulation and effective SPDM

Recently, the bottom-line potential of a range of strategic initiatives including

Digital Thread traceability (PLM), Digital Twins for predictive maintenance

and design improvements, multidisciplinary optimization (MDO) and

uncertainty quantification, generative design, and the quality assessment of

additive products, have galvanized the C-level at major global organizations.

At their core, each of these requires mainstreaming simulation automation and

data management to work robustly across significant and unpredictable design

changes and product families that share common functional architectures.

The status quo, with its inefficient, manual, error-prone and siloed SPDM,

driven by a scarce population of simulation experts, cannot meet this

exponentially increasing demand for rapid, timely, and accurate simulation.

Current approaches lead to simulation lagging behind in the product

development lifecycle as simulation data is not available to the enterprise in a

timely manner for rapid decision-making and predictive maintenance.

Furthermore, the closed landscape and approaches of legacy PLM vendors add

additional constraints to the use of multi-vendor tools and the free flow of

simulation data across the enterprise. These legacy PLM vendors sell their own

CAD and CAE tools and have, understandably, created platforms and data

models that best support their own tools, often providing limited or no support

for competitive tools, in-house tools, and their related data. This, despite the

widely accepted fact that their customers use a wide array of tools from a

diverse set of vendors.

Page 3: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the

Case for a Tool-Independent Unified Data Model

The desire to automate simulation processes has existed for decades. The status

quo is often scripting/programming-intensive, without the support of a unified

data model, with unsatisfactory results, limited repeatability, and limited ROI.

The ad hoc nature of this approach has resulted in fragmented solutions that do

not work well across the entire design space, are difficult to comprehend and

maintain, and isolated from other enterprise product data and processes.

Since the 1990’s, optimization (PIDO) tools, have provided “process

integration” to automate simulation steps using a black-box approach to the

model data and results. Design changes, essential for any design space

exploration, rely on automatically editing model files without semantic

knowledge of their content. This limits the design change scope that can be

explored, especially at higher (3-D) levels of model fidelity. When the

geometric design of the product, defined by CAD, changes significantly, this

“process integration” technique starts to break down.

Figure 1: Ramifications of continuing the status quo in SPDM

The ramifications (Figure 1) have been borne by end-user organizations and

will intensify as the need for rapid and accurate simulation and SPDM

exponentially increases.

2. SPDM Challenges

The challenges facing effective SPDM (Figure 2) have been driven by the need

to move towards a predominantly virtual testing environment, to support

Model Based Systems Engineering (MBSE) of increasingly complex and

multidisciplinary products, and a changing business landscape that requires

predictive maintenance. Organizations are struggling to overcome these

challenges using current simulation and SPDM approaches and tools.

Page 4: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

Figure 2: Challenges facing current simulation and SPDM approaches and tools

3. What’s Needed – Intelligent Simulation Automation and Data

Management in an open, scalable, extensible PLM Platform

The authors contend that more effective enterprise-wide SPDM is foundational

to achieving closed-loop traceability with requirements, test results, and design

data, and to support Intelligent Simulation Automation (ISA).

ISA is a fundamentally different approach that works robustly across

significant design changes and across an entire product family, while

supporting the appropriate level of mixed-fidelity modelling, from 0-D through

3-D, across the various physics. Different from the scripting PIDO approach,

the introduction of a neutral unified data model for SPDM in the PLM platform

provides an abstract model which expands the design scope of the automation

processes, enabling analysts to focus on the engineering of the product.

ISA, with its robust simulation automation technology, becomes a foundational

basis for successful implementation of various corporate strategic initiatives

including Digital Twin analysis for predictive maintenance and design

improvements, multidisciplinary optimization (MDO) and uncertainty

quantification, generative design, predictive maintenance, and the quality

assessment of additive products.

We propose the following foundational requirements for effective SPDM.

Requirement 1: Support for Systems Engineering is central to PLM.

All products are systems—their engineering, from concept to deployment and

maintenance, involves complex data and processes, and multidisciplinary

teams. The authors contend that requirements-driven system models must be

the “connective tissue” that binds the multidisciplinary data, including

Page 5: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the

Case for a Tool-Independent Unified Data Model

simulation data and processes, making this a critical foundational element of

SPDM (Figure 3). This approach facilitates and even encourages

multidisciplinary collaboration with potentially significant accuracy and

efficiency gains across a global product development organization.

Figure 3: Systems Engineering as the “connective tissue” for all PLM data

Requirement 2: Comprehensive/extensible, vendor-neutral, unified data model.

The authors contend that a comprehensive, highly-extensible, vendor-and tool-

neutral unified PLM data model is required for effective SPDM across the

entire product organization, including the various product disciplines. This

contention is borne out by the case studies in this paper.

Such a data model captures the following aspects of product data and must be

open and extensible to meet the needs of end-users and other tool vendors:

• Product intent through Functional Requirements that cascade to every

level of the product data and drive design decisions, supported by on-

demand “right-fidelity” simulation and testing.

• System Architecture captured by functional and logical models and

System Parameters that control the product design.

• Physical system models (associated with the functional and logical

system models) that support the following:

o Multiple representations per component to support simulation at

multi-levels of fidelity and physics, using a wide array of tools

o Product variants

o Performance, cost and manufacturing metrics

• Operating conditions, including System Constraints.

• Test data (critical for verification and validation for simulation models).

Page 6: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

• Simulation data (associated with the physical system models), abstracted

away from the data associated with the CAE tools used for the

simulations. The simulation data model supports the following:

o Engineering objects (component representations that capture

engineering functional abstractions such as joints, welds,

contact conditions, etc.)

o Abstract modelling (supports ISA across a product family by

capturing rules based on the functional architecture rather than

the geometry or topology of the product design)

o Simulation automation process abstractions

o Operating environments

o Simulation results, including reports

The open PLM Platform must support the following functions for all the data:

• Configuration management with multi-BOM associativity for variants.

• Change management and revision control.

• IP protection and access control defined across roles, organizational

groups and projects.

• Extensibility of the data model and processes without adverse

consequences for upgradability to newer versions of the PLM Platform.

Such a data model supports collaboration across the organization, breaking

down silos, supporting the seamless flow and consistency of data between

disciplines, providing traceability and up-to-date simulation data and

discussion forums for rapid and effective decision-making. This is what the

authors define as necessary requirements for effective SPDM.

The key business benefits of this fully-integrated, unified data model approach,

from requirements to system models to simulation and predictive maintenance

(the full Digital Thread) are:

• Nimble and effective product development processes, including the

reduction of physical testing.

• Traceability over the entire Digital Thread.

• Reduction of product defects and warranty issues.

• Reuse of old designs to rapidly create validated variants and new designs.

• Foundational support for predictive maintenance procedures.

• Foundational support for the Model Based Systems Engineering

approach across the enterprise.

4. Case Study 1: Analyzing the Structural Thermal Optical Performance

(STOP) of optical systems – The Aerospace Corporation

Analyzing the Structural Thermal Optical Performance (STOP) of an optical

system is a complex, multidisciplinary, manual, and inefficient process that is

Page 7: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the

Case for a Tool-Independent Unified Data Model

prone to human error. In this section, we will review how optical system

design teams at NASA and The Aerospace Corporation have used the Aras

SPDM Workspace to rapidly perform STOP analyses, including parametric,

“what-if” trade studies of their system designs and active thermal controls

software. The case study will compare the efficiency and robustness of this

automated process compared to the prior manual process.

In the traditional design process, many engineering disciplines (mechanical,

structures, thermal, optics, electronics, software) are needed to design and

build modern Electro-Optical (EO) sensors. Separate design models are

normally constructed by each discipline engineer using the CAD/CAE tools

and material properties familiar to each discipline. Design and analysis is

conducted largely in parallel, subject to requirements that have been levied on

each discipline, and technical interaction between the different engineering

disciplines is limited and infrequent. Design reviews are also conducted in a

serial manner, by discipline, using PowerPoint snapshots of design and

analysis status. Access to engineering results is largely limited to discipline

specialists because of the education and experience needed to understand the

technical issues, terminology, and computer tools for each discipline.

Using this traditional method, the discovery of sensor-level design issues tends

to occur late in the design process, often after the hardware has already been

built. A more collaborative environment with a unified view of all the

multidisciplinary data is required.

The STOP Team at The Aerospace Corp chose Aras SPDM (then called Comet

Workspace) to conduct integrated Structural/Thermal Optical Performance

(STOP) calculations on complex, space-borne sensor systems [1], [2].

In this case study, the STOP Team was selected by a satellite team at the

Sandia National Labs to better understand the behavior of a space-borne optical

system being tested in a TVAC (Thermal Vacuum) chamber. The STOP

automation process automates the entire analysis from CAD to thermal,

structural and optics calculations, using multiple tools, at different levels of

model fidelity and across the different physics domains, enforcing the

simulation rules encoded by the experts in the simulation template.

Following all the rules of the experts, the CAD model of the multi-lens system

is automatically meshed by the automation process (Figure 4). The thermal,

structural and optical calculations are also performed by the automation

process, with all the files required for analysis being created automatically.

This allows the engineers to modify various parameters, including geometric

CAD parameters, materials, operating conditions, etc., and rerun the STOP

calculation automatically, in a fraction of the time it used to take.

Page 8: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

Figure 4: CAD and meshes of the optical system – satisfying all mesh quality metrics

With the traditional approach, with the data being transferred manually, a

single iteration of the STOP analysis could take weeks to complete.. With this

new method, engineers could perform multiple analyses in a day, constrained

only by the amount of time it takes to run the numerical simulations.

Furthermore, the process is enforced consistently each time, the data are

accurately transferred from tool to tool (the workspace deals with unit and

coordinate system transforms automatically) and all the manual, error-prone

steps are removed.

In this case study, the Aerospace STOP Team was able to quickly validate the

simulation models using test data (Figure 5).

Figure 5: Validation of simulation models using TVAC Chamber test data

The validated automation process was then used to run various simulations to

better understand the behavior of this optical system. Thermal soak tests

confirmed that the models were behaving accurately, and these were followed

by more complex calculations, subjecting the optical system to a series of

Page 9: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the

Case for a Tool-Independent Unified Data Model

periodically varying thermal cycles designed to simulate the transient

environment that the system would see in orbit. This introduced the need to

add active thermal controls algorithms to the automation process to control the

temperatures at various set points within the optical system.

These integrated STOP analyses, using validated models, provided many

insights into the physical behavior of the system, which in turn allowed the

engineers to program the controls algorithms that are needed to keep the

system working correctly over a wide range of transient thermal operating

conditions during orbit. The engineers concluded [1] that this simulation

automation workspace was used successfully and efficiently to “validate an

unconventional thermal controls approach for maintaining the focus of the

visible channel of a flight payload over its expected thermal environment”

(Figure 6).

Figure 6: Integrated STOP analysis validated focus control effectiveness

Traditional approaches would have taken much longer (going from weeks for a

single iteration to less than a day) and would not have resulted in the accuracy,

consistency and deep physical insights seen using the new automation platform

and approach. The team concluded that “the savings in cost and schedule were

substantial, given that six different integrated STOP analyses were required to

complete the work.”

The unified data model allowed the multidisciplinary team to collaboratively,

within a single environment, view all the data associated with the system,

regardless of the tools used to perform the calculations, promoting “systems

thinking” among the various discipline experts. The expert in a given discipline

had a complete view of the system and the effect of changes to the overall

system performance. This, Dr. David Thomas, a system engineer and the team

leader concluded, was perhaps the most beneficial aspect of working within

this integrated environment—breaking down the silos between the experts,

tools and data, and giving the entire team a much better understanding of the

trade-offs.

Page 10: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

5. Case Study 2: Understanding the aberrant behavior of Directed-

Energy Laser Systems – Air Force Research Labs.

Directed-Energy laser weapons are complex precision optical systems being

developed by the US Air Force. In this case study, the Air Force Research

Laboratories wanted to simulate an early design that was being tested in the lab

and was showing aberrant behavior, in order to better understand the behavior

and sensitivities of the system.

Modelling laser weapon systems involves combined interactions among

structures, thermal, and optical effects, including ray optics and wave optics,

software controls, atmospheric effects, target interaction, computational fluid

dynamics, and spatiotemporal interactions between the laser light and the

medium. A variety of general-purpose commercial and special-purpose in-

house tools and techniques have been developed that address different parts of

this problem, but these tools are not integrated, and each require their own

experts to operate accurately. Furthermore, working with all these tools to

perform a single simulation results in siloed, disparate mounds of data that

must then be “integrated” by the engineers to extract real knowledge about the

system being simulated.

The simulation is a mixed-fidelity, multi-physics simulation. For

computational efficiency, it was necessary to combine lumped-parameter

systems models for most of the laser system with 3-D models of certain

subsystems that required higher-fidelity, using co-simulation techniques. The

goal was to automate the simulation process, significantly increasing the

efficiency of the transient co-simulation trade studies. Aras SPDM was chosen

to conduct the multi-fidelity calculations on the complex laser systems. [3]

The simulation automation platform, with its single unified data model that is

able to capture component definitions across all the required levels of

fidelity—from lumped-parameter systems models to 3-D CAD/mesh-based

representations—and the existing connectors to the various math and finite

element tools that were needed, was ideally-suited to address these challenging

modelling requirements. The team at Comet Solutions worked closely with

TimeLike Systems to connect their wave optics systems engineering tool,

WaveTrain™, with the automation platform. The goal of the project was to

demonstrate an effective mixed-fidelity Model-Based Systems Engineering

(MBSE) environment for the analysis and design of laser weapons systems. If

successful, the team was tasked with analysing the system that was producing

aberrant behavior in the test laboratory to better understand the physics, so a

more robust design could be achieved.

The laser system was not able to maintain a high-power focused beam on the

target. Instead, the target maximum intensity quickly degraded, defeating the

purpose of the device (Figure 7).

Page 11: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the

Case for a Tool-Independent Unified Data Model

Figure 7: Laser System Degradation of the Target Max Intensity

The engineers suspected that an optical element (MATRIX ASE, Figure 8) was

heating up and deforming, resulting in the diffusion of the primary laser beam.

Figure 8: MATRIX ASE optical element

The suspected optical element needed to be simulated in 3-D to compute the

structural deformations due to thermal gradients, while it was adequate to

simulate the rest of the laser system using lumped-parameter systems models.

The results of the simulation were validated by the test observations, predicting

the diffusion of the primary beam on the target surface (Figure 9).

Figure 9: Target Beam Intensity map diffusing over time

The automated simulation process (Figure 10), automatically running multiple

tools such as Thermal Desktop (transient thermal), Nastran (deformation),

CODE V (ray optics) and WaveTrain (wave optics), was able to execute the

50 100 150 200 250

50

100

150

200

250

50 100 150 200 250

50

100

150

200

250

50 100 150 200 250

50

100

150

200

250

Page 12: NAFEMS World Congress 2019 – Paper Submission Get the Data … · 2019-07-12 · NAFEMS World Congress 2019 – Paper Submission Get the Data Right for Effective Multidisciplinary

Get the Data Right for Effective Multidisciplinary SPDM – Making the Case

for a Tool-Independent Unified Data Model

co-simulation process efficiently and accurately, capturing the component

representations at the required levels of fidelity, and automatically generating

all the input files required to run the various tools.

Figure 10: Automation template for analysing the laser system

This new approach, with its unified data model and intelligent automation

processes, proved to be effective, rapidly giving the engineers a clear

understanding of the underlying physics processes driving the complex

multidisciplinary laser system. These insights were required to create a more

effective design of the laser system.

6. References

[1] Jason Geis, David Thomas, et.al. (2011), Concurrent Engineering of an

Infrared Telescope System, SPIE Proceedings, August 2011.

[2] Jason Geis, David Thomas, et.al. (2009), Collaborative Design and

Analysis of Electro-Optical Sensors, SPIE Proceedings, August 2009.

[3] Malcolm Panthaki, Steve Coy (2011), Model-Based Engineering for Laser

Weapons Systems, SPIE Proceedings, August 2011.