Download - 16 03 06 Smiths Aeros 2084a
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
CAA HUMS Research Project Meeting
16th March 2006
Presentation by: Brian Larder, Rob Callan, Lorry Stoate
210/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Agenda
1. Minutes of 20 October meeting (accuracy and actions)
2. Overview of activities since last meeting
3. Anomaly detection process development – model information extraction
4. Presentation and demonstration of the results of the off-line data analysis (phase 1)
5. Review of live trial system development
6. Additional fault data?
7. End of phase 1 report
8. Planning of live trial (phase 2)
9. HUMS for rotor systems
10.AOB
11.Date of next meeting
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Overview of activities since last meeting
410/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Overview of activities since last meeting
We have produced a software tool to manage the configuration for generating anomaly models
• Specifying data source, model storage, model parameters, input indicators, etc was very tedious and holding back efficient progress
• This tool allows the analyst to define a set of tasks that are then stored and executed as batch process
• Tasks can also be placed on different available machines which is important because each task takes a lot of CPU time
510/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Overview of activities since last meeting
Model Building• Models have been built for all main, tail, intermediate and accessory gears
• Fitness score predictions have been produced for all data
• A lot of analysis has been undertaken to review the models (see later sections)
Alerting threshold• The strategy for setting a threshold for anomaly alerts has been studied in
some detail
• We have implemented an alerting strategy but this will need reviewing based on feedback
Development of the live trial system• To be demonstrated later
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Anomaly detection process development – model information extraction
710/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Anomaly detection process development – model information extraction
There are three key elements to anomaly detection
• Data pre-processing
• Used to emphasis data characteristics that are considered important such as trends
• This can be difficult but it is a critical stage
• Pre-processing can be considered as tagging the data as interesting or not interesting
• Anomaly modelling
• Main function is to highlight data of interest and to emphasise its significance
• For HUMS data, model training needs to be robust to anomalous training
• In other words, the training data will contain anomalies due to sensor and general instrumentation problems that occur in practice
• Alerting strategy
• Defining the events that lead to flagging the operator’s attention
810/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Anomaly detection process development – model information extraction
Alerting strategy
• The predicted fitness scores from the training data have been used to generate alerting thresholds (one per model)
• The threshold is not based on a mean value plus so many standard deviations
• The fitness scores do not follow a normal distribution
• The strategy has been to examine the fitness scores using a statistical plot that can assist with identifying different regions of density
• The idea is to set the threshold at a point where there is change in density
• The point should also lie towards the end of the cumulative distribution (i.e. such that the majority of data will not be in alert)
• Whilst the modelling is robust to anomalies in the training data, the threshold will be affected by the quality of the training data
• This strategy will be used in the live trial but we need the live trial experience to refine it
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Presentation and demonstration of the results of the off-line data analysis (phase 1)
1010/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
The Anomaly modeling is the most complex element in the three stage process
• The technology is independent of the other two stages of pre-processing and alerting
• In our view it should provide more than a mechanism to flag anomalous data
• It should provide an insight into the ‘large data’ picture
We have addressed the following questions
• Does it do what it is designed to do?
• What benefits does it bring?
• The answers to these questions will be illustrated throughout today’s presentation
• It will become apparent that HUMS will clearly benefit from the application of the anomaly processing
• In our view, there is a proven need for this type of technology
1110/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
The overall objective for the anomaly detection technology can be summarized by the following requirement
• Anomaly models will be built from data that contain an undefined percentage of anomalies. The models should recognize outlying training data. The models response to outliers should be adaptive via configurable tuning – humans have to attach value to the anomalous information provided (i.e, ability to configure the alerting rate and trade off between false positives and costs)
What has been done to test the success of this objective (and to answer the questions from the previous slide)?
• Models have been built for all 35 shafts (main, tail, intermediate and accessories)
• Resulted in 140 models
• Absolute and trend X 2 (8 indicators and M6)
• Fitness predictions have been produced against all of these models for
• Training, validation and current data
1210/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
What has been done to test the success of this objective (and to answer the questions from the previous slide)?…
• The training data approximates to about half that available
• For example, the training data contained 60 main gearboxes but in total there are now 115
• The validation data contains those gearboxes whose data were acquired before April 2004 and were not used for training
• The current data consists of the current fleet and there will be some overlap with the validation data (where current components were in service before April 2004)
• The models have produced a huge amount of information and it is impossible to analyze all of this in the time available
• A selection of models have been explored in detail
• We have directed most of our attention at the current fleet
1310/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
Before proceeding with a review of the findings we need to keep in mind a few points
• The models have been built in a one-off process
• There has been no tuning but we have always stated there will be a need for this following the review and during the live trial
• Initially we were struck by the quantity of low fitness scores being produced by the models and thought the model adaptation was probably too aggressive
• In some models this is true (hence the need for the ability to re-tune) but in most cases the modeling is pointing to genuinely outlying data – there happens to be a lot of it!
• We can only re-tune once we have discussed some representative cases and established an initial policy of how we want the system to respond
• Some low fitness scores are misleading
• This is not a modeling issue and it originates in the trend data pre-processing where there has been a step change because of maintenance to correct high vibration
1410/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
Does the anomaly modeling fundamentally work?
• The ProDAPS modeling diagnostic tool has been used to explore model content and features
• Anomalous fitness scores have been reviewed against the condition indicators (mainly covered in a later section)
Model diagnostics
• The model diagnostics tool extracts a range of information and we shall see some of this
• We have taken all of the 8 indicator models for the main gearbox and extracted what we call coverage statistics
• Basically we ask the model for a fleet view on all of the indicators
• Sometimes the coverage shows that the tuning is too aggressive but often it illustrates that the modeling is doing a far superior job of representing the data than can be achieved using traditional parametric statistics
• An example follows
1510/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
The top chart shows a plot of the absolute values for FSA_SO1 from the training data on the Stbd Aft Fw
• The bars show a histogram of the data and whilst the bulk of the data approximates a normal distribution it contains a very long tail
• The red line is the distribution computed from the data
• A single mode fleet distribution is a poor model of the data
• This effect is repeated over many indicators and models
The histogram shows what looks to be a central distribution that is extended with a lot of outlying data
• The chart at the bottom shows the output of a ProDAPS cluster model
• This illustrates the impact of the outlying data
• ProDAPS indicated that it would take in the order of 17 distributions to provide a good model of this data
The anomaly models are built using 8 indicators
• Ideally we would like the model to target the outlying data and find the distribution that characterises the main fleet
• This is a very difficult challenge but it is the core objective we have set for the anomaly modelling
1610/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
The chart on the left is the fleet distribution that has been identified by the model• It is clearly much more representative of the fleet data
• It has clearly targeted outlying data
The diagnostic tool can also take the fleet statistics and produce what we call a global fit• For a component, the global fitness scores can look similar to the anomaly fitness scores but often they will
contradict
• The global fits in conjunction with the anomaly fits can be very informative
We shall illustrate the global fit for FSA_SO1 on the Stbd Aft Fw and show what is driving the outliers
1710/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Revisit the Bevel Pinion fault case
The global fitness on the Bevel Pinion case reveals some interesting information
• If we examine the absolute indicators for all data
• FSA_GE22 is the highest for this fault component
• FSA_SON is suppressed
• FSA_MS_2 does not look significant
• The diagnostics show, that although FSA_GE22 reaches the highest value of all data seen to date it is almost impossible to attach any significance to the trend when doing a simple fleet comparison
• On the other hand
• The anomaly model, shows the significance on both the absolute and trend models
• In terms of the absolute model
• FSA_MS_2 and FSA_GE22 are significant (see next slide)
• In terms of the trend model
• FSA_GE22 and FSA_SON are significant
1810/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Revisit the Bevel Pinion fault case
The global fitness on the Bevel Pinion case reveals some interesting information…
• The ProDAPS model diagnostics can indicate which indicators are driving the low fitness
• The plot to the right shows that FSA_MS_2 is more significant than FSA_GE22 in terms of absolute values
• This is lost in the fleet statistics
1910/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Model Assessment
Review of selected cases based on current fleet data
2010/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Fan 15 blades
2110/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Accelerometer n°8 (12RK1)
Accelerometer n°10 (12RK3)
Accelerometer n°14 (12RK5)
Accelerometer n°15 (18RK1)
Accelerometer n°16 (18RK2)
Accelerometer n°12 (14RK2)
Accelerometer n°11 (14RK1)
Accelerometer n°13 (12RK4)
2210/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Recommendations for future research
The activities of research and development often highlight the need for further work
Pre-processing• There is a great deal of variability in HUMS data (e.g. due to various maintenance actions)
• This variability could mask important trends because the trend pre-processing as it currently exists does a form of differencing and it does not distinguish between a developing trend and step change or ocurrence of noise in the data
Model tuning• The application of this technology is very new and we need the trial experience to refine the
models
Probabilistic alerting policy• It is unrealistic to set a hard threshold to demarcate the interesting from the non-interesting
• We need a point at which an alert is triggered but we also need a type of anomaly index
• There could be a tendency to interpret fitness scores in a manner similar to reading a linear temperature scale but the distribution is not linear
• A probabilistic measure is more appropriate
• A probabilistic measure would also normalise the anomalies and assist with reasoning across shafts
2310/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Recommendations for future research
The the activities of research and development often highlight the need for further work…
Data mine the features of anomalous trends to test theoretical models
• ProDAPS diagnostics can point to the indicators that are driving trends
• It would be informative to mine these features to test established diagnostic knowledge and develop this
Reasoning• More directed information could be provided by reasoning with anomaly
outputs
• Fuse information across shafts to identify instrumentation issues
• Reason about the nature of the anomaly – trends vs high variability vs step changes etc
• Reason about the indicators driving the trend to provide more detail on the significance of anomalies
• Case-based reasoning to search for any similar previous cases
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Review of live trial system development
2510/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Web-based anomaly detection system architecture
The anomaly detection system will operate as a secure web server, located at Smiths in Southampton.
• HUMS data is automatically transferred overnight from Bristow’s Web Portal to Southampton (Working).
• The HUMS data is imported into the anomaly detection system’s data warehouse and analysed overnight (Working).
• Bristow have a remote login to the system to view results at any time via a web browser (Tested).
www
HUMS Data onWeb Portal
HUMS Data Warehouse
BHL HUMS TypeEngineer’s PC
BHLAberdeen
SmithsSouthampton
Other BHLEngineer’s PCs
2610/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Summary of major activities and findings
Live trial system development…
Activities• Development of software components for live trial system
• Overnight Data processing
• Complete data flow (including Alerting) – has been working daily since 16 th February 2006.
• Management of documentary data (gearbox changes, recording of trial findings etc.) – still to be implemented.
• User Interface (UI) -
• A Basic navigation is in place (drill down).
• An Annotation strategy demonstrable.
• Acknowledging Alerts demonstrable.
• Historical Search strategy identified.
• Speed of connection, and practicality tested with Bristow.
Findings• We have addressed all potential areas of risk in the data processing and UI.
• We still have some features of the UI to complete for the live trial.
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Additional fault data?
2810/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Sources of fault data for system testing
BHL AS332L IHUMS data
• Large quantity of historical data.
• With BHL investigation of IHUMS indicator trends related to anomaly detection results, it is possible to perform a good assessment of anomaly detection capabilities using the existing data set.
CHC Scotia AS332L IHUMS data
• Data for cracked AS332L bevel pinion.
WHL AS332L MGB data from CAA seeded defect test programme
• Documentary information on this received from WHL.
• Limited analysis coverage, but some fault related trends could be obtained.
• Difficulties obtaining data from WHL.
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
End of phase 1 report
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
Planning of live trial (phase 2)
3110/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
Plan for 6 month trial
Tasks to be completed prior to start of 6 month trial• Completion of User Interface software
• Model tuning / Data re-modelling
• Definition of BHL operational procedures
• Daily review of anomaly detection
• Feedback from follow-up of review
• Reporting of faults, component changes and maintenance actions
• System problem reporting
• Definition of Smiths support procedures
• Database update following component changes
• Investigation of reported system problems and anomaly model behaviour
• Configuration control for any model tuning
Trial start date• Forecast for mid May
Trial performance assessment• Project review meeting after first month of operations?
Decision on trial extension
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
HUMS for rotor systems
3310/04/23Smiths Aerospace © 2006 by Smiths Aerospace: Proprietary Data
HUMS for rotor systems
Contract amendment received from CAA on 18 January
Obtaining University support for study• Selection of University of Glasgow (including rotorcraft group from Imperial
College London, led by Dr Richard Brown.)
• Arrangement of Confidentiality Agreement
• Agreement of SOW (focus on literature review, mathematical modelling and laboratory test facilities)
• Signing of a Research Agreement
Project start• Planned start date: 3rd April
• Planned Smiths resources
• BDL: Project management and liaison with UoG
• SS: Accident data review
• REC: Application of machine learning techniques
• DLG: Mathematician support
Smiths Aerospace
www.smiths-aerospace.com © 2006 by Smiths Aerospace: Proprietary Data
AOB and date of next meeting