multisensor fusion and integration - pres

22
ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 1 Multisensor Fusion and Integration Introduction Multisensor fusion and integration refers to the synergistic combination of data from multiple sensors to provide more reliable and accurate information. Sensor data can be incomplete, erroneous and uncertain. Three types of multisensor data fusion: Complementary Fusion: o E.g. fusion of several range sensors pointed in different directions. o Resolves incompleteness of sensor data. Competitive Fusion:

Upload: praneel-chand

Post on 08-Jan-2017

216 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 1

Multisensor Fusion and Integration Introduction

• Multisensor fusion and integration refers to the synergistic combination of data from multiple sensors to provide more reliable and accurate information.

• Sensor data can be incomplete, erroneous and uncertain.

Three types of multisensor data fusion: • Complementary Fusion:

o E.g. fusion of several range sensors pointed in different directions.

o Resolves incompleteness of sensor data. • Competitive Fusion:

Page 2: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 2o Fusion of uncertain sensor data from several sources,

e.g. heading from odometry and magnetic compass. o reduces the effect of uncertain and erroneous

measurements. • Cooperative Fusion:

o E.g. a touch sensor refines the estimated curvature of an object previously sensed by range sensors.

Page 3: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 3Multisensor Integration and Multisensor Fusion Multisensor integration is using info from multiple sensors to assist in the systems goal achievement. Multisensor fusion is the combination of sensor info into one representational format during the integration process

Page 4: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 4Architecture for a Multisensor Data Fusion System • Fuse data from sensors of many different modalities. • E.g. a mobile robot equipped with odometers, infrared

measuring devices, acoustic devices, and cameras.

Generic multisensor data fusion architecture

Page 5: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration

5 The main characteristics:

• Data from each sensor is first converted to a common internal representation.

• The actual fusion of data is performed in this common representation.

Page 6: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 6Low Level and High Level Fusion • Low level fusion is often used for direct combination of

sensory data e.g. range sensors and odometry. • High level fusion is used for indirect integration of sensory

data in layered architectures through command arbitration e.g. behaviour fusion

Page 7: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 7 Multisensor Integration

Functional diagram of multisensor fusion and integration

Page 8: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 8

• Sensor model represents the uncertainty and error in the data from each sensor

Integration with three different types of sensory processing: • Fusion:

o Sensor registration converts the sensor data s common internal representation

• Separate Operation:

o Data provided by a sensor may be significantly different

o Influences the other sensors indirectly via the system controller and the world model.

• Guiding or Cueing:

o data from one sensor is used to guide or cue the operation of the other sensors e.g. tactile bump sensors, IR light sensors

Page 9: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 9 • Sensor selection:

o used to select the most appropriate configuration of sensors to suit the environment conditions

Page 10: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 10

Multisensor Fusion The fusion of data from multiple sensors or a single sensor over time can take place at different levels of representation: • Signal

o Real-time applications, time sequential fusion, low level fusion

• Pixel o Improve image processing tasks like segmentation,

low level fusion • Feature

o Object recognition, feature extraction, mid level fusion • Symbol

o Object recognition, evidential reasoning, high level fusion

Page 11: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 11Example Implementation of Fusion Algorithm for Mobile Robot Tracking

Experimental setup for target tracking

• Fusion algorithm has two major agents for local decisions:

o Target-tracking agent (behaviour) o Collision avoidance agent

• Final decision calculated by fusing the two local decisions is the driving velocity of the mobile robot.

Page 12: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 12

Output to system controller S

ymbo

l Lev

el

Feat

ure

Leve

l S

igna

l & P

ixel

Le

vel

Sensor inputs after registration

Implementation of target tracking system integrating visual detection and ultrasonic sensory data

Page 13: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 13Multisensor Fusion Algorithms Estimation methods Usage: Signal level fusion

Non-recursive: • Weighted Average • Least Squares

Recursive: • Kalman Filtering • Extended Kalman Filtering

Classification

methods Usage: Extracting features & matching at pixel and feature level fusion

• Parametric Templates o Match extracted features to

classes in a multidimensional feature space

• Cluster Analysis

o Similar to SOFM o Learn geometrical relationships

Page 14: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 14

between sample data sets • Learning Vector Quantization (LVQ)

o Another type of NN • K-means Clustering

o Competitive NN • Kohonen Feature Map (SOFM) • ART, ARTMAP, Fuzzy-ART

Networks

Inference methods Usage: Symbol level fusion – evidential reasoning

• Bayesian Inference o Information combined according

to the rules of probability theory o Bayes formula

Page 15: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 15

• Dempster-Shafer Method o Rectifies some instances where

probabilities may become unstable in Bayesian inference

• Generalised Evidence Processing

o Unifies Bayesian and Dempster-Shafer methds

Artificial intelligence

methods Usage: Can be used at different levels of fusion

• Expert System o Performs inferences using a

data set and rule-based knowledge base

• Neural Networks

o Adaptive o Backpropagation

Page 16: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 16

• Fuzzy Logic o Multiple-valued logic where

variables are assigned degrees of membership between 0 and 1

o “maybe” exists between “yes” and “no”

Page 17: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 17

Applications in Robotics A Voting Scheme for Off-road Navigation The Distributed Architecture for Mobile Navigation (DAMN) is a behaviour based system for off-road navigation.

Behaviours sending votes to the arbitrator in the DAMN

architecture Two levels of fusion are used in this architecture. Each module (behaviour) has its own sensory suite, and uses low level fusion algorithms on the data. High level fusion is performed by a common arbitrator that performs command fusion by weighting the decision (vote) of each behaviour.

Page 18: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 18Hierarchical Neural Network for Mobile Robot Control

A hierarchical neural network for a mobile robot

• Reason network translates sensory inputs into behaviours

e.g. “move forward when the infrared sensor on the head detects light.”

Page 19: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 19

• Instinct network controls behaviour patterns that the robot should be taking over time e.g. repetition of behaviour cycles

• Interpolating is dangerous

o Neural networks must be trained properly to perform fusion.

Page 20: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 20

The Kalman Filter and its Application in Sensor Fusion An Introduction to the Kalman Filter Development of Multiple Sensor Fusion Experiments for Mechatronics Education

Page 21: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 21

Sources R. C. Luo, C. Yih and K. L. Su, “Multisensor Fusion and Integration: Approaches, Applications, and Future Research Directions”, IEEE Sensors Journal, vol. 2, no. 2, pp. 107-119, 2002.

M. Kam, X. Zhu and P. Kalata, “Sensor Fusion for Mobile Robot Navigation”, Proceeding of the IEEE, vol. 85, no. 1, pp. 108-119, 1997.

T. Bak, “Lecture Notes - Estimation and Sensor Information Fusion”, Estimation and Sensor Information Fusion, Aalborg University, http://www.control.auc.dk/~tb/Teaching/Courses/Estimation/sensfusion.pdf, 2000.

D. Langer, J. K. Rosenblatt, and M. Hebert, “A behavior-based system for off-road navigation,” International Journal of Robotics Research, vol. 10, no.6, pp. 776–782, 1994.

S. Nagata, M. Sekiguchi, and K. Asakawa, “Mobile robot control by a structured hierarchical neural network,” IEEE Control Systems Magazine, vol. 10, no. 3, pp. 69–76, 1990.

L. Reznik, “Fuzzy Controllers”, Newnes (Butterworth-Heinemann), Oxford, Great Britain, 1997.

“Chapter 9 Sensor Data Fusion”, Course Organisation and Design of Autonomous Systems, University of Amsterdam, http://www.science.uva.nl/~arnoud/education/OOAS/fwi/Chap9r.pdf.

G. Welch and G. Bishop, “An Introduction to the Kalman Filter”, University of North Carolina, http://www.cs.unc.edu/~welch/kalman/kalmanIntro.html, 2004.

Page 22: Multisensor Fusion and Integration - pres

ENEL 417/517 Mechatronics – 2005: Multisensor Fusion and Integration 22K. Song and Yuon. Chen, “Development of Multiple Sensor Fusion Experiments for Mechatronics Education”, Proceedings of the National Science Council ROC, vol. 9, no.2, pp. 56–64, 1999.

R. Siegwart and I. Nourbakhsh, “Introduction to Autonomous Mobile Robots”, The MIT Press, Massachusetts Institute of Technology, Massachusetts, USA, 2004.