automatic waterjet positioning vision system final …832406/fulltext01.pdf · master thesis...

63
Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian Dziak Bartosz Jachimczyk Tomasz Jagusiak This thesis is presented as part of Degree of Master of Science in Electrical Engineering Blekinge Institute of Technology August 2012 Blekinge Institute of Technology School of Engineering Department of Signal Processing Supervisors Prof. Wlodek J. Kulesza, Dr. Anders Jönsson Examiner: Dr. Sven Johansson

Upload: nguyendieu

Post on 31-Aug-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Master Thesis Electrical Engineering August 2012

i

Automatic Waterjet Positioning Vision System

Damian Dziak Bartosz Jachimczyk

Tomasz Jagusiak

This thesis is presented as part of Degree of

Master of Science in Electrical Engineering

Blekinge Institute of Technology

August 2012

Blekinge Institute of Technology School of Engineering Department of Signal Processing Supervisors Prof. Wlodek J. Kulesza, Dr. Anders Jönsson Examiner: Dr. Sven Johansson

Page 2: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

ii

This thesis is submitted to the School of Engineering at Blekinge Institute of Technology in partial fulfillment of the requirements for the degree of Master of Science in Electrical Engineering with Emphasis on Signal Processing. The thesis is equivalent to 20 weeks of full time studies.

Contact Information: Authors: Damian Dziak E-mail: [email protected] Bartosz Jachimczyk E-mail: [email protected] Tomasz Jagusiak E-mail: [email protected] Advisors: Prof. Wlodek J. Kulesza School of Engineering, Blekinge Institute of Technology Address: SE – 371 79 Karlskrona, Sweden Phone: +46 455 385898 Email: [email protected] Dr. Anders Jönsson School of Engineering, Blekinge Institute of Technology Address: SE – 371 79 Karlskrona, Sweden Phone: +46 455 385582 Email: [email protected] Dr. Johan Wall School of Engineering, Blekinge Institute of Technology Address: SE – 371 79 Karlskrona, Sweden Phone: +46 455 385509 Email: [email protected]

Page 3: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

iii

Abstract

The goals of this work are a design and implementation of a new vision system, integrated with the waterjet machine. This system combines two commercial webcams applied on an industrial dedicated platform. A main purpose of the vision system is to detect the position and rotation of a workpiece placed on the machine table. The used object recognition algorithm consists of edge detection, standard math processing functions and noise filters. The Hough transform technique is used to extract lines and their intersections of a workpiece. Metric rectification method is used, in order to obtain a top view of the workspace and to adjust an image coordinate system, accordingly to the waterjet machine coordinates. In situ calibration procedures of the booth webcams are developed and implemented. Experimental results of the proposed new vision system prototype confirm required performance and precision of the element detection. Index Terms—Edge Detection, Hough Transformation, Object Detection, Vision System, Waterjet Machine, Web Camera.

Page 4: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

iv

Page 5: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

v

Acknowledgment

This thesis was carried out at the Department of Electrical Engineering School of Engineering, Blekinge Institute of Technology, Karlskrona, Sweden, under the supervision of Professor Wlodek Kulesza.

We would like to thank Professor Wlodek Kulesza for his support and guidance throughout the work.

We would like to thank Johan Fredin for measurements performed at the Finecut AB. We would like to express our appreciation to the staff at Swedish Waterjet Lab. We are grateful for the support provided by Dr Anders Jönsson and Dr Johan Wall for their technical supervision. Special personal thanks go to Anna Harding, Project Assistant who always found the time to help us.

Page 6: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

vi

Page 7: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

vii

Table of Contents

ABSTRACT ................................................................................................................................. III

ACKNOWLEDGMENT ............................................................................................................. V

TABLE OF CONTENTS ......................................................................................................... VII

LIST OF FIGURES .................................................................................................................... IX

LIST OF TABLES ...................................................................................................................... XI

LIST OF ABBREVIATIONS .................................................................................................. XII

1 INTRODUCTION................................................................................................................. 1

2 SURVEY OF RELATED WORKS ..................................................................................... 2

3 PROBLEM STATEMENT AND MAIN CONTRIBUTION ........... ................................. 4

4 POSITIONING VISION SYSTEM MODEL ..................................................................... 7

4.1 GLOBAL CAMERA CALIBRATION .......................................................................................... 7 4.2 BACKGROUND SUBTRACTION ............................................................................................... 7 4.3 NOISE FILTERING AND EDGE DETECTION .............................................................................. 9 4.4 LINE AND INTERSECTION ESTIMATION USING HOUGH TRANSFORMATION ............................ 9 4.5 RECTIFICATION AND PRELIMINARY CORNER DETECTION .................................................... 12 4.6 LOCAL CAMERA CALIBRATION ........................................................................................... 12 4.7 ESTIMATION OF INITIAL CORNER COORDINATES AND WORKPIECE ANGULAR DEFLECTION . 13 4.8 SYSTEM BOUNDARIES AND LIMITATIONS............................................................................ 14

5 IMPLEMENTATION ........................................................................................................ 17

5.1 HARDWARE IMPLEMENTATION........................................................................................... 17 5.2 SOFTWARE IMPLEMENTATION ............................................................................................ 19

5.2.1 Global camera calibration ........................................................................................ 21 5.2.2 Image processing ...................................................................................................... 24 5.2.3 Preliminary corner detection .................................................................................... 26 5.2.4 Local Camera Calibration ........................................................................................ 27 5.2.5 Identification of the workpice initial corner and its angular deflection ................... 29

6 METHOD VALIDATION AND ACCURACY ANALYSIS ........... ............................... 33 6.1 ACCURACY OF ELEMENT DETECTION USING GC................................................................. 33 6.2 ACCURACY OF ELEMENT CORNER DETECTION BY LC ......................................................... 35 6.3 ESTIMATION OF ANGULAR DEFLECTION UNCERTAINTY ...................................................... 37

7 CONCLUSION ................................................................................................................... 40

8 FUTURE WORK ................................................................................................................ 42

REFERENCES ............................................................................................................................ 44

APPENDIX A .............................................................................................................................. 47

Page 8: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

viii

A VIEW OF GLOBAL CAMERA MOUNTING PLATE. ........................................................................ 47

APPENDIX B .............................................................................................................................. 48

A VIEW OF LOCAL CAMERA MOUNTING PLATE. .......................................................................... 48

APPENDIX C .............................................................................................................................. 49

TABLE OF CORNER’S DETECTION UNCERTAINTY FROM FIRST TRIAL. .......................................... 49

APPENDIX D .............................................................................................................................. 50 TABLE OF CORNER’S DETECTION UNCERTAINTY FROM SECOND TRIAL. ...................................... 50

APPENDIX E .............................................................................................................................. 51 TABLE OF ANGULAR DEFLECTION UNCERTAINTY. ...................................................................... 51

Page 9: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

ix

List of Figures

FIG. 3.1.GRAPHICAL INTERPRETATION OF ASSUMED ANGULAR DEFLECTION UNCERTAINTY

REQUIREMENT. ......................................................................................................................... 5 FIG. 3.2. PVS ALGORITHM BLOCK DIAGRAM. ................................................................................... 6 FIG. 4.1. MODEL OF WJ MACHINE WORKSPACE WITH MACHINE COORDINATE SYSTEM. ................... 7 FIG. 4.2. ILLUSTRATION OF GC IMAGE COORDINATES (X’,Y’) AND MACHINE COORDINATE SYSTEM

(X,Y,Z) ..................................................................................................................................... 8 FIG. 4.3. (A) BACKGROUND IMAGE, (B) IMAGE WITH WORKPIECE, (C) RESULT OF SUBTRACTION. .... 8 FIG. 4.4. NOISE REMOVAL AND EDGE DETECTION. (A) INPUT IMAGE,(B) NOISE REMOVED, (C)

DETECTED EDGES. .................................................................................................................... 9 FIG. 4.5. A POINT REPRESENTED IN: (A) CARTESIAN COORDINATES SYSTEM, (B) TWO DIMENSIONAL

HOUGH SPACE, (C) THREE DIMENSIONAL HOUGH SPACE. ....................................................... 10 FIG. 4.6. A LINE REPRESENTED IN: (A) CARTESIAN COORDINATES SYSTEM, (B) TWO DIMENSIONAL

HOUGH SPACE, (C) THREE DIMENSIONAL HOUGH SPACE. ....................................................... 11 FIG. 4.7. TWO LINES REPRESENTED IN: (A) CARTESIAN COORDINATES SYSTEM, (B) TWO

DIMENSIONAL HOUGH SPACE, (C) THREE DIMENSIONAL HOUGH SPACE. ................................ 11 FIG. 4.8. GRAPHICAL INTERPRETATION OF Ρ, Θ PARAMETERS. ........................................................ 12 FIG. 4.9.(A) REAL AND INITIAL CAMERA LOCATION . (B) PERSPECTIVE FROM REAL CAMERA'S

POSITION. (C) PERSPECTIVE FROM VIRTUAL INITIAL CAMERA POSITION. ................................ 13 FIG. 4.10. GRAPHICAL INTERPRETATION OF ESTIMATION OF INITIAL CORNER COORDINATES. ........ 14 FIG. 4.11. USABLE AREA(RED FRAME) IN WORKSPACE (ALL MEASURES IN MM). ............................ 16 FIG. 4.12. ILLUSTRATION OF RELATIONSHIP BETWEEN FOV AND THE CAMERA HEIGHT. ................ 16 FIG. 5.1. WJ MACHINE LAYOUT AND MAIN COMPONENTS USED IN VISION SYSTEM. (A) GC WITH

MOUNTING PLATE, (B) LC WITH MOUNTING PLATE, (C) CALIBRATION MARKER. .................... 18 FIG. 5.2. PVS CONNECTION DIAGRAM WITH WJ MACHINE ............................................................. 18 FIG. 5.3. LOCATION OF THE LC IN RELATION TO THE WJ MACHINE NOZZLE. ................................. 19 FIG. 5.4. PICTURE OF THE THREE TEST ELEMENTS. ......................................................................... 19 FIG. 5.5. PVS STRUCTURAL MODEL, GC PART. .............................................................................. 20 FIG. 5.6. PVS STRUCTURAL MODE, LC PART. ................................................................................. 21 FIG. 5.7. IMAGE CAPTURED BY GC. ................................................................................................ 22 FIG. 5.8. IMAGE WITH WRONG CALIBRATION POINTS MARKED BY ASTERISKS. ............................... 23 FIG. 5.9. IMAGE AFTER CALIBRATION. CALIBRATION POINTS MARKED BY ASTERISKS. ................... 23 FIG. 5.10. (A) CROPPED BACKGROUND IMAGE, (B) CROPPED WORKPIECE IMAGE. ........................... 24 FIG. 5.11. (A) EXTRACTED ELEMENT, (B) NOISE REMOVED. ............................................................ 25 FIG. 5.12. (A) ESTIMATED EDGES OF THE WORKPIECE. (B) ESTIMATED CORNERS OF THE WORKPIECE.

............................................................................................................................................... 26 FIG. 5.13. IMAGE AFTER RECTIFICATION. INITIAL CORNER IS MARKED WITH GREEN POINT. ........... 27 FIG. 5.14. (A) IMAGE CAPTURED BY LC CAMERA,(B) EXTRACTED MARKER. ................................... 27 FIG. 5.15. INTERPRETATION OF CALIBRATION VECTOR (CAL_VEC)................................................. 28 FIG. 5.16. (A) IMAGE CAPTURED BY LC AFTER SHIFT TO COORDINATES DETECTED BY GC, (B)

IMAGE AFTER BINARIZATION. ................................................................................................. 29 FIG. 5.17. (A) IMAGE WITH NOISE REMOVED, (B) DETECTED EDGES. ............................................... 30 FIG. 5.18. INTERPRETATION OF VECT1 ........................................................................................... 30 FIG. 5.19. (A) ORIGINAL IMAGE, (B) CROPPED IMAGE. .................................................................... 31

Page 10: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

x

FIG. 5.20. (A) GRAPHICAL INTERPRETATION OF RELATION BETWEEN SHIFT VECTORS VECT1, VECT2, AND VECT3, (B) GRAPHICAL INTERPRETATION OF VECT3 IN REAL MACHINE. .......................... 32

FIG. 6.1. WORKSPACE DIVIDED INTO 6 SECTORS WITH SHOWED GLOBAL CAMERA’S UNCERTAINTY

VECTORS AND THEIRS LENGTH [MM ] FOR DETECTING AN INITIAL CORNER OF SMALL ELEMENT

IN MACHINE COORDINATE SYSTEM. ........................................................................................ 33 FIG. 6.2. WORKSPACE DIVIDED INTO 6 SECTORS WITH SHOWED GLOBAL CAMERA’S UNCERTAINTY

VECTORS AND THEIRS LENGTH [MM ] FOR DETECTING AN INITIAL CORNER OF MEDIUM ELEMENT

IN MACHINE COORDINATE SYSTEM.. ....................................................................................... 34 FIG. 6.3. WORKSPACE DIVIDED INTO 2 SECTORS WITH SHOWED GLOBAL CAMERA’S UNCERTAINTY

VECTORS AND THEIRS LENGTH [MM ] FOR DETECTING AN INITIAL CORNER OF BIG ELEMENT IN

MACHINE COORDINATE SYSTEM. ............................................................................................ 35 FIG. 6.4. PRICED TEST ELEMENT DISPLAYED BY THE PROFILE PROJECTOR. ..................................... 36 FIG. 6.5. UNCERTAINTY OF CORNER DETECTION, DISTRIBUTION ON X-Y AXES, FIRST TRIAL, WITH

MEAN=0.22 AND STANDARD DEVIATION= 0.12. ..................................................................... 36 FIG. 6.6. UNCERTAINTY OF CORNER DETECTION, DISTRIBUTION ON X-Y AXES, SECOND TRIAL, WITH

MEAN=0.17 AND STANDARD DEVIATION= 0.1. ....................................................................... 37 FIG. 6.7. UNCERTAINTY OF CORNER DETECTION, VECTOR LENGTH, FIRST TRIAL; MEAN EQUALS 0.22

AND STANDARD DEVIATION EQUALS 0.12. ............................................................................. 37 FIG. 6.8. UNCERTAINTY OF CORNER DETECTION, VECTOR LENGTH, SECOND TRIAL; MEAN EQUALS

0.17 AND STANDARD DEVIATION EQUALS 0.1......................................................................... 38 FIG. 6.9. ANGULAR DEFLECTION UNCERTAINTY, WITH MEAN=0.23 AND STANDARD DEVIATION=

0.09........................................................................................................................................ 38

Page 11: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

xi

List of Tables

TABLE 3.1. ASSUMED SYSTEM REQUIREMENTS ................................................................................ 4 TABLE 6.1 GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING SMALL ELEMENT IN

EACH SECTOR. .................................................................................................................... 34 TABLE 6.2. GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING MEDIUM ELEMENT IN

EACH SECTOR. .................................................................................................................... 34 TABLE 6.3. GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING BIG ELEMENT IN

EACH SECTORS. .................................................................................................................. 34 TABLE 6.4 . MEAN AND STANDARD DEVIATION OF PERFORMED TESTS. .......................................... 39

Page 12: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

xii

List of Abbreviations

CNC Computer Numerically Controlled

FoV Field Of View

GC Global Camera

IP Ingress Protection

LC Local Camera

PVS Positioning Vision System

SWL Swedish Waterjet Lab

WJ Waterjet

Page 13: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Introduction

1

1 Introduction

Versatility of WaterJet (WJ) cutting technique and their application causes growing interest of companies which have noted the significant advantages of cold cutting. Due to a high tolerance and natural character of cutting process which does not involve artificial substances, chemicals or heat; WJ cutting might be used in the most of industrial sectors.

Most of cutting techniques requires operator who manually determines a start point for cutting process. It is time consuming, provides low accuracy and causes unnecessary material loses. One of the solutions of this problem is automatization applying a vision system. That kind of automatization system can shorten the positioning time and improve the accuracy of cutting process.

This thesis contributes to Swedish Waterjet Lab's (SWL) aim to develop support tools for end-users of the WJ technology. Following SWL's aim, the purpose of this thesis is to design and prototype the Positioning Vision System (PVS) which can be implemented in SWL. Two webcams forming a base of PVS and constitute the characteristic feature of that concept. The proposed identification algorithm, consisting of various image processing techniques performs a workpiece detection process and determines the position and angular deflection of the workpiece. An integration of vision and control units enables automatic positioning of WJ machine.

In the thesis we prove that, the proposed vision system ensures the measurement and control accuracy required in industrial environments. Presented solution, due to its simple structure and a use of common components, is easy to implement on different machine layouts.

Presented work consists of eight chapters. The first of them covers the introduction to the problem concerned in the thesis. The next section is devoted to survey related works and background studies. It is followed by problem statement, system requirements and main contribution part. The third section contains also research questions and hypotheses of the thesis. The fourth chapter presents the PVS model and description of used methods. Chapter 5 describes booth hardware and software implementation processes. Chapter 6 presents the validation process and obtained results. The last two chapters summarize the entire thesis and propose future works.

Page 14: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Survey of Related Works

2

2 Survey of Related Works

Prior to a research concerning PVS, the review of theoretical background and related works has been conducted. In [1] Spanish researchers have proposed an automated WJ cutting technique using a vision system. The camera with servo-controlled lens was located perpendicularly to the WJ table. The purpose of that work was to apply automatic system to optimize a cutting process and to quality inspection.

Another attempt, to apply a vision system placing planar objects on the WJ machine, is presented in [2]. The main feature of that system was a use of three parallel line scan cameras to guide cutting process.

Most of object identification methods, using industrial machine vision systems are based on CCD or CMOS sensors [3], line scan cameras [4] and laser scanners [5].

A combination of several cameras in one system increases the amount of acquired information significantly. For instance, the shape identification can be obtained using stereo cameras [6]. To determine the object localization and tracking in three dimensional space, combination of two cameras is needed [7].

Vision systems can substitute human in quality inspection. For searching defects and alien objects in empty bottles in the real time, generalized Hough transform can be used [8]. A vision system combined with Kalman filter and slip detector, constitute a mobile robot localization system for target tracking purposes [9]. Other application for machine vision system is motion detection used to ensure safety and crime prevention [10]. The concept of cellular vision sensor security system confirmed that proposed solution may be used in order to identify human behavior. Furthermore vision systems are commonly used for object positioning [11], detection [12] and recognition [13].

According to application requirements, vision algorithms are based on various image processing techniques. One of them is background subtraction which removes irrelevant data from images. Background removal in combination with mean shift algorithm may decrease the disturbance in the case of images with non-uniform background [14]. Development of computer engineering allows to apply shadow detection algorithm in grayscale video sequences. For this purpose the two stage method of background subtraction can be used [15]. Currently, the color video background removal method is used for tracking and recognition tasks. The background subtraction model in a Hue plane for frame differencing is used to classify foreground and falsely detected objects [16]

There are many different methods referring to corner detection. Nowadays the most common methods which are being developed are based on the Harris corner detection algorithm [17]. However, these methods are susceptible to interferences and can detect false corners. Smith and Brady propose a new Smallest Univalue Segment Assimilating Nucleus (SUSAN) [18] approach, which is faster and more resistant to noises than previous methods. In 2009 Chinese researchers Yiping Tang et al. improved the Harris corner detection algorithm, by an implementation of

Page 15: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Survey of Related Works

3

neighborhoods checking procedure [19]. This amelioration reduces the number of false corner detections, which results in the significantly shortened computation time.

Hough transformation is a base for various techniques of object detection. This commonly used transformation presented by Richard Duda and Peter Hart in [20], is based on the patent of Paul Hough from 1962 [21]. Hough transform can be used not only to detect lines and curves but to detect any shape which can be described by a set of parameters. There are many applications based on this solution. One of them uses the windowed Hough transform to detect rectangles and can be applied to synthetic as well as real images [22]. Another application of the transform is detecting ellipses [23][24]. It is used also to detect the corners [25], junctions and line intersections which are useful to identify a shape of element [26].

To get the top view from images taken from different angles, a rectification method can be used. To obtain that, relations between taken image and real world have to be found. There are different techniques to accomplish it, for example by selecting characteristic points [27] or finding relationships like between ellipses in image and a corresponding circles in a real world [28]. There are many applications of this method, such as a measure of lengths and angles ratios, from a perspective image [29]. Another application provides the valid view of document from the distorted picture [30].

Page 16: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Problem Statement and Main Contribution

4

3 Problem Statement and Main Contribution

WJ cutting process, which is a modern technique of separation and cutting many types of materials, has not yet been fully automated. Among others, there is a need to automate an identification process of workpiece placement in a workspace. Also an accurate determination of machine initial coordinates is a critical problem of any Computer Numerically Controlled (CNC) cutting technology.

In order to properly define the problem, the system requirements have to be determined, see Table 3.1. The requirement for maximum angular deflection uncertainty is that the gap between the real and the estimated initial corner of element is smaller than 1 mm (see Fig. 3.1). Because of that, the maximum angular deflection uncertainty depends on element edge length and can be described by:

� = 2 sin�� 0.5 mm� (3.1)

where: β – angular deflection uncertainty; l – length of the element.

TABLE 3.1. ASSUMED SYSTEM REQUIREMENTS

Assumption System requirement Precision of the vision system ±0.5 mm

Shape of the workpiece Rectangular

Workpiece color and transparency Only white

Minimal size of the workpiece 100 mm × 100 mm

Dimension of the workpiece: 2D

Workpiece rotation Up to 30º

Angular deflection uncertainty Depends on a size of the workpiece, according to (3.1)

Lightening conditions: Constant i.e. the same during taking pictures of a

background and an element.

Taking the stated problem under consideration, SWL has formed two research questions. The first question is how to identify a specified corner and angular deflection of randomly located workpiece at an existing WJ machine workspace using web cameras with required accuracy? The auxiliary question is how to in situ calibrate that system?

To answer these questions it was assumed that for the given WJ machine it is possible to automatically identify an initial corner of white rectangular element and its angular deflection, using two webcams and a suitable image processing algorithm, presented in Fig. 3.2. The global camera (GC) allows to detect the corner of an element with required accuracy of ±60 mm, then a local camera (LC) is used. The LC assures the corner identification with a precision up to

Page 17: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Problem Statement and Main Contribution

5

±0.5 mm and angular deflection accuracy according to (3.1). Four markers used as a reference points are sufficient to calibrate in situ the both global and local cameras with required accuracy.

The proposed processing algorithm presented in Fig. 3.2 consists of two main parts. The first procedure refers to GC and begins with calibration process. Subsequently, background and object images are captured during an acquisition process. To extract an element, background subtraction in combination with noise filtering and edge detection techniques are applied. Lines and intersections, determining the element features are obtained using Hough transformation. In order to estimate proper machine initial coordinates, the image is rectified. The second part of the algorithm refers to operations done on the images captured by a LC. Before the image acquisition, the LC is calibrated. Subsequently, the captured images are filtered and workpiece edges are detected. The Hough transformation helps to identify lines and their intersections. Finally, element’s initial corner coordinates and angular defection are defined.

The main contribution of the thesis is a conceptual solution which takes into consideration the existing machine layout. Following the general concept, the detailed design including selection of suitable components is accomplished. A prototype of a vision system is implemented on the WJ machine. Running the algorithm implemented in Matlab, the proposed solution has been validated. The system prototype was experimentally verified.

l

<1mm

l

β

Fig. 3.1.Graphical interpretation of assumed angular deflection uncertainty requirement.

l

l

<1mm β ½β

Page 18: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Problem Statement and Main Contribution

6

Fig. 3.2. PVS algorithm block diagram.

Background subtraction

Global camera calibration

Acquisition of background and element images

Noise filtering and edge detection

Hough transformation, detection of lines and intersection

Image and coordinates rectification

Machine coordinate determination

Noise filtering and edge detection

Fine detection of lines and intersections

Local camera calibration

Final coordinates determination and angular deflection estimation

Image acquisition

Page 19: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

7

4 Positioning Vision System Model

The PVS model is based on the algorithm presented in Fig. 3.2. Following subsections describe methods and techniques used in the PVS model.

4.1 Global camera calibration

An essential algorithm step which initializes detection procedure refers to calibration of GC. Color markers contrasting to the background are fixed at marginal corners of the workspace and determine the machine real operating range. An appropriate corner detection using the color markers ensures the valid reference to machine coordinate system, see Fig. 4.1. As a result of calibration, four points defining the machine operating range are identified. An image taken from the GC has to cover the entire workspace, as shown in Fig. 4.2. If this requirement is not fulfilled, an operator has to adjust the position of the camera.

4.2 Background subtraction

During an image acquisition step, N images of the background and one image of an element placed on the machine workspace are taken by the GC. Then cropping function reduces data by removal of images’ outer parts. For this purpose the calibration points, identified by the GC camera calibration procedure are used. Those points define area to be cut out. To reduce random noise, background images are averaged [31]:

Fig. 4.1. Model of WJ machine workspace with machine coordinate system.

Z

Y

X

Page 20: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

8

�� = 1� � ��

��� (4.1)

where: Ba – averaged background image; Bi – ith background image; N – number of background images.

Then the both images, one with averaged background and another with the element are converted from RGB to grayscale and then the element is extracted by removing the background. The result of that process is presented in Fig. 4.3c.

Fig. 4.2. Illustration of GC image coordinates (X’,Y’) and machine coordinate system (X,Y,Z)

X

Y

Z

X’

Y’

(b) (a) (c)

Fig. 4.3. (a) Background image, (b) image with workpiece, (c) result of subtraction.

Page 21: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

9

4.3 Noise filtering and edge detection

Noise filtering starts with the monochrome conversion process of output image obtained form a background subtraction. Groups of pixels which size is smaller than the smallest assumed test element are removed from the image. Additionally if there are some loses in the detected element as a result of background subtraction, they are morphologically closed. The obtained binary image is applied to detect the element contour as shown in Fig. 4.4, by using the Canny edge detection algorithm [32], which is a multi-stage process. At the beginning the image is smoothed by a Gaussian filter. Next, the algorithm finds the image intensity gradient using the first derivative operator. Then the algorithm through a non-maximal suppression process tracks the regions with high first spatial derivatives, in order to estimate the output line. The tracking process is controlled using two parameters, which are relative to the highest value of the image gradient magnitude. The adjustable parameters can affect the effectiveness of algorithm and computation time.

4.4 Line and intersection estimation using Hough transformation

Hough transformation as a feature extraction technique is used to identify lines and their parameters in the image. This transformation is used because of its relatively high resistance to image noise and effective handling of line discontinuities. It also provides an easy way to estimate the line angular deflection. According to transformation properties, each line can be described by an equatio:

� = �− cos �sin �� � + !

sin �" (4.2)

which can be rearranged as:

! = � cos � + � sin � (4.3)

where the graphical interpretation of ρ and θ parameters is presented in Fig. 4.8. According to (4.3) each point in the Cartesian coordinate system is represented by a sinusoid in

(b) (a) (c)

Fig. 4.4. Noise removal and edge detection. (a) Input image,(b) noise removed, (c) detected edges.

Page 22: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

10

Hough space (which can be observed in Fig. 4.5) and each line is represented as a point (see Fig. 4.6). Hough space is a quantized matrix of accumulator cells. During the Hough transformation of each point in Cartesian coordinate system, the corresponding discretized sinusoid is applied on Hough space. Every accumulator cells which is beneath this sinusoid is incremented. Detection of peaks in Hough space is a way to estimate the lines by ρ and θ parameters (example in Fig. 4.7). To find the intersection of two detected lines we use the equation set:

#��$ = %cos �� sin ��cos �& sin �&' #!�!&$( (4.4)

where: i, j – are indexes of detected lines and i≠j. The intersection points computed from (4.4) identified outside the borders of the image are omitted. Also intersections which are farther from the detected edge, than the declared threshold are ignored. Remaining intersections correspond to the approximate position of the corners.

Transformation to 2D Hough space

Transformation to 3D Hough space

Both are equivalent (b)

(a)

(c) Fig. 4.5. A point represented in: (a) Cartesian coordinates system, (b) two dimensional Hough space, (c) three

dimensional Hough space.

Page 23: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

11

Transformation to 2D Hough space

Transformation to 3D Hough space

Both are equivalent (b)

(a)

(c)

Fig. 4.7. Two lines represented in: (a) Cartesian coordinates system, (b) two dimensional Hough space, (c) three dimensional Hough space.

Transformation to 2D Hough space

Transformation to 3D Hough space

Both are equivalent (b)

(a)

(c) Fig. 4.6. A line represented in: (a) Cartesian coordinates system, (b) two dimensional Hough space, (c) three

dimensional Hough space.

Page 24: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

12

ρ – minimal distance between the line and the origin of the coordinate system; θ – angle of the normal vector to the line; α – angle of a line.

4.5 Rectification and preliminary corner detection

Because of existing machine layout, a perspective of GC is not most efficient from field of view (FoV) standpoint. However by using the workspace corner points identified during GC calibration, described in section 4.1, a rectification from GC coordinate system to the machine coordinates is possible. Illustrative picture of the result of rectification is presented in Fig. 4.9. The detailed mathematical derivation can be found in [27]. The image after rectification is the machine top view. The identified corners of the workpiece are also transformed into the machine coordinate system. Following, the initial corner is chosen by finding the minimal distance between each identified corner to the origin of the machine coordinate system.

4.6 Local camera calibration

In order to accurately estimate the initial element corner coordinates, LC calibration is required. The LC is mounted behind the nozzle. The calibration algorithm estimates the vector between the center of the nozzle and center of the LC’s FoV, see Fig. 5.15. This information is needed during the fine element corner detection. The LC calibration process begins with taking a picture of specified color marker, which provides reference coordinates. By performing the procedure of color and then edge detections, the contour of the marker is obtained. Next, using Hough transformation, the coordinates of the corner marker, which indicates upper right corner of the workspace, are computed and LC is moved to this position. LC is shifted until the center of its FoV is above the markers corner. Vector between marker’s corner real position and nozzle position after LC calibration constitutes the calibration vector considered in the detection process.

y

x

θ α

ρ

Fig. 4.8. Graphical interpretation of ρ, θ parameters.

Page 25: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

13

4.7 Estimation of initial corner coordinates and workpiece angular deflection

WJ machine coordinates used for LC positioning above the corner of the element preliminary estimated using GC, are obtained by subtracting the LC calibration vector from the coordinates computed in section 4.6.

When the LC is moved to the preliminary estimated initial corner position it captures an image of the element corner, see Fig. 4.10 The image is then undergoing a process of binarization. To reduce noise, every element which is smaller than the threshold amount of pixels, is removed from the image. In order to get a contour of the element, Canny edge detection method is used. Afterwards the lines and their intersections are found through the Hough transformation. The intersection which is closest to the image center is considered as the initial corner. Then to increase the precision of corner detection, the captured image is cropped around the detected corner. Subsequently the corner identification process is repeated with higher precision using a Hough transformation.

If the distance between the preliminary detected initial corner and the corner detected from cropped image is bigger than the assumed threshold, the LC is shifted over the detected corner and the whole procedure of finding the precise corner position is repeated. Otherwise, the

Rectification

(b)

(a)

(c)

Fig. 4.9.(a) Real and initial camera location. (b) Perspective from real camera's position. (c) Perspective from virtual initial camera position.

Page 26: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

14

machine nozzle is shift to the position, which is above precisely detected initial corner. Angle deflection of the element is computed using θ values from the former results of Hough transformation.

4.8 System boundaries and limitations

During system designing some problems and limitations related to the system adjustment were noticed and taken into account. The recognized problems and system limitations are presented in this section. There are also proposals how the problems and limitations can be solved or omitted.

• Harsh operating environment

During the cutting process, water under the high pressure splashes all over the workspace. Also dust and other solid particles may appear. In a case of using web cameras in that kind of environment, it is necessary to apply dedicated protective shields. Recommended minimal Ingress Protection (IP) rate is IP66 according to IEC 60529.

• Drawback of the existing machine layout

Because of insufficient distance between the workspace and the ceiling, FoV of GC mounted above the center of the workspace does not cover the whole required area. To solve this problem, the global camera is mounted in the corner between the wall and the ceiling, see Fig. 5.1. Though this solution provided inappropriate perspective, but the FoV covers the whole workspace. Metric rectification is applied to obtain valid top view.

vect1

vect2 vect3

2592 px 501 px

501

px

194

4 p

x

Center of the image captured by the LC Detected workpiece corner on the image captured by the LC = center of the cropped Detected workpiece corner on the cropped image = the best estimation of the workpiece corner

Fig. 4.10. Graphical interpretation of estimation of initial corner coordinates.

Workpiece

Page 27: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

15

• Lightening conditions

Because of the GC location, sunlight coming through the windows caused undesirable reflexes, which result in detection uncertainty. Installation of window blinds solved the problem of reflection.

Moreover existing lightning system provides non-uniform distribution of light. The measured illuminance in the range 373-518 lux for daily work was not sufficient for some elements. This problem has been solved, by using color contrasting workpiece. The test elements were covered with semi-matt white color paint.

• Changing background arrangement

This problem refers to the background subtraction process. An arrangement of the workspace may evolve because of cutting residues or left elements. Even during the separation process, some particular table ribs may be cut off. Taking the background image every time before detecting the element, solves the problem.

• Machine arrangement during acquisition process

To get the useful image form GC, the machine should not occult the view of workspace. Therefore during the image acquisition process, the machine should be located in one established position and the proposed position in machine coordinate system is X=1402;Y=1514.

To assure the system proper operating, the protection curtain should be open while capturing both the background and the element images.

• Workspace boundaries

Due to the machine structure, the LC is mounted at a fix distance from the nozzle. Therefore not every point of workspace could be covered by the center of LC’s FoV. The usable workspace is presented in the Fig. 4.11. However since embedded machine limit switches do not allow to cut near the workspace boundaries then introduced workspace limitations from the backside and both flanks of workspace are not additional constraints. Also because of mechanical construction of safety curtain, and distance between the nozzle and LC, workpiece cannot be placed closer than 90 mm to workspace front edge.

Due to the machine structure, the LC is mounted at a fix distance from the nozzle. Therefore not every point of workspace could be covered by the center of LC’s FoV. The usable workspace is presented in the Fig. 4.11. However since embedded machine limit switches do not allow to cut near the workspace boundaries then introduced workspace limitations from the backside and both flanks of workspace are not additional constraints. Also because of mechanical construction of safety curtain, and distance between the nozzle and LC, workpiece cannot be placed closer than 90 mm to workspace front edge.

Page 28: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Positioning Vision System Model

16

• Cameras’ mounting height

The precision of the identification mostly depends on the camera resolution and the mounting height which determines their FoV. Therefore one of the ways to improve the precision is to use the camera sensor with better resolution. Also decreasing installation height (see Fig. 4.12) improves the precision but on the other hand it reduces the FoV. GC's mounting location is restricted by the constraint of capturing the image of whole workspace. LC does not have that kind of limitations, but when the sensor is positioned too low, the output image might be blurred for improper lenses. The LC is mounted on its minimal height which assures a focused image and is 235 mm. The camera’s FoV for this positioning is 332×248 mm.

141

4 9

0

2784 10 10 Fig. 4.11. Usable area(red frame) in workspace (all measures in mm).

10

Fig. 4.12. Illustration of relationship between FoV and the camera height.

Page 29: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

17

5 Implementation

This chapter introduces the process of hardware and software implementation of the designed vision system. The WJ machine, control system and all components of the PVS are described in the first subsection. Principle of operation and the structure of implemented vision system are presented too.

5.1 Hardware implementation

An picture of the implemented vision system including its layout and main components is presented in Fig. 5.1. A computer controls the PVS via its control unit. The control data are based on images acquired from GC and LC. Siemens Sinumeric 840d sl is a control unit of Computer Numerically Controlled (CNC) machine which directly controls machine movements and cuts.

We used Dell Optiplex 755 computer with Intel® Core ™ 2 Duo E6850 3.0GHz processor, 3.23 GB RAM and Microsoft Windows XP with Service Pack 3 operating system.

The vision system consists of two Logitech B910 HD web cameras. Each web camera is equipped with high definition 5 megapixels sensor with 24-bits true color depth and 1.52 meter hi-speed USB 2.0 certified cable. The sensor has resolution of 2592×1944 pixels.

Because of insufficient camera's cable length, to connect the camera with the PC control unit, the additional extension cables are used. The type of the used active USB extension cables is Deltaco USB2-EX5M active USB 2.0 extension cable of 5 meters length. For each camera two extension cables are used. The PVS connection diagram is shown in Fig. 5.2.

To fix cameras properly, the special mounting plates were crafted. To avoid an influence of vibration they were cut off from 3 mm stainless steel plates. Mounting plates are presented in Fig. 5.1 and theirs technical specifications can be found from Appendix A and Appendix B for the global and local camera respectively. GC was fixed on the left part of the ceiling at the position (X=-600; Y=500; Z=1570) of machine coordinate system. The fastening for the LC is mounted in a way to ensure its fixed position about 5 mm in X-axes, 850 mm in Y-axes and 110 mm in Z-axes behind the WJ machine nozzle, see Fig. 5.3. Because the LC mounting plate is designed for easy removal, the distance between the LC and the nozzle may vary slightly. This difference does not influence system functionality since it is compensated during the local camera calibration process.

To ensure calibration the GC and LC markers are cut off from a stainless steel plate and painted on color contrasting to the background. After several tests, the pink color assured the required accuracy. The markers have rectangular shape and are fixed to ribs of machine table in such a way that markers’ corners which are closest to the workspace center indicate workspace corners.

Page 30: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

18

In order to perform experiments, three different test elements were prepared, see Fig. 5.4. Each of them was cut from the stainless steel plate and painted on semi-mat white color.

Fig. 5.1. WJ machine layout and main components used in vision system. (a) GC with mounting plate, (b) LC with mounting plate, (c) calibration marker.

(b)

(a)

(c)

USB

Fig. 5.2. PVS connection diagram with WJ machine

GC

LC

PC Sinumeric 840d

Control Panel

Waterjet Machine

USB

Ethernet

Profibus

Profibus

Page 31: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

19

Dimensions of the test workpieces: Small – 100 mm × 100 mm; Medium – 465 mm × 465 mm; Big – 700 mm × 1000 mm.

5.2 Software implementation

This subchapter describes the software implementation of vision system and its structure. A PVS model presented in Fig. 5.5 and Fig. 5.6 was implemented in MATLAB ver. R2012a. All of the functions used in the program are commented which helps to understand the program. Main functions are described in the following section.

Fig. 5.3. Location of the LC in relation to the WJ machine nozzle.

X-axis vector component (~5mm)

Y-axis vector component (~850mm)

Z-axis vector component (~110mm)

Nozzle

Fig. 5.4. Picture of the three test elements.

LC

Page 32: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

20

The main program calls all the vision application functions. Program initialization begins with preparation functions such as clearing the command window, variables and functions from Matlab memory, and declarations of variables used later in other functions such as thresholds and cameras’ numbers etc. An operator may manually change these variables in order to adjust the function parameters. All the program variables and parameters are defined and described in the program code comments. The application user interface contains several functions such as message boxes, dialog boxes, which make vision applications easy to operate and guide how to handle them. The precise coordinates of the initial workpiece corner and the element angular deflection are the program outputs.

Main

Variable daclaration

Global camera calibration

im=camera(x)

global_calib(image1)

calib_manual(image1)

calib_automatic(image1)

X=det_lin_cor(im,image,tresh)

Rough corner detection

X=det_lin_cor(im,image,tresh)

out=rl_cor(X,im)

[cor,im2]=rectify2(X,im,x_shift,y_shift) H=homography(x1,x2)

[P2,T] = normalise(P)

Fig. 5.5. PVS structural model, GC part.

Image processing

[bcgd_mean,element,min_x,min_y]=ld_im

outIm=bgd_rem(bg, im)

out=ns_rem(img,ns_el,ns_bg)

bgd

wpc

Page 33: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

21

5.2.1 Global camera calibration

In this section the functions used in calibration process of global camera are described.

• camera

The function captures image from GC (see Fig. 5.7) and converts it into RGB color model.

• global_calib

The procedure loads the stored set of calibration points and places them on the image captured in camera function (see Fig. 5.8). The operator can zoom the picture manually and check if the loaded calibration points match markers’ corners which correspond to the workspace corners in the new picture. The procedure input dialog box asks if the old calibration points mach the new

Fig. 5.6. PVS structural mode, LC part.

Main contd. Local camera calibration local_calib

While(precision<0.2mm)

im=camera(x)

Color detection and image processing

[X,thetas,rhos]=det_lin_cor_loc(im,image,tresh)

Precise corner detection and angular deflection

While (distance between corner and center >10px)

im=camera(x)

Image processing

[X,thetas,rhos]=det_lin_cor_loc(im,image,tresh)

Cropping the image

[X,thetas,rhos]=det_lin_cor_loc(im,image,tresh)

Finalizing corner detection

angular_def

Page 34: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

22

markers corners. If so, program begins image processing part. Otherwise the calibration process queries which of two available calibration methods, automatic or manual can be run. If program could not find stored calibration coordinates, it at once starts with the calibration procedure.

• calib_automation

After calling the calib_automatic function, the color segmentation technique is applied to identify markers on the workspace. Then in order to assure the correct marker detection, the image is processed by the Canny edge detector. The function det_lin_cor detects the lines and then estimates the corners of each marker. Four points, one point from each marker, which are located closest to the center of image, state the new set of calibration points which define the workspace frame, see Fig. 5.9.

• det_lin_cor

This function applies Hough transform on the image with formerly detected edges. The transform peaks (example is showed in Fig. 4.7) define lines’ parameters (ρ,θ) (see Fig. 4.8) which yield to find all intersections. Points located outside the image boundaries, as well as points located further than declared threshold from the nearest detected edge are omitted. This way all the virtual intersections are ignored. The outputs of this function are the edges of identified elements and their intersections.

Fig. 5.7. Image captured by GC.

Page 35: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

23

calib_manual

In a case when the automatic calibration cannot determine the calibration points correctly, then calib_manual lets to mark them on the image using mouse cursor.

Fig. 5.8. Image with wrong calibration points marked by asterisks.

Fig. 5.9. Image after calibration. Calibration points marked by asterisks.

Page 36: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

24

5.2.2 Image processing

In the image processing part of the vision system algorithm, in order to properly identify the initial corner, all necessary data are collected. All functions used in this procedure are presented in this section.

• bgd

This function initializes the image acquisition; the GC captures six background images, and saves them as an external file.

• wpc

This function displays permission to put en element on the workspace. When the element is placed on the workspace then the GC captures an image with the element and saves it as an external file.

• ld_im

The saved formerly images of backgrounds and workpiece are loaded and cropped (see Fig. 5.10) using the set of global calibration points defined in the global camera calibration procedure. Finally the background images are averaged using the method defined by (4.1). The number of images which are involved in the averaging was chosen empirically. If the set of background images is less than six, the noise level can be too high. On the other hand, a big number of images increases the computation time. The chosen number of background samples is a compromise between identification uncertainty and computation complexity.

• bgd_rem

This function processes the background subtraction. The average image of background and the

(b) (a)

Fig. 5.10. (a) cropped background image, (b) cropped workpiece image.

Page 37: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

25

workpiece image are firstly converted into the grayscale and compared according to:

)*+,-./0 = 1/,-./0 > (0�,-./0 − +ℎ506) (5.1)

where: outImage – binary output image (background - "1", element - "0"); elImage – the grayscale image with an element; bgImage – grayscale average background image; thres – threshold reducing the noise.

The output of this function is an image of extracted element shown in Fig. 5.11(a).

• ns_rem

This function removes pixels considered as artifacts. A number of white pixels in each adjacent group of pixels detected in the image are counted and if the number is smaller than 1440 px then the group is considered as an artifact and removed from the image. 80% of the smallest test element, located furthest from GC, defines the threshold of 1440 px.

Furthermore when workpiece contains gaps, being results of background subtraction or bad image binarization, the gaps are morphologically closed to avoid later a false line detection, see Fig. 5.11(b).

To identify edges of the workpiece, the filtered image is processed by Canny gradient method, see Fig. 5.12(a). This solution is based on research presented in [32] and on series of tests. In a case of possible image distortion and different positions of workpiece, the Canny edge detection method is the most suitable for this task because.

(b) (a)

Fig. 5.11. (a) extracted element, (b) noise removed.

Page 38: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

26

5.2.3 Preliminary corner detection

The purpose of this part of the algorithm is to identify the initial corner of workpiece and to provide the preliminary coordinates of the local camera initial position.

• det_lin_cor

The function is the same as in section 5.2.1. The result of this function can be seen in Fig. 5.12(b)

• rectify2

The rectification process provides coordinates of detected corners in the machine coordinate system. To transform the GC coordinate system into the machine coordinate system, rectification uses the workspace corners identified in the GC calibration procedure as reference points. Using this set of reference points, rectify2 function computes the homography for image transformation. The homography transforms the element image from the camera perspective to the top view perspective and adjust the image to machine coordinate system, see Fig. 5.13.

• rl_cor_ld

The function chooses the initial corner (as shown in Fig. 5.13), from the corner set identified by det_lin_cor. It is found as a corner closest to the machine coordinate origin:

8 = {: ∈ ,: =��> + ��> = -:?&∈@ =�&> + �&>} (5.2)

where: K – set of indexes of points which are closest to the origin; xi,yi – coordinates of the i th point respectively for x and y axis.

(b) (a)

Fig. 5.12. (a) Estimated edges of the workpiece. (b) Estimated corners of the workpiece.

Page 39: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

27

5.2.4 Local Camera Calibration

In the case when LC’s position has been changed then, it is necessary to calibrate it. This process begins with searching for a file containing the former calibration vector. If the file does not exist, the local camera calibration is initialized. Otherwise an input dialog box whether to start the calibration is created and opened.

The first step of calibration is to move machine’s nozzle to the default position, so that the center of camera is located above the area of the already identified reference corner. Because of inconvenient sensor fastening, the upper right marker is the most suitable for the calibration purpose.

• camera

The image is captured by the LC and converted into RGB color model, see Fig. 5.14(a).

(b) (a)

Fig. 5.14. (a) image captured by LC camera,(b) extracted marker.

Fig. 5.13. Image after rectification. Initial corner is marked with green point.

Page 40: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

28

• Color detection and image processing

Firstly, the region with marker is sampled to detect marker's color. Next, using obtained information, the color segmentation is performed, see Fig. 5.14(b). The image is binarized and filtered by removing smaller elements. The final step of this function is to identify edges of the marker.

• det_lin_cor_loc

This function is similar to the det_lin_cor function described in section 5.2.1. It also returns line's parameters θ and ρ which are required later to find the element angular deflection. Moreover another parameter res is introduced. This parameter indicates the resolution of θ in Hough transformation and is used to gain more accurate element angular deflection. In this call res is equal 1˚.

• Finalizing the local camera calibration procedure

The marker corner which is closest to the origin of the machine coordinate system is chosen as a target. The vector from the chosen marker corner to the center of LC's FoV is computed. If this vector is greater than required threshold, then the nozzle and camera are moved by this vector and whole calibration procedure is repeated. Otherwise the calibration vector (cal_vec) (see Fig. 5.15) is found as the difference between the reference marker corner position and the actual machine position.

This calibration vector is saved into two files. One designated by actual time, and the second one called cal_loc.mat which consists of actual calibration vector and marker angular deflection. The file cal_loc.mat is overwritten after each camera calibration procedure.

cal_vec

Local Camera Nozzle

Fig. 5.15. Interpretation of calibration vector (cal_vec).

Page 41: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

29

5.2.5 Identification of the workpice initial corner and i ts angular deflection

The goal of the final part of the program is to precisely estimate the element initial corner position and its angular deflection. The machine nozzle is shifted to the position which is computed by subtracting the length of cal_vec from the corner coordinates identified by the GC. As a result, the center of the camera is placed above the preliminary identified corner.

• camera

The image is captured by the LC and converted into the RGB color model; see Fig. 5.16(a).

• image processing

At first, the captured image is converted into monochromatic one, see Fig. 5.16(b). After binarization the artifacts are removed by erasing elements smaller than 1/32 part of the image (~150 kpx), see Fig. 5.17(a). Finally the Canny edge detection is applied and its result is shown in Fig. 5.17(b).

• det_lin_cor_loc

This function was described in LC calibration, section 5.2.4.

• rl_cor_c

This function is similar to the rl_cor_ld, presented in the preliminary corner detection section, but instead of searching for the corner closest to the machine origin it looks for the one closest to the image center according to:

(b) (a)

Fig. 5.16. (a) Image captured by LC after shift to coordinates detected by GC, (b) image after binarization.

Page 42: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

30

8 = {: ∈ ,: B��C > + ��C > = -:?&∈@ B�&C > + �&C >} (5.3)

where:

��C = (��D�E FGHIGJ − ��)

��C = (��D�E FGHIGJ − ��)

K – set of indexes of points which are closest to the origin; xi,yi – coordinates of the ith point respectively for x and y axis.

Then the vector vect1 shown in Fig. 5.18, from the detected corner (red point) to the center of image (yellow cross) is computed.

(b) (a)

Fig. 5.17. (a) Image with noise removed, (b) detected edges.

vect1

Fig. 5.18. Interpretation of vect1

Page 43: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

31

• cropping the image

To improve the identification accuracy the image with detected edges is cropped, see Fig. 5.19. The size of new image is 501×501 px, with the center at the former detected initial corner.

• det_lin_cor_loc

This function is the same as in local camera calibration, section 5.2.4, but with the smoother parameters res and tresh.

• rl_cor_ld

This function is the same as in preliminary corner detection, section 5.2.3.

• finalizing the corner detection

Then the vector vect2 which is the distance from the detected corner to the center of cropped image is computed similarly as vect1 in rl_cor_c.

The adjusting vector vect3 is found. The vector refers to the precise distance between the center of the image captured by LC and the detected workpiece corner on the cropped image. This vector is computed by adding vectors vect1 and vect2, see Fig. 5.20(a). To improve the identification accuracy, if vector’s length is greater than 10 px the machine is shifted by the adjusting vector. Then the whole procedure of precise corner detection is repeated. When the adjusting vector is shorter than 10 px then the machine is moved by sum of vect3 and cal_vec, see Fig. 5.20b. As a procedure result, machine's nozzle is placed exactly above the identified element corner.

• angular_def

Using the data from det_lin_cor_loc function, performed on cropped image, this function

Fig. 5.19. (a) original image, (b) cropped image.

(a)

(b)

Page 44: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Implementation

32

estimates the angular deflection of the element. At the beginning, the function determines parameters of two lines which intersect in initial corner position. Then lines’ θ values are converted form radians to degrees and used to estimate element angular deflection.

vect3

vect1

vect2

(a)

Vect3 cal_vec

Local Camera

Workpiece (b)

Nozzle

Fig. 5.20. (a) Graphical interpretation of relation between shift vectors vect1, vect2, and vect3, (b) Graphical interpretation of vect3 in real machine.

Page 45: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

33

6 Method Validation and Accuracy Analysis

Prototype validation consists of three stages. Estimation of the accuracy of element detection by GC is described in the first section. It constitutes an initial step of accurate coordinate localization. Accuracy of element’s corner detection by LC is estimated in a second part of the section. The third stage deals with estimation of an angular deflection uncertainty.

6.1 Accuracy of element detection using GC

The precision of preliminary element’s initial corner detection by the GC was estimated experimentally. A purpose of the investigation was to verify if LC’s FoV can cover an area of element’s corner location within the margin of estimated detection uncertainty when the GC was used. The experiment shows that during each trial, the element corner detected by GC was located within LC’s FoV.

To estimate a detection uncertainty caused by the side placement of GC for different element sizes, the workspace was divided into sectors. For small and medium size elements the workspace was divided into 6 sectors A1-A6 (see Fig. 6.1.and Fig. 6.2.). And for large size elements the workspace was divided into 2 sectors B1 and B2 (see Fig. 6.3.). Fig. 6.1, Fig. 6.2 and Fig. 6.3 depict detection uncertainties as vectors in each workspace sectors. The beginning of the arrow (•) indicate the global camera detection point, the end of the arrow (�) depicts the real corner position. Numbers next to arrows depict the vector lengths in mm. Measurement numerical data of the experiments are presented in Table 6.1, Table 6.2 and Table 6.3 respectively.

A1 A2 A3

A4 A5 A6

8.9 6.8 14.5

7.1 13.7

14.4

X

Y

Fig. 6.1. Workspace divided into 6 sectors with showed global camera’s uncertainty vectors and theirs length [mm] for detecting an initial corner of small element in machine coordinate system.

GC

Page 46: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

34

TABLE 6.1 GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING SMALL ELEMENT IN EACH SECTOR.

Uncertainty A1 A2 A3 A4 A5 A6 x [mm] 6.6 4.7 10.5 4.6 9.2 5.8 y [mm] 5.9 4.9 9.9 5.3 10.2 13.2

length [mm] 8.9 6.8 14.5 7.0 13.7 14.4 TABLE 6.2. GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING MEDIUM ELEMENT IN EACH SECTOR.

Uncertainty A1 A2 A3 A4 A5 A6 x [mm] 2.2 -4.9 -2.6 1.7 -2.0 -4.2 y [mm] 0.0 4.2 9.4 4.2 5.6 12.8

length [mm] 2.2 6.4 9.7 4.5 5.9 13.5

TABLE 6.3. GC UNCERTAINTY VECTORS AND THEIRS LENGTH FOR DETECTING BIG ELEMENT IN EACH SECTORS.

Uncertainty B1 B2 x [mm] 3.0 -0.8 y [mm] 7.0 11.9

length [mm] 7.6 11.9

The accuracy of element identification by the GC mostly depends on the quality of metric rectification process and a distance of the element from the GC. For elements placed in sectors further from the GC like sectors A3, A6 and B2, the estimation uncertainty is bigger. In the sectors closer to the GC, estimation of workpiece position is more accurate. This is caused by the fact that the resolution decreases in sectors further from the GC.

A1 A2 A3

A4 A5 A6

2.2 6.5

9.7

4.5

5.9

13.5

X

Y

Fig. 6.2. Workspace divided into 6 sectors with showed global camera’s uncertainty vectors and theirs length [mm] for detecting an initial corner of medium element in machine coordinate system..

GC

Page 47: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

35

The identification accuracy is also affected by light quality what was noticed during PVS development and testing. Because of non uniform light distribution of the existing luminance system, shadows casted by the machine and light intensity changes during the day, cause problems like reflections, light underexposure and overexposure. However lightening conditions was good enough to identify the elements by the GC which accuracy needed for fine element identification..

It can be noticed that the lines detection of the small elements is less accurate than for big elements. Especially, the effect of element’s size combined with lightening distortion can increase the identification accuracy significantly.

Nevertheless, the experimental results show that uncertainty of identification using the GC does not exceed the assumed level. The maximum uncertainty of detection is 14.5 mm which is less than 25% of permissible uncertainty of 60 mm

6.2 Accuracy of element corner detection by LC

The purpose of second experiment was to find out a shift between the element corner position estimated by the LC and the real one. To measure this difference after PVS algorithm execution, the machine pierced the workpiece in the formerly estimated position. A profile projector Mitutoyo PJ-A3000 was used to find the shift between the piercing center and the plate corner. The view from the profile projector screen is presented in Fig. 6.4.

Measurements were performed in two trials, each trail consists of four series and each series contains several samples. The results are presented in Fig. 6.5 and Fig. 6.6. Between trials, there were periods of machine normal work. Because of operation under high pressure of water and abrasive mixture, weariness of waterjet’s focusing tube was noticeable. As a result, the diameter

B1 B2

11.9 7.6

X

Y

Fig. 6.3. Workspace divided into 2 sectors with showed global camera’s uncertainty vectors and theirs length [mm] for detecting an initial corner of big element in machine coordinate system.

GC

Page 48: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

36

of cutting beam increased. In order to properly estimate the accuracy it was necessary to measure the piercing diameter using profile projector.

Accordingly to the results, a correction vector was updated after each series. Results from the first series of first trial (see red dots in Fig. 6.5 and Fig. 6.7) indicate the primary accuracy of the system, without any adjustments. For this trial the mean value of the uncertainty vector length reached level 0.39 mm.

Further series were performed in order to estimate best accuracy of PVS. Through the second trial (see Fig. 6.6 and Fig. 6.8), correction vector still was applied in order to obtain minimal

Fig. 6.4. Priced test element displayed by the profile projector.

Fig. 6.5. Uncertainty of corner detection, distribution on X-Y axes, first trial, with mean=0.22 and standard deviation= 0.12.

Page 49: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

37

detection uncertainty. The results proved, that applying a proper correction vector, the uncertainty of detection decreased below 0.1 mm. The mean value of uncertainty vector length of 4th series is 0.075 mm.

Because of very law uncertainty, below 0.1 mm, the last series of second trial had to be measured using microscope QVI StarLite 150 equipped in Gage-X™ metrology software. This instrumentation assures accuracy of two decimal places.

Tables with numerical measurements data of corner detection accuracy and precision are presented in Appendix C and Appendix D. Mean values and standard deviations of the validation experiments are shown in Table 6.4.

6.3 Estimation of angular deflection uncertainty

Accuracy of element angular deflection estimation was validated by comparing the result from

Fig. 6.6. Uncertainty of corner detection, distribution on X-Y axes, second trial, with mean=0.17 and standard deviation= 0.1.

Fig. 6.7. Uncertainty of corner detection, vector length, first trial; mean equals 0.22 and standard deviation equals 0.12.

Page 50: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

38

PVS and manual measurements where the latter one is treated as a real one. The real angular deflection was estimated by detecting manually the coordinates of two neighboring element corners. Knowing the coordinates of these points, the angle to the line that passes through these points was computed using:

K = tan�� �> − ���> − ��

(6.1)

where: x,y – are point coordinates respectively for X and Y axis.

The experiment was performed to verify accuracy of proposed method. The estimated uncertainty is presented in Fig. 6.9 and Appendix E. Identification of angular deflection using PVS method provides detection with maximum uncertainty below 0.3˚. This way of angular deflection identification is sensitive to lightening disturbances, which cause undesirable lines

Fig. 6.8. Uncertainty of corner detection, vector length, second trial; mean equals 0.17 and standard deviation equals 0.1.

Fig. 6.9. Angular deflection uncertainty, with mean=0.23 and standard deviation= 0.09.

Page 51: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Method Validation and Accuracy Analysis

39

detection, which may result in identifying a false angular deflection.

Knowing about the high accuracy of corner detection by PVS, another way to estimate angular deflection is to base it on coordinates of two corners. However this method is time consuming because PVS has to identify positions of two corners what involves more movements of machine and also computational complexity.

TABLE 6.4 MEAN AND STANDARD DEVIATION OF PERFORMED TESTS.

Trial Series Mean value [mm]

Standard deviation [mm]

1

1 0.40 0.09 2 0.25 0.06 3 0.16 0.09 4 0.11 0.06

2

1 0.30 0.05 2 0.18 0.05 3 0.15 0.10 4 0.07 0.03

Page 52: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Conclusion

40

7 Conclusion

This work contributes to SWL's aim to develop supporting tools for the end-user of WJ technology. A purpose of the thesis was to design and implement a prototype of PVS, which automates a control of WJ devices and improves the accuracy of cutting process. An integration of vision system and control units yields to automatic positioning of WJ machine.

Through the specific combination of two web cameras, the algorithm and machine control unit, the PVS system is able to identify a specified initial corner position and angular deflection of the randomly located workpiece.

The procedure using reference markers ensures the appropriate in situ camera calibration, which is necessary for the required precision of identification process. Calibration of PVS can be done correctly even in harsh environment when calibration markers are dirty. However to get a reliable result, the markers should be kept clean.

During an image acquisition by the GC, machine parts such as the machine nozzle, protection curtain etc. should be placed in the default positions to simplify the process.

It was proved that the Hough transform technique is sufficient to identify the workpiece lines and their intersections in given conditions. However a high resolution of Hough transform increases significantly the computation complexity what may cause memory overload.

In a case of side installation of GC, the metric rectification method gives a top view of workspace and facilitates transition from the image coordinate system into the WJ machine coordinates.

The LC mounting height must be customized to required FoV and accuracy of PVS, and furthermore should match the accuracy of identification process using GC.

Carried out validation experiments and accuracy analysis approved the functionality of the both GC and LC systems. The achieved mean value of the identification uncertainty of initial corner coordinates is better than ±0.1 mm and its precision in terms of standard deviation is better than 0.03 mm.

The prototype of PVS ensures the estimation of the element angular deflection with average uncertainty better than ±0.3˚ and standard deviation less than 0.09˚. Since the uncertainty of angular deflection estimation depends on the size of the workpiece, the achieved accuracy fulfils the requirement for elements with the longest edge shorter than 200 mm. To obtain the required accuracy for bigger elements, the angular deflection must be found out after identification of two corners coordinates.

Due to the changeable sunlight exposure during a working day, to get the required identification accuracy, background images have to be captured every time before detecting the element. However in a case of using web cameras it is recommended to supply uniformly distributed light.

Page 53: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Conclusion

41

Due to the simple structure and commonly used components, it is easy to implement the proposed PVS on different machine layouts. The presented prototype is a cheap alternative to dedicated expensive industrial vision systems.

Page 54: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Future Work

42

8 Future Work

In a case of further development of this PSV there are some important aspects to cope with. In this section, main points useful for a future work are presented.

Sub-pixel accuracy

A software method which can improve the precision of identification using web cameras is a sub-pixel accuracy technique. By this modification of edge detection algorithm, the quality of calibration may be improved.

Detecting elements in other colors

The presented prototype is designed to detect white elements. However the proposed algorithm sufficiently can manage also elements in other colors contrasting with the background. Still a reliable method for the detection of elements in any color or made of transparent material needs to be developed.

Element shape and usage recognition

Despite the fact that a rectangular workpiece is the most common in WJ cutting, elements of other shapes may occur. Based on the proposed PVS, a detection algorithm for a freeform element may be developed and implemented. The shape recognition can be useful to optimize a usage of workpiece and to detect damages to indicate useless areas of element.

Element material identification

Due to shape and color recognition abilities and support from workpiece database, the PVS may be complemented with other functions, for instance the element material identification. After the proper material identification, the system can automatically adjust cutting parameters such as pressure, cutting speed, federate etc.

Cutting optimization

The proposed PVS may be used as a foundation for cutting optimization. It can compute the best cutting path using a proper optimization method based on identification of workpiece’s and target shapes, usage area, and type of material.

Hardware implementation

A hardware implementation of PVS e.g. on PLC, may provide a complex industrial solution dedicated to many CNC cutting purposes. Combining it with proper Human Machine Interface can provide user friendly tool.

Page 55: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Future Work

43

Covers for camera

To adjust the prototype to industrial environment, sufficient protection against dust and water is needed. Protective covers for both cameras will sufficiently shield them during the cutting process.

A cover material should be highly transparent in order to assure a proper image capturing. Otherwise the covers should be automatically opened for image acquisition.

Unification of lightening conditions

The existing machine layout does not provide an adequate illumination system. To unify the working conditions, the global lightening installation should be modernized. Implementation of an additional lightening system can improve the identification quality.

Page 56: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

References

44

References

[1] J. D. Aranda Penaranda, J. A. Ramos Alcazar, L. M. Tomas Balibrea, J. L. Munoz Lozano, R. Torres Sanchez, "Inspection and measurement of leather system based on artificial vision techniques applied to the automation and waterjet cut direct application," in IEEE International Conference on Systems, Man, and Cybernetics, San Antonio, Texas, 1994, vol. 1, pp. 863–867.

[2] J. Paakkari, H. Ailisto, M. Niskala, M. Makarainen, K. Vainamo, "Machine-vision-guided waterjet cutting," in EUROPTO Conference on Diagnostic Imaging Technologies and Industrial Applications, Munich, 1999, pp. 44–51.

[3] R. A. Reyna, D. Esteve, D. Martinez, "An integrated vision system: object detection and localization," in 3rd International Workshop on Design of Mixed-Mode Integrated Circuits and Applications, Puerto Vallarta, 1999, pp. 118-121.

[4] J. Paakkari, H. Ailisto, M. Niskala, M. Makarainen, K. Vainamo, "Machine-vision-guided waterjet cutting," in EUROPTO Conference on Diagnostic Imaging Technologies and Industrial Applications, Munich, 1999, pp. 44–51.

[5] L. Jin-Seob, K. Ji-Wook, C. Dongkyoung, H. Suk-Kyo, "Object Detection of Mobile Robot Using Data-Mining Algorithm," in International Conference on Control, Automation and Systems, Seoul, Oct. 2007, pp 1962-1965.

[6] Y. Yabuta, H. Mizumoto, S. Arii, "Binocular robot vision system with shape recognition," in International Conference on Control, Automation and Systems, Seoul, 2007, pp. 2299–2302.

[7] Li-Wei Zheng, Yuan-Hsiang Chang, Zhen-Zhong Li, "A Study of 3D Feature Tracking and Localization Using A Stereo Vision System," in International Computer Symposium, Taiwan, 2010, pp. 402–407.

[8] F. Shafait, S. M. Imran, S. Klette-Matzat, “Fault detection and localization in empty water bottles through machine vision,” in Emerging Technology Conference, San Diego, 2004, pp. 30-34.

[9] W. Hawang, et al.,” Vision tracking system for mobile robots using two Kalman filters and a slip detector,” in International Conference on Control, Automation and Systems, Gyeonggi-do, Korea, 2010, pp. 2041-2046.

[10] H. Kawai, H. Kobayashi, “Motion Detection with Networked Cellular Vision System for Preventing Crime and Security,” in The Institution of Engineering and Technology Conference on Crime and Security, London, 2006, pp.396-401.

[11] M. Abou-El-Ela, F. El-Amroussy, “A Machine Vision System For The Recognition And Positioning Of Two-Dimensional Partially Occluded Objects,” in 8th Mediterranean Electrotechnical Conference, Bari, May 1996, vol. 2, pp. 1087-1092.

[12] R. A. Reyna, D. Esteve, D. Martinez, “An Integrated Vision System: Object Detection and Localization,” in Third International Workshop on Design of Mixed-Mode Integrated Circuits and Applications, Puerto Vallarta, July 1999, pp. 118-121.

[13] M. Pena, I. Lopez, R. Osorio, “Invariant Object Recognition Robot Vision System for Assembly,” in Electronics, Robotics and Automotive Mechanics Conference, Cuernavaca, Sept. 2006, pp. 30-36.

[14] L. Jinmei, Z. Chunhui, W. Hongmin, S. Yingying, “Background Removal of Microscopy Gray-Level Images,” in The 2nd International Conference on Bioinformatics and Biomedical Engineering, Shanghai, May 2008, pp.2639-2642.

Page 57: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

References

45

[15] J. C. S. Jacques, C. R. Jung, S. R. Musse, “Background Subtraction and Shadow Detection in Grayscale Video Sequences,” in 18th Brazilian Symposium on Computer Graphics and Image Processing, Natal, Oct. 2005, pp. 189-196.

[16] A. Chowdhury, C. Sang-jin, C. Ui-Pil, “A Background Subtraction Method using Color Information in the Frame Averaging Process,” in International Forum on Strategic Technology, Harbin, Aug. 2011, pp. 1275-1279.

[17] C. Harris, M. Stephens, “A Combined Corner and Edge Detector,” in Proceedings of The Fourth Alvey Vision Conference, Manchester, 1988, pp. 147-151.

[18] S. M. Smith, J. M. Brady, “SUSAN - A New Approach to Low Level Image Processing,” in International Journal of Computer Vision, vol. 23, no. 1, pp. 45–78, May 1997.

[19] W. Wang, Y. Tang, J. Hong, H. Fan. “Image Corner Detection Technique Research on Machine Vision,” in International Workshop on Intelligent Systems and Applications,” Wuhan, 2009, pp. 1-4.

[20] R. O. Duda, P. E. Hart, "Use of the Hough Transformation to Detect Lines and Curves in Pictures," in Communications ot the ACM Magazine, vol. 15(1), Jan. 1972, pp. 11-15.

[21] P. V. C. Hough, “Method and means for recognizing complex patterns,” U.S. Patent 3 069 654, Dec. 18, 1962.

[22] C. R. Jung, R. Schramm, “Rectangle Detection based on a Windowed Hough Transform,” in 17th Brazilian Symposium on Computer Graphics and Image Processing, Oct. 2004, pp. 113-120.

[23] Y. Ito, K. Ogawa, K. Nakano, “Fast Ellipse Detection Algorithm using Hough Transform on the GPU,” in Second International Conference on Networking and Computing, Osaka, Dec. 2011, pp. 313-319.

[24] P. K. Ser, W. C. Siu “Novel 2-D Hough Planes for the Detection of Ellipses,” in International Symposium on Speech, Image Processing and Neural Networks, Apr. 1994, vol. 2, pp. 527-530.

[25] R. J. Christopher, P. H. Gregson, “Detecting corners using the ‘patchy’ Hough transform,” in Canadian Conference on Electrical and Computer Engineering, Halifax, Sep. 1994, vol. 2, pp. 576-579.

[26] W. A. Barrett, K. D. Petersen, “Houghing the Hough: Peak Collection for Detection of Corners, Junctions and Line Intersections,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2001, vol. 2, pp. 302-309.

[27] “Digital Image Forensics,” ch. 5.4, pp. 89-90 [Online] http://www.cs.dartmouth.edu/farid/downloads/tutorials/digitalimageforensics.pdf

[28] M. I. A. Lourakis, “Plane metric rectification from a single view of multiple coplanar circles,” in 16th IEEE International Conference on Image Processing, Cario, Nov. 2009, pp. 509-512.

[29] D. Liebowitz, A. Zisserman, “Metric rectification for perspective images of planes,” in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Santa Barbara, Jun. 1998, pp. 482-488.

[30] M. Ligang, P. Silong, “Perspective Rectification of Document Images Based on Morphology,” in International Conference on Computational Intelligence and Security, Guangzhou, Nov. 2006, pp. 1805-1808.

[31] Z. Zohou, Lu Xiaobo, “An Accurate Shadow Removal Method For Vehicle Tracking,” in International Conference on Artificial Intelligence and Computational Intelligence, Sanya, Oct. 2010, pp. 59-62.

Page 58: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

References

46

[32] R. Maini, H. Aqqarwal, “Study and Comparison of Various Image Edge Detection Techniques,” in International Jpurnal of Image Processing, Feb. 2009, vol. 3(1).

Page 59: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Appendix A

47

Appendix A

A view of global camera mounting plate.

Global camera mounting plate

Page 60: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Appendix B

48

Appendix B

A view of local camera mounting plate.

Local camera mounting plate.

Page 61: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Appendix C

49

Appendix C

Table of corner’s detection uncertainty from first trial. ELEMENT CORNER'S DETECTION UNCERTAINTY, FIRST TRIAL.

Series Test

sample

detection uncertainty In

x-axis [mm]

In y-axis [mm]

length [mm]

1

1 0.50 0.10 0.510 2 0.10 0.35 0.364 3 0.27 0.09 0.285 4 0.29 0.32 0.432

2

5 -0.16 0.24 0.288 6 0.12 0.27 0.295 7 -0.17 0.01 0.170 8 -0.05 0.23 0.235

3

9 -0.07 0.00 0.070 10 0.01 0.04 0.041 11 0.09 0.03 0.095 12 0.08 0.28 0.291 13 0.20 0.13 0.239 14 0.05 0.16 0.168 15 0.10 0.12 0.156 16 0.17 0.13 0.214

4 17 0.08 0.06 0.100 18 0.17 -0.06 0.180 19 0.06 -0.01 0.061

Page 62: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Appendix D

50

Appendix D

Table of corner’s detection uncertainty from second trial. ELEMENT CORNER'S DETECTION UNCERTAINTY, SECOND TRIAL

Series Test sample

detection uncertainty In

x-axis [mm]

In y-axis [mm]

length [mm]

1

1 0.165 -0.197 0.25697 2 0.125 -0.2 0.23585 3 0.3 0.04 0.30265 4 0.34 0.03 0.34132 5 0.35 0.04 0.35228

2 6 -0.05 -0.16 0.16763 7 -0.06 -0.23 0.2377 8 -0.04 -0.13 0.13601

3

9 -0.04 -0.045 0.06021 10 0.025 -0.195 0.1966 11 0 -0.31 0.31 12 -0.01 -0.1 0.1005 13 -0.01 -0.1 0.1005

4

14 0,02 -0,085 0,087321 15 -0,005 -0,045 0,045277 16 -0,01 -0,13 0,130384 17 0,0098 0,0427 0,04381 18 0,057 0,0771 0,095882 19 0,054 0,0188 0,057179 20 0,0692 0,0108 0,070038

Page 63: Automatic Waterjet Positioning Vision System final …832406/FULLTEXT01.pdf · Master Thesis Electrical Engineering August 2012 i Automatic Waterjet Positioning Vision System Damian

Appendix E

51

Appendix E

Table of angular deflection uncertainty. ELEMENT'S ANGULAR DEFLECTION UNCERTAINTY.

Designated K[˚]

Calculated K[◦]

Uncertainty[˚]

-5.88 -5.6005 -0.2795 9.12 9.38965 -0.2697 12.4 12.0911 0.30889 -3.69 -3.9309 0.24092 -4.74 -4.5307 -0.2093 -7.79 -7.9906 0.20056 -4.85 -4.847 -0.003 4.51 4.25625 0.25375 2.58 2.90414 -0.3241