robot vision thesis

Upload: mada-sanjaya-ws

Post on 04-Jun-2018

227 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 robot vision thesis

    1/88

    OBJECT TRACKING OF MOBILE ROBOT USING IMAGE PROCESSING

    LIM TEIK YEE

    This thesis is submitted in fulfillment for the

    Requirement for the award of the degree of

    Bachelor of Engineering (Electrical - Mechatronics)

    FACULTY OF ELECTRICAL ENGINEERINGUNIVERSITY TECHNOLOGY MALAYSIA

    MAY 2011

  • 8/13/2019 robot vision thesis

    2/88

  • 8/13/2019 robot vision thesis

    3/88

    I acknowledge that I have studied this piece of work and in my opinion it is in

    accordance with the scope requirement and quality for the purpose of awarding the

    Bachelor Degree in Electrical Engineering (Mechatronic)

    Signature : ..

    Name of Supervisor : MR. JOHARI HALIM SHAH BIN OSMAN

    Date : 1-6-2011

  • 8/13/2019 robot vision thesis

    4/88

    i

    DECLARATION

    I declare that this thesis entitled Object Tracking of Mobile Robot with Image

    Processing is the result of my own research except as cited in the references. The

    thesis has not been accepted for any degree and is not concurrently submitted in

    candidature of any degree.

    Signature : ......................................................................

    Name : LIM TEIK YEE

    Date : MAY 2011

  • 8/13/2019 robot vision thesis

    5/88

  • 8/13/2019 robot vision thesis

    6/88

    iii

    ACKNOLEDGEMENT

    First of all, I would like to send my heartily appreciation to my project

    supervisor, Prof. Dr. Johari Halim Shah bin Osman for his guidance throughout this two

    semester. With his support and guidance, this project is able to finish in time.

    At the other side, I would also like to thanks my course mates who are also a

    Robocon team member for his technical knowledge of software and hardware. With

    their expertise, I solved hardware technical problem and learn more programming

    skills.

    .

    Lastly, I would like to thank my family whose is always morally support me.

    Thank for their motivation, I manage to go through every difficulties I had faced.

  • 8/13/2019 robot vision thesis

    7/88

    iv

    ABSTRACT

    This project is to develop a mobile robot with computer vision. The robot is a

    flat base robot which can mount a laptop and the camera. It consists of two brush

    motors which responsible for the robot movement on the ground. The image is

    captured by using a low cost webcam. The system is expected to track a single object

    based on objects color characteristic and keep it in the center of view. This technique

    can be archived by calculating the target X coordinate. Such value can use to control

    motor power output and direction.

  • 8/13/2019 robot vision thesis

    8/88

    v

    ABSTRACK

    Projeck ini bertujuan untuk membina satu mobile robot dengan computer

    vision. Robot ini adalah robot dengan tapak mendatar dan boleh membawa laptop.

    Gambar ditangkap dengan menggunakan webcam kos rendah. Sistem ini dijangka

    boleh mengikut satu objek berdasarkan sifat color objek tersebut. Ia sentiasa

    mengekalkan object di tengah penglihatan. Teknik ini boleh dicapai dengan mengira

    koordinat X target. Nilai ini kemudiannya menentukan kuasa keluaran dan arah

    motor.

  • 8/13/2019 robot vision thesis

    9/88

    vi

    TABLE OF CONTENT

    CHAPTER TITLE PAGE

    DECLARATION i

    DEDICATION ii

    ACKNOWLEDGEMENTS iii

    ABSTRACT iv

    ABSTRAK v

    TABLE OF CONTENTS vi

    LIST OF FIGURES x

    LIST OF TABLE xi

    LIST OF APPENDICES xiv

    1 INTRODUCTION 1

    1.1 Background 1

    1.2 Problem Statement 1

    1.3 Objective 2

    1.4 Scope 2

    1.5 Thesis Organization 2

  • 8/13/2019 robot vision thesis

    10/88

    vii

    2 LITERATURE REVIEW

    2.1 Image Processing 3

    2.2 Object Tracking of Mobile Robot 4

    2.3 Kalman Filter 5

    2.4 Background Subtraction Methods 5

    2.5 Smoothing 7

    2.6 RGB (Red Green Blue) 8

    2.7 Lego Pan Tilt Camera and Objects Tracking 10

    2.8 Conclusion of Literature Review 13

    3 METHODOLOFY AND APPROACH 14

    3.1 Mobile Robot System 14

    3.2 Methodology and Approach 16

    3.3 Hardware Design 17

    3.4 Hardware Components 183.4.1.1 DC Motor with Encoder

    MO-SPG-30E-200K

    18

    3.4.1.2 State Diagrams and Waveform 20

    3.4.1.3 Pin Description 21

    3.4.2 Camera 22

    3.4.3 Microcontroller 23

    3.4.4 Sensor 23

    3.4.5 Power Supply 25

    3.4.6 Motor driver 26

    3.4.7 UART 26

    3.5 Circuit Diagram 27

    3.5.1 Microcontroller Connection 28

    3.5.2 Motor Driver Connection 29

  • 8/13/2019 robot vision thesis

    11/88

    viii

    3.5.3 Infra Red Sensor Connection 31

    3.5.4 Circuit Board 31

    3.6 Software Design 33

    3.6.1 MPLab 33

    3.6.2 Tiny Bootloader 33

    3.6.3 Microsoft Visual Studio Basic 2010 33

    3.6.4 Image Processing 34

    3.6.5 Graphic User Interface (GUI)

    Development

    34

    3.6.6 USB Camera Detection 35

    3.6.7 Mode Selection 36

    3.6.8 Color Filter Mode Selection 36

    3.6.10 Coordinate of Detected Object Center 37

    3.6.11 Video Source Player 38

    3.6.12 Picture Box 39

    3.7 Flow Diagram 39

    4 RESULT 41

    4.1 Hardware Result 41

    4.2 Software Result 43

    4.2.1 Color Tracking 43

    4.2.2 Edge Filter 44

    5 FUTURE WORK AND CONCLUSION 46

    5.1 Future Work 46

    5.1.1 Hardware Improvement 46

    5.1.2 Software Improvement 47

    5.2 Conclusion 47

  • 8/13/2019 robot vision thesis

    12/88

    ix

    REFERENCES 49

    APPENDIX 51

  • 8/13/2019 robot vision thesis

    13/88

    x

    LIST OF FIGURE

    FIGURE TITLE PAGE

    2.1 Flow Diagram of Image Processing Step 4

    2.2 Model underlying the Kalman Filter. 5

    2.3 Flow Diagram Motion Tracking Flow Diagram 6

    2.4 Background Subtraction Techniques 6

    2.5 Smoothing 1 7

    2.6 Smoothing 2 7

    2.7 Smoothing 3 8

    2.8 Representative of Addictive Color Mixing 8

    2.9 Flow Diagram of Color Tracking 9

    2.10 Example of Color Based Tracking 10

    2.11 Stereo Vision Robot Top View 10

    2.12 Stereo Vision Robot Side View 11

    2.13 Stereo Vision Robot with Webcam Installed 11

    2.14 Graphic User Interface 12

    2.15 Color Filtering 12

    3.1 System Overview 15

    3.2 Flow Diagram of System Overview 1 15

  • 8/13/2019 robot vision thesis

    14/88

    xi

    3.3 Flow Diagram of System Overview 2 16

    3.4 Project Flow 17

    3.5 Top View of the Robot Base 17

    3.6 Side View of the Robot Base 18

    3.7 Front View of the Robot Base 18

    3.8 DC Geared Motor with Encoder 29

    3.9 Square Quadrature Waveform 21

    3.10 Connector Pin Descriptions. 21

    3.11 Logitech Webcam C120 22

    3.12 Microcontroller 18F452 24

    3.13 LM324 24

    3.14 Sensors 24

    3.15 ATX Power Supply Unit 25

    3.16 Modified circuit for Output Voltage 5V and 12V 25

    3.17 Motor Driver L298 26

    3.18 Main components to Build a RS 232 27

    3.19 Microcontroller Connection 28

    3.20 Motor Driver Connection 30

    3.21 Infra red Sensors Connection 31

    3.22 Main Circuit Board 32

    3.23 Sensors Circuit Board 32

    3.24 Graphical User Interface 35

    3.25 Com Port Selection Panel 35

    3.26 Mode Selection Panel 36

    3.27 Color Filter Mode Selection Panel 36

    3.28 Webcam Device Detection Panel

    3.29 Coordinate Display Panel 36

    3.30 Source Code for Image Acquisition 37

    3.31 Video Source Player 38

  • 8/13/2019 robot vision thesis

    15/88

    xii

    3.32 Picture Box 38

    3.33 Overall Flow Diagram 39

    4.1 Robot Front View 41

    4.2 Robot Side View 42

    4.3 Robot Rear View 42

    4.4 Color Tracking 44

    4.5 Edge Filter 45

  • 8/13/2019 robot vision thesis

    16/88

    xiii

    LIST OF TABLE

    TABLE TITLE PAGE

    2.1 RGB Notation for Color Red 9

    3.1 State Diagram of DC Geared Motor 18

    3.2 Connector Pin Description 20

    3.3 Logic Level of RS 232 25

    3.4 Pins Connection Description of the 18F452

    Microcontroller

    29

    3.5 Pins Connection Description of the L298 Motor

    Driver.

    30

    3.6 Pins Connection Description of the LM324 OP AMP 31

  • 8/13/2019 robot vision thesis

    17/88

    xiv

    LIST OF APPENDIX

    APPENDIX TITLE PAGE

    A Gantt Chart 51

    B Full Circuit Schematic 52

    C Microcontroller Programming 53

    D Image Processing and GUI 59

  • 8/13/2019 robot vision thesis

    18/88

    1

    CHAPTER 1

    INTRODUCTION

    1.1 Background

    Object tracking has variety of use, such as security and surveillance, traffic

    control, video communication and compression and etc. If the amount of data is huge,

    video object tracking can be a time consuming process.

    Object tracking colligates targets object in many consecutive video frame. If

    the object is moving, the colligation will become difficult especially if the speed is

    fast relative to the frame rate. Another difficulty is if object orientation keep

    changing over time.

    Tracking object based on color properties is one of the quickest methods from

    one image frame to another. The speed of this technique makes it very attractive for

    near-realtime applications but due to its simplicity many issues exist that can cause

    the tracking to fail.

    1.2 Problem Statement

  • 8/13/2019 robot vision thesis

    19/88

    2

    Most of the cameras available in market have limited monitoring view, this is

    due to the cameras are stationary. If a single target needs to be monitored in all corner,

    it require too many cameras and is not cost effective.

    Most of the cameras also need to be manually operated or else the cameras

    will only focus on one point.

    1.3 Objective

    To design a mobile robot that can follow an object base on its object color

    and to create a GUI that can monitor the process.

    1.4 Scope

    The scope of this Build a two wheels mobile robot, install camera (might be

    IP camera or CMOS) and infra red sensor and apply infra red sensor. While

    programming involves Microchip programming using MPLAB C++, Graphic

    User Interface is built with Visual Basic, and image processing using Aforgenet

    library.

    1.5Thesis Organization

    In Chapter 2, there will be literature review for this project. Chapter 3 is

    about project overview. Chapter 4 discussed about project methodology. Hardware

    and software implementation are reviewed in Chapter 5 and Chapter 6. The last

    chapter, Chapter 7 discuss about the result and some recommendation for future

    work.

  • 8/13/2019 robot vision thesis

    20/88

    3

    CHAPTER 2

    LITERATURE REVIEW

    This chapter will include information which had been studied related to this

    project. It discussed about similar project or previous research of this project. The

    previous works provided recommendation and suggestion to this project. This

    reference is referred carefully as a useful source. Most of the source is obtained from

    journal, article, thesis, book and internet forum.

    2.1 Image Processing

    Image processing is a physical process used to convert an image signal (either

    digital or analog) into physical image. The actual output itself can be an actual

    physical image or the characteristic of an image.For example, the most common type

    of image processing is photography.

    In digital photography, the image is stored as a computer file. The file is

    translated using photographic software to generate actual image. The color, shading,

    and nuances are all captured at the time the photograph is taken the software

    translates this information into an image. Figure 2.1 shows the general 3 step of

    image processing.

  • 8/13/2019 robot vision thesis

    21/88

    4

    Figure 2.1 Flow Diagram of Image Processing Step

    2.2 Object Tracking of Mobile Robot

    Object tracking is a process tracing an object based on object properties such

    as color, shape, brightness or motion. It usually performed in higher application that

    requires the location/shape/color of the object in every frame.

    For this project, object tracking using mobile robot implemented with image

    processing tracking technique.

    By using mobile robot, the camera can change it position to track the target.

    Useful in surveillance. Object tracking of mobile robot consist of two main features,

    which is motion tracking and color tracking with several stage of algorithm such as

    object detection, object identification and object tracking.[7]

    Output, the output might be

    altered image.

    Manipulate and analyze the image

    in some way. For example image

    enhancement and data

    com ression

    By using optical scanner or digital

    photography, import the image.

  • 8/13/2019 robot vision thesis

    22/88

    5

    2.3 Kalman Filter

    . Kalman Filter is introduced by Rudolf E.Kalman in 1960. It is a set of

    mathematic equation that provides computational means to estimate the state of

    process and minimize the square error.

    Kalman filter estimate 3 state: past, present and future state. The filter is

    powerful, this three states can be estimated even when the precise nature of the

    modeled system is unknown.

    Figure 2.2 Model Underlying the Kalman filter.

    The equation of Kalman filter evolving for time k-1 to k is given by

    xk= Fxk-1+ Buk-1+ Wk

    Where

    F is the state transition model which is applied to previous state x

    B is the control input which is applied to u

    W is the noise produced in the process. [12]

    2.4 Background Subtraction Methods

    . The moving object is identified by comparing current image frame to

    background model. Figure 2.3 explain the process of this method. First the

  • 8/13/2019 robot vision thesis

    23/88

    6

    background image is set as reference image, after that the current image is capture

    and compare to the reference image. If any differences are found, the differences are

    set to white spot. Figure 2.4 visualize how this method is performed. [6]

    Figure 2.3 Flow Diagram Motion Tracking Flow Diagram

    Figure 2.4 Background Subtraction Techniques

    However there are three limitations to these methods:

  • 8/13/2019 robot vision thesis

    24/88

    7

    1st, it must be sturdy against changes in illumination.

    2nd, it must avoid detecting insignificant moving background object such as

    shadow (casted by target), whether (such as rain),

    3rd, the internal background model should be able to react quickly to the

    changes in background.

    2.5 Smoothing

    Smoothing is needed to improve the detection of objects, figures below showhow the smoothing is made. In figure 2.5, there is snow flake in the left video frame

    and the right frame show that the flakes was removed. [6]

    Figure 2.5 Smoothing 1

    In figure 2.6, the moving tree leaves was removed using morphological

    processing as shown in figure 2.7.

  • 8/13/2019 robot vision thesis

    25/88

    8

    Figure 2.6 Smoothing 2

    In figure 2.7, the right video frame is more robust against illumination change

    compare with left video frame.

    Figure 2.7 Smoothing 3

    2.6 RGB (Red Green Blue)

    In color tracking, standard RGB is used to determine the color detected.

    Every RGB color model is formed by different combination of red, green blue color

    as shown in figure 2.8. It is based on Young-Helmoholtz theory of trichromatic color

    vision which is developed by Thomas Young and Hermann Helmholtz in the early to

    mid 19thcentury. [3]

  • 8/13/2019 robot vision thesis

    26/88

    9

    Figure 2.8 Representative of Addictive Color Mixing

    During digital image processing the RGB can be represented as binary value.

    It can be represented in differences notation as shown in table 2.1.

    Table 2.1. RGB Notation for Color Red

    Figure 2.9 shows how the process of color filtering. First, when the image is

    captured, it compare to the RGB value wanted (normally in color tracking the value

    will not be set to only one value, but rather in a range, for example (1.0, 0.0, 0.0) ~

    (0.8, 0.1, 0.1). After the color wanted is detected, we can set the wanted color or the

    unwanted colors (depending on the user) to black color (or other colors). Figure 2.10

    shows example on how the color is filtered.

  • 8/13/2019 robot vision thesis

    27/88

    10

    Figure 2.9 Flow Diagram of Color Tracking

    Figure 2.10 Example of Color Based Tracking

    2.7 Lego Pan Tilt Camera and Objects Tracking

    Pan tilt camera shown in figure 2.11, 2.12 and 2.13 is a quite popular camera

    for people who like to build tracking camera on their own. It requires only simple

    electronics, stepper motor, and pan tilt equipment. This homemade pan tilt camera

    makes use of regular USB webcam and Lego robotic kits. The pan tilt camera

    doesnt need to move around, it just stay on the same spot and moving its camera to

    certain degree. As shown in figure 2.11 and figure 2.12, the pan module can be easily

    built by setting rotating platform piece. The tilt module is a block based one with a

    thread manipulated platform.

  • 8/13/2019 robot vision thesis

    28/88

    11

    Figure 2.11 Stereo Vision Robot Top View

    Figure 2.12 Stereo Vision Robot Side View

    In figure 2.13, a Logitech camera is attached, giving it two degree of freedom

    with an interesting structure.

    Figure 2.13 Stereo Vision Robot with Webcam Installed

  • 8/13/2019 robot vision thesis

    29/88

    12

    This is a robot built on Lego Pan Tilt. It has quite interesting structure with 2

    degree of freedom camera.

    Figure 2.14 shows the GUI of the robot. This GUI provides the robot two

    special controls which allow controlling the camera - one controls the pan device and

    the second controls the tilt device.

    Figure 2.14 Graphic User Interface

    This robot task is to track simple object with solid color. The object detection

    is done quite easily utilizing image processing routines provided by the Aforge.NET

    framework. The result is shown in figure 2.15. [14]

    Figure 2.15 Color Filtering

  • 8/13/2019 robot vision thesis

    30/88

    13

    2.7 Conclusion of Literature Review

    For this project, color filtering and robot with graphic user interface is useful.

    For color filtering, standard RGB is implemented.

  • 8/13/2019 robot vision thesis

    31/88

    14

    CHAPTER 3

    METHODOLOGY AND APPROACH

    This chapter introduces how the project is carried at. After that, this chapter

    will discuss main component, mechanism and software used.

    3.1 Mobile Robot System

    The object tracking mobile robot consists of three main components, the

    image acquisition unit, computer and the mobile robot. The image acquisition unit

    act as eye of the robot, it receive information (capture image) from environment. The

    mobile robot act as body and muscle of the system, it receive command from the

    computer and move accordingly. The mobile robot consists of three main part,

    microcontroller, motor driver and motor. The computer act as brain, it receives

    information from eye and process the information and after that it tells the muscle

    what to do. So the overall process is:

    Mobile robot receives various information from the sensors such as infra red

    and camera, sending the information to the computer. The computer will then detect

    the object shape and color and then calculate the distance between the target and

    mobile robot. Finally the computer will send a signal that determines the robot next

    move. Figure 3.1 shows the communication between the hardware.

  • 8/13/2019 robot vision thesis

    32/88

    15

    Figure 3.1 System Overview

    In this project, the computer and the mobile robot will be connected by cable

    to prevent data lost. The speed of motor will be controlled using PWM (Pulse

    Modulation Width).

    Figure 3.2 shows communication of the main parts in this project.The

    camera is not directly connected to the microcontroller but needed to beprocessed by the computer first.

    Figure 3.2 Flow Diagram of System Overview 1

  • 8/13/2019 robot vision thesis

    33/88

    16

    Figure 3.3 show that if extra sensors are needed, such system can be

    applied. But taking consideration of the microcontroller memory, system in

    figure 3.3 can be improved into figure 3.3 systems. The uses of the sensors are to

    maintain the target distance.

    Figure 3.3 Flow Diagram of System Overview 2

    3.2 Approach

    This project is started by designing the mechanical hardware. After that

    proceed with circuit design followed by software development. The next step is

    hardware and circuit construction. The last phase is testing, tunning and

    improvement as shown in figure 3.4.

  • 8/13/2019 robot vision thesis

    34/88

    17

    Figure 3.4 Project Flow

    3.3 Hardware Design

    The robot base is built with tow hollow bar and two L bar as shown in

    figure 3.5. Each with 40 cm long, and a 34cm x 38cm Perspex(with 0.5cmthickness). In figure 3.6, two servowheel (3.5cm Radius) is used. While figure

    3.7 shows two 2.5cm castor is used. The robot is 40cm x 40 cm x 2.5 cm (length

    x width x height).

    Figure 3.5 Top View of the Robot Base.

  • 8/13/2019 robot vision thesis

    35/88

    18

    Figure 3.6 Side View of the Robot Base.

    Figure 3.7 Front view of the robot Base.

    3.4 Hardware Components

    This part discussed main component used.

    3.4.1.1 DC Motor with Encoder MO-SPG-30E-200K

    It is decided to use DC geared motor with encoder (17 revolutions per minute

    with 0.784 Nm torque) as shown in figure 3.8. This DC geared motor is typically

  • 8/13/2019 robot vision thesis

    36/88

    19

    used in wide electrical appliance such as label printer, auto shutter welding machine,

    grill, oven etc. It runs on 12 volt, giving out 1.1 Watt output power, producing

    17RPM speed and with rated 0.41 Ampere current. It is equipped with 5V

    Quadrature Hall Effect encoder that monitoring the position and direction of the

    encoder. The resolution of the encoder output is 12 counts per rear shaft revolution or

    2400 counts per main shaft revolution. The motor is purchased in Cytron, please visit

    following webpage for further information http://alturl.com/u2juz (this is a shorten

    URL).

    Figure 3.8 DC Geared Motor with Encoder MO-SPG-30E-200K

    The features of quadrature hall effect encoder is it can operate from 4.5V to

    5.5V, it has two digital outputs (quadrature waveform), it is small in size and light in

    weight. It has high resolutions with 12 counts per rear shaft revolution where:

    - 240 counts per main shaft revolution for 1:20 geared motor

    - 360 counts per main shaft revolution for 1:30 geared motor

    - 720 counts per main shaft revolution for 1:60 geared motor

    - 1800 count per main shaft revolution for 1:150 geared motor

    - 2400 count per main shaft revolution for 1:200 geared motor

    - 3600 count per main shaft revolution for 1:300 geared motor

    Two DC motors will be used, to control the motor direction. For further

  • 8/13/2019 robot vision thesis

    37/88

    20

    information please refer to the user manual pdf which can be downloaded in

    http://alturl.com/u2juz.

    3.4.1.2 State Diagrams and Waveform

    Table 3.1 show the signal produced by channel A and B when the robot is

    moving forward or backward. Depend on how the motor is install, clockwise rotation

    can be either moving forward or moving backward. The phases of clockwise rotation

    are reverse of the counter clockwise rotation phases (phase 1,2,3,4 of clockwiserotation is equal to phase 4,3,2,1 of counter clockwise rotation. So by reading the

    signal from channel A and B phase by phase, the distances traveled and also the

    direction of motor rotation can be determined. Figure 3.9 shows how the phases in

    Table 3.1(a) displayed in waveform.

    (a) (b)

    Table 3.1 State Diagram of DC Geared Motor (a) Clockwise (b) Counter

    Clockwise

  • 8/13/2019 robot vision thesis

    38/88

    21

    Figure 3.9 Square Quadrature Waveform for Channel A and B (Clockwise).

    3.2.1.3 Pin Description

    Figure 3.10 shows the configuration of the pins. Starting from left is Motor

    (motor voltage input), Motor + (motor voltage input), VCC (voltage supply for

    encoder), GND (encoder ground), A (channel A), B (channel B). Depending on the

    voltage input (from motor driver), Motor and Motor + will determine the direction

    of motor rotation. While VCC and GND enable the encoder inside the motor to

    operate. Channel A and Channel B is a signal output that will be received by

    microcontroller. Table 3.2 lists down the description of the pin (taken from the

    manual).

    Figure 3.10 Connector Pin Descriptions.

  • 8/13/2019 robot vision thesis

    39/88

    22

    Table 3.2 Connector Pin Descriptions

    3.4.2 Camera

    For this project, Logitech Webcam C120 is used. It is a CMOS camera with

    USB 2.0 UVC driverless interface. The camera has 1.3 megapixels when capturingimage, 640*480 pixels when capturing video. Lastly it has frame rate up to 30 frames

    per seconds. The focus of the camera can be adjusted by turning the ring located at

    the outer part of the lens. Figure 3.11 shows the camera used.

    Figure 3.11 Logitech Webcam C120

  • 8/13/2019 robot vision thesis

    40/88

  • 8/13/2019 robot vision thesis

    41/88

    24

    and infra red sensor. The operational used in this project is LM324, a commonly used

    IC that consists of 4 operational amplifier. Figure 3.13 show the pin configuration of

    LM 324.

    Figure 3.13 LM324

    The about 4 infra red sensors will be used in this project. Each infra red

    sensor comes in pair, a transmitter and a receiver. Basically the transmitter will emit

    infra red and the receiver detect it by changing own resistance value. The receiver of

    the IR sensor is LDR (light dependent resistor), also known as photo resistor.

    Normally, this kind of receiver will have it resistance value dropped when exposed to

    light. Figure 3.14 shows the infra red transmitter (the blue ones) and the receiver (the

    black ones)

    Figure 3.14 Infra Red Sensor

  • 8/13/2019 robot vision thesis

    42/88

    25

    3.4.5 Power Supply

    For this project, power supply to microcontroller and motor will come from

    ATX power supply unit (PSU) which is commonly used in old and discarded

    computer (single core processor).

    Due to its built in current protecting feature, this power supply need to be

    modified. This modified PSU is chosen because it has high current output, short

    circuit protection and very tight voltage regulation. Figure 3.15 shows the PSU used

    and figure 3.16 shows the modified circuit for output voltage 5V and 12V which is

    used together with the PSU.

    Figure 3.15 ATX Power Supply Unit

    Figure 3.16 Modified circuit for Output Voltage 5V and 12V.

  • 8/13/2019 robot vision thesis

    43/88

    26

    3.4.6 Motor Driver

    The L298 is an integrated monolithic circuit in a 15-lead Multiwatt and

    PowerSO20 packages. It is a high voltage, high current dual full-bridge driver

    designed to accepts standard TTL logic levels and drive inductive loads such as

    relays, solenoids, DC and stepping motors. Two enable inputs are provided to enable

    or disable the device independently of the input signals. The emitters of the lower

    transistors of each bridge are connected together rand the corresponding external

    terminal can be used for the connection of an external sensing resistor. An additional

    supply input is provided so that the logic works at a lower voltage.-Figure 3.17

    shows the pin configuration of motor driver L298.

    Figure 3.17 Motor Driver L298

    The specification of this full bridge motor driver is it has operating voltage up

    to 46V, total DC current up to 4A, low saturation voltage with over temperature

    protection, logical 0 input voltage up to 1.5V.

    3.4.7 UART (Universal asynchronous receiver/transmitter)

    The purpose of UART is to act as communicator between computer and

    mobile robot. The tricky part in here is the way to avoid data loss. For this project, aRS232 receive protocol will be built using MAX232. MAX 232 is an IC that convert

  • 8/13/2019 robot vision thesis

    44/88

    27

    signal from RS232 serial port to signal suitable for use in TTL (transistor to transistor

    logic) compatible digital logic circuit. Table 3.3 shows the logic level of RS232

    while figure 3.18 show the components needed to build the UART.

    Table 3.3 Logic Level of RS 232

    Figure 3.18 Main components to Build a RS 232: MAX232, Capacitor 104uF x 5,

    PC D89 Female

    3.5 Circuit Diagram

    This part will show the circuit connection. The circuit is manually soldered

    on donut board using solder gun, solder paste and solder lead. No PCB board

    involved. Wrapping wire are used as jumper to connect electronics components and

    pins..

  • 8/13/2019 robot vision thesis

    45/88

    28

    3.5.1 Microcontroller Connection

    Figure 3.19 below shows the connection of microcontroller. The

    microcontroller is connected to the 10 GHz crystal. It receives signal from IR sensor

    and the computer (through UART). All LEDs is act as indicator (to see whether there

    is output signal or input signal successfully received or transmitted). The

    microcontroller receives 5 V of voltage supply from a regulator. The four output of

    microcontroller is sent to the L298 motor driver. Table 3.4 shows the pins connection

    description of the microcontroller.

    Figure 3.19 Microcontroller Connection

    Pin Description

    1 Master Clear

    2 Switch

  • 8/13/2019 robot vision thesis

    46/88

  • 8/13/2019 robot vision thesis

    47/88

    30

    Figure 3.20 L298 Motor Driver Connections

    Pin Description

    1 Ground

    2 Motor 1 +

    3 Motor 1 -

    4 Receive 12 V voltage supply (for motor).5 Receive signal from microcontroller.

    6 Receive PWM from microcontroller.

    7 Receive signal from microcontroller.

    8 Ground.

    9 Receive 5 V voltage supply (for motor driver)

    10 Receive signal from microcontroller.

    11 Receive PWM from microcontroller.

    12 Receive signal from microcontroller.13 Motor 2 +

    14 Motor 2 -

    15 Ground

    Table 3.5 Pins Connection Description of the L298 Motor Driver.

  • 8/13/2019 robot vision thesis

    48/88

    31

    3.5.3 Infra Red Sensor Connection

    Figure 3.21 shows the connection of infra red sensor. This is an active low

    configuration (the resistance of receiver will drop when exposed to infra red, low

    resistance will have low voltage output, the receiver symbol is the circled diode

    symbol). The LED D9 act as indicator while D8 is the infra red transmitter.

    Figure 3.21 Infra Red Sensor Connection

    Pin Description

    1 Output signal send to microcontroller

    2 Motor 1 +

    Table 3.6 Pins Connection Description of the LM324 OP AMP.

    3.5.4 Crcuit Board

    Figure 3.22 shows the main board. The main board is mainly consist of

    voltage regulator, the microcontroller and motor driver. Label A is connector to

  • 8/13/2019 robot vision thesis

    49/88

    32

    power supply (fPSU unit), Label B is connector to UART, Part C is connector to

    the two motor and Part D is connector to infra red sensor.

    Figure 3.22 Main Circuit Board

    Figure 3.24 shows circuit board of infra red sensor. Label A is connected to

    power supply (from main board) and Label B send the output signal to

    microcontroller.

    +

    Figure 3.23 Sensors Circuit Board

    A

    B

    C

    D

    A

    B

  • 8/13/2019 robot vision thesis

    50/88

    33

    3.6 Software Design

    This part will discussed about software used to development the GUI and

    microcontroller programming.

    3.6.1 MP Lab

    For this project, to compile and write the C++ language (which wrote intomicrocontroller), MP Lab is used. Mp Lab is a free software which can obtained

    from the internet.

    In this project, for microcontroller, there is two main parts. One is the

    communication between the computer and UART, and another one is to act

    accordingly to the signal receive (main program).

    3.6.2 Tiny Bootloader

    Tiny Bootloader is a soft ware that load hex file and burn it into

    microcontroller. Normally, it is used together with MP Lab.

    3.6.3 Microsoft Visual Studio Basic 20103Microsoft Visual Studio Basic (VB) is a integrated development environment

    (IDE) from Microsoft. It is commercial software available with seven languages:

    English, French, German, Italian, Japanese, Korean, and Spanish. Visual Basic is a

    popular IDE due to it interface and also widely used Window OS platform.

  • 8/13/2019 robot vision thesis

    51/88

    34

    Visual Basic is derived from Basic language. Microsoft provide VB express

    edition for no cost. VB come with intellisense function which is very handy for

    beginner.

    3.6.4 Image ProcessingTo detect camera from USB, the Microsoft DirectShow library is used. For

    this library, COM object programming interface such as graph filter is used.

    In image processing, there is many libraries, for example, Open CV,

    Aforge.net, EMGU CV and etc. For this project, Aforge.Net framework is used.Aforge.Net is an opens source and free library which can be downloaded from the

    internet. This framework is developed by Andrew Kirillov. Aforge.Net is a artificial

    intelligence and computer vision library.

    The framework includes support for computer vision, artificial intelligence,

    neural networks, genetic programming, fuzzy logic, machine learning, image

    processing.

    For this project, only imaging library is used.

    3.6.5 Graphic User Interface (GUI) Development

    This part discussed about graphical user interface developed by using Visual

    Basic. Figure 3.24 shows the appearance of the GUI.

  • 8/13/2019 robot vision thesis

    52/88

    35

    Figure 3.24 Graphical User Interface

    3.6.6 USB Camera Detection

    In the panel shown in figure 3.25, the GUI will detect all available and

    connected USB port. The detection will start at two condition, 1stis upon the GUI is

    loaded, second is upon the Detect Port button is pressed. If any of the USB port(s)is (are) connected to the UART. After that, the connected port(s) will be listed down

    on the combo box list. Choosing it and press the connect button to start connect.

    Only 1 port is allowed to be connected at one moment. Information of the port will

    be displayed on the rich text box. Click the connect/disconnect button to connect or

    disconnect. It also displays the baud rate.

    Figure 3.25 Com Port Selection Panel

  • 8/13/2019 robot vision thesis

    53/88

  • 8/13/2019 robot vision thesis

    54/88

    37

    3.6.9 Webcam Device Detection

    The panel shown in figure 3.28 displays information of USB webcam which

    is connected to the computer. It displays the connection status and the device name of

    the webcam. The GUI automatically detects the webcam when the GUI is loaded.

    Only one webcam can be detected at one moment.

    Figure 3.28 Webcam Device Detection Panel

    `3.6.10 Coordinate of Detected Object Center

    The panel shown in figure 3.29 displays the target object center coordinate.

    This information is useful to determine which direction the robot is heading to. The

    robot will try to keep the object in the middle of the sight.

    Figure 3.29 Coordinate Display Panel

    3.6.11 Video Source Player

  • 8/13/2019 robot vision thesis

    55/88

    38

    This video source player user interface (UI) is imported from Aforge.Net

    library to Visual Basic. It is capable of detect the video source from the webcam

    (image acquisition) and displaying image. Figure 3.30 show the complete source

    code for displaying data file.

    Figure 3.30 Source Code for Image Acquisition.

    Figure 3.30 shows the video source player user interface. Pressing the Load

    Image button to start the video source player while pressing the Stop Loading button

    to stop the video source player.

    Figure 3.31 Video Source Player

  • 8/13/2019 robot vision thesis

    56/88

    39

    3.6.12 Picture Box

    Picture box shown in figure 3.31 will displays any filtered image or result.

    This picture is modified picture box by Aforge.Net. It is similar to picture box

    available in Visual Basic but this picture box is much more compatible with video

    image.

    Figure 3.32 Picture Box

    3.7 Flow Diagram

    The flow diagrams in figure 3.33 show how the robot will react. This is

    programmed in the microcontroller. There is two factors influencing the reaction of

    the robot, first is the horizontal coordinate of the target, second is the signal

    condition of the infra red. After the robot is started, the robot will detect the wanted

    object. After the target is founded, it will calculate the position of the target and send

    a string to microcontroller. (For example, if the target on the right side, it will send

    the right string). To determines whether the robot moving in front or backward, it

    depends on the signal from infra red.

  • 8/13/2019 robot vision thesis

    57/88

    40

    Figure 3.32 Overall Flow Diagram

  • 8/13/2019 robot vision thesis

    58/88

    41

    CHAPTER 4

    RESULT

    This chapter discusses the result obtained, the hardware implementation and

    color tracking features.

    4.1 Hardware Result

    Figures 4.1 shows that the robot base install with a laptop and the circuit. In

    order to mount a laptop, the base is made in large size and flat.

    Figure 4.1 Robot Front View

  • 8/13/2019 robot vision thesis

    59/88

    42

    In Figure 4.2 and Figure 4.3, label A is the sensor circuit which place in the

    robot front. Label B is the main circuit. Label C is the webcam. The hardware is

    successfully installed. But it seem like there is a problem with the communication

    problem between the UART and microcontroller. As for result, the robot failed to

    move correctly.

    Figure 4.2 Robot Side View

    Figure 4.3 Robot Rear View.

  • 8/13/2019 robot vision thesis

    60/88

    43

    The problem in hardware failure is most probably in the main circuit board

    and the microcontroller. The signal output send from the computer has been checked

    with LCD monitor and the result is desired output. It is found that the problem reside

    in the microcontroller and poor circuitry.

    4.2 Software Result

    4.2.1 Color Tracking

    The software implementation is successful and the output result is the

    desired result. When the desired color object is detected (foe this example, red), it

    will be highlighted in a green rectangular. The center coordinate of the object is

    obtained by halving the length and height of the green rectangular and add with the

    upper left coordinate of the green rectangular. The direction of the robot is display at

    the left of object center coordinate panel. This direction is determined by comparing

    the center horizontal axis coordinate of the webcam with X value of the object center

    coordinate. As been mention is previous chapter, this color tracking is not limited to

    red color, it can be adjusted to any color by changing the value in the filter setting

    panel. Figure 4.4 shows the result of color tracking.

  • 8/13/2019 robot vision thesis

    61/88

    44

    Figure 4.4 Color Tracking

    4.2.2 Edge Filter

    Edge filter in Figure 4.5 below is an optional feature for this project. The

    filter implements convolution operator, which calculates each pixel of the result

    image as weighted sum of the correspond pixel and its neighbors in the source image.

    The weights are set by convolution kernel. The weighted sum is divided by Divisor

    before putting it into result image and also may be thresholded using Threshold

    value.

  • 8/13/2019 robot vision thesis

    62/88

    45

    Figure 4.5 Edge Filter

    Convolution is a simple mathematical operation which is fundamental to

    many common image processing filters. Depending on the type of provided kernel,

    the filter may produce different results, like blur image, sharpen it, find edges, etc.

  • 8/13/2019 robot vision thesis

    63/88

    46

    CHAPTER 5

    FUTURE WORK AND CONCLUSION

    5.1 Future Work

    5.1.1 Hardware Improvement

    For more advance hardware, IP camera can be installed to the system,

    replacing current camera. IP camera is a wireless system, enable lesser circuitry and

    wiring.

    The robot size is also a problem, too big is not suitable for tracking and will

    greatly reduce the movement speed and direction changing speed. The robot

    movement is also not flexible and limited for certain space only. It can be improve

    by applying wireless communication between the microcontroller and the computer,

    by doing so, the computer is not required to be placed on the robot and the robot size

    can be greatly reduce. This will sharply enhance the robot mobility and quickness of

    the robot, making it a much better land based tracking robot. Capable of moving into

    narrow space, faster direction changing and higher movement speed, and the user

    doesnt need to follow the robot in order to monitor the process. With this

    enhancement, the robot will be more suitable for military use, smaller size make it

    harder to be detected.

  • 8/13/2019 robot vision thesis

    64/88

    47

    5.1.2 Software Improvement

    For further improvement, color can be equipped with image recognition

    function. It can be used for simple tracking such as human faces, vehicle, geometric

    objects, hand written or printed character. The user interface can be further improved

    by enable the user to select target by clicking the object on the video source player.

    This requires shape recognition technique and better function user interface (require

    another library). Current GUI has limited target range, especially on target size (it

    automatically choose bigger target, the ambiguity increase when there is object with

    similar size), so by enable on screen object selection, user can avoid this problem and

    able to choose the target with less limitation and more accurate. The GUI should also

    equipped with video recording function, the image capture is able to save in video orpicture format file (such as .avi and .bmp), this information is useful for surveillance

    purpose.

    5.2 Conclusion

    This project contains three parts, Image acquisition, inference unit and

    positioning unit. Each part is responsible Image acquisition unit can is involvement

    of webcam in grabbing video frame. Inference unit handle the image processing and

    graphical user interface with Visual Basic programming. Creation of GUI enable the

    device is usable by everyone. Lastly the positioning unit which consist of

    microcontroller (programmed in C language), by substitute powerful servo motor

    over the current dc motor, the robot can be more lightweight.

    It is important that the robot small in size, so that it will cost less and occupy

    lesser space and faster to assemble. The image processing and GUI should be

    multifunction and user friendly. A camera that can perform multi task, such as video

    capturing, photographing, face/finger print recognition is better than three cameras

    with single task, especially in the aspect of cost and convenient.

  • 8/13/2019 robot vision thesis

    65/88

    48

    Overall this project is partially successful, due to the failure in

    communication between microcontroller and computer. But in view of camera

    capability and image processing, part of objective scope is fulfilled. Further research

    in image processing field will help human better in process the information in image.

  • 8/13/2019 robot vision thesis

    66/88

    49

    REFERENCES

    [1] Chee Pei Song (2010). Object Tracking Camera. Degree of Bachelor.Universiti Teknologi Malaysia, Skudai. Pages 70-80.

    [2] Lin Rui, Duzhijiang, He Fujun, Kong MIngxiu and Sun Lining(2008).Tracking a Moving Object with Mobile Robot Based On Vision. Pages 23.

    [3] Sanghoon Kim , Sangmu Lee, Seungjong Kim(2008). Object Tracking ofMobile Robot using Moving Color and Shape Information for the aged

    walking.A. Pages 56-67

    [4] Kai-Tai Song and Wen-Jun Chen(2007). Face Recognition and Trackingfor Human-Robot. Interaction Department of Electrical and Control

    Engineering National Chiao Tung University Hsinchu, Taiwan, R.O.C.

    [5] Keita Itoh, Takashi Kikuchi, Hiroshi Takemura and HiroshiMizoguchi(2008). Development of a Person Following Mobile Robot in

    Complicated Background by Using Distance and Color Information.

    Tokyo University of Science 2641 Yamazaki Noda-shi Chiba 278-8510

    Japan

    [6] Mohammed Asief Brey, The Segmentation and Tracking of individuals inan indoor video surveillance environment,2007

    [7] Christian Schlegel, Jorg Illmann, Heiko Jaberg,Matthias Schuster, RobertWorz(2003). Vision Based Person Tracking with a Mobile Robot.Research

  • 8/13/2019 robot vision thesis

    67/88

    50

    Institute for Applied Knowledge Processing (FAW)PO-Box 2060, D -

    89010 Ulm, Germany.

    [8] Greg Welch and Gary Bishop(2006).An Introduction to the Kalman Filter.TR 95-041 Department of Computer Science University of North Carolina

    at Chapel Hill Chapel Hill, NC 27599-3175.

    [9] J. Canny(1983). Finding edges and lines in images. Technical ReportAI-TR-720, MIT Artificial Intelligence Lab.

    [10] M. Sullivan, C. Richards, C. Smith, O. Masoud, and N.Papanikolopoulos(1995.). Pedestrian tracking from a stationary camera

    using active deformablemodels. In IEEE Industrial Electronics Society,

    editor, Proc. of Intelligent Vehicles

    [11] S.A. Brock-Gunn, G.R. Dowling, and T.J Ellis(1994). Tracking usingcolour information.In 3rd ICCARV

    [12] http://en.wikipedia.org/wiki/Kalman_filter.

    [13] http://www.edaboard.com/

    [14] http://www.aforgenet.com/forum/

    [15] http://social.msdn.microsoft.com/Forums/en-US/Vsexpressvb/threads

  • 8/13/2019 robot vision thesis

    68/88

    51

    APPENDIX A

    Table 1 PSM 1 Gantt Chart

  • 8/13/2019 robot vision thesis

    69/88

    52

    APPENDIX B

    FULL CIRCUIT SCHEMATIC

  • 8/13/2019 robot vision thesis

    70/88

    53

    APPENDIX C

    MICROCONTROLLER PROGRAMMING

    //*********************************************/

    //* Include Header */

    //*********************************************/

    #include

    #include

    #include

    #pragma config OSC=HSPLL

    #pragma config OSCS=OFF

    #pragma config PWRT=OFF

    #pragma config BOR=OFF

    #pragma config WDT=OFF

    #pragma config CCP2MUX=ON#pragma config STVR=OFF

    #pragma config LVP=OFF

    #pragma config DEBUG=OFF

    //*********************************************/

    //*********************************************/

    //* Define */

    //*********************************************/

    #define ENB LATCbits.LATC1

    #define ENA LATCbits.LATC2

    #define IN1 LATAbits.LATA1

    #define IN2 LATAbits.LATA2

    #define IN3 LATAbits.LATA3

    #define IN4 LATEbits.LATE1

    #define Startled LATEbits.LATE0

    #define IR PORTAbits.RA5

    #define ChAR PORTBbits.RB1

    #define ChBR PORTBbits.RB0

    #define Start PORTAbits.RA0

  • 8/13/2019 robot vision thesis

    71/88

    54

    #define PWM1 CCPR1L

    #define PWM2 CCPR2L

    //*********************************************/

    //*********************************************/

    //* Function Prototype */

    //*********************************************/

    void Init(void);

    void Delay(unsigned long uldelay);

    void Goright(void);

    void Goleft(void);

    void Gomid(void);

    void Stop(void);

    //*********************************************/

    //*********************************************/

    //* Variable */

    //*********************************************/

    char temp[];

    char usart=0;

    void rx_handler (void);

    //*********************************************/

    //*********************************************/

    //setting interrupt vector

    //*********************************************/

    #pragma code rx_interrupt = 0x8

    void rx_int (void)

    {

    _asm goto rx_handler _endasm

    }

    //interrupt subroutine

    //=========================================================

    #pragma code

    #pragma interrupt rx_handler

    void rx_handler (void)

    {

    while (!DataRdyUSART());

    temp[0]=RCREG;

  • 8/13/2019 robot vision thesis

    72/88

    55

    switch(temp[0])

    {

    case 'R': Goright();

    break;

    case 'L': Goleft();

    break;

    case 'M': Gomid();

    break;

    case 'S': Stop();

    break;

    }

    usart=1;

    //clear the flag bit

    PIR1bits.RCIF = 0;

    }

    //*********************************************/

    //* Main Function */

    //*********************************************/

    void Init(void)

    {

    TRISA = 0b00100001;

    TRISB = 0b00000011;

    TRISC = 0b00000000;

    TRISD = 0b00110000;

    TRISE = 0b00000000;

    // PWM

    T2CON = 0b00000101; //timer2 used for pwm

    PR2 = 0xFF; //set up PWMCCP1CON = 0b00001100; //PWM

    CCP2CON = 0b00001100; //PWM

    // UART setting through library

    // OpenUSART( USART_TX_INT_OFF &

    // USART_RX_INT_OFF &

    // USART_ASYNCH_MODE &

    // USART_EIGHT_BIT &

    // USART_CONT_RX &

    // USART_BRGH_LOW, 1);

  • 8/13/2019 robot vision thesis

    73/88

    56

    //

    // Interrupt

    // RCONbits.IPEN = 1;

    // IPR1bits.RCIP = 1;

    // INTCONbits.GIEH = 1;

    }

    void main(void)

    {

    Init();

    Startled=1;

    while(1)

    {

    if (Start !=1){

    while(1)

    {

    while (!DataRdyUSART());

    temp[0]=RCREG;

    switch(temp[0])

    {

    case 'R': Goright();

    break;

    case 'L': Goleft();

    break;

    case 'M': Gomid();

    break;

    case 'S': Stop();

    break;

    }

    usart=1;

    if (Start ==0)

    Stop();

    }

    }

    }

    }

    void Delay(unsigned long uldelay)

    {

    for( ; uldelay > 0; uldelay--);

    }

  • 8/13/2019 robot vision thesis

    74/88

    57

    void Goleft()

    {

    PWM1=255;

    PWM2=150;

    if (IR =! 1)

    {

    IN1 = 0;

    IN2 = 1;

    IN3 = 0;

    IN4 = 1;

    }

    if (IR ==1)

    {

    IN1 = 1;IN2 = 0;

    IN3 = 1;

    IN4 = 0;

    }

    }

    void Goright()

    {

    PWM1=150;

    PWM2=255;

    if (IR =! 1)

    {

    IN1 = 0;

    IN2 = 1;

    IN3 = 0;

    IN4 = 1;

    }

    if (IR ==1){

    IN1 = 1;

    IN2 = 0;

    IN3 = 1;

    IN4 = 0;

    }

    }

    void Gomid()

    {

  • 8/13/2019 robot vision thesis

    75/88

    58

    PWM1=255;

    PWM2=255;

    if (IR =! 1)

    {

    IN1 = 0;

    IN2 = 1;

    IN3 = 0;

    IN4 = 1;

    }

    if (IR ==1)

    {

    IN1 = 1;

    IN2 = 0;

    IN3 = 1;IN4 = 0;

    }

    }

    void Stop()

    {

    IN1 = 0;

    IN2 = 0;

    IN3 = 0;

    IN4 = 0;

    PWM1=0;

    PWM2=0;

    }

  • 8/13/2019 robot vision thesis

    76/88

    59

    APPENDIX D

    GUI AND IMAGE PROCESSING

    GRAPHIC USER INTERFACE

    Imports AForge.Video.DirectShow

    Imports AForge.Imaging.Filters

    Public Class Form1

    ' create filter

    Dim colorFilter As New ColorFiltering()

    Dim WithEvents serialPort As New IO.Ports.SerialPort

    Dim image As Bitmap

    Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As

    System.EventArgs) Handles Button1.Clickload_device()

    GroupBox1.Enabled = False

    Label16.Visible = True

    Button3.Enabled = False

    GroupBox5.Enabled = False

    End Sub

    Private Sub Form1_HandleDestroyed(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles Me.HandleDestroyed

    VideoSourcePlayer1.Stop()

    load_device2()

    VideoSourcePlayer1.Dispose()

    SerialPort1.Close()

    End Sub

    Private Sub Form1_KeyDown(ByVal sender As Object, ByVal e As

    System.Windows.Forms.KeyEventArgs) Handles Me.KeyDown

    If SerialPort1.IsOpen = True Then

  • 8/13/2019 robot vision thesis

    77/88

    60

    Select Case e.KeyCode

    Case Keys.Right

    ' Transmit data

    serialPort.Write(CChar("R"))

    serialPort.Write(CChar("R"))

    serialPort.Write(CChar("R"))

    e.Handled = True

    Exit Select

    Case Keys.Left

    ' Transmit data

    serialPort.Write(CChar("L"))

    serialPort.Write(CChar("L"))serialPort.Write(CChar("L"))

    e.Handled = True

    Exit Select

    Case Keys.Up

    ' Transmit data

    serialPort.Write(CChar("M"))

    serialPort.Write(CChar("M"))

    serialPort.Write(CChar("M"))

    e.Handled = True

    Exit Select

    Case Keys.Down

    ' Transmit data

    serialPort.Write(CChar("S"))

    serialPort.Write(CChar("S"))

    serialPort.Write(CChar("S"))

    e.Handled = True

    Exit Select

    End Select

    End If

    End Sub

    Private Sub Form1_Load(ByVal sender As System.Object, ByVal e As

    System.EventArgs) Handles MyBase.Load

    Label16.Visible = False

    load_device2()

  • 8/13/2019 robot vision thesis

    78/88

    61

    Button1.Enabled = False

    Button2.Enabled = False

    For i As Integer = 0 To _

    My.Computer.Ports.SerialPortNames.Count - 1

    ComboBox1.Items.Add( _

    My.Computer.Ports.SerialPortNames(i))

    Next

    End Sub

    Private Sub Button1_Click_1(ByVal sender As System.Object, ByVal e As

    System.EventArgs) Handles Button2.Click

    VideoSourcePlayer1.Stop()

    load_device2()

    PictureBox1.Image = NothingGroupBox1.Enabled = True

    Label16.Visible = False

    Button3.Enabled = True

    GroupBox5.Enabled = True

    End Sub

    Private Sub VideoSourcePlayer1_NewFrame(ByVal sender As Object, ByRef

    image As System.Drawing.Bitmap) Handles VideoSourcePlayer1.NewFrame

    image = VideoSourcePlayer1.GetCurrentVideoFrame

    If RadioButton3.Checked = True Then

    If Me.InvokeRequired() Then

    Me.BeginInvoke(New MethodInvoker(AddressOf

    load_color_filtered_image))

    Else

    load_color_filtered_image()

    End If

    ElseIf RadioButton4.Checked = True ThenIf Me.InvokeRequired() Then

    Me.BeginInvoke(New MethodInvoker(AddressOf

    load_threshold_filtered_image))

    Else

    load_threshold_filtered_image()

    End If

    End If

    End Sub

  • 8/13/2019 robot vision thesis

    79/88

  • 8/13/2019 robot vision thesis

    80/88

    63

    System.Windows.Forms.KeyPressEventArgs) Handles TextBox3.KeyPress

    Dim num As Byte

    If Not Char.IsDigit(e.KeyChar) Then e.Handled = True

    If TextBox3.Text.Length = 3 Then e.Handled = True

    If e.KeyChar = Chr(8) Then e.Handled = False 'allow Backspace

    If Byte.TryParse(TextBox3.Text, num) = False Then

    TextBox3.Clear()

    TextBox3.Focus()

    Else

    If e.KeyChar = Chr(13) Then TextBox4.Focus() 'Enter key moves to

    specified control

    End If

    End Sub

    Private Sub TextBox1_KeyPress(ByVal sender As Object, ByVal e As

    System.Windows.Forms.KeyPressEventArgs) Handles TextBox1.KeyPressDim num As Byte

    If Not Char.IsDigit(e.KeyChar) Then e.Handled = True

    If TextBox1.Text.Length = 3 Then e.Handled = True

    If e.KeyChar = Chr(8) Then e.Handled = False 'allow Backspace

    If Byte.TryParse(TextBox1.Text, num) = False Then

    TextBox1.Clear()

    TextBox1.Focus()

    Else

    If e.KeyChar = Chr(13) Then TextBox2.Focus() 'Enter key moves to

    specified control

    End If

    End Sub

    Private Sub TextBox2_KeyPress(ByVal sender As Object, ByVal e As

    System.Windows.Forms.KeyPressEventArgs) Handles TextBox2.KeyPress

    Dim num As Byte

    If Not Char.IsDigit(e.KeyChar) Then e.Handled = True

    If TextBox2.Text.Length = 3 Then e.Handled = True

    If e.KeyChar = Chr(8) Then e.Handled = False 'allow Backspace

    If Byte.TryParse(TextBox2.Text, num) = False ThenTextBox2.Clear()

    TextBox2.Focus()

    Else

    If e.KeyChar = Chr(13) Then TextBox3.Focus() 'Enter key moves to

    specified control

    End If

    End Sub

    Private Sub TextBox1_Leave(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles TextBox1.Leave

  • 8/13/2019 robot vision thesis

    81/88

    64

    Dim num As Byte

    If Byte.TryParse(TextBox1.Text, num) = False Then

    TextBox1.Clear()

    TextBox1.Focus()

    End If

    End Sub

    Private Sub TextBox2_Leave(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles TextBox2.Leave

    Dim num As Byte

    If Byte.TryParse(TextBox2.Text, num) = False Then

    TextBox2.Clear()

    TextBox2.Focus()

    End If

    End Sub

    Private Sub TextBox3_Leave(ByVal sender As Object, ByVal e AsSystem.EventArgs) Handles TextBox3.Leave

    Dim num As Byte

    If Byte.TryParse(TextBox3.Text, num) = False Then

    TextBox3.Clear()

    TextBox3.Focus()

    End If

    End Sub

    Private Sub TextBox4_Leave(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles TextBox4.Leave

    Dim num As Byte

    If Byte.TryParse(TextBox4.Text, num) = False Then

    TextBox4.Clear()

    TextBox4.Focus()

    End If

    End Sub

    Private Sub TextBox5_Leave(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles TextBox5.Leave

    Dim num As Byte

    If Byte.TryParse(TextBox5.Text, num) = False ThenTextBox5.Clear()

    TextBox5.Focus()

    End If

    End Sub

    Private Sub TextBox6_Leave(ByVal sender As Object, ByVal e As

    System.EventArgs) Handles TextBox6.Leave

    Dim num As Byte

    If Byte.TryParse(TextBox6.Text, num) = False Then

    TextBox6.Clear()

    TextBox6.Focus()

  • 8/13/2019 robot vision thesis

    82/88

    65

    End If

    End Sub

    Private Sub RadioButton1_CheckedChanged(ByVal sender As System.Object,

    ByVal e As System.EventArgs) Handles RadioButton1.CheckedChanged

    colorFilter.FillOutsideRange = True

    End Sub

    Private Sub RadioButton2_CheckedChanged(ByVal sender As System.Object,

    ByVal e As System.EventArgs) Handles RadioButton2.CheckedChanged

    colorFilter.FillOutsideRange = False

    End Sub

    Private Sub Button3_Click(ByVal sender As System.Object, ByVal e As

    System.EventArgs) Handles Button3.Click

    Button4.Enabled = FalseButton1.Enabled = True

    Application.DoEvents()

    If Button3.Text = "Connect" Then

    'Check whether serial port is initially open or not

    If SerialPort1.IsOpen Then

    SerialPort1.Close()

    End If

    If ComboBox1.Text = Nothing Then

    Button4.Enabled = True

    MsgBox("Please Choose your Comm Port",

    MsgBoxStyle.Critical)

    Else

    Try

    With SerialPort1

    .PortName = ComboBox1.SelectedItem

    .BaudRate = 115200

    .Parity = IO.Ports.Parity.None

    .DataBits = 8

    .StopBits = IO.Ports.StopBits.One

    End With

    ' Set the read/write timeouts

    SerialPort1.ReadTimeout = 1000

    SerialPort1.WriteTimeout = 1000

    SerialPort1.Open()

    Button3.Text = "Connected /" & Environment.NewLine &

  • 8/13/2019 robot vision thesis

    83/88

    66

    "Disconnect"

    Label15.Visible = True

    Label15.Text = "Baud Rate:115200"

    RichTextBox1.AppendText(Environment.NewLine)

    RichTextBox1.AppendText("Port Connected :" &

    ComboBox1.SelectedItem)

    Catch ex As Exception

    MsgBox(ex.ToString)

    End Try

    End If

    ElseIf Button3.Text = "Connected /" & Environment.NewLine &

    "Disconnect" ThenButton1.Enabled = False

    Button4.Enabled = True

    Button3.Enabled = True

    Button3.Text = "Connect"

    SerialPort1.Close()

    Label15.Visible = False

    RichTextBox1.AppendText(Environment.NewLine)

    RichTextBox1.AppendText("Port " & ComboBox1.SelectedItem & "

    disconnect ")

    End If

    End Sub

    Private Sub Button4_Click(ByVal sender As System.Object, ByVal e As

    System.EventArgs) Handles Button4.Click

    ComboBox1.ResetText()

    ComboBox1.Items.Clear()

    For i As Integer = 0 To _

    My.Computer.Ports.SerialPortNames.Count - 1ComboBox1.Items.Add( _

    My.Computer.Ports.SerialPortNames(i))

    Next

    End Sub

    Private Sub Button5_Click(ByVal sender As System.Object, ByVal e As

    System.EventArgs)

  • 8/13/2019 robot vision thesis

    84/88

    67

    SerialPort1.Close()

    SerialPort1.Open()

    Try

    SerialPort1.Write(CChar("L"))

    SerialPort1.Write(CChar("L"))

    SerialPort1.Write(CChar("L"))

    Label16.Text = "Moving to Left"

    Catch ex As Exception

    MsgBox(ex.ToString)

    End Try

    End Sub

    End Class

    ********************************************************************

    DESIGNER FORM

    ********************************************************************

    Public Sub load_device()

    Try

    Dim video_device = New

    FilterInfoCollection(FilterCategory.VideoInputDevice)

    If video_device.Count = 0 Then

    Throw New ApplicationException()

    Else

    For Each device As FilterInfo In video_deviceLabel1.Text = "Device Connected"

    Label2.Text = "Device:" & device.Name

    Label3.Text = "Displaying Image"

    Button1.Enabled = False

    Button2.Enabled = True

    Next

    End If

    ' create video source

    Dim videoSource As New

    VideoCaptureDevice(video_device(0).MonikerString)

  • 8/13/2019 robot vision thesis

    85/88

    68

    VideoSourcePlayer1.VideoSource = videoSource

    videoSource.DesiredFrameSize = New Size(320, 240)

    VideoSourcePlayer1.AutoSizeControl = False

    'New Size(160, 120)

    ' start the video source

    VideoSourcePlayer1.Start()

    Catch e1 As ApplicationException

    Label1.Text = "No Local Device Detected"

    Label2.Text = "Device:(Not Available)"

    Label3.Text = "Please connect a Webcam."

    Button2.Enabled = False

    End Try

    End Sub

    Public Sub load_device2()

    Try

    Dim video_device = New

    FilterInfoCollection(FilterCategory.VideoInputDevice)

    If video_device.Count = 0 Then

    Throw New ApplicationException()

    Else

    For Each device As FilterInfo In video_device

    Label1.Text = "Device Connected"

    Label2.Text = "Device:" & device.Name

    Label3.Text = "Select Com Port"

    Button2.Enabled = FalseButton1.Enabled = True

    Next

    End If

    Catch e1 As ApplicationException

    Label1.Text = "No Local Device Detected"

    Label2.Text = "Device:(Not Available)"

    Label3.Text = "Please connect to a Webcam"

    Button2.Enabled = False

    End Try

    End Sub

  • 8/13/2019 robot vision thesis

    86/88

    69

    Public Sub load_color_filtered_image()

    ' configure the filter

    colorFilter.Red = New IntRange(CInt(TextBox1.Text),

    CInt(TextBox2.Text))

    colorFilter.Green = New IntRange(CInt(TextBox3.Text),

    CInt(TextBox4.Text))

    colorFilter.Blue = New IntRange(CInt(TextBox5.Text),

    CInt(TextBox6.Text))

    ' apply the filter

    Dim img1 As Bitmap = Me.Invoke(Function()

    VideoSourcePlayer1.GetCurrentVideoFrame)

    Dim objectImage As Bitmap = Me.Invoke(Function()

    colorFilter.Apply(img1))

    ' create blob counter and configure it

    Dim blobCounter As New BlobCounter()

    blobCounter.MinWidth = 25 ' set minimum size of

    blobCounter.MinHeight = 25 ' objects we look for

    blobCounter.FilterBlobs = True ' filter blobs by size

    blobCounter.ObjectsOrder = ObjectsOrder.Size ' order found object by size

    ' grayscaling

    Dim grayFilter As New Grayscale(0.2125, 0.7154, 0.0721)

    Dim grayImage As Bitmap = grayFilter.Apply(objectImage)

    ' locate blobs

    blobCounter.ProcessImage(objectImage)

    Dim rects() As Rectangle = blobCounter.GetObjectsRectangles()

    ' draw rectangle around the biggest blob

    If rects.Length > 0 Then

    Dim objectRect As Rectangle = rects(0)

    Dim g As Graphics = Graphics.FromImage(objectImage)

    Using pen As New Pen(Color.FromArgb(160, 255, 160), 3)g.DrawRectangle(pen, objectRect)

    End Using

    g.Dispose()

    End If

    PictureBox1.Image = Me.Invoke(Function() objectImage)

    If (rects.Length 0) Then

  • 8/13/2019 robot vision thesis

    87/88

    70

    Dim objectrect As Rectangle = rects(0)

    Dim object_x As Integer = objectrect.X + objectrect.Width / 2

    Dim object_y As Integer = objectrect.Y + objectrect.Height / 2

    Me.Invoke(Function() colorFilter.Apply(img1))

    Label7.Text = "X:" & Me.Invoke(Function() object_x)

    Label8.Text = "Y:" & Me.Invoke(Function() object_y)

    If (190 < object_x) Then

    Try

    SerialPort1.Write(CChar("R"))

    SerialPort1.Write(CChar("R"))

    SerialPort1.Write(CChar("R"))

    Label16.Text = "Moving to Right"Catch ex As Exception

    MsgBox(ex.ToString)

    End Try

    ElseIf (160 > object_x) Then

    Try

    SerialPort1.Write(CChar("L"))

    SerialPort1.Write(CChar("L"))

    SerialPort1.Write(CChar("L"))

    Label16.Text = "Moving to Left"

    Catch ex As Exception

    MsgBox(ex.ToString)

    End Try

    ElseIf (160 < object_x < 190) Then

    Try

    SerialPort1.Write(CChar("M"))

    SerialPort1.Write(CChar("M"))

    SerialPort1.Write(CChar("M"))Label16.Text = "Moving Forward"

    Catch ex As Exception

    MsgBox(ex.ToString)

    End Try

    End If

    Else

    Try

    SerialPort1.Write(CChar("S"))

    SerialPort1.Write(CChar("S"))

    SerialPort1.Write(CChar("S"))

  • 8/13/2019 robot vision thesis

    88/88