image tracing system - rpicats-fs.rpi.edu/~wenj/ecse446s06/team4proposal.pdf · 2006-02-28 ·...

31
IMAGE TRACING SYSTEM Proposal for ECSE-4460 Control Systems Design Jason Duarte Azmat Latif Stephen Sundell Tim Weidner February 22, 2006 Rensselaer Polytechnic Institute

Upload: others

Post on 19-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

IMAGE TRACING SYSTEM Proposal for ECSE-4460 Control Systems Design

Jason Duarte Azmat Latif

Stephen Sundell Tim Weidner

February 22, 2006

Rensselaer Polytechnic Institute

ii

Abstract

This report describes the design approach to build a laser-image tracing system

using two degrees of freedom for pan and tilt motion. The design model parameters are

obtained through Matlab and SolidWorks. The system is subdivided into three major

components: image analysis, trajectory planning, and control design. The image analysis

is used to identify an image to be traced. The trajectory planning returns the joint

trajectories to follow a desired path. Finally the control design positions the links to the

desired locations. Friction modeling and verification is discussed. The design objectives

are to traverse an image at a rate of 1 ft/sec with 0% overshoot and no steady state errors.

3

Table of Contents

1. Introduction................................................................................................................4 2. Specifications .............................................................................................................5 3. Objectives ..................................................................................................................8 4. Design Strategy ..........................................................................................................9

4.1 Image Processing...................................................................................................9 4.2 Trajectory Planning .............................................................................................10 4.3 Control System Design ........................................................................................12

5. Plan of Action ..........................................................................................................14 6. Verification and Tolerance .......................................................................................16 7. Cost Analysis ...........................................................................................................18 8. Schedule...................................................................................................................19 9. Bibliography.............................................................................................................21 Appendix ......................................................................................................................22

A. Robotics Diagram ................................................................................................22 B. Output generated by pantilt.m ..............................................................................23 C. Pan Assembly (Body A).......................................................................................24 D. Tilt Assembly (Body B).......................................................................................25 E. Laser Mounting Assembly....................................................................................26 F. Pan Tilt Assembly ................................................................................................27 G. Mass Properties for Pan Axis (Body A)................................................................28 H. Mass Properties for Tilt Axis (Body B) ................................................................29 I. Motor Datasheet ....................................................................................................30

Statement of Contribution .............................................................................................31

4

1. Introduction

The goal of this project is to create a shape-tracing laser system capable of tracing patterns consisting of curves and polygons. There are several similar designs that are used in the industry today. Predominant applications are robots used for spray-painting, arc welding, and laser cutting.

One clear advantage of using robotics manipulators is that the required task can

typically be completed more quickly and efficiently. The combination of minimal fixturing and high execution speed can also be economically beneficial for companies. In addition, several applications of robots may also save lives. For example, a worker can develop respiratory irritation and metabolic toxicity from inhaling paint components and spray drift. Therefore, it is desirable to use a robot to create a safer work environment.

Most industry applications robots use CAD drawings which use a color coded

scheme to highlight the curves to be cut and etched. Our design will use a webcam to identify the image to be traced. We will then perform several image analysis functions to determine the robot’s trajectory.

One similar design that we found was Team 5 of 2004’s Signature machine. Their project used a sharpie black marker to write down words and trace out basic shapes such as a circle and square on a seven by seven inch area. However, our design uses a laser pointer which will not be in contact with the surface.

We intend to place the laser perpendicular to an initially vertical plate that tilts on two axes in order to control the direction of the laser. Each tilting axis will be operated on by an electric motor. Each motor will be controlled using software with position and image feedback for control (See appendix for system layout). By using two forms of feedback, we intend for our design to have high accuracy and repeatability with every image. The laser pointer will trace an image at a speed of 1ft/sec.

5

2. Specifications

When beginning the design process, specifications need to be made in order to understand what requirements the system needs to meet. The pan-tilt motion range, speed, and accuracy are all vital specifications associated with the laser tracing system.

In order to keep simplicity for the range of motion specification, the laser will be positioned directly in the center of the 3’x 3’ image exactly 3 feet away.

Figure 1: Laser range of motion on the tracing plane.

This ensures that the maximum range of motion needed in opposite directions is

equal. Simple trigonometry can be used to determine the maximum angle movement of the laser from the center of the image for either pan or tilt motions.

6

Figure 2: Pan and tilt angular range of motion.

Team 1 from the 2003 Control Systems Design course [1] produced a vision

tracking system with a tracking speed of 1 ft/sec from 10 feet away. We plan on matching the same tracking speed at only 3 feet away. To reach a speed of 12 inches/sec, angular velocity of the laser is calculated as:

sec/71.17"18

565.26sec1

'12 οο

=× (1)

or

sec/309.180_

sec171.17 radrad

=× ο

ο π (2)

This angular speed must now be compared to the gear ratio of the motor/gear

assembly that was originally installed on the system. Both tilt and pan motors are Pittman GM8724S010 24VDC with a reduction ratio of 6.3:1. Also, the link gears have a radius of 1.2” while the motor gears have a radius of .45”. Therefore, the gear ratio is 2.666. To convert the maximum angular velocity of the laser to angular velocity of the motor, we use the equation:

lm rr θθ &&21 = (3)

7

Motor angular velocity is calculated as sec/824. rad , or 7.869rpm. This velocity is then used in the maximum current and torque equations:

[ ] RNKei me /maxmax θ&−= (4)

maxmax iKN tm=τ (5)

Constants can be found in the motor datasheet located in the Appendix. Calculated maximum current is 1.39A, which corresponds to the motor speed-torque curve below. Calculated maximum torque is 54.45 oz-in., which is not too much beyond the curve. This verifies that the motor given with the system can support our design.

Figure 3: Torque speed curve of the Pittman GM8724S010 motor.

8

3. Objectives

The goal of the image tracing system will be to initially control a laser to follow a simple path by following the boundaries of image. From there, the goal will be to expand the capabilities of the system so it can trace more complex paths. The system should be able to utilize both position and image feedback in accomplishing these goals.

Initially, the desire of the team was to design a control system that would use only one form of feedback. Encoder feedback from the motor was considered first, but this would only provide information relating directly to the position and speed of the motor. Image analysis and feedback was then considered because it can account for disturbances in the overall system. Since each has its own advantages, careful planning will be taken to utilize both.

This project may lead to a new method of hybridizing multiple types of feedback. In turn, our system may be useful in the area of manufacturing by identifying sources of disturbance, and by increasing the shelf life of manufacturing systems. Such a system can also be provided at relatively low cost by offering optimization and upgrade services.

Several challenges can be foreseen in the design and implementation of this project. There will be a tradeoff between speed and accuracy. Trajectory planning and generation will be critical to optimizing these parameters. Writing an efficient trajectory generation algorithms will be challenging.

Image processing provides another challenge. Currently, our team is becoming familiarized with image processing functions and techniques. This aspect of the project involves many tasks in the completion of our system. To simplify this subsystem, all image processing will be done using LabView; however, integrating the image processing with the control system provides a difficult challenge.

Hybridization of image and encoder feedback will also provide a challenge. It will be necessary to determine the strengths of each type of feedback. Most likely, each will be used separately during the testing of the system, and combining the two with our control system will be the final step.

9

4. Design Strategy

4.1 Image Processing

For our project, we need a way to extract a path around an image at a distance of three feet from our machine. This would require a source of visual feedback such as a webcam. The images retrieved from the camera will be processed using LabView’s Vision module.

The image processing will result in a set of coordinates, or path definitions, of all the lines in the image. One method would be listing all the pixels that are colored black, and then sorting them in the order to be traced. Another option would be to use edge detection to find all of the lines and arcs in the image. We will most likely use the edge detection tools in the IMAQ toolset. Once we have the edges, we will know the path that our laser must follow.

In order to obtain correct angles for the motors, we will have the camera image mapped to a coordinate frame. Each pixel will represent a certain distance. Using the viewing angle and the distance from the image we will find the correct mapping for our image to a coordinate system. We will then convert our set of lines or points into this coordinate frame.

We are considering the use of image feedback to determine the position of the

laser. We would be looking to see if there is a red dot overlaying an area that needs to be traced. If it was off target, we would need to correct the movement. Combining this with the position feedback from the encoders we can control the path of the laser more accurately.

10

4.2 Trajectory Planning

According to Groover [2], if the speed of each joint can be controlled independently then the robotic manipulator is capable of continuous path motion control. Continuous path motion is used when there is a definite geometric path to follow such as spray painting. A more primitive method involves pick and place operations such as those used for spot welding.

The path of a robot describes the position and orientation of the manipulator. A trajectory contains information about the path plus the desired speed at which the robot traverses along that path. The velocity and acceleration profile must be kept within the actuator limits in order for the robot to follow a given path.

Trajectory planning can be conducted in joint variable space or in Cartesian space [3]. The advantage of planning in joint variable space is that the planning is done directly with the controlled variables (joint angles). It is therefore a simpler approach which is easier to plan. However, most manipulation tasks are given in terms of the Cartesian world coordinates. For example, in our design we will be using a webcam which will not provide information in terms of joint variables; it will specify the desired position of our end-effector (laser pointer).

The desired trajectory can be represented in terms of a vector w in 6R known as the tool configuration vector. This vector is defined as:

=

RP

w (6)

where P and R represent the position and orientation of the end-effector relative to the base frame. For our design we can consider the vector as being in 3R since we are only controlling the position of the laser pointer and not its orientation.

Given a desired position we must determine the corresponding joint angles. The procedure to map from the Cartesian space to joint space is known as inverse kinematics. The inverse kinematics – which must be performed in real time- can be computationally intensive, lead to longer control intervals, and many times return multiple solutions [4]. For example, Figure 3 shows two configurations of joint angles that arrive at the same point on the block.

11

Figure 3: Non-unique inverse kinematics solutions.

Use of inverse kinematics will enable us to obtain the joint trajectories and then

differentiate them to determine the rate at which the individual joints should be driven. From this derivation, we will also be able to determine the maximum velocity and torque for each joint. Types of trajectory motion

A simple example of trajectory planning is straight line motion control. There are a variety of applications in which a tool is required to follow a straight path. Examples include tracking a conveyer belt and inserting operations [2]. In many of these applications the robot travels at a constant speed. Given an initial and final tool configuration point { fo ww , } a general straight line trajectory for the tool can be represented as:

fo wtswtstw )()](1[)( +−= Tto ≤≤ (7)

Here )(ts is a speed distribution function which specifies the speed at which the tool moves along a straight line path. For uniform straight line motion the speed function is simply:

Ttts =)( (8)

For a straight line motion the path is explicitly and completely specified. In many

instances this is not the case. Instead only knot points such as the end points and intermediate points are specified. The task then is to interpolate between the knot points and produce a smooth path. There are several techniques available for interpolated motion. These include cubic polynomial paths, piecewise-linear interpolation, spline interpolation etc. We will consider these trajectory planning methods for our design.

In addition, we will look at manual leadthrough programming [2] where the laser will be manually moved to the desired motion pattern. The position of the laser will be sampled and recorded using the encoder values. This technique is a convenient and simpler method of “teaching” the robot to follow a desired trajectory.

Elbow up Elbow down

12

4.3 Control System Design

The purpose of a controller is to compare the actual output to the desired input and provide a control signal which will reduce the error to zero or as close to zero as possible [4]. Initially we will test how accurately we can position each link. Therefore, the input to our system will be the desired joint angles [ 21 ,ϑϑ ] calculated from the inverse kinematics.

A common approach to robot control that is used in many commercial robots is single-axis PID control [2]. To describe the PID control we let )(tr be the desired joint angles and )(tq the actual joint angles. The error therefore is

)()()( tqtrte −= (9)

Ideally the error should be zero but in practice it varies, particularly when the reference input )(tr is changing. A common technique used to control a robot is to employ n independent controllers, one for each joint [2]. The general equation is given by:

)()()(0

teKdeKteKt

DIp ∫•

++ ττ (10)

where the gains { DIP KKK ,, } are nn × diagonal matrices indicating that each axis is controlled independently. For our design the gain matrices will be 22 × . Figure 4 shows a block diagram for a single axis PID controller.

Figure 4: A one dimensional PID controller

A PID controller provides quick response, good control of system stability, and low steady-state error. These characteristics along with our familiarity with PID controllers make this type of controller an attractive initial approach.

The gains of the PID controller must be adjusted to meet our design specifications. The gains can be determined and simulated using the root locus. A

13

design model always contains some unmodeled dynamic terms. Friction terms are generally difficult to model and are always present in practical designs. Therefore, the gains from our simulation will experimental; although, we can model the friction and feedforward it to the controller in order to cancel its affect.

Upon successfully controlling the joint positions we aim to look at more sophisticated types of controller such as an optimal controller. Using an optimal controller we will able to assign weights to design parameters such as speed and accuracy. We also intend to explore an iterative learning algorithm to accurately control both the joint position and speed.

Model Development

Since the system is quite complex, an extremely precise non-linear model needs to be developed. In this case, a Lagrange-Euler dynamic model is considered and kinetic and potential energies of the system must be found.

A Solidworks model of the system was created and used to provide the mass and inertia matrices and of both pan and tilt bodies. Inertia tensors for body A (pan) and body B (tilt) were revealed as:

−−−−

=0008.00001.00001.00001.00007.00000.00001.00000.00012.0

AI

−=

0007.00000.00000.00000.00006.00001.00000.00001.00003.0

BI

Masses for both bodies were found to be:

kgmA 5603.0= kgmB 3611.0=

The total mass properties for both bodies can be found in the appendix.

Professor Wen’s pantilt.m script is used to define the symbolic equations for the system. The file requires an input of body masses, inertia tensor matrices and center of gravity. Running the file in MATLAB returns values for the inertia tensor, velocity coupling matrix, gravity loading vector, and total energy.

14

5. Plan of Action

The development of the image tracing system can be broken down into three subsystems: image processing, trajectory generation, and control. Each subsystem requires various tasks to be completed before system integration. Figure 5 shows the tasks with respect to each subsystem.

Image Tracing System

Trajectory Generation Control SystemImage Processing

Shape Recognitionand Calibration

Acquire Image

Obtain laser position athigh sampling frequency

Integrate with Control System

Straight Line Path(continuous)

Interpolated Motion(continuous)

Exact Path(non-continuous)

Research Options

Implement in LabVIEW

Algorithms

Integrate with ControlSystem

PID tuning

Consider alternativecontrol methods

Hybridize Position and Vision Feedback

Friction Modeling

Inverse Kinematics

Resolution/AngleCorrelation

Figure 5: Task breakdown of the image tracing system.

The image processing system will be implemented using the LabView Vision module. Stephen Sundell and Tim Weidner are currently working on this. Image acquisition often requires special cameras and data acquisition cards. To minimize cost, a simple USB camera will be used which only requires drivers which are provided free by National Instruments. These drivers and their documentation have been already been obtained.

Shape recognition will be the initial step to determine the complexity of the image to be traced, and will be implemented among other calibration steps. Ultimately, the goals of calibration are to store important image properties and to plan which form of trajectory generation will be necessary.

15

Other tasks involved in the development of the image processing subsystem are to

determine the correlation between standard resolution and pan/tilt angles, and to find an acquisition rate that will perform sufficiently with our control system. It is critical to simplify the integration of the image processing with the trajectory planning and control systems.

The tasks involved with trajectory generation are researching available options and writing the algorithms to implement them. To get started, Azmat Latif has done some research on trajectory generation and Stephen Sundell has written some LabView scripts that can successfully trace a circle.

The trajectory planner will use the image processing subsystem as an input. If we understand which options are available, the trajectory planner will be able to decide which type of robot motion will be most efficient; although, this can be quite time consuming. It is therefore our plan to implement straight line and interpolated motion before attempting the exact path method.

At this point, the control system will be implemented using two PID controllers. There are other options to consider though, such as optimal and hybrid controllers. Stephen Sundell and Tim Weidner can provide information on these controllers since they are applying them in another course. The use of alternate control methods will be decided by the team after the kinematics and friction of the system are determined.

An important step in any design phase is the identification of physical parameters such as friction and gravity loading. Friction is created at the interface of two surfaces and is present in every mechanical system. The friction terms will be present in numerous locations such as the motor, gears, joints etc. The presence of friction can lead to poor system performance. It is therefore important to accurately model the friction terms so that we can design control laws that cancel or significantly reduce the effects of friction [5].

Jason Duarte has modeled the physical system in SolidWorks and MatLab. With his help and the team’s background in Electrical Engineering, the control system will be systematically modeled and fine tuned using MatLab and LabView to provide the best performance.

16

6. Verification and Tolerance In order to complete our design we must first verify the results of each design component. Trajectory Planning

The trajectory planner will use the techniques discussed (straight line motion, interpolated motion etc) and inverse kinematics to obtain the joint-space trajectories. The joint trajectories will then be fed into the controller. We will then monitor how accurately the laser traces an image. Depending on the magnitude of deflection we will be able to determine what alterations are needed in our trajectory planner to improve the accuracy. Control Design

The controller will allow us to position the two links. To verify this portion of the design a pair of random joint angles [ 21 ,ϑϑ ] will be fed directly into the controller. The desired location of the laser pointer with these joint angles will have been determined beforehand so as to compare with its actual position. Based on the link settling time, overshoot, and steady state error we will determine which parameters of our controller need to be adjusted. For example, to reduce any overshoot and steady-state error the proportional and integral gains { IP KK , } of the PID controller will have to be tuned accordingly. Simulation

The current control simulation uses numerical values obtained from running the pantiltinit.m provided by Prof. Wen which contained our calculated mass, center of mass, and inertia values. The simulated system uses two PID controllers for pan and tilt, respectively. A diagram of this raw nonlinear simulation is shown below.

Figure 6: SIMULINK diagram for pan and tilt nonlinear simulation.

17

Analysis of frictional components must still be made; however, scope readings of a step response and error show that minimal tuning will be needed for the controller to correctly operate.

Figure 7: Step response of pan and tilt axes.

Figure 8: Error plot for pan and tilt axes.

Image Processing

First, we will need to acquire an image from our camera. The camera will be connected to the computer through a USB slot. We will need to initialize an image in LabView for use. We will then send this image to the snap function in the IMAQ toolset. This gets an image from a device we select and puts it into the image. This will then be sent on to be processed. Various image processing functions will be used to minimize error.

Once we have this information, we will divide the path into multiple sections of path, depending on the complexity of the image, and test them separately. If it is a simple path with no corners, we will most likely keep it as a single path. If there are corners, it

18

will be easiest to split the path at these places and start a new path there. This will simplify the trajectory planning of our laser.

7. Cost Analysis

The cost for developing the system can be broken down into cost of labor, cost of equipment, and cost of additional items as shown in Tables 1, 2, and 3. TABLE 1: COST OF LABOR

Team Hours Cost Total Duarte, Jason (engineer) 250 $30/hr $7,500 Latif, Azmat (engineer) 250 $30/hr $7,500

Sundell, Stephen (engineer) 250 $30/hr $7,500 Weidner, Tim (engineer) 250 $30/hr $7,500

Total 1000 $30,000

TABLE 2: COST OF EQUIPMENT Item Qty Cost Total Source

Pittman motor GM8724S010 2 $100.00 $200.00 Supplied Pan gear A 1 $15.00 $15.00 Supplied Pan gear B 1 $7.50 $7.50 Supplied Tilt gear A 1 $15.00 $15.00 Supplied Tilt gear B 1 $7.50 $7.50 Supplied Pan belt 1 $2.00 $2.00 Supplied Tilt belt 1 $2.00 $2.00 Supplied

LabView 7.1 software 1 $2,395.00 $2,395.00 Supplied LabView Real-Time software 1 $1,995.00 $1,995.00 Supplied

LabView FPGA software 1 $1,995.00 $1,995.00 Supplied LabView IMAQ software 1 $2,995.00 $2,995.00 Supplied

NI cRIO-9004 1 $1,495.00 $1,495.00 Supplied NI 1 M gate reconfigurable I/O (RIO) FPGA 1 $1,195.00 $1,195.00 Supplied

NI cRIO-9411 1 $100.00 $100.00 Supplied NI cRIO-9263 1 $100.00 $100.00 Supplied NI cRIO-9215 1 $100.00 $100.00 Supplied

NI-9401 1 $100.00 $100.00 Supplied Total $12,719.00

TABLE 3: COST OF ADDITIONAL ITEMS

Additional Items Qty Cost Total Source Laser Pointer 1 $11.00 $11.00 CircuitCity USB camera 1 $35.00 $35.00 OfficeMax

Total $46.00

19

8. Schedule

A schedule for the remaining weeks of the spring 2006 semester is shown in Table 4. Although tentative, the schedule is realistic and aims for a fully functional design before the demonstration day. It includes specific tasks for developing each subsystem of the design on a week-by-week basis. Each task has a leader identified by their initials; although, the entire team will be participating in the development each subsystem. Report, presentation, and demonstration dates are shown on the schedule as well.

20

TABLE 4: PROPOSED SCHEDULE Month/Week, Reports

Image Processing Trajectory Generation

Control System

2/27-3/3 Project Proposal Presentation due 3/1

Acquire image (SS, TW)

Research (AL)

Become familiarized with system, identify friction (JD)

3/6-3/10 Determine image analysis functions, begin writing algorithms (SS, AL)

Research (JD)

Become familiarized with system (AL)

3/20-3/24 Determine link between image resolution and pan-tilt angles, store electronically (JD)

Research, build model (AL, TW)

Tuning and testing (SS)

3/27-3/31 Progress Report and Presentation due 3/29

Modify system to take images at a given sampling rate (TW)

Build model, write algorithms (AL, JD)

Tuning and testing (SS)

4/3-4/7 Finalize algorithms, integrate with trajectory generation (AL)

Write algorithms (SS)

Tuning and testing (JD, TW)

4/10-4/14 Integrate with control system (SS)

Test (TW)

Integrate with feedback and trajectory generation (AL, JD)

4/17-4/21 Demonstration (use of demo script) 4/19

Test (AL, TW)

Integrate with feedback and trajectory generation (JD, SS)

4/24-4/26 Final Presentation due 4/26

Integrate with feedback and trajectory generation (AL, JD, SS, TW)

Final Report and Lab Notebook due 5/3/2006

21

9. Bibliography 1. 2003 Control Systems Design Team 1 Preliminary Report http://www.cats.rpi.edu/%7Ewenj/ECSE4962S03/prelimdesign/team1prelimreport.pdf 2. Schilling, Robert J. Fundamentals of Robotics: Analysis & Control Upper Saddle River, New Jersey, Prentice Hall, 1990 pages 140-147, page 265-266. 3. Groover, Mikell P., Weiss, Mitchell, Nagel, Roger N., Odrey, Nicholas G. Industrial Robotics: Technology, Programming, and Applications McGraw-Hill Inc, 1986 pages 393-410. 4. Fu, K.S, Gonzalez, R.C, Lee, C.S.G, Robotics: Control, Sensing, Vision, and Intelligence, McGraw Hill Inc, 1987 pages 149-155. 5. Olsson, H., Astrom, K.J., Canudas de Wit,C., Gafvert,M., Lischinsky,P. “Friction Models and Friction Compensation” Available athttp://www.cats.rpi.edu/%7Ewenj/ECSE446S06/astrom_friction.pdf

22

Appendix

A. Robotics Diagram

ImageAnalyzer

TaskPlanner

TrajectoryPlanner

RobotController

ArmDynamics Arm kinematics

Position, VelocitySensor Environment

Camera

Tool

I (k, j)Reduced Data

Task

{ wk }

MotionType

r (t) tau (t)

q (t)

w (t)

Ftool (t)

23

B. Output generated by pantilt.m *** Mass/Inerita Matrix *** M_11 14482491428700703/9223372036854775808+Im1*N1^2+22353272161815393/9223372036854775808*c2^2+2298086153330699/9223372036854775808*s2*c2 M_12 2347097999413041/4611686018427387904*s2-3701450764297979/147573952589676412928*c2 M_22 1521180990828581/576460752303423488+Im2*N2^2 *** Coriolis/Centrifugal Term *** 1/147573952589676412928*(73538756906582368*c2^2-36769378453291184-715304709178092576*s2*c2)*td2*td1+1/147573952589676412928*(3701450764297979*td2*s2+75107135981217312*td2*c2)*td2 (22353272161815393/9223372036854775808*s2*c2-2298086153330699/9223372036854775808*c2^2+2298086153330699/18446744073709551616)*td1^2 *** Gravity Term *** 0 -10833/400000*g*c2-83053/50000000*g*s2

24

C. Pan Assembly (Body A)

25

D. Tilt Assembly (Body B)

26

E. Laser Mounting Assembly

27

F. Pan Tilt Assembly

28

G. Mass Properties for Pan Axis (Body A) Output coordinate System: -- default -- Density = 2796.7634 kilograms per cubic meter Mass = 0.5603 kilograms Volume = 0.0002 cubic meters Surface area = 0.0680 square meters Center of mass: ( meters ) X = -0.0126 Y = -0.0228 Z = -0.0003 Principal axes of inertia and principal moments of inertia: ( kilograms * square meters ) Taken at the center of mass. Ix = (0.1102, 0.7740, -0.6235) Px = 0.0006 Iy = (-0.0656, 0.6316, 0.7725) Py = 0.0009 Iz = (0.9917, -0.0442, 0.1204) Pz = 0.0012 Moments of inertia: ( kilograms * square meters ) Taken at the center of mass and aligned with the output coordinate system. Lxx = 0.0012 Lxy = 0.0000 Lxz = -0.0001 Lyx = 0.0000 Lyy = 0.0007 Lyz = -0.0001 Lzx = -0.0001 Lzy = -0.0001 Lzz = 0.0008 Moments of inertia: ( kilograms * square meters ) Taken at the output coordinate system. Ixx = 0.0015 Ixy = 0.0002 Ixz = -0.0001 Iyx = 0.0002 Iyy = 0.0008 Iyz = -0.0001 Izx = -0.0001 Izy = -0.0001 Izz = 0.0012

29

H. Mass Properties for Tilt Axis (Body B) Output coordinate System: -- default -- Density = 2797.6691 kilograms per cubic meter Mass = 0.3611 kilograms Volume = 0.0001 cubic meters Surface area = 0.0390 square meters Center of mass: ( meters ) X = 0.0750 Y = 0.0151 Z = 0.0046 Principal axes of inertia and principal moments of inertia: ( kilograms * square meters ) Taken at the center of mass. Ix = (0.9452, -0.3132, -0.0920) Px = 0.0002 Iy = (0.3263, 0.9140, 0.2409) Py = 0.0006 Iz = (0.0087, -0.2578, 0.9662) Pz = 0.0007 Moments of inertia: ( kilograms * square meters ) Taken at the center of mass and aligned with the output coordinate system. Lxx = 0.0003 Lxy = -0.0001 Lxz = -0.0000 Lyx = -0.0001 Lyy = 0.0006 Lyz = 0.0000 Lzx = -0.0000 Lzy = 0.0000 Lzz = 0.0007 Moments of inertia: ( kilograms * square meters ) Taken at the output coordinate system. Ixx = 0.0003 Ixy = 0.0003 Ixz = 0.0001 Iyx = 0.0003 Iyy = 0.0026 Iyz = 0.0001 Izx = 0.0001 Izy = 0.0001 Izz = 0.0029

30

I. Motor Datasheet

31

Statement of Contribution Jason Duarte _____________________________________ Specifications Control System Design (Model Development) Verification (Simulation) Azmat Latif ______________________________________ Abstract Introduction Design Strategy (Trajectory Planning, Control Design) Verification (Trajectory Planning, Control Design) Friction Identification (in Plan of Action) Stephen Sundell ___________________________________ Design Strategy (Image Processing) Verification (Image Processing) Tim Weidner ______________________________________ Objectives Plan of Action Cost Schedule