software engineering final report - blue team

65
Department of Aerospace Engineering 1 Software Engineering Project Final Report Blue Team 1 The Pennsylvania State University, University Park, PA 16802 This report serves as the culiminating experience for the software engineering project in AERP 440 Introduction to Software Engineering for Aerospace Engineers. Information about requirements, design, coding, testing and verification and validation are presented and analyzed. Various reports or key data points are presented, such as UML sequence diagrams and testing reports. The Blue Team found this project to be benefial to their learning in the course. I. Introduction HIS semester in AERSP 440 Introduction to Software Engineering for Aerospace Engineers, the class has experienced two parallel activities. The first of these activities are traditional instructional methods, namely lectures, homework and examinations. The later of these activities consisted of a software engineering project; contained here is the final report for that project. Specifically, we were directed by Dr. Long that: “The project for this course will be the development of a software/hardware system using a small three- wheeled robot chassis and an onboard Arduino processor, sonar sensor, and wifi camera. Each team will build a robot that will: “Controlled thru a user interface on a laptop “The person controlling the robot will not be in the same room as the competition “Each robot will have a webcam on it and the person controlling it will see the image from the “onboard camera “Try to find and shoot the other robot using infrared sensor “Try to evade the other robot “Which ever team shoots the other teams robot the most wins” In order to undertake this task, we needed a laptop and a C or C++ compiler. Additionally, Dr. Long provided: A robot platform An Arduino processor An Arduino WiFi board An Arduino Motor control board A wifi camera (Foscam FI8909W) A Infrared transmitter and receivers Batteries The team noted early on that all of the above provided materials were to be returned at the end of the semester, or that the team would have to reimburse Dr. Long for the cost of them. Our team elected Brad Sottile as its Chief Executive Officer (CEO), Tom Gempp as its Chief Financial Officer and Brian Harrell as its Chief Information Officer (CIO). 1 Brad Sottile, CEO, is currently a graduate student in aerospace engineering. The other members of the team are undergraduate students in aerospace engineering or engineering science. Various members of the team are student members of AIAA and/or IEEE. T

Upload: bsharrell

Post on 09-May-2017

224 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

1

Software Engineering Project Final Report

Blue Team1 The Pennsylvania State University, University Park, PA 16802

This report serves as the culiminating experience for the software engineering project in AERP 440 Introduction to Software Engineering for Aerospace Engineers. Information about requirements, design, coding, testing and verification and validation are presented and analyzed. Various reports or key data points are presented, such as UML sequence diagrams and testing reports. The Blue Team found this project to be benefial to their learning in the course.

I. Introduction HIS semester in AERSP 440 Introduction to Software Engineering for Aerospace Engineers, the class has experienced two parallel activities. The first of these activities are traditional instructional methods, namely

lectures, homework and examinations. The later of these activities consisted of a software engineering project; contained here is the final report for that project. Specifically, we were directed by Dr. Long that:

“The project for this course will be the development of a software/hardware system using a small three-wheeled robot chassis and an onboard Arduino processor, sonar sensor, and wifi camera. Each team will build a robot that will:

“Controlled thru a user interface on a laptop “The person controlling the robot will not be in the same room as the competition “Each robot will have a webcam on it and the person controlling it will see the image from the

“onboard camera “Try to find and shoot the other robot using infrared sensor “Try to evade the other robot “Which ever team shoots the other teams robot the most wins”

In order to undertake this task, we needed a laptop and a C or C++ compiler. Additionally, Dr. Long provided:

A robot platform An Arduino processor An Arduino WiFi board An Arduino Motor control board A wifi camera (Foscam FI8909W) A Infrared transmitter and receivers Batteries

The team noted early on that all of the above provided materials were to be returned at the end of the semester, or that the team would have to reimburse Dr. Long for the cost of them. Our team elected Brad Sottile as its Chief Executive Officer (CEO), Tom Gempp as its Chief Financial Officer and Brian Harrell as its Chief Information Officer (CIO).

1Brad Sottile, CEO, is currently a graduate student in aerospace engineering. The other members of the team are undergraduate students in aerospace engineering or engineering science. Various members of the team are student members of AIAA and/or IEEE.

T

Page 2: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

2

II. Chiefs

A. Chief Executive Officer (CEO) Brad Sottile is the Chief Executive Officer (CEO) for the Blue Team. The role of the CEO is to interact with the

other groups to maintain the project’s costs and time time schedule. The CEO has also been the main point of contact for the customer. Every week, Brad has tried to stay in touch with Dr. Long, the TA, and the other groups in order to mentor the groups and help to troubleshoot problems. The CEO has also action as the Blue Team’s designated liaison to the CEO of the White Team, Tim Double. For the saking of space, a detailed Gannt chart reflecting the Team’s progress can be found attached to this report. All in all, some tasks where completed ahead of schedule, some tasks ran late, and some tasks were completed right on time. The Blue Team found trying to work ahead to be beneficial, since it enabled us to have a little more time to fall back on for when we did have schedule slips. Overall, the CEO is pleased that this project was delivered on time and under budget, a rarity for many software engineering projects.

B. Chief Financial Officer (CFO) Tom Gempp is the Chief Financial Officer (CFO) for the Blue Team. The CFO is responsible for the financial

planning and record-keeping, as well as financial reporting to higher management. Typically, the CFO reports directly to the CEO and assists the Chief Operating Officer on all strategic and tactical matters as they relate to budget management, cost benefit analysis, forecasting needs and the securing of new funding. Throughout the course of this development, the CFO reported directly to the CEO, and updated the teams and client regularly.

Overall, the project was planned to have cost $326,000.00; the final cost of the project is $206.150.00, approximately 45% less than what was expected.

1. Constructive Cost Model (COCOMO)

The Constructive Cost Model (COCOMO), an empirical model based on project experience, is a well-documented, publically available model which acts independently from a specific software vendor. There are three distinct level of project complexity that this model is able to represent: simple, moderate, and embedded. This project fell into the simple complexity category in which the following model can be used: 2.4 . (1) where PM is person months, KDSI is the thousands of delivered source instructions, and M is the product, project, and team characteristics; all rated on a scale from 1 to 6. The variable M is defined as:

(2) where PERS is the personnel capability, RCPX is the product reliability and complexity, RUSE is the reuse requirement, PDIF is the platform difficulty, PREX is the personnel experience, FCIL is the team support facilities, and SCED is the required schedule. The following values were used for each:

PERS 3 PREX 2 RCPX 2 FCIL 1 RUSE 1 SCED 2 PDIF 2

With 1,000 lines of code written and using the COCOMO equation above, the expected time for this project was: 115.2 person months, which for a team of 35 people works out to be 3.29 months/person. This appears high considering no one was able to work on this project more than part time, however, this overprojection may be a function of our inexperience in software cost estimation.

2. Initial Cost Estimation The methodology used in conducting the initial cost estimation was the bottom-up approach. This method starts

at the component level and estimates the effort required for each component; then these values are added to reach a final estimate. Table 1 shows the breakdown each of the Chiefs and team leads submitted for estimation.

Page 3: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

3

Table 1. Initial Blue Team Cost Re-Evaluation

Midway through the project, a voluntary reassessment was conducted so that the Chiefs and team leads could have a better understanding of the amount of remaining work to complete our mission objective. The following table shows the breakdown of the Chiefs’ and team leads’ estimation for the second half of the project.

Table 2. Midsemester Blue Team Cost Re-Evaluation

3. Operational Target (OPTAR) Using the operational target (OPTAR) system used predominately at the United States Naval War College, the budget was able to monitored more efficiently. This required team leads to submit weekly hours reports for their groups, which were then complied into a series of interconnected documents. Once the data was loaded into the system, the OPTAR was updated instantaneously. The complete Blue Team OPTAR is shown in Figure 1.

Figure 1. Blue Team OPTAR

Group Hours/Person Team Members Total Man Hours (hrs) Hourly Rate ($/hr) Total Cost $/wk (15 wks)

CEO 180 1 180 400.00$                     72,000.00$           4,800.00$           

CFO 160 1 160 400.00$                     64,000.00$           4,266.67$           

CIO 160 1 160 400.00$                     64,000.00$           4,266.67$           

Requirements 10 6 60 200.00$                     12,000.00$           800.00$               

Design 15 6 90 200.00$                     18,000.00$           1,200.00$           

Coding 42 7 294 200.00$                     58,800.00$           3,920.00$           

Testing 17 6 102 200.00$                     20,400.00$           1,360.00$           

V&V 12 7 84 200.00$                     16,800.00$           1,120.00$           

Total 596 35 1130 ‐ 326,000.00$         21,733.33$         

Group Hours/Person Team Members Total Man Hours (hrs) Hourly Rate ($/hr) Total Cost $/wk (7 wks)

CEO 55 1 55 400.00$                     22,000.00$           3,142.86$           

CFO 30 1 30 400.00$                     12,000.00$           1,714.29$           

CIO 30 1 30 400.00$                     12,000.00$           1,714.29$           

Requirements 5 6 27 200.00$                     5,400.00$              771.43$               

Design 5 6 29.5 200.00$                     5,900.00$              842.86$               

Coding 18 7 127.5 200.00$                     25,500.00$           3,642.86$           

Testing 10 6 60 200.00$                     12,000.00$           1,714.29$           

V&V 8 7 54.25 200.00$                     10,850.00$           1,550.00$           

Total 160 35 413.25 105,650.00$         15,092.86$         

Page 4: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

4

As can be seen in the preceeding figure, the overall project was significantly under budget. During the first five weeks of the project, all of the teams were working simultaneously in order to get ahead, mainly by the Requirements and Design groups. This surge leveled off from weeks five through eight, this can attributed to the majority of the work being done by mainly the Coding group. There was a steady increase in workflow from week nine through eleven. There was a significant surge in fund expenditures from week eleven through thirteen as Testing and V&V began to have a significant increase in workflow; this trend continued up to the completing of the project.

4. Blue Team Final Cost Breakdown

A complete analysis of the hours and fund expenditures by each of the groups was conducted. The breakdown of each team is shown in Table 3.

Table 3. End of Semester Blue Team Cost Breakdown

As can be seen by Table 3, the overall project was significantly under budget. This can be attributed to using the bottom-up method as described above. All of the groups overestimated their number of project hours because of the unfamiliarity with the objective and tasks. The following table and figures show the distribution of the funds expended throughout the course of the project.

Table 4. Cost Breakdown by Expense

Group Total Work Hours Hours Estimated Hours Remaining Funds Est. Funds Exp. Funds RemainingCEO 52 180.0 128.00 72,000.00$     20,800.00$     51,200.00$            CFO 46 160.0 114.00 64,000.00$     18,400.00$     45,600.00$            CIO 38 160.0 122.00 64,000.00$     15,200.00$     48,800.00$            Requirements 58 60.0 2.00 12,000.00$     11,600.00$     400.00$                  Design 94 90.0 -4.00 18,000.00$     18,800.00$     (800.00)$                 Coding 324 294.0 -30.00 58,800.00$     64,800.00$     (6,000.00)$             Testing 108 102.0 -6.00 20,400.00$     21,600.00$     (1,200.00)$             V&V 174.75 84.0 -90.75 16,800.00$     34,950.00$     (18,150.00)$           

Total 894.75 1130.0 235.25 326,000.00$   206,150.00$  119,850.00$          

Group Total Work Hours Hourly Rate Funds Exp % Hours % Funds

CEO 52 400.00$ 20,800.00$ 5.81 10.09

CFO 46 400.00$ 18,400.00$ 5.14 8.93

CIO 38 400.00$ 15,200.00$ 4.25 7.37

Requirements 58 200.00$ 11,600.00$ 6.48 5.63

Design 94 200.00$ 18,800.00$ 10.51 9.12

Coding 324 200.00$ 64,800.00$ 36.21 31.43

Testing 108 200.00$ 21,600.00$ 12.07 10.48

V&V 174.75 200.00$ 34,950.00$ 19.53 16.95

Total 894.75 - 206,150.00$       100 100

Page 5: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

5

Figure 2. Project Work Hours Distribution Figure 3. Project Fund Expenditure Breakdown

C. Chief Information Officer (CFO) Brian Harrell is the Chief Information Officer for the Blue Team. The CIO had several responsibilities over the

duration of this project. His primary responsibilities involved setting up a secure website for the blue team, and monitoring all files and file revisions. He began the semester by setting up a website for the blue team, which was hosted by Google. The website contains a separate page for each step of the software engineering V-cycle (Requirements, Design, Coding, Testing, Validation & Verification), as well as separate pages for the chiefs, finances, Gannt chart and hours reports. Each page of the website contains all of the files and file revisions for each separate V-cycle group. In addition, the chiefs page contains brief descriptions of each chief’s role and the Finances page contains information and graphs related to our projected cost for this project as well as reports for each week’s progress and expenses. Finally, the Gannt chart page contains the original Gannt chart as well as updates to the chart as they have been provided by the CEO.

On top of maintain the website and monitoring access to the website, the CIO has also spent time backing up and organizing all of our team’s files on his own personal external hard drive. He also spent time discussing and monitoring the progress of the project at several meetings with the chiefs and group leads. Overall, throughout the course of this project, he was able to closely monitor access to, and the security of our website, as well as ensuring that all groups had the necessary files and documents needed to complete the project in a timely manner.

III. Requirements Ty Druce is the lead of the Requirements group. This portion of the report covers all the aspects leaned about

requirements engineering, including the overall process, CONOPS, requirements elicitation and documentation, and useful charts. This document focuses on lessons learned for each of these aspects. Requirements engineering is a crucial part of the systems engineering process and vital to the success of any program.

A. Reflection and Lessons Learned 1. Requirements Documentation

The first step in requirements engineering was requirements elicitation. This is the process of “requirements gathering” and required collaboration between the customer, program management, the design team, V&V, and the user. It was learned that the coding team also played an integral part in determining requirements, since this team has direct knowledge on the capabilities of the software itself. In future programs, the coding team should be consulted sooner in the process to save time and money.

In requirements documentation, semantics played a crucial role. Many requirements were rewritten several times to obtain the most clear and concise language. It is important that requirements clearly convey what the system should do without overbearing constraints. Finding this balance was difficult and it took a couple weeks of practice

Page 6: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

6

before good requirements could be written easily. Unclear language can result in misunderstandings and deviate the program from the proper path.

2. Requirements Schematics

Several types of schematics were used to visually represent the requirements and convey a high level picture of the system. This team utilized system flow charts, UML diagrams, scenarios, and sequence diagrams to help visually portray the system. Software engineering is uniquely difficult since it is not a tangible product; it is hard to visualize and track progress. For this reason, it is imperative that visual aids are used to help team members understand the scope of the system. These visuals also help program management track progress and help estimate hours and cost. They are also helpful in making sure the customer and the designers are on the same page for the product.

3. Requirements Engineering

One of the leading reasons for software engineering failures is due to poor requirements. Requirements engineering is the first step in the engineering cycle, but the most critical. Requirements are the cheapest item to fix at the start of a program, but the most expensive to fix at the end of the program; therefore, it is critical to invest time and skill into generate solid, clear, and practical requirements. This team learned the requirements process is iterative in itself. To develop proper language, and to convey system and user capabilities, all take several iterations.

B. CONOPS and Requirements Document A Concept of Operations (CONOPS) and formal requirements document was approved by the customer and

then utilized by the Architecture and Design team to design the system. This document was also used by the Verification and Validation team to ensure the proper product was built and all the requirements were met. This document has been submitted with this report.

C. Traceability Matrix A traceability matrix was used to describe the relationships between requirements, their sources, and the system

design. It helps to link dependent and reliable requirements; therefore, making it easier to see how a requirement’s change propagates to other requirements. Our traceability matrix may be found below in Figure 4.

Figure 4. Traceability Matrix

D. Viewpoint Charts When developing a system, it is important to consider all actors with will be involved in using the system.

These actors may directly or indirectly act on the system, and they may be tangible or intangible actors. For

Page 7: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

7

example, both the specific user and manager of a system play a separate and unique role in how they will interact with it. Formal specifications, such as government guidelines, will also play a role in how the system is designed. Figure 5 shows the viewpoint chart for this program.

Figure 5. Viewpoint Chart

E. Scenarios Scenarios are used to describe real-life examples of how a system can be used. It makes it easier for people to

understand how the system would react in certain situation. Below is an example scenario created for the program.

1. Initial Assumption The robot seeks the white team’s robot and shoots it with an infrared laser more times than the white team can

hit the blue team’s robot. Each team’s operator has the control laptop in a separate room and can only view the robots through a wifi camera.

2. Normal Operation

The blue team’s operator maneuvers the robot and quickly finds the white team’s robot. Once fount, it will begin firing and infrared laser at it and hit it as many times as possible. If the white team locks onto the blue team’s robot, the blue robot will evade it immediately and continue to seek and destroy the white robot. The blue team will hit the white team’s robot with the laser more times than white team hits the blue robot.

3. What Can Go Wrong

The white team locates the blue robot first and begins firing upon it before the blue team can find the white robot or the white team’s robot starts shooting the blue robot more times than the blue team is hitting the white robot. Further, the blue team must immediately perform evasive maneuvers and continue to seek and fire at the white robot. Finally, if the blue robot hits an obstacle or gets stuck, the operator must immediately proceed around the object, or reverse out of the situation.

4. Other Activities

The white team may be evading the blue team’s lasers remarkably or the blue team keeps missing the white robot.

5. System State on Conclusion

The blue team hits the white team’s robot with the laser more times than the white team hits the blue robot and the blue team receives three extra credit points. The robot and the control laptop are shut down.

F. Use Cases Use cases were used to identify the actors in an interaction and describe the interaction itself in the Unified

Markup Language (UML). The following is an example of simple use case describing the task of our current system. The 3 persons in the diagram are the customer who makes the demands, the user who programs the system

Page 8: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

8

and controls the robot, and the robot itself which performs the action. The customer creates instructions for the user. The user then edits the instructions for the robot in a language which the robot can understand. The robot then transmits information back to the user so that the user’s instructions may be accurate to the current situation of the robot. The robot also takes the instructions from the user and interprets the commands and performs the task. Our use case may be found in Figure 6

Figure 6. Use Case

G. Sequence Diagram A sequence diagram was used to add detail to use-cases by showing sequence of even processing in the system.

Sequence diagram helps understand the interaction between user and a system via series of events that take place for a given condition. Figure 7 is a general representation of a task described in sequence of events.

Figure 7. Sequence Diagram

This sequence diagram below shows the interactions between the user, the motherboard, and the robot. It is the responsibility of the team to ensure that the robot gives feedback to the user of whether or not a task can be completed, when it was completed, and what the user can do to correct a possible error. Here it shows whether the commands were accepted, whether the laser was fired, and whether the robot has taken damage from the other robot in competition.

Page 9: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

9

H. Structure Chart The robotic system that the Blue Team will be using is made up of two main parts. The vehicle system will be in

the room of the competition while the controller or operator will be in another room. As the competition commences, the camera mounted on the vehicle will continuously send a real time feed through an internet browser on the operator’s laptop in the other room. From this separate room the operator will control the vehicle using a series of commands that will be sent to the processor and board on the vehicle system. These commands will funnel down from the processor to the IR gun and sensor as well as the vehicle motor. Figure 8 shows the robotic system breakdown.

Figure 8. High Level Structure Chart

I. Requirements Conclusions Requirements engineering begins the development process and is one of the most critical aspects of a program.

The difference between good and bad requirements could be the difference between a successful or unsuccessful program, or a program with runaway costs at the end to fix poor requirements. Requirements elicitation should utilize the several chart explained in this document to help engineers consider all actors involved in a system. It is important requirement engineers understand the capabilities and limitations of the systems they are creating so unfeasible requirements do not plague lower level design engineers. Understanding requirements is necessary for all engineers.

IV. Design Evan Masters is the lead of the Design group. The design team first began with the information provided from

the requirements engineering process. Requirements were reviewed, rated, and traced based on understanding and clarity for the system. Preliminary and architectural designs were created in the form of sequence diagrams, structure diagrams, state diagrams and data flow models. The key goal of the design team was to create a design portfolio that could be easily understood and developed by the coding team, and then passed to testing and validation and verification (V&V). It is of paramount importance that the design team worked quickly and efficiently to produce a system design so that the coding team could get a quick start on code development. In addition, the preliminary budget dictated that the coding team would expend the most hours on the project, so the design team needed to work in an efficient manner to produce preliminary and architectural designs to the team within the budget guidelines. Once each team progressed through the V-cycle, the design team used an iterative approach to the architectural design to meet the overall team goals. As requirements were updated, the corresponding design was tweaked and edited to meet these new requirements. During each iteration this information was passed on to coding, testing and V&V to ensure that the overall team was making progress and conforming to each updated design. It is the belief of the design team that a robust and efficient design was created to give the team the greatest chance of success and therefore completing the objectives set forth in the competition, thus placing the team in position to win the overall competition.

Page 10: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

10

A. Reflection and Lessons Learned This passage reviews and reflects on the design engineering portion of the software engineering project. It

focuses on the lessons learned during this process.

1. Design Process To prepare for the design process, the team familiarized itself with chapter 3 of the “Software Engineering Book

of Knowledge” (SWEBOK) and section 2.3 of a “Gentle Introduction to Software Engineering” (GISE). These publications discussed the need for a robust design and ways in which this can be achieved. The next step in developing an architectural design for the robot was to obtain the requirements document from the requirements team. A preliminary design was critical in understanding the link between product specifications from requirements and detailed interactions between the individual components. This architectural design is important in the verification and validation of non-functional requirements of the system. For this project, the simplified V-cycle does not allow for V&V to take place until some testing has begun. In a real software application, this V&V work would take place at each step in the software development cycle, which contributes to V&V often consuming 50% or more of the overall budget.

Detailed design of the system and its components revolved around creating diagrams that show the interaction

between system components for each of the functional requirements. These models were created in multiple ways through sequence, state and data flow models to give the coding team different angles of the same processes to create a more clear vision of the system they are to develop. A very important portion of software design lies with giving the coding team enough information from which they can develop a system that not only completes the intended goal, but also is created in a way that the testing and V&V teams can test and verify the system requirements in a reasonable manner.

2. Design Models

The models created for the coding team’s software development included sequence diagrams for both the movement and infrared systems. A structural model of all components and their main interactions was created to show the overall system, and a state transition model included all of these components for their various run case situations. These systems were also modeled for redundancy in data flow models. The goal of creating many diagrams for the coding team was to ensure that they were able to fully understand the system they were to create, while showing the component interactions from different viewpoints. Because of the extensive costs related to coding, a good design is critical in reducing costs associated with meetings and interactions between design and coding team members to clarify models, when this time should be spent by the coding team writing code.

3. Design Engineering

A sound architectural and detailed design of a software system is critical in maintaining a schedule and meeting budget requirements for an entire project. By designing a system that can be readily developed into useful code, the V-cycle for software development can seamlessly transition from design to coding, testing and verification and validation with fewer changes to be made. In the case of requirements changing, design models can be updated and passed on to the coding team to make appropriate changes, while limiting the time required to implement these modifications. As with other areas of the software lifecycle, the design method became an iterative process between design, requirements and coding.

B. Requirements Ratings Table The design team individually rated the requirements numerically on a scale from “not well understood- a good

design cannot be developed” (5) to “the requirement is well understood and a good design can be developed from it” (1). The team then discussed as a group how well they understood the requirements, and comments were relayed to the requirements team for revisions. This was an iterative process, after which all of the requirements were understood by team members. The final ratings are shown in Figure 9, with averages taken and no ratings showing that the individual believes that they could not create an acceptable design from the requirement.

Page 11: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

11

Figure 9. Requirements Ratings Table

C. Traceability Matrix Once the design team had determined that they fully understood these requirements, a traceability matrix was

constructed and is presented as Figure 10. It helps to link dependent and reliable requirements in a way such that it is easier to see how a requirement can effect or change other requirements. Pertaining to design, this helped construct the data flow and sequence models to guide the commands throughout the system from start to finish. Understanding the related and dependent requirements also helped to construct the overall system structure diagram.

Figure 10. Traceability Matrix

Page 12: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

12

D. Sequence Diagrams Sequence diagrams were created for two distinct scenarios. One situation modeled the movement commands

from the user to the robot, while the other showed how the infrared system should interact with the enemy robot. The following diagrams illustrate a use case for movement and the infrared gun, with other system constraints set in to show pertinent non-functional requirements.Figure 11 is the sequence diagram the movement control system, and Figure 12 is the sequence diagram for the IR system.

Figure 11. Sequence Diagram for Movement Control System

Figure 12. Sequence Diagram for IR System

Page 13: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

13

E. State Transition Model A state transition model shows the various states of components onboard the robot at any instant in time during

which the robot is active. This diagram below shows the movement flow for the required forward, backwards and left / right rotations, as well as the infrared gun trigger and delay mechanism. The infrared sensor is continuously on and sensing for hits from the enemy IR gun. This model helps the coding team understand the various functions of the robot and how they are interconnected. This also shows some of the non-functional requirements such as the two second delay on firing.

Figure 13. State Transition Model

F. Structural Model A structural model displays all of the system components with arrows directing the ways in which information is

passed between them. This integrates both hardware and software to show the full system. This is useful for the coding team to check where information is being passed to, and making sure that these are the correct destinations.

Figure 14. Structural Model

Page 14: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

14

G. Data Flow Diagrams A structural model is a model that represents the flow of information through a system, with particular emphasis

on its process aspects. A data flow diagram shows what kind of information will be input to and output from a system, and its various subcomponents. Figure 15 illustrates the data flow model for the IR system, while Figure 13 illustrates the data flow model for movement.

Figure 15. Data Flow Model for IR System

Figure 16. Data Flow Model for Movement

H. Design Conclusions To fully implement and confirm that the requirements have been met, a well understood design is necessary.

Using different diagrams is needed to explain and document the different component interactions in the robotic system so that the coding team can develop a robust code that will execute the mission successfully. This was accomplished through taking time to understand the functional and non-functional requirements set forth from the customer needs, after which these requirements were examined for their relations and dependencies. From this, preliminary structure and sequence diagrams were created, which were then revised after communicating with the coding team to determine their capabilities of developing code. These final diagrams were then split into component and system models to show how the movement and infrared systems should interact with the robotic vehicle in the

Page 15: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

15

test arena. These detailed sequence diagrams provide the basis for movement and infrared shooting, with listed non-functional requirements that constrain the overall system. By using both data flow and sequence models, the coding team is given multiple angles of the final design to aid in their code development. There will be continued communication between all of the teams included in this project, ultimately using an iterative V-cycle process to develop a robust design capable of winning the final competition.

V. Coding Kelvin Nguyen is the lead of the Coding group. Software constructions is related to the creation of working,

useful software. Desirable traits of this software include minimal complexity, anticipation of change, constructs for testing and verification, and use standards.

Figure 17. Client Code Flow Diagram

A. Client Side Code Summary A high level flowchart for the client side code is shown in Figure 17. The main function begins by creating an

instance of the ‘Game’ class, which is where the connection is initialized. Should the initializations fail, the program closes. Otherwise, the game object begins running and transitions into its main loop.

While the game is running, the code loops through many events. First, the event log and keyboard state are updated. If the controller is attached, the code will take input from the controller and check for any desired events. However, if the controller is not attached, the controller pointer is reset and input is taken from the keyboard. Once the code checks the state of the keyboard or controller, it then fills a 5 element array with the appropriate values to control the wheels and IR gun of the robot.

Page 16: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

16

Finally it sends the array via the established Wi-Fi connection to the robot. If the IR gun has been fired during the past two seconds, a reload flag will be set in place to prevent subsequent firings too early, and an appropriately sized rectangle will be rendered on the GUI to act as a reload bar. Should the client code detect a disconnect from the Arduino server, a ‘Disconnected’ image will be rendered in place of the ‘Connected’ image on the GUI.

B. Detailed Source Code Components Shown in Figure 18 is a detailed UML sequence diagram of the client side code. The major components of the

code are then detailed.

Figure 18. UML Sequence Diagram

1. Key Variables The UInt8 sendarray[] variable is a five element array that is sent to the Arduino Wi-fi board. The variable tells

the Arduino to do two different things; move and fire. The first and second elements control the left wheel, while the third and fourth control the right wheel. The first and third elements control the speed of the wheels, with a higher value corresponding to a faster speed. These values range from 0 to 255. The second and fourth elements control the direction of the wheel. If the integer is 1 then the wheel will turn forward, if it is 0 it will turn in the opposite direction. The last element determines if the IR gun fires – if the value is 1, voltage is sent through the IR gun. Each element is only one byte in size which allows for simple transmission from the client to the Arduino.

The integer time_now is updated at the start of each loop iteration, and is used to keep track of clock and ensures that events do not happen too often. Futher, the integer time_fired receives updates by using function call to SDL_GetTicks() when command to fire is sent. This variable is used to prevent firing more often than once every 2.0 seconds. Finally, the integer time_hit is updated by using function call to SDL_GetTicks() when robot detects a hit event. This variable is used to prevent a single hit from being registered more than 1.5 seconds.

Page 17: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

17

2. Functions The first function is OnInit(). This section of the code handles initialization of the SDL and Winsock

components. In this function a connection is made between the user’s computer and the robot. If the connection succeeds then it continues on with the program. If a connection cannot be established then it will keep attempting to establish a connection until the user terminates the program. However before a connection is established the function will begin initializing SDL and create a window on the upper left corner of the computer screen which the client will use to visually communicate with the user later in the program.

After passing through the OnInit() function the client runs through one of two functions: GetControllerInput() and GetKeyboardInput(). In this section of the code the system takes the user inputs through whatever medium the user is using and turns them into values. During each cycle of the main loop the client checks for a controller input through either the GetControllerInput() or GetKeyBoardInput() functions. If the client detects a controller the client uses the GetControllerInput() function where it reads in the state of all the buttons and joysticks and turns them into usable values which is stored in the UInt8 sendarray[] variable. When filling the first slot of the variable the function adds a bias value to fix a hardware problem present in the robot’s left wheel. Should a controller not be connected, then the GetKeyBoardInput() function will be called to fill up the UInt8 sendarray[] variable. Like the controller input function, this function also maps the state of the keyboard into usable values and adds an additional bias to the first value in the array.

After the GetControllerInput() and GetKeyBoardInput() functions are called within the main loop the client goes through the functions OnLoop() and OnRender().In this section of the client code are the functions that contain the transmission of data to the robot and the rending of the visual the user will use to determine wether or not they will be able to fire the IR emitter at the given time. Here the function OnLoop() packages the array generated by the GetControllerInput() and GetKeyBoardInput() functions as well as determines if the IR gun is allowed to be fired. If the IR gun is allowed to fire the function sends off the array to the robot as is. At the start of the OnLoop() function, the time_now variable is set by using the SDL_GetTicks() function, which returns the number of milliseconds since epoch, or when the program started. When a packet is sent, or the IR gun is fired, more function calls to GetTicks are used to set the values of time_fired and time_sent. Conditional if statements are then used to allow packets to be sent to the Arduino every 40 ms, as to not overload the server. Another if statement is used to control firing of the IR gun. If the gun should not be fired, then it changes the value pertaining to the IR gun in the code so it will not fire (the fifth element of the sendarray[] variable is set to 0). After this, the loop runs through the OnRender() function which handles rendering of the reload bar and GUI images to the window. Here the function uses the time_now, time_fire and time_hit to generate SDL visuals that correspond to the robot’s IR gun and sensor. OnRender() uses the time_now and time_fire variables to generate a red box which the user will use to determine wether or not they are allowed to fire the IR gun.

3. Third Party Libraries

The code utilizes two third party libraries for many of its primary functions. The first, SDL, or Simple DirectMedia Layer, allows for fluid, low-level input from various devices, as well as low-level graphical output. SDL is a cross-platform library written in C that is used in video playback software, emulators, and even in popular games from various developers, including Valve. The library has a lot of capabilities, but is used in this code mostly for keyboard and controller input, and for creating some simple graphical outputs. Because SDL is designed for use in video games, it was an attractive choice in terms of the desired capabilities of our software; one of the key goals was to make the control scheme as simple and sensible as possible, and casting it as a game seemed like the best way to do that.

SDL's primary capability revolves around the use of events. Any time an event is triggered, it is added to the event log, and the functions can be used to poll for any desired events, while ignoring any others, and resolve anything that the code should do upon those specific events. Things like key presses, controller button presses, mouse clicks, and many others are all distinct events that the library can recognize. Another useful capability of SDL is its ability to retrieve a keyboard or controller state, as opposed to polling for distinct key press or joystick motion events. That is, the library has functions that return whether or not a key is pressed, or how far a joystick is moved on either axis, so that input can be obtained based on the state of the device, to allow for a constant stream of the desired user command. The last capability of SDL our code utilizes is the graphical output. SDL's simplest graphical capabilities involve rendering simple shapes and image files to a generated window. The code utilizes these capabilities to display messages, saved as bitmap files, as well as the reloading bar, a simple animation that is just a series of rendered rectangles increasing in size.

The second third party library, WinSock, is a Windows platform-dependent networking library. There are two standards for networking in C/C++: for Windows platforms, it is WinSock, while Unix platforms have their own

Page 18: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

18

standard networking libraries. The decision to use WinSock was made for a couple reasons. First, because almost the entire coding group is running Windows platforms, and getting the UNIX libraries installed and linked properly on a Windows platform would have been an undesirable amount of trouble. Second, because the tutorials we found for WinSock were easier to follow, especially considering none of us have experience in networking. The WinSock library is used for all the networking functionality of the client code. The connect(), send(), and recv() functions form the basic networking capabilities required for the software. Winsock is used to create and bind a socket, change the settings so that communication works the way it needs to, and send and receive packets on that socket.

C. Arduino Side Code It is now possible to review the coding for the Arduino side code.

1. Key Global Variables

The char ssid[] is the Wifi network name. Additionally, the char pass[] contains the Wifi Password. The byte in[] is a 5 byte array to receive and store incoming packets. The first four bytes control direction and speed of each motor, while the 5th byte controls the state of the IR gun. Finally, the variables Time_now, time_fired, and time_sent ar used to keep track of current time and time send and firing occurred; all of these variables were initialized to 0.

2. setup()

During the setup() loop, the interrupts for the IR receiver are declared. This section of code was provided by Vidullan, our Graduate Teaching Aide (TAide), to ensure fair and symmetric hit detection between both robots. The middle part of the setup() loop declares the motor pin and IR pin as outputs to control the motor and IR gun. The remainder of the setup function contains a while statement, which attempts to connect to the declared Wifi network using the defined password. Once a connection is established, the setup() function is complete.

3. loop()

At the start of each loop, the millis() function is called to set the value of time_now to be the time, in milliseconds, since epoch. Like on the client code, this allows the Arduino to control when the board checks for new data, and when to allow the IR gun to fire. Whenever a new packet is received, or the IR gun is fired, calls to millis() are used to store the time the event occurred. Every 50 milliseconds the Arduino Wifi-shield checks for a packet from the client code. Once a packet is detected, calls to client.read() are used to read and store control information into the in[] array. Once the array is filled, the MotorControl() function is called to command the wheels to move at the specified speed and direction. Each loop iteration, a hit flag is used to determine whether or not the robot is being hit by the opposing IR gun. The criteria for this hitflag returning true is built into the code provided by Vidullan. A conditional if statement is used to trigger an event, such as lighting an LED or freezing the robot, to acknowledge the hit.

Figure 19. Arduino Wiring Diagram

Page 19: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

19

D. Wiring Diagram Figure 19 details the various wiring connections on the robot. The Arduino UNO and Wifi Shield are connected

directly on every pin. A few pins had to be rewired between the Wifi Shield and Motor Shield, via the blue and green wires, because both shields are designed to use them by default. The Arduino code is written for all these specific pin connections, so any changes made in the hardware must be mirrored in the software for the system to work properly.

E. Coding Conclusions In the end, the client code is shown to be effective at commanding the Arduino robot through a TCP Wi-Fi

connection. At a high level, operation of the robot is very simple, as there are only five degrees of freedom associated with the robot – the direction of each wheel, the speed of each wheel, and the state of the IR gun. Commands (the 5 valued array) are simply sent to the robot at continuous, set intervals, and based off the values in the array, the controllable components of the robot – the two wheels and IR gun are commanded. The infrared sensor drives an interrupt, which flips a flag when triggered to perform the penalty action – turning on the LED and freezing the robot momentarily.

VI. Testing Steve Stanek is the lead of the Testing group. The primary goal of testing should not only be to identify and

correct errors but also to identify the root causes of errors and modify the software development process so that current error trends do not continue. It should be noted that testing can only show the presence of errors, it cannot show the absence of errors. It is possible for the developed software to have minimal errors, yet still not fulfill the user’s needs. Organizations should also keep in mind that cutting the amount of time allowed for testing not only increases the number of remaining errors but also eliminates the chance fix those errors.

A. Reflections and Lessons Learned There are eight distinct sections of testing. These sections include, component, static, dynamic, unit, integration,

system, defect, and release testing. Though testing cannot show the absence of errors, through these different testing procedures, the end product should perform as best as possible.

1. Component Testing

Component tests are designed to ensure that the individual components of the system are functioning properly in isolation from the rest of the system. During the process, the system is broken down in to smaller components. These components should be the smallest testable parts of an application. Components may be individual functions, object classes or composite components. The individual units are then tested to expose defects. An example of how component testing was utilized in this project includes the testing of the motor and the unequal spin rates of the two wheels. To fix this, a gain was added to ensure the wheels spun at equal rates.

2. Static Testing

Static testing is testing the object when it is not in use. Static testing is mainly performed by visually scanning the code or using software that helps debug the code. During static testing, hardware and integration aspects are ignored and the code is examined for errors. During the addition of several bits of code, multiple sets were utilized to ensure the code that was inputting was correct. This could include any time constraint, such as the two second delay between firing the IR gun.

3. Dynamic Testing

Dynamic testing is a crucially vital process in successfully delivering a product to the consumer. Dynamic testing is executed while a program is in operation. Typically it will look at system memory, functional behavior, response time, and overall performance of the system. The main goal of the tests is to execute a program with the sole intent to find where errors occur. The way in which this is carried out is by running a certain case with a known outcome. This known outcome then is compared to the outcome from the system. If the two outcomes are different, then there is an error in the system. When planning out dynamic testing, it is important to design a test in which errors are systematically discovered.

An example of a dynamic test includes the testing of the IR gun and the reload bar found in the command

window. The requirements states that the IR gun cannot fire more than once every two seconds. Because of this, the

Page 20: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

20

IR gun and the reload bar should recharge at the same time. This was tested in two ways. First, each test was tested individually. They were both looked at visually and the reload time or the flash of the IR gun, seen using a digital camera, was measured via a stop watch. Once that test was completed, they were tested dynamically, together. To test this, the flash and recharged reload bar were put side by side and examined. Multiple shots were fired consecutively and shown that they were indeed both on the same two second recharge. 4. Unit Testing

The purpose of unit testing is to divide the software up into separately testable units and verify their functionality in isolation from one-another. If possible, unit testing should not be done by the developers. They may be not feel inclined to honestly identify errors in the software. There are two different types of unit testing known as white-box testing and black-box testing. During white-box testing, the unit testers have access to the unit's code. This helps the testers plan their test cases and also allows them to verify that the unit is doing what it was designed to do. Black-box testing involves testing how the unit works when confronted by typical input. Due to the large number of possible inputs, complete black-box testing is impossible. It is also important to note that black-box testing also requires the testing of invalid inputs. In this case, with the program running, multiple key strokes that should have no binding were utilized to test to see if they affected the system in any way.

5. Integration Testing

The purpose of integration testing is to verify that the individual units of the software are working in conjunction with each in the way they were designed. Integration testing focuses on the interfaces between various units of the software since these tend to be a problem area in development. Things such as parameters and global variables as well as all possible interactions between units must be examined and tested. When testing large and complex software, it is usually a bad idea to test the system by putting all the components together at once. In these cases is makes more sense to integrate the components incrementally. In relation to this project, first the motion was tested, followed by the Wi-Fi camera feed, incrementally up to the IR gun and sensor.

6. System Testing

The purpose of system testing is to verify that the system is working as a whole. It is also useful for determining if the system meets its non-functional requirements. At this point in testing, the majority of errors should have already been identified and dealt with in previous stages. For our case, once assembled and fully integrated, the robot was tested as a whole to ensure the individual tests held up.

7. Defect Testing

The definition of defect testing inherently means that it is never truly finished. The absence of errors does not mean that they do not exist, only that they have not yet been discovered. While testing the robot, expected error messages were displayed on GUI with specified actions, but no unforeseen errors occurred. The testing process for this class of defects will continue until the product is released, and is the responsibility of all team members to track. Proper communication between the different groups is vital in order to ensure that proper defect testing is administered.

8. Release Testing

Before a software product is released for public use, release testing is performed on the software to ensure that it is error free, safe for use, and has met the customer’s needs. Before the product is released to the customer, the robots functionality will be tested and the prior tests evaluated to ensure they have all been completed properly.

B. Testing Highlights This section highlights each component’s description, purpose, characteritics, testing procedure and the testing

results.

1. The vehicle must be able to execute all commands given from the controller

Purpose of the Part - The controller is what the user will use to control the robot and enable proper command input to enable

victory in the final competition.

Page 21: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

21

- Single inputs should be able to make the robot rotate left/right, move forward/backward, and fire an IR gun.

Characteristics of the Part

- Utilizing the keyboard on a laptop there are designated keys for left, right, forward, and backward motion of the robot. The Xbox 360 controller utilizes the left joystick for forwards and backwards motion with the right joystick utilized for rotation.

- The spacebar and right trigger are utilized to fire a single IR gun shot. - The up and down arrow keys, as well as the right and left bumper, are utilized to change the speed of the

robot increasingly or decreasingly, respectively.

Procedure 1. Start up the robot following the normal start up procedures. 2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement, and fire

the IR gun. 3. Make note of any discrepancies or malfunctions between the robots action and the command window.

Results 1. Initially the forward motion of the robot was impeded due to each wheel rotating at different rates. a. Adjustments were made and the wheels were given different gains and now rotate at the same rate. i. The robot is now able to move forwards or backward in a straight line. 2. On occasion the robot would become stuck in a rotating state a. Adjustments were made in the code before secondary testing of the rotation. i. The robot now only gets stuck in a rotation if it becomes disconnected from the controller.

3. The IR gun fires once for each suppression of the spacebar. 2. Command Window

Purpose of the Part - The command window will be used by the controller to send signals to the robot that will enable it to

move, fire the IR gun, and transmit video. - The user command window should utilize single keys for movement and fire commands. - The command window will output necessary messages.

Characteristics of the part

- The command window should appear in an aesthetically pleasing manner to the user on the operating laptop.

- The command window should show the timing it takes to fire the IR gun so that the user can see when they are able to fire.

Procedure

1. Start up the robot following the normal start up procedures. 2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement, and

fire the IR gun. 3. Make note of any discrepancies or malfunctions between the robots action and the command window.

Results

1. There were no discrepancies between the input commands from the window and the robots actions. 2. When the robot is online the window will read ‘connected.’ If the robot disconnects the connected

message will be replaced by ‘disconnected.’ Furthermore, the command window displays a hit message when we are hit.

Page 22: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

22

3. Wi-Fi Camera and the video feed that is provided to the user; video speed/lag and a basic level of clarity are the main focus of the testing

Purpose of the Part - The primary purpose of the part is to provide a reliable source of visual data to the user so that proper

commands can be given in order to complete a specified mission objective Characteristics of the Part

- The camera provides a basic video feed via the Wi-Fi board to the user oriented computer - The frame rate and video quality can be varied, and inherently affect the performance of the robot - Required power vs. video quality is a major focus due to the limit of available amperage and voltage

Procedure

1. The robot will be powered up and set to normal operating parameters to initialize the testing process 2. Video feed will be started with the robot in a stationary form. 3. Physical objects will be introduced into the field of view of the camera and quickly removed. Upon

moving them, time will be kept to determine the lag time of the video feed. 4. In addition to the feed timing, any issues such as black-outs or feed drop will be noted. 5. The entire process will be repeated with the robot in motion to ensure that it can continue proper

functions while in a dynamic state. 6. A frame rate count is determined using the computing functions of the robot, not from a direct test.

Results

1. The camera functioned properly and provided a basic visual feed that the user could use to direct the robot.

2. One issue that arose was a considerably video lag. An overall lag of 5.2 seconds occurred. It was determined that this was attributed to the high frame-rate provided to the user. This was corrected by lowering the frame rate to 10 FPS.

3. Required power for the video unit was not a problem. The battery unit supplied enough power that the robot could continue dynamic functions and still provide the required video feed.

4. The vehicle should have a manual on/off switch for the batteries and the camera

Purpose of the Part

- The manual on/off switch will enable the robot to be disconnected and powered down separately from the controller.

- Having a backup option to power down the camera and robot will allow for a failsafe in case there are any issues with the controller during runtime.

Characteristics of the Part - There is a manual toggle switch located on the robot that allows for power to be turned on and off.

Procedure

1. Start up the robot following the normal start up procedures. 2. Test the on/off switch by changing from the on position to the off position. 3. Make note of any discrepancies or malfunctions between the robot and the command window.

Results

1. The on/off switch properly controls the robots power as well as the camera’s power.

5. The vehicle must be able to operate on linoleum floors Purpose of the Part

- The tires and power supplied to them must be enough so that the vehicle can navigate in a controlled and efficient manner on a linoleum floor.

Page 23: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

23

Characteristics of the Part - The tires are made up of a standard rubber material with standard all-purpose treading.

Procedure

1. Start up the robot following the normal start up procedures. 2. Run the robot through a series of commands: Left/Right rotations, forward/backward movement all on a

linoleum floor. 3. Make note of any discrepancies or malfunctions between the robots action and the command window.

Results

1. The robot was able to execute all commands efficiently and in a controlled manner on the linoleum floor.

6. IR gun

Purpose of the Part - The Infrared gun will be utilized to hit the opposing target that is equipped with IR sensors. The IR gun

is not allowed to fire more than once every 2 seconds.

Characteristics of the Part - An IR beam cannot be focused to pinpoint accuracy. Due to this, the part will be tested to show the

range and cone of the IR shot. By measuring this, the user will be given proper knowledge to successfully hit their target.

- The beams often can often become unaligned. Due to this, the IR beam will be continually tested in regular intervals to monitor this.

- Like hockey, the wall has potential to be your friend. In hockey, players use the wall as a way to deflect pucks around other players or to create a passing lane. An IR beam can act the same way. It will be tested to see the reliability of bouncing the IR beam off a wall to the opponent.

Procedure

1. To ensure the functionality of the IR gun, our own receiver will be used to detect hits or not. 2. To give us the upper hand, the characteristics of the part will be tested and monitored without utilizing

the other team. In the room that will host the competition, the IR gun will be used as various location to bounce off the walls. This test will either confirm or debunk the utilization of walls.

3. Twice per week the IR gun will be tested to ensure the alignment is correct. A more in depth description of the procedure to this will become available at a later date.

Results

1. Prior to the update of the firmware, it was determined that the IR gun is functional. a. After the firmware update, the IR gun remains functional.

2. It was determined that the utilization of walls is not favorable. 7. IR receiver; the IR receiver is composed of three sensors, located at the front, left, and right side of the robot

 Purpose of the Part

- The IR receiver will be utilized to register hits from the opposing team’s robot. Furthermore, each hit that is registered should display in the command window with a tally.

Characteristics of the Part - IR sensors are very common and most notably, found on your television. The IR sensors can detect

invisible light but not visible light. - IR sensors have a demodulator inside them. Because the demodulator will only register specific

frequencies, the IR gun and sensor must be tested in unison.

Page 24: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

24

Procedure

1. To ensure the functionality of the IR receiver, our own IR gun will be used to confirm whether or not hits can be registered on each of the three sensors.

2. The command window will be used to determine if there was a hit by displaying a message

Results 1. Prior to the update of the firmware, it was determined that the IR gun is functional.

a. After the firmware update, the IR gun remains functional. 2. The command window outputs a ‘hit’ message. It does not however keep a running tally

8. The vehicle’s battery should power the motor, camera, and all other electronics for at least 45 minutes

Purpose of the Part

- The battery will provide the power to the robot Characteristics of the Part

- The battery pack is connected to the circuit boards via a connection link that is easily accessible. This makes swapping the battery pack out a simple task.

Procedure

1. Start up the robot following the normal start up procedures. 2. When the robot is moving a full speed, the robot will be disconnected. By disconnecting the robot, the

robot’s last received command runs on an infinite loop. This means the robot will have the motors running at full speed and the camera feed will also be available.

3. The robot’s battery life will be measured via a stop watch and recorded Results

1. The robot’s battery life is well over 1.25 hours.

9. Structural Integrity at High Speeds The part being tested is the structural integrity (main supporting component) of the robot. The main analysis is of any cracking, distortion, or stress fractures of the main structural unit when impact an object at full speed.

Purpose of the Part - The main purpose of this part is providing a base structure for the required components of the robot. - It must be able to properly contain and protect the secondary components of the robot so that they may

fulfill their intended purpose. Characteristics of the Part

- The main structural body is a very straightforward and simplistic setup consisting of screws, a main board, and connecting components.

- All secondary components are properly attached to the main body. Procedure

1. The robot will be powered up and set to normal operating parameters to initialize the testing process. 2. Three obstacles will be used to determine if the robot can withstand harsh physical impact. One that fits

below the main board and impact the wheel base, one that is the same height as the main board, and one that is taller than the entire unit.

3. With the robot engaged at full speed, direct impact is observed with each of the obstacles. 4. After each impact, the robot is visually inspected for any of the aforementioned flaws. 5. The robot is tested at full speed impact for both the front and rear portions of the system.

Results

1. The unit was able to withstand all forms of impact against a solid obstacle at maximum speed from both a front and rear approach.

Page 25: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

25

2. No structural flaws were found via the testing procedure.

VII. Verification and Validation (V&V) Mike Matas is the lead of the Verification and Validation group. The V&V group operated based on the

information and findings from both the Requirements and Testing Teams, with a majority of the effort examining and verifying the requirements documents. Tests were designed and performed in order to verify if each requirement had been met. The project website was also examined and an internal verification process took place to make sure each team knew of and completed the deliverables they were responsible for. The Validation process involved taking our findings and current status of the robot software (and hardware in some cases) and making sure it was suitable for the customer.

A. Reflection and Lessons Learned This passage reviews and reflects on the verification and validation portion of the software engineering project.

It focuses on the lessons learned during this process. 1. V&V Process

To prepare for the V&V process, the team read through the material in “Software Engineering Book of Knowledge” (SWEBOK) and “Gentle Introduction to Software Engineering” (GISE) related to requirements engineering and testing so that we could be prepared for interactions with those groups. We then familiarized ourselved with the V&V process and the steps we would need to take to complete it for this project. For this project, we divided into subgroups to distribute the work load. These groups included Wedsite Verification (1 member), Requirements Verification (4 members), and Validation (2 members).

2. Website Verification

One member of the V&V team needed to be tasked with going through the website and making sure that every group (Requirements, Design, Testing, Coding, and V&V) and the Chiefs (CEO, CFO, CIO) had all completed what was asked of them. This process involved breaking the website into necessary deliverables, communicating with the other groups, and keeping a checklist of what was actually being delivered via the final reports.

3. Requirements Verification

Requirements verification ensures that the product is built right. For each requirement, a test procedure is defined and the necessary resources are identified. These procedures are followed, and the results are recorded in order to determine whether or not each requirement has been met. In the Requirements Verification section of this report, the results of the verification of functional and nonfunctional requirements are discussed with the format [requirement number]: requirement description followed by the tests conducted and results.

4. Validation

Requirements validation ensures that the right product is built, i.e. the system functions to meet the needs of the customer. The validation team was responsible for first meeting with the customer to acquire an overview of how the system will be used and how it should function. The team then observed the verification process to make sure the product will operate the way the customer intends it to.

B. Website Verification This is a listing of the website verification conducted by the V&V group. 1. Requirements Deliverables

a. All members read chapter 2 of SWEBOK and Section 2.1 of GISE – confirmed by team lead on 4/8 b. CONOPS – attached as separated document. c. Use cases – page 6 of Requirements report. d. Viewpoint charts – page 4 of Requirements report. e. Scenarios – page 5 of Requirements report. f. Sequence diagrams – page 7 of Requirements report. g. Use the provided template for the Requirements Document – completed. h. Final report uploaded in AIAA Format – completed.

Page 26: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

26

2. Design Deliverables a. All members read chapter 3 of SWEBOK and Section 2.3 of GISE – confirmed by team lead on 4/8 b. Requirements rating table – page 3 of Design report. c. Traceability Matrix – page 4 of Design report. d. Sequence model – page 5 of Design report. e. State transition model – page 6 of Design report. f. Structural model – page 7 of Design report. g. Data flow models – page 8 of Design report. h. Final report uploaded in AIAA Format – completed.

3. Coding Deliverables

a. All members read chapter 4 of SWEBOK and Section 2.4 of GISE – confirmed by team lead on 4/8 b. Flowcharts – page 2 of Coding report. c. User manual – attached as separated document. d. Documentation of each unit or component of code – page 3-6 of Coding report. e. Source code – attached as separated document. f. Guidance to testing group – confirmed by both team leads on 11/11 g. UML diagrams – page 3 of Coding report. h. Final report uploaded in AIAA Format – completed.

4. Testing Deliverables

a. All members read chapter 5 of SWEBOK and Section 2.5 of GISE – confirmed by team lead on 4/8 b. Description of all tests done – pages 5-13 of Testing report. c. Results of all tests – pages 5-13 of Testing report. d. List of errors found and corrections – pages 5-13 of Testing report. e. Result of static testing – page 3 of Testing report. f. Show how you did a dynamic analysis – page 3 of Testing report. g. Unit testing results – page 3 of Testing report. h. Integration test results – page 3 of Testing report. i. Final report uploaded in AIAA Format – completed.

5. V&V Deliverables

a. All members read chapters in SWEBOK and GISE related to V&V – confirmed by team lead on 4/2 b. Designate one person to be responsible for reading the website and make sure the team addresses

every – done. c. Show (prove) which requirements have been met – pages 6-10 of V&V report. d. Show which requirements have NOT been met – page 10-11 of V&V report. e. Keep a record of all tests – page 6-10 of V&V report. f. Are all functional and non-functional requirements met – page 6-10 of V&V report. g. Suggest fixes required to requirements, coding, and testing groups – confirmed by team leads on 4/11 h. Final report uploaded in AIAA Format – completed.

6. CEO Deliverables

a. Final Gantt chart – completed b. Integration of final reports – completed

7. CFO Deliverables

a. COCOMO model – completed b. Teams financial projections versus current financial status – completed

8. CIO Deliverables

a. Working website that is secure to the Blue Team – confirmed by CIO on 4/11

Page 27: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

27

C. Requirements Verification 1. Functional Requirements

The functional requirements, or capabilities, provide an outline for the functions that the software must be able to execute and how the system should react to different inputs. Upon verification of all functional requirements, it shall be confirmed that the product is capable of performing all basic operations.

a. [2.1.1]: The vehicle must be entirely remote controlled.

Once turned on, verify that no direct user interaction is necessary to operate all vehicle functions. After the power was switched on, the user was able to move the robot forward, backward, left, and right, fire the IR gun, view the camera feed through the GUI, and detect hits without any direct interaction of a team member with the robot.

b. [2.1.1.1]: The vehicle controller system must consist of only a laptop, wi-fi receiver, controller devices, and a single user.

Examine the system hardware during operation to verify the vehicle is controlled by using only a laptop, wi-fi receiver, any controller device and one user. Full control of the vehicle was achieved using only these materials and no external assistance.

c. [2.1.]2: The vehicle must transmit as close to real time video feed as possible via a wi-fi camera to the controller.

Determine if the video feed viewed by the user has a small enough delay such that it does not interfere with effective operation of the vehicle. The server push mode was able to be used, and the video feed was very close to real time (delay was on the order of milliseconds), which provides acceptable competition performance.

d. [2.1.3]: The vehicle must be able to execute all commands given from the controller.

Test that all commands can be transmitted from controller to vehicle and verify that they are performed. Forward, backward, left, and right movement, all combinations of directional motion, and shooting were all executed when commanded by the user. e. [2.1.3.1]: Vehicle must be able to move forwards, backwards, rotate right, and rotate left.

Input each command or series of commands to confirm that each maneuver can be executed. While the Xbox controller is being used, the left joystick commands forward and backward motion and the right joystick commands left and right motion. When the keyboard is connected, the W, A, S, and D keys command forward, left, backward, and right motion respectively.

f. [2.1.3.2]: Vehicle must have a designated button to fire the IR gun

Verify that there is a designated command on the controller that fires the IR gun. For the Xbox controller, the right trigger fires the IR gun, and the space bar will fire the IR gun for the keyboard controller.

g. [2.1.3.3]: Vehicle must execute commands given wirelessly from the controller utilizing an Arduino wi-fi board and Arduino processor

The Arduino processor and wi-fi board allow the controller to wirelessly send commands that are executed by the robot.

h. [2.1.4]: Vehicle must have an IR gun and sensor

Examine the fully equipped vehicle and verify the presence of an IR gun and sensor. Visual inspection showed that there are both an IR gun and sensor on the vehicle.

Page 28: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

28

i. [2.1.4.1]: The IR gun must be able to fire every 2 seconds, as specified by the customer as the minimum interval

Fire the IR gun multiple times within two seconds to verify that the shots only register at the two second intervals. The IR gun was viewed through the camera while it was rapidly fired, which allowed the shot to be seen, and it was confirmed that shots fired in succession after the initial fire did not register until two seconds passed.

j. [2.1.4.2]: The IR sensor must be continuously on

The IR gun was fired continuously at each of the sensors located on the vehicle. All hits were registered appropriately.

k. [2.1.5]: The vehicle must withstand ramming full speed into a rigid object with no critical structural damage

Drive the vehicle into rigid objects while at full speed and then examine the vehicle to determine the absence or existence of structural damage. The vehicle was driven into a wooden table leg and a classroom wall at full speed with no structural damage. The front metal bumper protected the vehicle and all accessories from impairment.

2. Nonfunctional Requirements The nonfunctional requirements, or constraints, are used to determine the quality and level of performance

desired by the user. Many of these are necessary to allow for a fair competition, but others are simply specifications preferred by the customer. Although they may not be essential to the overall functionality of the system, it is imperative that the nonfunctional requirements are met to ensure the best chance of success and eliminate the possibility of disqualification.

a. [2.2.1]: Software must be written using C or C++

The files were examined to determine what language was used to develop the code. There are 16 .cpp files, 3 header files, and 1 Arduino sketch.

b. [2.2.2]: Data link between the controller and vehicle must be wireless

Verify that data is transferred between the vehicle and controller wirelessly. All data was transferred without any hardware connecting the vehicle and controller.

c. [2.2.2.1]: Software must utilize and integrate an Arduino processor, Arduino wi-fi board, Arduino motor control board, a laptop, IR gun, and IR sensor

System components were analyzed to ensure that they carry out their functions once integrated together. The Arduino processor, Arduino wi-fi board, Arduino motor control board, laptop, IR gun, and IR sensor all performed appropriately.

d. [3.1.1]: All control commands should be executed using an Xbox controller (primary) or using the ASDW keys on the keyboard (secondary).

The vehicle was operated under two separate tests in which the robot was controlled using either the Xbox controller or keyboard individually.

e. [3.1.1.1]: The control commands should be mapped to the Xbox controller as follows: left joystick for forward/backwards, right joystick for left/right, right trigger to shoot

While using the Xbox controller to operate the vehicle, all joysticks and triggers were used to determine their functionality. Moving the left joystick up and down resulted in forward and backward motion, moving the right joystick left and right resulted in the respective rotation or turning, and the right trigger resulted in the firing of the IR gun.

Page 29: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

29

f. [3.1.1.2]: The control commands should be mapped to the keyboard as follows: W-key for forward, S-key for backwards, A-key for left, D-key for right, space bar to shoot

While using the keyboard to operate the vehicle, the relevant keys were used to determine their functionality. The W-key resulted in forward motion, the S-key resulted in backward motion, the A-key resulted in counterclockwise rotation or left turning, the D-key resulted in clockwise rotation or right turning, and the space bar resulted in the firing of the IR gun.

g. [3.1.2]: The system should utilize a separate window for video feed

Verify that the video feed is displayed in its own window. A separate internet explorer window is used to view the video feed.

h. [3.1.3]: Software should communicate applicable system errors to user

Verify that relevant errors are displayed to the user via the interface. There was one error found, and this was a wi-fi disconnection. When the system is disconnected, the system alerts the user via the GUI.

i. [3.1.4]: A reload bar should notify the user when reload time is done

The IR gun was fired to verify that there is a reload bar on the command window that displays when the IR laser is able to fire. Reload bar starts full, and when the gun is fired is goes to zero and then grows back over a time period of 2 seconds.

j. [3.1.5]: User should be notified when vehicle is tagged

The IR gun was fired at the sensor, and the GUI was observed to determine if the user will be notified when the vehicle is hit. A message is displayed and an alert noise sounds when the vehicle is tagged to visually and audibly notify the user.

k. [3.2.1]: The robotic system should be placed on a small, two-wheeled chassis

Upon observation of the vehicle, the robotic system was confirmed to be controlled via a two-wheeled chassis and an omnidirectional roller support.

l. [3.4.1]: The system should be user friendly to the point where a non-team member could be taught all aspects of how to operate the vehicle in 15 minutes or less with less than 5 operational errors

Introduce a non-team member to the system and show them how to fully operate the vehicle. The teaching session should be timed and stopped at 15 minutes, when the user will be asked to operate the vehicle and carry out all functions with less than 5 operational errors. This 15 minute teaching session was carried out and the user successfully operated all tasks without exceeding 5 errors.

m. [4.1.1]: System should process as fast as hardware allows with no unnecessary time delays

Verify that there is no significant time delay in overall functioning of the system. The system operates close to real time, and the code was examined to make sure there was no unnecessary interrupts or delays.

n. [4.2.1]: The system should have a manual on/off for the batteries and camera

Verify that there are manual power switches for the batteries and camera. There is a single main power switch on the robot chassis that controls the power to the batteries, boards, and camera

Page 30: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

30

o. [4.3.1]: Operator should be able to visually detect an obstacle 3 feet in front of the vehicle through the wi-fi camera

Place an object unknown to a team member in the view of the wi-fi camera and have the member detect the object through the video feed. A monster energy drink was placed 3 ft in front of the vehicle. A user with no knowledge of the item was able to identify the object and determine the approximate size.

p. [4.4.1]: System must be able to operate on a linoleum floor

Operate the vehicle on a linoleum floor and verify full functionality. The vehicle was operated on a linoleum floor and the system operated appropriately and consistently.

q. [4.4.2]: Battery life should exceed 50 minutes at full performance

Power on the vehicle and operate at full speed to determine the battery life. The robot was fully functional for more than 60 minutes without losing power.

r. [4.4.3]: Controller and camera range should be more than 50 feet

Test that the vehicle can operate remotely when placed 50 ft from the controller. The vehicle was driven across the length of the room (past 50 ft from the controller) and remained fully operational with slight video lag as the distance increased.

s. [5.2.2]: Command software should interface with the Arduino processor and board

The command software interfaces the user input with the Arduino hardware wirelessly, allowing the robot to operate.

t. [5.2.3]: Arduino processor and board should interface with wheel motors

After the user commands are sent to the Arduino processor, the board interfaces with the wheel motors and allows for movement.

u. [5.2.4]: Arduino processor and board should interface with IR gun and sensor

The user commands are sent to the Arduino processor, the board interfaces with the IR gun and allows for firing. The processor also interfaces with the IR sensor and keeps track of any “tags”.

v. [5.2.5]: Battery will interface with motors and hardware

The hardware was examined, and the batteries are directly connected. Upon activating the switch, the batteries power the motor and the Arduino boards used for operation.

w. [5.3.1]: System should be able to execute multiple commands at once

While operating the vehicle, various combinations of multiple, simultaneous inputs were tested. The user was able to move forward or backward while also turning left or right, and the IR gun was fully functional while the vehicle was in motion.

x. [5.4.1]: The vehicle should weigh less than 15 pounds

Verify that the vehicle weighs less than 15 lbs on a scale or other measuring device. The vehicle with all accessories attached was weighed and are a combined 2.3 lbs.

y. [5.4.2]: The vehicle dimensions should be less than 2 ft x 2 ft x 2 ft

A tape measure was used to determine the length, width, and height of the vehicle to ensure it does not exceed the required dimensions.

Page 31: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

31

z. [6.1.1]: All information must be proprietary to team

All locations that host information regarding the project will be checked to ensure proper steps are taken to prevent access of anyone non-team member. It was confirmed that the CIO is the only person who has access to who can view all official documents uploaded to the team website.

aa. [7.1.1]: All instruments must be returned to the customer at the end of the semester

The customer will be contacted upon completion of the semester and asked if all the instruments have been returned to them.

bb. [7.1.2]: Customer must be reimbursed for any broken instruments

The customer will be contacted upon completion of the project and asked if any instruments have been broken. In the event that there are any broken instruments, the customer will be asked if they have been reimbursed.

cc. [9.1.1]: All electronic devices and structural components should be off the shelf products

All of the electronic devices and structural components were recorded and research was conducted to determine that all items were able to be purchased off the shelf. However, this requirement was not fully met since the IR gun contains custom circuitry.

dd. [10.1.1]: Vehicle performance is limited by the specific instrumentation given by the customer

All of the instruments used on the vehicle was compiled and then compared to the list of all the instrumentation given by the customer to confirm the vehicle performance does not depend devices not approved by the customer.

ee. [10.3.1]: No outside equipment or instrumentation may be used

A list of all the instrumentation used in the vehicle was compiled and then compared to the list of all the instruments given by the customer to confirm the vehicle performance does not depend on devices that were not approved by the customer.

ff. [11.1]: Customer reserves the rights to reasonably change requirements based on evolving needs

3. Verification Failures Throughout the process of requirements verification, many problems occurred which caused some setbacks. The

inability to verify certain requirements is a major issue, but it is a problem that must be solved. In some cases it is best to either alter the requirement to make it more achievable or even remove it entirely due to the level of reasonability or difficulty required. When neither of these options are available, an alternate solution must be found.

a. [3.1.6]: Controller software should keep count of opposing tags

Unfortunately this requirement was not able to be met. The software is capable of alerting the user that their vehicle has been tagged, but it cannot keep count of how many hits have occurred over a given period of time. However, the tags will be able to be counted manually by observing the visual and audible signals produced while being hit.

b. [3.2.2]: Vehicle should not contain any other instrumentation that the given processor, wi-fi board, wi-fi camera, IR sensor/receiver, and battery

The vehicle and all accessories were examined to verify the possession of necessary instrumentation. It was confirmed that the vehicle contained the given Arduino processor, wi-fi board, wi-fi camera, IR sensor, and battery. Also, an LED light was used in addition to this given instrumentation. The LED light will light up when the IR sensor has been hit to serve as supplemental notification that the vehicle has been tagged. Although excess

Page 32: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

32

materials were used and the requirement was not fully verified, the LED light does not alter the functionality of the vehicle or allow for any advantage in the competition

D. Requirements Validation 1. Preliminary Customer Validation

This preliminary list was compiled after meeting with the customer to see what the ultimate goals were for the project. These are the guidelines that were used for the validation process while testing the robot as well as for the final customer validation. Any subsequent specification changes were accounted for appropriately.

a. The system should consist of a small three-wheeled robot chassis and an onboard Arduino processor, IR

gun and sensors, and wi-fi camera. b. The robot should be controlled though a user interface on a laptop c. The person controlling the robot will not be in the same room as the competition, and should be a trained

team member from a subgroup other than coding d. The system should also be reliable – especially the day of the competition, i.e. no IP address issues or

Wi-Fi connection errors. e. The vehicle will have a wi-fi camera attached to it, and the user will see the video feed on the GUI f. The vehicle should be able to shoot the other robot using infrared gun g. The vehicle should be able to evade the other robot h. There must be a two second reload time between shots i. The vehicle should be able to move through a series of obstacles to get to the other robot and shoot it j. Victory is achieved by tagging the opponents vehicle more times than they tag you k. Above all, the ultimate goal is to win the competition

2. Internal Validation of Project Documentation

During the verification phase of the project, the above specifications were checked to see if they would meet the customer’s satisfaction. Along with the tests conducted by the verification team, the validation team used the same tests to see if the requirements were validated for the customer.

a. The robot chassis consisted of two rubber wheels with one roller ball as the third wheel support. This chassis also held the Arduino processor, IR gun and sensors, and the Wi-Fi camera.

b. The robot can be controlled through the keyboard or an Xbox controller. It is important to note that the controller takes higher priority than the keyboard option. But in the event that the controller is not able to be used, the keyboard will suffice. The user interface properly displays the camera feed and the two second reload bar.

c. The person controlling the robot is able to be within 50 ft of the robot with minimal lag. This was tested by placing the user and the laptop outside of the classroom where the competition will be held. This system works only when the router is in the same room as the robot.

d. Throughout the initial testing, the robot experienced random brownout errors but it would be able to display to the user that it had disconnected from the system. In these cases, the robot would have to be rebooted or reconnected. After a few fixes, this type of issue only happened while the robot was used at maximum speed which would generate noise from the wheels that would interfere with the signal. The brownout issue is the only aspect raising concern in the reliability aspect of the robot.

e. The Wi-Fi camera is able to display its image on the webpage that is accessed through the camera’s IP address. This showed no errors unless there is a brownout issue which freezes the image display.

f. In the display that is shown to the user, there is a reload bar that will refill for two seconds therefore not allowing the user to fire while loading.

g. The usability of the IR gun has been confirmed as well as the detection of being hit by the IR sensors. At this moment, the sensors are not working but will be fixed come the day of the competition.

h. The feedback from the camera is close to real time and is therefore a good source for navigation of the robot to evade the enemy.

i. The robot is able to maneuver through obstacles rapidly and is therefore able to evade the robot. j. The way the system is programmed right now, it is not able to count the number of hits. But with the

LEDs, that were provided by the customer, which will light up when hit, a reliable team member is able to count the number of hits. With the proper functioning of the IR gun and sensors, LEDs and the proper counting, it will be possible to detect which robot had the most hits.

Page 33: Software Engineering Final Report - Blue Team

Department of Aerospace Engineering

33

k. The ultimate validation will come at the day of the competition, but if all components of the system work together with no brownout errors, the robot should be able to win the competition.

VIII. Conclusion In summary, this team has undertaken traditional course activities this semester as well as participated in this

software design process. Through the course of this project, many of the team members have remarked how benefitial it was to truly experience the software engineering process. The process was, at times, “messy,” however this is the kind of real-world experience many members of the team believe that we as students need to be successful.

Appendix A few images from the day of the competition are provided below in Figure 20 and Figure 21

Figure 20. Close-up of the Blue Team's Robot

Figure 21. The Robots Compete in the Competition

Acknowledgments The Blue Team would like to thank Dr. Lyle N. Long, Distinguished Professor of Aerospace Engineering and

Mathematics, for his mentorship through this undertaking. The Blue Team would like to recognize the technical support efforts of Vidullan Surendran as we undertook this project. Finally, the Blue Team would like to recognize the White Team for being a cooperative partner in mutual testing endeavors, particularly as it related to sensor and receiver testing; they have been a very worthy adversary.

Page 34: Software Engineering Final Report - Blue Team

ID Task Mode

Task Name Deadline Duration Start Finish

1 Team and group choices Wed 1/15/14 3 days Fri 1/17/14 Tue 1/21/14

2 Team formations announced Wed 1/22/14 1 day Fri 1/17/14 Fri 1/17/14

3 CEO, CIO and CFO of each team selected by teams

Wed 1/22/14 1 day Fri 1/17/14 Fri 1/17/14

4 Team documentation website in place

Mon 1/27/14 4 days Mon 1/20/14 Thu 1/23/14

5 CEO Establish Gannt Chart Fri 1/31/14 5 days Mon 1/27/14 Fri 1/31/14

6 Cost Estimate Mon 1/27/14 10 days Mon 1/20/14 Fri 1/31/14

7 Receive estimates from group leads

Mon 1/27/14 7 days Wed 1/22/14 Thu 1/30/14

8 Completed cost estimate Fri 1/31/14 8 days Wed 1/22/14 Fri 1/31/14

9 Establish and maintain hours records

Thu 1/23/14 3 days Mon 1/20/14 Wed 1/22/14

10 Requirements Presentation Mon 2/3/14 1 day Wed 1/15/14 Wed 1/15/14

11 CONOPS NA 14 days Mon 1/20/14 Thu 2/6/14

12 Contact Dr. Long for any additional requirements

NA 14 days Mon 1/20/14 Thu 2/6/14

13 Tracability Matrix, relevance & dependancy to each other

NA 14 days Mon 1/20/14 Thu 2/6/14

14 Finalize requirements documentation

NA 14 days Mon 1/20/14 Thu 2/6/14

15 Create req. presentation Sun 2/2/14 14 days Mon 1/20/14 Thu 2/6/14

16 Design Presentation Mon 2/17/14 1 day Wed 1/15/14 Wed 1/15/14

17 Requirements ratings and traceability matrix breakdown

Mon 2/17/14 1 day Mon 2/3/14 Mon 2/3/14

18 Preliminary high level sequence,structural and data flow diagrams

Mon 2/17/14 1 day Mon 2/3/14 Mon 2/3/14

19 Detailed high level sequence, state transition and data flow diagrams

Mon 2/17/14 1 day Mon 2/3/14 Mon 2/3/14

20 Final detailed low level sequence, structural, state transition and data flow diagrams

Mon 2/17/14 1 day Wed 2/12/14 Wed 2/12/14

21 Coding Demonstration Mon 2/24/14 1 day Wed 1/15/14 Wed 1/15/14

22 Get Wi‐Fi to Arduino communication operational

Mon 2/24/14 11 days Fri 1/24/14 Fri 2/7/14

23 Command robot wirelessly by sending simple command line inputs

Mon 2/24/14 6 days Fri 1/24/14 Fri 1/31/14

24 Coding Presentation Wed 3/19/14 46 days Wed 1/15/14 Wed 3/19/14

25 Interfacing gamepad controller inputs

Wed 3/19/14 17 days Sat 2/15/14 Sat 3/8/14

26 Integration of WiFi and Robot wheels/motors

Wed 3/19/14 21 days Sat 2/8/14 Fri 3/7/14

27 Functioning Camera Mon 3/31/14 6 days Fri 1/24/14 Fri 1/31/14

28 Power to camera Fri 3/28/14 22 days Sat 2/15/14 Sat 3/15/14

29 IR Gun firing Mon 3/31/14 8 days Fri 3/7/14 Tue 3/18/14

30 IR Receiver functioning Mon 3/31/14 Fri 3/7/14

31 Testing Presentation Mon 4/7/14 1 day Mon 4/7/14 Mon 4/7/14

32 Assemble Hardware Mon 2/10/14 7 days Sat 1/25/14 Sat 2/1/14

33 Create testing outline Fri 2/14/14 11 days Mon 1/27/14 Mon 2/10/14

34 Assign testing subgroups Wed 2/19/14 6 days Mon 2/10/14 Mon 2/17/14

35 Develop subgroup testing plans Mon 3/3/14 8 days Wed 2/19/14 Fri 2/28/14

36 Component Testing Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

37 Moving Capability Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

38 Camera Feed Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

39 Miscellaneous, including IR gun and collision testing

Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

40 Integration Testing Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

41 Movement Control Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

42 IR Gun Functionality Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

43 Miscellaneous, including speed variability

Mon 3/31/14 22 days Fri 2/28/14 Mon 3/31/14

44 Defect Testing Mon 4/7/14 26 days Fri 2/28/14 Fri 4/4/14

45 Use all features together simultaneously

Mon 4/7/14 26 days Fri 2/28/14 Fri 4/4/14

46 Checking that "errors" properly occur

Mon 4/7/14 26 days Fri 2/28/14 Fri 4/4/14

47 Stress and Release Testing Mon 4/7/14 37 days Fri 2/28/14 Mon 4/21/14

48 WiFi range and controls Mon 4/7/14 37 days Fri 2/28/14 Mon 4/21/14

49 Button mashing Mon 4/7/14 37 days Fri 2/28/14 Mon 4/21/14

50 Verifying test results Mon 4/7/14 37 days Fri 2/28/14 Mon 4/21/14

51 Testing  Dry Run Wed 4/23/14 1 day Wed 4/23/14 Wed 4/23/14

52 V&V Presentation Mon 4/14/14 56 days Mon 1/27/14 Mon 4/14/14

53 Create V&V Project Outline Mon 2/3/14 6 days Mon 1/27/14 Mon 2/3/14

54 Develop subgroup V&V plans Mon 2/10/14 6 days Mon 2/3/14 Mon 2/10/14

55 Website Verification Mon 4/14/14 21 days Mon 3/17/14 Mon 4/14/14

56 Break website into deliverables by subgroup

Mon 3/24/14 6 days Mon 3/17/14 Mon 3/24/14

57 Contact subgroups w/ their required deliverables and deadlines

Mon 3/31/14 6 days Mon 3/24/14 Mon 3/31/14

58 Assemble list of completed deliverables

Mon 4/7/14 6 days Mon 3/31/14 Mon 4/7/14

59 Requirements Verification Mon 4/14/14 21 days Mon 3/17/14 Mon 4/14/14

60 Assembly task list (formal wording)

Mon 4/7/14 16 days Mon 3/17/14 Mon 4/7/14

61 Performed tests Mon 4/14/14 6 days Mon 4/7/14 Mon 4/14/14

62 Validation Mon 4/28/14 0 days Tue 3/11/14 Tue 3/11/14

63 Meet with Customer Mon 3/17/14 1 day Mon 3/17/14 Mon 3/17/14

64 Compile checklist of validations

Mon 3/24/14 5 days Tue 3/18/14 Mon 3/24/14

65 Final product validation Mon 4/28/14 18 days Mon 4/7/14 Wed 4/30/14

66 Final Demonstration Wed 4/30/14 1 day Wed 4/30/14 Wed 4/30/14

67 Finalized Final Report Fri 5/2/14 0 days Mon 4/21/14 Mon 4/21/14

68 Executive Summary Thu 5/1/14 2 days Mon 4/21/14 Tue 4/22/14

69 Financial Report Mon 4/28/14 2 days Mon 4/21/14 Tue 4/22/14

70 Information Section Mon 4/28/14 2 days Mon 4/21/14 Tue 4/22/14

71 Requirements Report Mon 4/28/14 6 days Mon 4/21/14 Mon 4/28/14

72 Design Report Mon 4/28/14 6 days Mon 4/21/14 Mon 4/28/14

73 Coding Report Wed 4/30/14 7 days Mon 4/21/14 Tue 4/29/14

74 Testing Report Wed 4/30/14 8 days Mon 4/21/14 Wed 4/30/14

75 V&V Report Wed 4/30/14 8 days Mon 4/21/14 Wed 4/30/14

76 Editing/Review Fri 5/2/14 2 days Thu 5/1/14 Fri 5/2/14

77 Printing Fri 5/2/14 1 day Fri 5/2/14 Fri 5/2/14

78 Binding Fri 5/2/14 1 day Fri 5/2/14 Fri 5/2/14

AERSP 440

Dr. Long

Blue Team

CIO

CEO

CFO

CFO

CFO

Requirements

Ty Druce

Requirements

Requirements

Requirements

Design

Design

Design

Design

Coding

Coding

Coding

Coding

Coding

Coding

Coding

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

Testing

V&V

V&V

V&V

V&V

V&V

V&V

V&V

3/11

V&V

V&V

Blue Team

Blue Team

4/21

CEO

CFO

CIO

Requirements

Design

Coding

Testing

V&V

CEO and V&V Lead

CEO

CEO

S W S T M F T S W S T M F T S W S T M F T S W S T M F T S W S T M F TJan 19, '14 Feb 2, '14 Feb 16, '14 Mar 2, '14 Mar 16, '14 Mar 30, '14 Apr 13, '14 Apr 27, '14 May 11, '14 May 25, '14

Task

Split

Milestone

Summary

Project Summary

External Tasks

External Milestone

Inactive Task

Inactive Milestone

Inactive Summary

Manual Task

Duration‐only

Manual Summary Rollup

Manual Summary

Start‐only

Finish‐only

Deadline

Progress

Page 1

Project: Blue Team Gannt Chart wDate: Fri 5/2/14

Page 35: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 1 of 8

Penn State Department of Aerospace Engineering

Aerospace 440 Spring 2014

Blue Team

Concept of Operations & Requirements

for

Controlled Robotic Vehicle

Revision G

Submitted: 05/02/2014

Page 36: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 2 of 8

Signatures Submitted by: Tyler Druce, Requirements Manager Date: xx/xx/xxx Submitted by: Brad Sottile, Chief Executive Officer Date: xx/xx/xxx

Submitted to: Dr. Lyle N. Long Review Notes During verification process, a couple requirements were deemed either untestable and unnecessary or not codeable. These have been removed but document by a strikethrough in this report.

Page 37: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 3 of 8

Table of Contents

1 SYSTEM CONTEXT & SCOPE

1.1 SCOPE OF THE SYSTEM 1.2 WORKFLOW PARTITIONING 1.3 PRODUCT BOUNDARY 1.4 USE CASE LIST

2 FUNCTIONAL AND DATA REQUIREMENTS

2.1 FUNCTIONAL REQUIREMENTS. 2.2 DATA REQUIREMENTS.

3 LOOK, FEEL AND USE REQUIREMENTS

3.1 USER INTERFACE REQUIREMENTS 3.2 STYLE OF THE PRODUCT REQUIREMENTS 3.3 EASE OF USE REQUIREMENTS 3.4 EASE OF LEARNING REQUIREMENTS

4 PERFORMANCE REQUIREMENTS

4.1 SPEED REQUIREMENTS 4.2 SAFETY CRITICAL REQUIREMENTS 4.3 PRECISION REQUIREMENTS 4.4 RELIABILITY AND AVAILABILITY REQUIREMENTS 4.5 CAPACITY AND SCALABILITY REQUIREMENTS

5 OPERATIONAL REQUIREMENTS

5.1 EXPECTED TECHNOLOGICAL ENVIRONMENT REQUIREMENTS 5.2 PARTNER APPLICATIONS & INTERFACES REQUIREMENTS 5.3 SUPPORTABILITY REQUIREMENTS 5.4 MAINTAINABILITY AND PORTABILITY REQUIREMENTS

6 SECURITY REQUIREMENTS

6.1 SYSTEM CONFIDENTIALITY REQUIREMENTS 6.2 DATA INTEGRITY REQUIREMENTS 6.3 AUDIT REQUIREMENTS

7 LEGAL REQUIREMENTS

8 STANDARDS REQUIREMENTS

9 OFF-THE-SHELF SOLUTIONS REQUIREMENTS

9.1 COTS SYSTEM SOLUTIONS REQUIREMENTS 9.2 READY-MADE COMPONENTS REQUIREMENTS

10 CONSTRAINTS AND ASSUMPTIONS

10.1 SOLUTION CONSTRAINTS 10.2 EXTERNAL FACTORS 10.3 ASSUMPTIONS

11 FUTURE REQUIREMENTS

Page 38: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 4 of 8

1 SYSTEM CONTEXT & SCOPE

SCOPE OF THE SYSTEM

The system to be developed is a robotic vehicle that can execute given commands from a user at a separate location through a wi-fi connection. It will be driven through a laptop or game controller and use a wi-fi camera to provide visual data to the user. The vehicle must be able to avoid obstacles in order to tag the opposing vehicle with an IR sensor; as well as receive tags from the opposing vehicle. The vehicle will have a designated reload time as provided by the customer. The goal of the user is tag the opposing vehicle as many times as possible while avoiding tags from the opposing vehicle.

WORKFLOW PARTITIONING

The workflow should follow the below diagram. A V cycle should be utilized. The coding team has blocks of code tested and verified in increments in order to keep progress flowing and allowing time to address issues and changing customer needs.

Figure 1. Workflow Diagram

PRODUCT BOUNDARY

The team should assemble the vehicle using given off-the-shelf products and create the software necessary for its control. The vehicle must meet all the requirements designated by the customer and should meet all following requirements as determined by requirements engineering. Testing must approve the vehicle is fully operational and meets design criteria before V&V may verify and validate all requirements are met. A user should be trained to control the vehicle to compete on the competition.

Page 39: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 5 of 8

USE CASE LIST

2 FUNCTIONAL AND DATA REQUIREMENTS

2.1 FUNCTIONAL REQUIREMENTS.

2.1.1 The vehicle must be entirely remote controlled

2.1.1.1 The vehicle controller system must consist of only a laptop, wi-fi receiver, any controller devices, and a single user

2.1.2 The vehicle must transmit as close to real time video feed as possible via a given wi-fi camera to the controller

2.1.3 The vehicle must be able to execute all commands given from the controller

2.1.3.1 Vehicle must be able to move forwards, backwards, rotate right, and rotate left

2.1.3.2 Vehicle must have a designated button to fire the IR gun

2.1.3.3 Vehicle must execute commands given wirelessly from the controller utilizing an Arduino wi-fi board and an Arduino processor

2.1.4 Vehicle must have an IR gun and sensor

2.1.4.1 The IR gun must be able to fire every 2 seconds, as specified by the customer as the minimum interval

2.1.4.2 The IR sensor must be continuously on

2.1.5 The vehicle must withstand ramming full speed into a rigid object with no critical structural damage

DATA REQUIREMENTS.

2.2.1 Software must be written using C or C++

2.2.2 Data link between the controller and vehicle must be wireless

2.2.2.1 Software must utilize and integrate an Arduino processor, Arduino wi-fi board, Arduino motor control board, a laptop, IR gun, and IR sensor

3 LOOK, FEEL AND USE REQUIREMENTS

3.1 USER INTERFACE REQUIREMENTS

3.1.1 All control commands should be executed using an Xbox controller (primary) or using the ASDW keys on the keyboard (secondary).

3.1.1.1 The control commands should be mapped to the Xbox controller as follows: left joystick for forward/backwards, right joystick for left/right, right trigger to shoot

3.1.1.2 The control commands should be mapped to the keyboard as follows: W-key for forward, S-key for backwards, A-Key for left, D-Key for right, space bar to shoot

3.1.2 The system should utilize a separate window for video feed

3.1.3 Software should communicate applicable system errors to user

Page 40: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 6 of 8

3.1.4 A reload bar should notify the user when reload time is done

3.1.5 User should be notified when vehicle is tagged

3.1.6 Controller software should keep count of opposing tags

3.2 STYLE OF THE PRODUCT REQUIREMENTS

3.2.1 The robotic system should be placed on a small, two-wheeled chassis.

3.2.2 Vehicle should not contain any other instrumentation than the given processor, wi-fi board, wi-fi camera, IR sensor/receiver, and battery

3.3 EASE OF USE REQUIREMENTS

N/A

3.4 EASE OF LEARNING REQUIREMENTS

3.4.1 The system should be user friendly to the point where a non-team member could be taught all aspects of how to operate the vehicle in 15 minutes or less with less than 5 operational errors

4 PERFORMANCE REQUIREMENTS

4.1 SPEED REQUIREMENTS

4.1.1 System should process as fast as hardware allows with no unnecessary time delays

4.2 SAFETY CRITICAL REQUIREMENTS

4.2.1 The system should have a manual on/off for the batteries and camera 4.2.2 The system should have a remote kill switch initiated by the user on the laptop

4.3 PRECISION REQUIREMENTS

4.3.1 Operator should be able to visually detect an obstacle 3 feet in front of the vehicle through the wi-fi camera

4.3.2 Camera feed should include a dot, circle, or crosshair as an aiming device

4.4 RELIABILITY AND AVAILABILITY REQUIREMENTS

4.4.1 System must be able to operate on a linoleum floor

4.4.2 Battery life should exceed 50 minutes at full performance

4.4.3 Controller and camera range should be more than 50 feet

4.5 CAPACITY AND SCALABILITY REQUIREMENTS

5 OPERATIONAL REQUIREMENTS

5.1 EXPECTED TECHNOLOGICAL ENVIRONMENT REQUIREMENTS

5.2 PARTNER APPLICATIONS & INTERFACES REQUIREMENTS

5.2.1 Wi-fi camera must interface with command software

Page 41: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 7 of 8

5.2.2 Command software should interface with the Arduino processor and board

5.2.3 Arduino processor & board should interface with wheel motors 5.2.4 Arduino processor and board should interface with IR gun & sensor 5.2.5 Battery will interface with motors and hardware

5.3 SUPPORTABILITY REQUIREMENTS

5.3.1 System should be able to execute multiple commands at once

5.4 MAINTAINABILITY AND PORTABILITY REQUIREMENTS

5.4.1 The vehicle should weigh less than 15 pounds 5.4.2 The vehicle dimensions should be less than 2x2x2 feet

6 SECURITY REQUIREMENTS

6.1 SYSTEM CONFIDENTIALITY REQUIREMENTS

6.1.1 All information must be proprietary to team

6.2 DATA INTEGRITY REQUIREMENTS

6.3 AUDIT REQUIREMENTS

N/A 7 Legal Requirements 7.1.1 All instruments must be returned to the customer at the end of the semester

7.1.2 Customer must be reimbursed for any broken instruments

8 STANDARDS REQUIREMENTS N/A

9 OFF-THE-SHELF SOLUTIONS REQUIREMENTS

9.1 COTS SYSTEM SOLUTIONS REQUIREMENTS

9.1.1 All electronic devices and structural components should be off the shelf products

9.2 READY-MADE COMPONENTS REQUIREMENTS

N/A

10 CONSTRAINTS AND ASSUMPTIONS

10.1 SOLUTION CONSTRAINTS

10.1.1 Vehicle performance is limited by the specific instrumentation given by the customer

10.2 EXTERNAL FACTORS

N/A

10.3 ASSUMPTIONS

10.3.1 No outside equipment or instrumentation may be used

Page 42: Software Engineering Final Report - Blue Team

Blue Team Concept of Operations & Requirements

Rev. G Page 8 of 8

11 FUTURE REQUIREMENTS

. 11.1 Customer reserves the rights to reasonably change requirements based on evolving needs

Page 43: Software Engineering Final Report - Blue Team

AERSP 440: Robot User Manual

(BLUE TEAM)

I. Robot Set-up

a. Connect laptop to router-

Username: Cisco14266 Password: abcd1234

b. Make sure the battery is charged before use

c. Make sure the battery is plugged into the Arduino board

d. Turn on the robot with the silver switch on the top (light should turn on)

e. Use USB printer cable to connect Arduino to computer

f. Run Arduino IDE serial monitor to obtain IP for robot

g. Open folder with executable inside and run the executable

h. Window should pop-up and prompt for IP address of the robot

i. Enter the robot IP address and hit enter

j. Window should display reload bar for IR gun; the IR gun can be fired again once the bar has been filled (2 seconds)

k. (optional) Connect USB controller to use controller scheme, otherwise default is keyboard controls

Notes: *Make sure the window for the executable is active in Windows platform

Page 44: Software Engineering Final Report - Blue Team

II. Camera Set-up

a. Run camera software ‘IPCamera’ to obtain camera IP address

b. Open Chrome and type in the IP address found

c. Input camera information-

Username: blue Password: kelvinisrad

d. Click on ‘Server Push Mode’ to access camera feed

Notes: *May need to wait a few seconds for camera IP to become available *If camera cannot connect to router, see camera user manual for setup

III. Controls

a. Xbox Controller

a.i. Left Joystick – Forward and Backward Motion

a.ii. Right Joystick – Left and Right Turning Motion

a.iii. Left Bumper – Speed Down

a.iv. Right Bumper – Speed Up

a.v. Right Trigger – Fire IR Gun

Figure 1: Xbox controller layout

Notes: *Both joysticks can be used for 8-way compass control

Page 45: Software Engineering Final Report - Blue Team

b. Keyboard

W – Forward Motion

S – Backward Motion

A – Turn Left

D – Turn Right

Up Arrow – Speed Up

Down Arrow – Speed Down

Space Bar – Fire IR Gun

Figure 2: Keyboard controller layout Notes: *Keyboard can only be used when Xbox controller is not plugged in

Page 46: Software Engineering Final Report - Blue Team

IV. Trouble Shooting

Arduino can’t connect

Wrong username and password for router

Client can’t connect

Wrong IP address for robot

Camera can’t connect

Check camera user manual to reconnect camera to router

Run IPCamera application to verify IP address of camera

Keyboard control won’t work

Make sure executable window is active (click in it)

Is the controller plugged in? If so, it takes priority over keyboard

If robot disconnects and doesn’t auto reconnect

Restart executable first, then robot.

Page 47: Software Engineering Final Report - Blue Team

Source Code Table of Contents: 

Headers: 

WifiConnection.h Game.h header.h  C++ Files: 

 

Game.cpp GameCleanup.cpp GameGetControllerInput.cpp GameGetKEyboardInput.cpp GameLogSDLError.cpp GameOnInit.cpp GameOnLoop.cpp GameOnRender.cpp  Main.cpp  WifiConnection.cpp WifiConnectionConnect.cpp WifiConnectionInitialize.cpp WifiConnectionLogWSAError.cpp WifiConnectionQuit.cpp WifiConnectionSendData.cpp        

Page 48: Software Engineering Final Report - Blue Team

/** WifiConnection Header File     Contains the information about the WifiConnection class */  #ifndef WIFICONNECTION_H_INCLUDED #define WIFICONNECTION_H_INCLUDED  #include "header.h"  class WifiConnection { private:     friend class game;              //allows the game class to access all private data of the WifiConnection class     u_short PORT;                          //port number     const char* IP_ADDRESS;         //character array that holds IP address     WSADATA wsa;                    //needed for WinSock initialization     SOCKET s;                       //socket     struct sockaddr_in server;      //server struct that holds IP and port information     Uint8 sendarray[5];             //control data array     Uint8 arraysent[5];             //stores data just sent     char *ptr_sendarray;            //pointer to control data     bool connected;     char recvarray;     char *ptr_recvarray;  public:     WifiConnection();     void logWSAError(ostream &os, const string &msg);     bool Initialize();     bool Connect();     bool SendData();     void Quit();     bool ReceiveHit(); };  #endif 

   

Page 49: Software Engineering Final Report - Blue Team

/** Game Header File     Contains the information about the game class, and also funnels all the header libraries into one place     Also contains a WifiConnection object that handles all the networking capabilities */  #ifndef GAME_H_INCLUDED #define GAME_H_INCLUDED  #include "header.h" #include "WifiConnection.h"  const int SCREEN_WIDTH = 400;   //width of the game window const int SCREEN_HEIGHT = 100;  //height of the game window const SDL_Rect CONRECT = {10, 50, 97, 24}; const SDL_Rect DISRECT = {10, 50, 121, 24}; const SDL_Rect HITRECT = {10, 10, 44, 24};  class game { private:     bool Running;                       //defines if the game is running     SDL_Window * window;                //pointer for the game window     SDL_Renderer * renderer;            //pointer for the window renderer     SDL_Surface * loadSurface;     SDL_Texture * TextConnect;     SDL_Texture * TextDisconnect;     SDL_Texture * TextHit;     SDL_Event Event;                    //variable to hold event data     SDL_GameController *controller;     //pointer for the game controller     const Uint8 *keystate;              //pointer (used as an array) for the keystate information     Uint8 speed, max_speed;                        //initial speed     Uint8 left_bias, right_bias;     Sint16 AxisState_Left, AxisState_Right, AxisState_Trigger;                   //state of controller axes     int time_fire, time_now, time_sent, time_hit;         //time of last firing and current time     WifiConnection Connection;          //WifiConnection object     int deadzone;                       //Deadzone for gamepads     bool firing;     bool reloading;     bool hit;     int time_nowms,time_firems;   public:     game();     void logSDLError(ostream &os, const string &msg);     int OnExecute();     bool OnInit(); 

Page 50: Software Engineering Final Report - Blue Team

    bool LoadContent();     void GetKeyboardInput();     void GetControllerInput();     void OnEvent(SDL_Event * Event);     void OnLoop();     void OnRender();     void Cleanup(); };  #endif // GAME_H_INCLUDED 

 /** Header File     Contains all the libraries (and perhaps later, function prototypes, but we should try to keep those in a relevant class structure)     that the program needs to run.  In order to compile and run this program, SDL2.0 and Winsock2 are also required, and their respective     libraries must be linked in your IDE.  SDL also has a .dll file that must be in the project folder for the executable to run. */  #ifndef HEADER_H_INCLUDED #define HEADER_H_INCLUDED  #include <iostream> #include <SDL.h> #include <winsock2.h> #include <stdint.h> #include <ctime>  using namespace std;  #endif // HEADER_H_INCLUDED 

   

Page 51: Software Engineering Final Report - Blue Team

/** Game Constructor and Execution Loop     Contains both the constructor for game, which initializes the required variables,     and the simple loop that controls the order in which things occur and update during the game.     UPDATE:  Eliminated delay.  Should be balanced by the fact that the code only sends a new array     when it changes, but we'll have to keep an eye on this. */  #include "Game.h"  game::game()    //game constructor {     window = NULL;     renderer = NULL;     loadSurface = NULL;     TextConnect = NULL;     TextDisconnect = NULL;     TextHit = NULL;     controller = NULL;     keystate = SDL_GetKeyboardState(NULL);      //initializes a keyboard state that can be updated later     speed = 100;     max_speed = 150;                            //initial speed for keyboard input     left_bias = right_bias = 0;     Running = true;                             //the game is now running     firing = false;     reloading = false;     hit = false;     time_fire = 0;     time_sent = 0;     time_hit = 0;     deadzone = 6000;  }  int game::OnExecute() {     if (OnInit() == false)                      //initializes everything needed to run the game     {         return ‐1;                              //if initialization failed, close the program     }     while (Running)                             //loops as long as game is running     {          SDL_PumpEvents();     

Page 52: Software Engineering Final Report - Blue Team

        if (SDL_GameControllerGetAttached(controller) == SDL_TRUE)      //get controller input if controller is attached         {             GetControllerInput();         }         else         {             controller = NULL;                  //otherwise, reset controller pointer (in case it's been disconnected)             GetKeyboardInput();                 //and get keyboard input         }           while (SDL_PollEvent(&Event))           //check the event log for any desired events         {             OnEvent(&Event);         }          OnLoop(); //sends data         OnRender();     }      Cleanup();      //once game stops running, clean up everything     return 0;  } 

/** Game Cleanup     Before the game quits, this function cleans all the data and processes that were initialized at the start, including the networking routines */  #include "Game.h"  void game::Cleanup() {     Connection.Quit();                      //quit the networking routines      SDL_GameControllerClose(controller);    //close the game controller     SDL_DestroyRenderer(renderer);          //close the renderer     SDL_DestroyWindow(window);              //close the game window     SDL_Quit();                             //quit SDL } 

   

Page 53: Software Engineering Final Report - Blue Team

/** Game Controller Input     Checks the state of all desired controller axes and sets data values that correspond to current state     In the future, controller buttons can also be mapped for certain purposes */  #include "Game.h"  void game::GetControllerInput() {      Connection.sendarray[0] = 0;     Connection.sendarray[1] = 0;     Connection.sendarray[2] = 0;     Connection.sendarray[3] = 0;      AxisState_Right = SDL_GameControllerGetAxis(controller, SDL_CONTROLLER_AXIS_RIGHTX);     AxisState_Left = SDL_GameControllerGetAxis(controller, SDL_CONTROLLER_AXIS_LEFTY);      if ((AxisState_Right > Sint16(deadzone)))     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(1);     }     else if ((AxisState_Right < Sint16(‐deadzone)))     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(0);     }            //the left joystick y axis is positive down, negative up     if ((AxisState_Left > Sint16(deadzone)))                                                     //it controls the left motor     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(1);     }     else if ((AxisState_Left < Sint16(‐deadzone)))     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(0);     } 

Page 54: Software Engineering Final Report - Blue Team

     if (AxisState_Left < Sint16(‐deadzone) && AxisState_Right > Sint16(deadzone) )     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = (6.5/10.0)*speed;         Connection.sendarray[3] = Uint8(0);     }     else if (AxisState_Left < Sint16(‐deadzone) && AxisState_Right < Sint16(‐deadzone) )     {         Connection.sendarray[0] = (6.5/10.0)*(speed + (30/130.0)*speed);         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(0);     }      if (AxisState_Left > Sint16(deadzone) && AxisState_Right > Sint16(deadzone) )     {         Connection.sendarray[0] = speed + (30/130.0)*speed;         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = (6.5/10.0)*speed;         Connection.sendarray[3] = Uint8(1);     }     else if (AxisState_Left > Sint16(deadzone) && AxisState_Right < Sint16(‐deadzone) )     {         Connection.sendarray[0] = (6.5/10.0)*(speed + (30/130.0)*speed);         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = speed;         Connection.sendarray[3] = Uint8(1);     }      AxisState_Trigger = SDL_GameControllerGetAxis(controller, SDL_CONTROLLER_AXIS_TRIGGERRIGHT);    //the right trigger controls the firing mechanism     if ((AxisState_Trigger > Sint16(deadzone)))     {         Connection.sendarray[4] = Uint8(1);     }     else     {         Connection.sendarray[4] = Uint8(0);     }      for (int i=0; i<5; i++)                                 //prints out control data for debugging     {         cout << int(Connection.sendarray[i]) << endl;     }  } 

Page 55: Software Engineering Final Report - Blue Team

/** Game Keyboard Input     Checks the state of all desired keys and sets data values that correspond to keys pressed */  #include "Game.h"  void game::GetKeyboardInput() {      if (keystate[SDL_SCANCODE_UP])      //pressing the up arrow key increases the speed, up to a max of 130     {         if (speed < Uint8(max_speed))         {             speed++;         }     }     else if (keystate[SDL_SCANCODE_DOWN])       //pressing the down arrow key decreases the speed, down to a min of 0     {         if (speed > Uint8(0))         {             speed‐‐;         }     }     else {}      //of the WASD keys, when more than one is pressed, only one will determine the control data     //it may be helpful in the future to define scenarios in which multiple keys are pressed     if (keystate[SDL_SCANCODE_W])       //W moves forward and has highest priority     {         Connection.sendarray[0] = speed + left_bias;         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = speed + right_bias;         Connection.sendarray[3] = Uint8(0);     }     else if (keystate[SDL_SCANCODE_S])      //S moves backward and has second priority     {         Connection.sendarray[0] = speed + left_bias;         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = speed + right_bias;         Connection.sendarray[3] = Uint8(1);     }     else if (keystate[SDL_SCANCODE_A])      //A turns left and has third priority     {         Connection.sendarray[0] = speed + left_bias;         Connection.sendarray[1] = Uint8(1);         Connection.sendarray[2] = speed + right_bias; 

Page 56: Software Engineering Final Report - Blue Team

        Connection.sendarray[3] = Uint8(0);     }     else if (keystate[SDL_SCANCODE_D])      //D turns right and has fourth priority     {         Connection.sendarray[0] = speed + left_bias;         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = speed + right_bias;         Connection.sendarray[3] = Uint8(1);     }     else                                    //if none are pressed, zero‐value movement commands are sent     {         Connection.sendarray[0] = Uint8(0);         Connection.sendarray[1] = Uint8(0);         Connection.sendarray[2] = Uint8(0);         Connection.sendarray[3] = Uint8(0);     }      if (keystate[SDL_SCANCODE_SPACE])       //the spacebar fires the IR gun     {         Connection.sendarray[4] = Uint8(1);     }     else                                    //if not pressed, gun will not fire     {         Connection.sendarray[4] = Uint8(0);     }      for (int i=0; i<5; i++)                 //prints out control data for debugging     {         cout << int(Connection.sendarray[i]) << endl;     } } 

/** Game Log SDL Error     Log an SDL error with some error message to the output stream of our choice     @param os The output stream to write the message to     @param msg The error message to write, format will be "<msg> error: <SDL_GetError()>" */  #include "Game.h"  void game::logSDLError(ostream &os, const string &msg) {     os << msg << " error: " << SDL_GetError() << endl; } 

   

Page 57: Software Engineering Final Report - Blue Team

/** Game Event     This function receives an event off the top of the event log and checks if it is one that we are interested in     It also handles the connection of a USB game controller */  #include "Game.h"  void game::OnEvent(SDL_Event* Event) {     if (Event‐>type == SDL_QUIT)        //if the game window is closed, the game will stop running     {         Running = false;     }     if (Event‐>type == SDL_CONTROLLERDEVICEADDED)       //if a controller is connected, it will be mapped and used in later iterations     {         cout << "Controller connected.\n";         for (int i=0; i<SDL_NumJoysticks(); i++)         {             if (SDL_IsGameController(i))             {                 controller = SDL_GameControllerOpen(i);                 if (controller)                 {                     cout << "Controller Name: " << SDL_GameControllerName(controller) << endl;                     break;                 }                 else                 {                     logSDLError(cout, "SDL_GameControllerOpen");                 }             }         }     }     if (Event‐>type == SDL_CONTROLLERBUTTONDOWN)     {         if (Event‐>cbutton.button == SDL_CONTROLLER_BUTTON_LEFTSHOULDER)         {             if (max_speed >= 5)             {                 max_speed = max_speed ‐ 5;                 if (speed > max_speed)                 {                     speed = max_speed;                 }             }             cout<<"LEFT SHOULDER\n"; 

Page 58: Software Engineering Final Report - Blue Team

        }         if (Event‐>cbutton.button == SDL_CONTROLLER_BUTTON_RIGHTSHOULDER)         {             max_speed = max_speed + 5;             speed = max_speed;  //change this later             cout<<"RIGHT SHOULDER\n";         }     } } 

 /** Game Initialization     Initializes all the required processes to run the game, including creating a window and renderer,     and establishing a network connection */  #include "Game.h"  bool game::OnInit() {     if (SDL_Init(SDL_INIT_EVERYTHING) < 0)          //initialize SDL     {         logSDLError(cout, "SDL_Init");         return false;     }      if ((window = SDL_CreateWindow("SDL Render Clear", 100, 100, SCREEN_WIDTH, SCREEN_HEIGHT, SDL_WINDOW_SHOWN)) == NULL)       //create a window near the top left of the screen     {         logSDLError(cout, "SDL_CreateWindow");         return false;     } renderer = SDL_CreateRenderer(window, 0, SDL_RENDERER_ACCELERATED | SDL_RENDERER_PRESENTVSYNC);        //create a renderer for that window so that images can be displayed later     if (renderer == NULL)     {         logSDLError(cout, "SDL_CreateRenderer");         return false;     }      if (Connection.Initialize() == false)       //initialize networking functions     {         return false;     }     if (Connection.Connect() == false)          //connect to network     {         return false;     } 

Page 59: Software Engineering Final Report - Blue Team

       for (int i=0; i<SDL_NumJoysticks(); i++)     {         if (SDL_IsGameController(i))         {             controller = SDL_GameControllerOpen(i);             if (controller)             {                 cout << "Controller Name: " << SDL_GameControllerName(controller) << endl;                 break;             }             else             {                 logSDLError(cout, "SDL_GameControllerOpen");             }         }     }      loadSurface = SDL_LoadBMP("Connected.bmp");     if (loadSurface == NULL) {         logSDLError(cout, "SDL_LoadBMP_Connect");         return false;     }     TextConnect = SDL_CreateTextureFromSurface(renderer, loadSurface);     if (TextConnect == NULL) {         logSDLError(cout, "SDL_CreateTextureFromSurface_Connect");         return false;     }     SDL_FreeSurface(loadSurface);     loadSurface = NULL;      loadSurface = SDL_LoadBMP("Disconnected.bmp");     if (loadSurface == NULL) {         logSDLError(cout, "SDL_LoadBMP_Disconnect");         return false;     }     TextDisconnect = SDL_CreateTextureFromSurface(renderer, loadSurface);     if (TextConnect == NULL) {         logSDLError(cout, "SDL_CreateTextureFromSurface_Disconnect");         return false;     }     SDL_FreeSurface(loadSurface);     loadSurface = NULL;    

Page 60: Software Engineering Final Report - Blue Team

    loadSurface = SDL_LoadBMP("Hit.bmp");     if (loadSurface == NULL) {         logSDLError(cout, "SDL_LoadBMP_Hit");         return false;     }     TextHit = SDL_CreateTextureFromSurface(renderer, loadSurface);     if (TextConnect == NULL) {         logSDLError(cout, "SDL_CreateTextureFromSurface_Hit");         return false;     }     SDL_FreeSurface(loadSurface);     loadSurface = NULL;       return true; } 

/** Game Loop     This function contains any processes that need to be updated each iteration, including checking the time since last fire,     and sending the control data */  #include "Game.h"  void game::OnLoop() {       time_now = SDL_GetTicks();    //takes the current time in ms     if (Connection.sendarray[4] == 1 && !reloading)        //if more than two seconds have passed since last firing,     {         cout<<"FIRING"<<endl;          firing = true;         reloading = true;          time_fire = SDL_GetTicks();     }      if (time_now ‐ time_fire > 500){         firing = false;     }      if (time_now ‐ time_fire > 2000){         reloading = false;     }  

Page 61: Software Engineering Final Report - Blue Team

    if (firing) {         Connection.sendarray[4] = 1;     }     if (!firing)     {         Connection.sendarray[4] = 0;     }       if ((time_now ‐ time_sent) > 100){         if (Connection.SendData() == false)     //sends control data         {             Connection.connected = false;             cout<<'\a';             closesocket(Connection.s);             Connection.s = socket(AF_INET,SOCK_STREAM,0);             Connection.Connect();         }         time_sent = SDL_GetTicks();     }      if (Connection.ReceiveHit()) {         if ((time_now ‐ time_hit) > 1000) {             hit = true;             time_hit = SDL_GetTicks();             cout<<'\a';         }     }      if ((time_now ‐ time_hit) > 1000){         hit = false;     }   } 

/** Game Render     This function controls what, if anything, is rendered in the game window     It is used to inform the user of being hit, as well as displaying the reload time */  #include "Game.h"  void game::OnRender() {     if (reloading){     //Draws and redraws a growing rectangle between 0 and 2000 ms of firing.         SDL_Rect r;         SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255); 

Page 62: Software Engineering Final Report - Blue Team

        SDL_RenderClear(renderer);         r.x = 75;         r.y = 10;         r.w = (time_now‐time_fire)/10.0 ;         r.h = 25;          SDL_SetRenderDrawColor(renderer, 255, 0, 0, 255);         SDL_RenderDrawRect( renderer, &r );     }       if (!reloading){         SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);         SDL_RenderClear(renderer);      }      if (time_now ‐ time_fire > 2200){         SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);         SDL_RenderClear(renderer);      }      if (Connection.connected) {         SDL_RenderCopy(renderer, TextConnect, NULL, &CONRECT);     }      if (!Connection.connected) {         SDL_RenderCopy(renderer, TextDisconnect, NULL, &DISRECT);     }     if (hit){         SDL_RenderCopy(renderer, TextHit, NULL, &HITRECT);     }       SDL_RenderPresent(renderer);  } 

/** Main Routine     Creates a game object called theGame, which also runs its constructor, and then starts the game. */ #include "Game.h" int main(int argc, char* argv[])    //SDL Requires main() to take these parameters  {     game theGame;     return theGame.OnExecute(); } 

Page 63: Software Engineering Final Report - Blue Team

/** WifiConnection Constructor     Initializes necessary data for the WifiConnection object */  #include "Game.h" #include <string.h>  WifiConnection::WifiConnection() {      string ip_addr;     cout<<"Input IP address: ";     cin>>ip_addr;      PORT = 5001;       IP_ADDRESS = ip_addr.c_str();      ptr_sendarray = (char*)(&sendarray);     ptr_recvarray = (char*)(&recvarray);     arraysent[0] = 0;     arraysent[1] = 0;     arraysent[2] = 0;     arraysent[3] = 0;     arraysent[4] = 0;     connected = false;      server.sin_addr.s_addr = inet_addr(IP_ADDRESS);     server.sin_family = AF_INET;     server.sin_port = htons(PORT); } 

/** WifiConnection Connect     Connects to the network IP and port defined in the constructor */  #include "Game.h"  bool WifiConnection::Connect() {     if (connect(s , (struct sockaddr *)&server , sizeof(server)) < 0)     {        logWSAError(cout, "connect");         return false;       }    

Page 64: Software Engineering Final Report - Blue Team

    u_long iMode =1;     ioctlsocket(s,FIONBIO,&iMode);  //Sets socket to non‐blocking mode      setsockopt(s,IPPROTO_TCP,TCP_NODELAY,ptr_sendarray,1); //Deactivates Nagle algorithm       cout << "Connected\n";     connected = true;      return true; }  

/** WifiConnection Initialize     Initializes all the WinSock functions and processes necessary for networking, and creates a socket */  #include "Game.h"  bool WifiConnection::Initialize() {     cout << "Initializing Winsock_Version2...";     if (WSAStartup(MAKEWORD(2,2),&wsa) !=0)     {          logWSAError(cout, "WSAStartup");         return false;     }     cout << "Initialized.\n";      if((s = socket(AF_INET,SOCK_STREAM,0))==INVALID_SOCKET)     {         logWSAError(cout, "socket");         return false;     }     cout <<  "Socket created.\n";      return true; } 

   

Page 65: Software Engineering Final Report - Blue Team

/** WifiConnection Log WSA Error     Log a WSA error with some error message to the output stream of our choice     @param os The output stream to write the message to     @param msg The error message to write, format will be "<msg> error: <WSAGetLastError()>" */  #include "Game.h"  void WifiConnection::logWSAError(ostream &os, const string &msg) {     os << msg << " error: " << WSAGetLastError() << endl; } 

/** WifiConnection Quit     Ends the connection, closes the socket, and quits all the Winsock processes */  #include "Game.h"  void WifiConnection::Quit() {     ptr_sendarray = NULL;     closesocket(s);     WSACleanup(); } 

 /** WifiConnection Send Data     Sends the control data over the created network connection /  #include "Game.h" bool WifiConnection::SendData() {         if (send(s, ptr_sendarray, 5, 0) < 0)         {             logWSAError(cout, "send");             return false;         }         arraysent[0] = sendarray[0];         arraysent[1] = sendarray[1];         arraysent[2] = sendarray[2];         arraysent[3] = sendarray[3];         arraysent[4] = sendarray[4];                return true; }