Design Report. IniTech for



Similar documents
UM-D WOLF. University of Michigan-Dearborn. Stefan Filipek, Richard Herrell, Jonathan Hyland, Edward Klacza, Anthony Lucente, Sibu Varughese

P545 Autonomous Cart

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

Table of Contents 1. Introduction Team Structure Design Process Specifications Mechanical System Design

An Experimental Study on Pixy CMUcam5 Vision Sensor

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

FRC WPI Robotics Library Overview

USER MANUAL V5.0 ST100

Prof. Wyatt Newman, EECS Dept, CWRU

G-100/200 Operation & Installation

CM705B - Universal Expander Module CM707B - Plug On Zone Expander Security Systems

Basler. Line Scan Cameras

Accident Notification System by using Two Modems GSM and GPS

GTR-128/GTR-129 Motorcycle/ Vehicle Tracker Quick Start Guide

2. TEST PITCH REQUIREMENT

Overview. Proven Image Quality and Easy to Use Without a Frame Grabber. Your benefits include:

Automatic Labeling of Lane Markings for Autonomous Vehicles

NETWORK ENABLED EQUIPMENT MONITOR

UPS PIco. to be used with. Raspberry Pi B+, A+, B, and A. HAT Compliant. Raspberry Pi is a trademark of the Raspberry Pi Foundation

Networking Remote-Controlled Moving Image Monitoring System

Eric Mitchell April 2, 2012 Application Note: Control of a 180 Servo Motor with Arduino UNO Development Board

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

Dupline Carpark Guidance System

In-Sight Hardware specifications

JNIOR. Overview. Get Connected. Get Results. JNIOR Model 310. JNIOR Model 312. JNIOR Model 314. JNIOR Model 410

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Bluetooth + USB 16 Servo Controller [RKI-1005 & RKI-1205]

Section Contents ISOBUS GENERAL INFORMATION...3. What is ISOBUS?... 3 Connectors... 5 CANBUS GENERAL INFORMATION...7

Basler. Area Scan Cameras

GPS & GSM BASED REAL-TIME VEHICLE TRACKING SYSTEM.

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004

Designing Behavior-Based Systems

SMART DRUNKEN DRIVER DETECTION AND SPEED MONITORING SYSTEM FOR VEHICLES

Basler scout AREA SCAN CAMERAS

USER MANUAL V5.0 VT300

PX100 MBP with Configurable Distribution

Massachusetts Institute of Technology

How to read this guide

Electronics Ltd. Presenting. Power, Intelligence... with a sense of feeling

Color Mark Sensor with Red or Green LED E3S-VS

TONER CARTRIDGE REMANUFACTURING INSTRUCTIONS DELL 3110 CN TONER CARTRIDGE

MANAGEMENT SOLUTION (P.A.M.S.)

Conventional Fire Detection and Extinguishant Control System Specification

Build Better Robots Faster. Radim ŠTEFAN

Indoor Surveillance Security Robot with a Self-Propelled Patrolling Vehicle

Evo Laser Firmware Developer s Manual

The demonstration will be performed in the INTA high speed ring to emulate highway geometry and driving conditions.

M O D U L E - 7 E Model CS-Caliprompter Operator s Manual

Sensor-Based Robotic Model for Vehicle Accident Avoidance

User manual DinaSys DTC/DTS and DTC/DTZ

What is LOG Storm and what is it useful for?

Software Manual RS232 Laser Merge Module. Document # SU Rev A

FLEET MANAGEMENT & CAR SECURITY SYSTEM GPRS/GPS

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

THE MODULAR INTEGRATED STACKABLE LAYERS SYSTEM: A NASA DEVELOPMENT PARTNERSHIP

1. Lay out 2 pieces of 7/8" tubing and mark for bending as shown. Remember that the bend is in the shaded area as shown below in Figure 1.

TRAIL CHARGER BASED CHARGING SOLUTIONS

Shopping Cart Chair. Estimated Cost: Less then $25 if you find a free cart. Shopping Cart Chair page 1. FOR MORE PROJECTS, VISIT: makezine.

Developing a Sewer Inspection Robot through a Mechatronics Approach

ATS - Automatic Changeover

E70 Rear-view Camera (RFK)

White Paper: Pervasive Power: Integrated Energy Storage for POL Delivery

Applications > Robotics research and education > Assistant robot at home > Surveillance > Tele-presence > Entertainment/Education > Cleaning

Waterproof portable tracker and

Location-Aware and Safer Cards: Enhancing RFID Security and Privacy

Vehicle Tracking System for Security and Analyzing Transportation Vehicle Information

Data Sheet. Remote Presence for the Enterprise. Product Overview. Benefits of Video Collaboration Robots

Vision-Based Blind Spot Detection Using Optical Flow

AC-115 Compact Networked Single Door Controller. Installation and User Manual

i ChatterBox! Motorcycle Security

New XUY roller sensors

Wireless Security Camera

Electronic Power Control

Navigation Aid And Label Reading With Voice Communication For Visually Impaired People

MSc in Autonomous Robotics Engineering University of York

LiteKeeper 4 & 8 Programming Guide and User Manual

Product Specification instalert Rapid Messenger Variable Message Sign

Alliance System Ordering Guide

Ferrostat Speed Sensor Series DSF Explosion Proof Versions EEx

CB-OLP425 DEVELOPMENT KIT GETTING STARTED

Mobile Robotics I: Lab 2 Dead Reckoning: Autonomous Locomotion Using Odometry

MCOM VEHICLE TRACKING SYSTEM MANUAL

A 5 Degree Feedback Control Robotic Arm (Haptic Arm)

"#$%&'#!()*&+%,-! ./01#2! 3"4!5!677!! 3,!89#)9%#:!!!

Special Lecture. Basic Stamp 2 Programming. (Presented on popular demand)

GPS Tracking Solution GPS GSM Tracker

INTRODUCTION TO SERIAL ARM

Electronic Control Devices The European Product Catalogue 2007

RC2200DK Demonstration Kit User Manual

CONTENTS. Section 1 Document Descriptions Purpose of this Document Nomenclature of this Document... 3

GPS Vehicle and personal location tracker. User manual

Frequently Asked Questions for TFT Remote Controlled Monitors. First Rev 4/6/2005 SGMc Second Rev 5/7/2005 Third Rev 11/15/2005

2016 Evolocity High Schools competition Technical regulations

Mirror Mount Video Monitor/Recorder with Front and Rear View Night Cameras PLCMDVR5

Transcription:

Design Report by IniTech for 14 th Annual Intelligent Ground Vehicle Competition University of Maryland: Baltimore County: Erik Broman (Team Lead and Requirement Lead) Albert Hsu (Design Lead) Sean Wilson (Coding Lead) Hussenia Ozigi Otaru (Testing Lead) UMBC IEEE members May 25, 2006

SQUINTY DESIGN REPORT PAGE 1. INTRODUCTION... 3 1.1 PURPOSE OF THIS REPORT... 3 1.2 TEAM ORGANIZATION... 3 2. DESIGN DEVELOPMENT... 4 2.1 INCREMENTAL DEVELOPMENT... 4 3. DESIGN PROCESSES... 5 3.1 VEHICLE MECHANICAL SPECIFICATION AND DESIGN... 5 3.1.1 Software Program Design... 5 3.1.2 Primary Vision Sensory Component... 6 3.2.2 Secondary Hardware Components... 7 3.1.3 Vehicle Safety Component... 8 3.2 AUTONOMOUS CHALLENGE NAVIGATION: PATH PLANNING... 8 3.3 NAVIGATION CHALLENGE NAVIGATION: GPS ALGORITHM... 8 4. EFFICIENCIES OF RESOURCES... 9 4.1 BATTERY POWER...9 4.2 MATERIALS DATA... 9 5. LAST REMARKS... 11 APPENDIX A: FACULTY ADVISOR STATEMENT SIGN-OFF... 12 2

1. Introduction 1.1 Purpose of This Report This report serves the purpose of providing the user or client with the System Design development and an explanation of its components, highlighting unique innovative aspects of the design, as well as intelligence aspects of the Squinty system for autonomous challenge. Since the navigation challenge is planned to be completed, the system used for navigation event will also be detailed. 1.2 Team Organization The Initech team is comprised of undergraduate students from multiple disciplines, such as Computer Engineering, Computer Science, and Mechanical Engineering, who are all enrolled at the University of Maryland: Baltimore County (UMBC) and who are also part of the UMBC IEEE student branch. Two members are returning who entered in the previous intelligent ground vehicle competition, Albert Hsu and Erik Broman. Development phase leaders were elected to make the project process run smoothly by having a team lead, requirement lead, design lead, coding lead, and testing lead. Name Class and Department Phase Leaders Hours Erik Broman Undergraduate, Computer Engineering Team Lead and Requirement Lead Sean Wilson Undergraduate, Computer Engineering Coding Lead 120 Albert Hsu Undergraduate, Computer Engineering Design Lead 94 Hussenia Ozigi Otaru Undergraduate, Computer Engineering Testing Lead 81 Steve Sapp Undergraduate, Computer Engineering 20 Andrew Wilson Graduated, Computer Engineering 30 Emmanuel Oduroja Undergraduate, Computer Science 30 Andrew Stowell Undergraduate, Mechanical Engineering 70 120 Total Hours 565 3

2. Design Development The design development of Squinty took the approach of refitting and adding additional features to the previous design of the Should be Trivial robot, which was entered into the 13 th intelligent ground vehicle competition (IGVC). Throughout September 2005 to December 2005, the Initech team discussed many improvements and additional requirements to add to the 14 th IGVC vehicle, so that next year from January to the day of the competition, the implementation and testing processes could be performed on the Squinty system. 2.1 Incremental Development The development of the previous entry, Should Be Trivial, was performed for the 13 th IGVC with navigation in mind but not implemented. After the competition, the autonomous challenge code was improved, and currently the team is working on the navigation challenge. Therefore, the development of the project was closely related to the incremental development approach shown in Figure 1. Figure 1: Incremental development An iterative process was used between the design and implementation phases, because of the configuration needed to integrate new components. 4

3. Design Processes The Squinty vehicle will be entered into the autonomous and navigation challenge for IGVC. There were significant improvements in the performance of the current implementation of the vision detection algorithm. The highlight of the autonomous challenge is the image processing path and detection algorithm, and the highlight of the navigation challenge is the GPS navigation algorithm. 3.1 Vehicle Mechanical Specification and Design Squinty is a vehicle with two front wheel drive motors using differential steering. The vehicle s movement is driven by an assortment of inputs: camera images, sensors, motors, and safety components designed for the autonomous and navigation challenge for the IGVC. 3.1.1 Software Program Design All control of the vehicle is performed by a Dell Inspiron 1150 Laptop running Ubuntu Linux version 6.06 beta 2 Operating System, a change from the use of Fedora Core 2 Operating System last year. Ubuntu is a free, stable, and well designed operating system that is not too resource intensive. The program to run Squinty was developed in C/C++ and was split into four essential processes which are run in parallel: vision & sonar, navigation, execution and emergency conditions. A detailed process is shown in Figure3. Figure 3: Detailed Process of Autonomous Program 5

The Navigation process will differ between the autonomous and navigation challenges, since the inputs that will determine the path decisions made will change between the cameras and the GPS device. Further details will be explained later in this report. In execution, the command to drive the motors is written to shared memory, where the navigation code reads it and sends a pulse width modulation (PWM) signal to the motors through the Brainstem micro controller to a RoboteQ motor controller, which powers the two motors. The PWM signal commands the motors to perform basic driving instructions: Turn left, turn right and drive forward. 3.1.2 Primary Vision Sensory Component Vision to the vehicle is the most important part of detecting obstacles on the road, looking for distances to objects, their colors, and shape. How do we perform those actions? One technique is to use a camera to take an image and process them to the specifications of the challenge. The Unibrain Fire-I Board Firewire camera was used for the vision portion of the system. We are using only one camera; hence the name Squinty. First, the camera is adjusted to the light at the beginning of the program, looking for obstacle colors orange, yellow and white and specifying the range of them in terms of red, green and blue values. The code utilizes the OpenCV open source image library in the program for vision functions. The image takes a low resolution image (300x225 pixels) to decrease the number of pixels to process, and filters out non obstacle colors for detection. When the colors were found, the vision program groups those colors to find the boundaries of an object and passes these boundaries to the navigation process. This cycle is repeated an average of 10 frames per second, an improvement of 1 frame per second from last years design. 6

3.2.2 Secondary Hardware Components 3.2.2a Sonar Sensor Hardware Squinty s camera can only see what is directly in front of the robot. As such, if the robot needs to make a turn but there is an obstacle to the side of the robot, then the robot will not be able to know whether it can make the turn correctly. Therefore, the sonar sensors will be placed on the left and right sides of the robot to ensure that it will not be impeded by obstacles while making a turn. 3.2.2b Navigation Hardware Squinty will read GPS inputs from the Laipac TF30 global positioning system (GPS) module. The component provides the necessary latitude and longitude so Squinty can plan a path to reach a waypoint. Another component to the navigation is the compass, a Devantech CMPS03 IIC Digital Compass. The compass was used with the GPS to provide the current heading of Squinty, which provides data to create correct paths to waypoints. 3.2.2c Motors & Control Hardware The two NPC-02446 motors were used, replacing the powerful E-tek motors from last years competition. The new light weight motors have better mobility, and are cost efficient. In addition, the motors connect to the RoboteQ motor controller that receives commands from remote controls and microcomputers to drive the two motors. 7

3.2.2d Micro-controllers All the processes of secondary devices go through the Arconame Brainstem micro-controller. It provides a variety of I/O functions for Squinty systems. Combining programmability and a built-in command set, this module is both a stand-alone controller and realworld interfacing tool. A Brainstem program is written in The Embedded Application (TEA) language for the brainstem. TEA is a subset of C language, and controls the Compass and the Motor Controller. 3.1.3 Vehicle Safety Component The emergency stop (E-stop) function is a required feature for an IGVC robot. Our E-Stop system consists of a remote control keyfob system and a large button on the back of the Squinty robot. Either method will shut off the robots motors. 3.2 Autonomous Challenge Navigation: Path Planning The obstacle path was passed from the vision help the navigation to determine the collision path, were not to go. This was done by using five feeler coordinates. If there is obstacle color that shares the same x coordinate as one of these five feeler coordinates, collision is registered. When collision path to the left half of the image occurred, it will command Squinty to go right and vise versa. Squinty will proceed straight if there is no obstacle expecting that an obstacle will appeared either on the left or the right. The process is repeated every time the image is filtered. 3.3 Navigation Challenge Navigation: GPS Algorithm At the start of the program of Navigation Challenge, the GPS (The Laipac TF30) will read in the waypoints from a file, get the current heading, 8

bearing, and coordinates of itself and place the waypoints on a virtual x y coordinate map. This map was split into four even grids. The robot tries to find the most clustered waypoints on the map. A list then is generated of the priority of each grid, based on the most clustered sectors of the map. This accomplishes the goal of getting to the most waypoints in the shortest amount of time. Squinty will head to the first waypoint on the list using the current bearing and location with the help of the vision portion to detect obstacles. The compass and GPS constantly re-map the robot s position on the map, updating the robots location. 4. Efficiencies of Resources Resources in the real world are limited and engineers are required to be efficient in using them. Furthermore the resources are limited by the budget of how much each material costs and the amount of space it takes up on the vehicle. 4.1 Battery Power Squinty has multiple power sources: one for the motors, one for aux components and one for the laptop. In order to keep vehicle sensors' electronics isolated from voltage transients generated by the motors, a separate battery was selected for the drive system. In an effort to save space, only one battery was used (12V 33Ah deep cycle Sealed Lead Acid). The laptop draws its power of its battery that last 2 and half hours. Lastly, the aux components draw power from set of two 12 V battery packs mounted in the vehicle. 4.2 Materials Data The entire Squinty system is cost effective and has reused some of the materials to complete intelligent ground vehicle competition. Below table 1 is the total cost of materials required for building Squinty, which includes using last year materials, which are highlighted in red. Since these. Changes in red that is taken out of the total cost. 9

Part Unit Cost Total Cost Cost to Team Quantity Locomotion: () Briggs and Stratton E-Tek Motor ($400.00) 2 ($800.00) ($400.00) NPC-02446 Motors $72.00 2 $144.00 $144.00 NPC-Gearset 4:1 $95.00 2 $190.00 $190.00 RoboteQ AX2550 Motor Controller $495.00 1 $495.00 $495.00 E-Stop Circuit & High Power Relays $300.00 1 $300.00 $300.00 Dual Pro LS 2200 Deep Cycle Battery $133.00 1 $133.00 $133.00 Cherry GS100902 Geartooth Hall Effect Sensor $40.00 2 $80.00 $0.00 60 Tooth Sprocket, #35 $29.68 2 $59.36 $59.36 1" Axle Pillow Block Bearing $12.95 4 $51.80 $51.80 1"x1" Low-Carbon Steel Box Tubing (per foot) $2.00 25 $50.00 $0.00 () 4 x 10" Knobby Hand Truck Tires $36.00 1 ($36.00) ($36.00) Golf Cart Tire 16.2 $48.00 4 191.96 $191.96 () Drive Wheel Hub $14.99 2 ($29.98) ($29.98) Wheel Hub $20.00 4 $79.96 $79.96 15 Tooth Sprocket #35 $7.43 2 $14.86 $14.86 #35 Riveted Roller Chain, 3/8" Pitch $2.21 6 $13.26 $13.26 1/4" Plywood (24" x 24" sheet) $3.49 2 $6.98 $6.98 Sensing: Unibrain Fire-I Board Color Cameras w/ 2.1mm Lens $139.00 2 $278.00 $278.00 GPS Module $200.00 1 $200.00 $200.00 Brainstem GP Microcontroller $81.00 2 $162.00 $162.00 Serial to USB Adapter $40.00 1 $40.00 $40.00 Devantech IIC Compass Module $51.00 1 $51.00 $51.00 Misc Wire and Wiring Hardware $50.00 1 $50.00 $50.00 Component Boards and Enclosures $50.00 1 $50.00 $50.00 Control: Dell Inspiron 1150 Laptop $989.00 1 $989.00 $989.00 Total: $2,810.24 $3,034.00 Old Total: $3,890.00 $3,360.00 *new items Table 1: Cost of materials for Squinty ()old items As the table shows, the total cost of the vehicle decreased, mainly because we used less powerful motors. This is not the total cost spent developing the current Squinty Vehicle, since some materials were reused or replaced with a similar type. 10

5. Last Remarks Squinty was designed to be inexpensive, but still capable of competing in the autonomous and navigation challenges in the 14 th Annual Intelligent Ground Vehicle Competition. While the Initech team might be small compared to others, we still managed to acquire the skills, knowledge, tools, and funding required to complete the task at hand. We would like to extend our gratitude to the faculty of UMBC s CSEE department, as well as the IEEE Baltimore Branch, without whose help this project would never have gotten off the ground. 11

Appendix A: Faculty Advisor Statement Sign-off Faculty Advisor Statement Sign-off By signing below, I am certifying that the engineering design in this vehicle by the University of Maryland: Baltimore County student team Initech has been significant and equivalent to what might be awarded credit in a senior design course. Comments: Professor David Bourner Date Computer Science and Electrical Engineering Department University of Maryland: Baltimore County 12