A ROBOTIC MOBILITY AID FOR FRAIL VISUALLY IMPAIRED PEOPLE



Similar documents
E190Q Lecture 5 Autonomous Robot Navigation

Robot Perception Continued

The GuideCane A Computerized Travel Aid for the Active Guidance of Blind Pedestrians

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

Adaptive Cruise Control

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Controlling a Mobile Robot with a Personal Digital Assistant

Autonomous Advertising Mobile Robot for Exhibitions, Developed at BMF

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Automated Profile Vehicle Using GSM Modem, GPS and Media Processor DM642

Development of Docking System for Mobile Robots Using Cheap Infrared Sensors

A Surveillance Robot with Climbing Capabilities for Home Security

Force/position control of a robotic system for transcranial magnetic stimulation

Mobile Robotics I: Lab 2 Dead Reckoning: Autonomous Locomotion Using Odometry

understanding sensors

Design Report. IniTech for

Microcontrollers, Actuators and Sensors in Mobile Robots

Robot Task-Level Programming Language and Simulation

Robotics. Lecture 3: Sensors. See course website for up to date information.

LEGO NXT-based Robotic Arm

REAL-TIME STREAMING ANALYTICS DATA IN, ACTION OUT

Modular architecture for high power UPS systems. A White Paper from the experts in Business-Critical Continuity

Internet based manipulator telepresence

In: Proceedings of RECPAD th Portuguese Conference on Pattern Recognition June 27th- 28th, 2002 Aveiro, Portugal

A Client-Server Interactive Tool for Integrated Artificial Intelligence Curriculum

Experimental Results from TelOpTrak - Precision Indoor Tracking of Tele-operated UGVs

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

multisensor integration [7, 8, 9], and distributed sensing [10, 11, 12].

Effective Use of Android Sensors Based on Visualization of Sensor Information

ExmoR A Testing Tool for Control Algorithms on Mobile Robots

Programming the VEX Robot

Next Gen Platform: Team & Mentor Guide

Applications > Robotics research and education > Assistant robot at home > Surveillance > Tele-presence > Entertainment/Education > Cleaning

MSc in Autonomous Robotics Engineering University of York

Mobile Robot FastSLAM with Xbox Kinect

Field and Service Robotics. Odometry sensors

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

P545 Autonomous Cart

Original Research Articles

A Method for Image Processing and Distance Measuring Based on Laser Distance Triangulation

Digital Single Axis Controller

Sensor Based Control of Autonomous Wheeled Mobile Robots

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

Solar Tracking Controller

Sensor-Based Robotic Model for Vehicle Accident Avoidance

Bachelor Degree in Informatics Engineering Master courses

Robotic motion planning for 8- DOF motion stage

THREE YEAR DEGREE (HONS.) COURSE BACHELOR OF COMPUTER APPLICATION (BCA) First Year Paper I Computer Fundamentals

Automated Container Handling in Port Terminals

Build Better Robots Faster. Radim ŠTEFAN

Indoor Surveillance Security Robot with a Self-Propelled Patrolling Vehicle

Degree programme in Automation Engineering

Questions to be responded to by the firm submitting the application

UNIVERSAL DESIGN OF DISTANCE LEARNING

Bernice E. Rogowitz and Holly E. Rushmeier IBM TJ Watson Research Center, P.O. Box 704, Yorktown Heights, NY USA

Networking Remote-Controlled Moving Image Monitoring System

Mobility Aids For the Elderly

Sense it! Connect it! Bus it! Solve it! EncoderS

Chapter 11 I/O Management and Disk Scheduling

Servo Motors (SensorDAQ only) Evaluation copy. Vernier Digital Control Unit (DCU) LabQuest or LabPro power supply

The GuideCane Applying Mobile Robot Technologies to Assist the Visually Impaired

Intelligent Flexible Automation

dspace DSP DS-1104 based State Observer Design for Position Control of DC Servo Motor

Synthetic Sensing: Proximity / Distance Sensors

NATIONAL SUN YAT-SEN UNIVERSITY

An Autonomous Orchard Spraying System: Mission Planning and Control

The Basics of Robot Mazes Teacher Notes

An Instructional Aid System for Driving Schools Based on Visual Simulation

Steering Angle Sensor Resets

Project Development Plan

Robot Sensors. Outline. The Robot Structure. Robots and Sensors. Henrik I Christensen

STEPPER MOTOR SPEED AND POSITION CONTROL

E70 Rear-view Camera (RFK)

Motorcycle application definition design

Guardian Home Security System

FUNDAMENTALS OF ROBOTICS

Building on our strengths through technology. Glenrose Rehabilitation Hospital Technology Strategy

Designing Behavior-Based Systems

Implementation of the Remote Control and Management System. in the Windows O.S

DESIGNING AND MANUFACTURING OF FOOT OPERATED STEERING FOR DISABLED PEOPLE

EagleVision Mobile DVR Security system With GPS Tracking & Data analysis m

TETRIX Add-On Extensions. Encoder Programming Guide (ROBOTC )

Rail Automation. What is ACSES? usa.siemens.com/rail-automation

A Client-Server Computational Tool for Integrated Artificial. Intelligence Curriculum

Robotic Home Assistant Care-O-bot: Past Present Future

Camera Sensor Driver Development And Integration

2/26/2008. Sensors For Robotics. What is sensing? Why do robots need sensors? What is the angle of my arm? internal information

Improved Mecanum Wheel Design for Omni-directional Robots

Storage Technologies for Video Surveillance

UM-D WOLF. University of Michigan-Dearborn. Stefan Filipek, Richard Herrell, Jonathan Hyland, Edward Klacza, Anthony Lucente, Sibu Varughese

SECOND YEAR. Major Subject 3 Thesis (EE 300) 3 Thesis (EE 300) 3 TOTAL 3 TOTAL 6. MASTER OF ENGINEERING IN ELECTRICAL ENGINEERING (MEng EE) FIRST YEAR

2013 International Symposium on Green Manufacturing and Applications Honolulu, Hawaii

Raghavendra Reddy D 1, G Kumara Swamy 2

Navigation Aid And Label Reading With Voice Communication For Visually Impaired People

CSC384 Intro to Artificial Intelligence

ENHANCED AUTONOMIC NETWORKING MANAGEMENT ARCHITECTURE (ENAMA) Asif Ali Laghari*, Intesab Hussain Sadhayo**, Muhammad Ibrahim Channa*

Exploration, Navigation and Self-Localization in an Autonomous Mobile Robot

Intelligent Monitoring Software. IMZ-RS400 Series IMZ-RS401 IMZ-RS404 IMZ-RS409 IMZ-RS416 IMZ-RS432

Developing accessible portals and portlets with IBM WebSphere Portal

Transcription:

A ROBOTIC MOBILITY AID FOR FRAIL VISUALLY IMPAIRED PEOPLE Shane MacNamara, Gerard Lacey Department of Computer Science Trinity College Dublin, Ireland ABSTRACT This paper discusses the design of a smart mobility aid for frail, visuallyimpaired people. The device is based on the concept of a walker or rollator - a walking frame with wheels. The device, which is called the PAMAID (Personal Adaptive Mobility Aid) has two modes of operation manual and assistive. In manual mode the device behaves very much like a normal walker. In assistive mode, the PAMAID assumes control of the steering and will navigate safely inside buildings, giving the user feedback on the immediate environment via a speech interface. The PAMAID was evaluated in a nursing home in Ireland and the results of these tests will be briefly presented. INTRODUCTION Comprehensive statistics on dual disabilities are rare. Some studies do provide compelling evidence that there is a substantial group of elderly people with both a visual-impairment and mobility difficulties. Ficke[1] estimated that of the 1.5 million people in nursing homes in the United States around 23% have some sort of visual impairment and 71% required some form of mobility assistance. Both visual impairments and mobility impairments increase substantially with age. Rubin and Salive[2] have shown that a strong correlation exists between sensory impairment and physical disabilities. The people in this target group have difficulty using conventional navigational aids in conjunction with standard mobility aids. Their lifestyle can thus be severely curtailed because of their heavy dependence on carers. Increased mobility would lead to more independence and a more active, healthier lifestyle. A number of electronic travel aids for the visually impaired already exist. Farmer [3] provides a comprehensive overview. A small number of devices have reached the stage of extensive user trials, notably the Laser Cane[4], the Pathsounder[5] and the Sonicguide[6]. None of these devices provide any physical support for the user however. A full review of assistive technology for the blind is provided in [7]. DESIGN CRITERIA A number of considerations had to be taken into account when designing the device. The device has to be constructed such that the cognitive load - 163 -

on the user is kept to a minimum. Thus the user interface has to be very simple and intuitive. The device has to be safe and reliable to use and the user must have immediate control over the speed. For this reason, it was decided that the device should not have motorised locomotion, only the steering is motor controlled. This also reduces the power requirements of the mobility aid substantially. The one disadvantage of giving the user control over the speed is that from a control perspective, the system becomes under-determined. One of the two control parameters is lost and the system is more difficult to control. As a consequence, the control loops must be tight so that the system can react to unexpected changes such as the user accelarating when close to an obstacle. To make the device as inexpensive as possible, most of the components are available off-the shelf. Ultrasonic range sensors were chosen over a laser scanning rangefinder to further reduce the potential cost of the device. MECHANICAL DESIGN The mechanical design of the device is very similar to that of a conventional walker with a few important differences. The two castor wheels at the front of the walker have been replaced by two wheels controlled by motors. The motors are solely for - 164 - adjusting the steering angle of the device, they do not in any way propel the device. Absolute encoders return the angular position of each of the front wheels. The device thus has kinematic constraints similar to those of an automobile. Fig 1. Photograph of mobility device Handlebars are used for steering the device in manual mode and indicating an approximate desired direction in assistive mode. They can rotate approximately +/-15 degrees and are spring loaded to return them to the central position. In the manual mode of operation, the handlebar rotation is converted to a steering angle and the device can be used in the same way as a conventional walker. The two wheels are controlled independently because of the highly non-linear relationship between them at larger steering angles. It is desirable to achieve these large

steering angles for greater manoeuvrability. Rotation on the spot can even be achieved as shown in fig 2. To slow the vehicle down, the wheels are toed in by a few degrees from their current alignment. The exact misalignment angle used will depend on the severity of the braking required. monitoring the steering angles of the two front wheels. A pair of incremental encoders are used for odometry. These are mounted on the rear wheels. All the encoder information reaches the motion controller via a single serial bus (SEI Bus, US Digital). The handlebar steering angle is monitored by a linear hall-effect sensor positioned between 2 magnets. Fig 2. The steered wheels can be positioned so that rotation on the spot is possible. HARDWARE Control of the device is distributed through a number of separate modules. An embedded PC (Ampro LittleBoard P5i, 233MHz) is used for high-level reasoning. The motion control module is custom built around a singleboard micro-controller (Motorola MC68332). Communication between the PC and the motion controller is via serial line. This motion control board also deals with general I/O. Optical absolute encoders (US Digital) are used for Fig 3. Sonar configuration in plan and elevation - 165 -

Ultrasonic sensors (Helpmate Robotics Inc.) are used for object detection and ranging. Fifteen sonar transducers are used in total. This provides degree of sensor redundancy which is appropriate for the current application. The arrangement of the sonars around the mobility aid is shown in fig 3. The arrangement is very similar to that proposed by Nourbakhsh in [8]. There are seven groups of sonars in all. Four sonars point sideways (One group, composed of two sonars, on each side) and are used to determine the presence of any adjacent walls. Two groups point approximately straight ahead. One of the groups is at a height of approximately 40cm and contains 3 sonars. The second group contains 2 sonars and is at a height of 25cm and used for detecting obstacles closer to the ground. Two more groups are set at angles of approximately 45 degrees and 45 degrees. The fifth group comprises of two sonar at a height of - 166-30cm from the ground pointing upwards at an angle of approximately 60 degrees. This group is used predominantly for detecting headheight obstacles, tables etc. The PC is equipped with a sound card so audio feedback can be provided where appropriate. The sound samples are pre-recorded and contain messages such as Object left, Object ahead and Head-height obstacle SOFTWARE Due to the high demands on reliability, the mobility aid uses the Linux operating system. Its extensive configurability means also that it possible to tailor the system to the requirements of the application. The Task Control Architecture[9] was used as a framework for the software design. TCA is essentially an operating system for task-level robot control. The control can be transparently distributed across

multiple machines as TCA can handle all the interprocess communication. A central server is used to pass messages between individual software modules. Other services provided include scheduling, resource management and error handling. Communication between modules is via UNIX sockets. Currently, there are five modules running on the device motion control, sensing, feature extraction, audio output and high-level control (see fig. 4). All processes run on the same processor. If required however, processes can be moved transparently to other processors and connected together via a small hub. The feature extraction module uses the sonar returns to determine simple features in the indoor environment such as corridors, junctions and dead ends. The four sideways-pointing sonars (see fig 3.) are predominantly used for this feature extraction. Evidences for the existence of walls on either side of the device is accumulated. A histogram representation of feature evidences is used. If a particular feature is detected from one set of sonar returns, its evidence is incremented by one, otherwise its evidence is decremented. The feature with the highest histogram score is then the most probable feature in the local environment. For instance, the criteria for a positive corridor identification is that evidence of a wall either side of device is strong and that the measured angles to the left and right walls are parallel within a certain tolerance. Once a positive feature has been identified, the robot will switch into the mode associated with that feature. For example, if the device detects that it is in a corridor, the follow_corridor mode will steer the device to the centre of the corridor. Similarly, if a left junction has been detected, the device will query the user on how to proceed. A rule-based obstacle avoidance routine is located within the high-level control module. The rule-based system is more suitable than a potential field algorithm for the current sonar layout adopted. RESULTS The device was evaluated on-site on seven persons (all male) registered as visually impaired. The average age of the test participants was 82. They suffered from a variety of other physical problems such as arthritis, balance problems, frailty, nervousness and general ill-health. After testing the device, the users were questioned on its performance. The results are summarised in the table below. The results were compiled using a 5 point Likert scale. User s sense of safety while 4.4 / 5 using device Ease of use 4.2 / 5 Usefulness 3.8 / 5 Table 1. User Feedback on device performance - 167 -

FUTURE WORK Work is continuing on improving the autonomy of the device indoors. An inexpensive vision system is being developed for detecting features such as doors. Sensors which can reliably detect down-drops are also being developed. ACKNOWLEGEMENTS The authors would like to acknowledge the assistance of Heather Hunter of the National Council for the Blind in Ireland while carrying out the user trial. We would also like to thank Magnus Frost and Jan Rundenschold of Euroflex, Sweden for constructing the chassis. This research was funded in part by the European Union Telematics Application Program 3210. REFERENCES [1]. Ficke R.C. Digest of Data on Persons with Disabilities. National Institute on Disability and Rehabilitation Research, Washington DC. 20202, USA, 1991. [2]. G.S. Rubin G.S. Salive M.E. The Women s Health and Aging Study: Health and Social characteristics of Older Women with Disability, Chapter: Vision and Hearing. Bethesda, MD: National Institute on Aging, 1995. [3]. Farmer L.W. Foundations of Orientations and Mobility, chapter Mobility Devices, pages 537-401. American Foundation for the blind. 15 West 16 th Street. New York, N.Y. 10011, 1987. [4]. Benjamin J.M. The new c-5 laser cane for the blind. In Proceedings of the 1973 Carahan conference on electronic prosthetics, pages 77-82. University of Kentucky Bulletin 104, November 1973. [5]. L. Russell. In L.L Clark, editor, Proceedings of the Rotterdam Mobility Conference, pages 73-78, 15 West 16 th Street, NewYork, N.Y. 10011, American Foundation for the blind, May 1965. [6]. Kay L. A sonar aid to enhance the spatial perception of the blind: engineering design and evaluation. Radio and Electronic Engineer, 44(11):605-627, November 1974. [7]. Lacey G. Adaptive Control of a Robot Mobility Aid for the Frail avisually Impaired. PhD Thesis, Trinity College Dublin. To be published, 1999. [8]. Nourbakhsh I. The Sonars of Dervish, The Robotics Practitioner, Vol. 1, 4, 15-19, 1995. - 168 -

[9]. Simmons R., Lin L., Fedor C. Autonomous Task Control for Mobile Robots. In Proceedings of IEEE Symposium on Intelligent Control. Philadelphia, PA, September 1990. ADDRESS Shane MacNamara Department of Computer Science Trinity College Dublin Ireland Tel: +353-1-6081800 Fax: +353-1-6772204 email:shane.macnamara@cs.tcd.ie - 169 -