Robotic Control Using Six Degrees of Freedom Tracked Stylus and Virtual-Holographic 3D Display

Similar documents
Off-line programming of industrial robots using co-located environments

Robot Task-Level Programming Language and Simulation

CREATE A 3D MOVIE IN DIRECTOR

Scooter, 3 wheeled cobot North Western University. PERCRO Exoskeleton

MMVR: 10, January On Defining Metrics for Assessing Laparoscopic Surgical Skills in a Virtual Training Environment

A Remote Maintenance System with the use of Virtual Reality.

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

Chapter 1. Introduction. 1.1 The Challenge of Computer Generated Postures

VIRTUAL TRIAL ROOM USING AUGMENTED REALITY

Universal Exoskeleton Arm Design for Rehabilitation

SPANviewer - A Visualization Tool for Advanced Robotics Applications

Definitions. A [non-living] physical agent that performs tasks by manipulating the physical world. Categories of robots

Autonomous Mobile Robot-I

CATIA Basic Concepts TABLE OF CONTENTS

Computer Animation. Lecture 2. Basics of Character Animation

Classifying Manipulation Primitives from Visual Data

5. Tutorial. Starting FlashCut CNC

Go to contents 18 3D Visualization of Building Services in Virtual Environment

What Is an Electric Motor? How Does a Rotation Sensor Work?

Interactive Motion Simulators

HYFLAM D3: Report of Scenarios 08/02/2012

LEGO NXT-based Robotic Arm

CE801: Intelligent Systems and Robotics Lecture 3: Actuators and Localisation. Prof. Dr. Hani Hagras

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

SimFonIA Animation Tools V1.0. SCA Extension SimFonIA Character Animator

Industrial Robotics. Training Objective

INTRODUCTION TO SERIAL ARM

Parallels Remote Application Server

Automated Software Testing by: Eli Janssen

Primary and Secondary Qualities Charles Kaijo

Radius Compensation G40, G41, & G42 (cutter radius compensation for machining centers, tool nose radius compensation for turning centers)

Objects in Alice: Positioning and Moving Them

Module 9. User Interface Design. Version 2 CSE IIT, Kharagpur

Aerospace Information Technology Topics for Internships and Bachelor s and Master s Theses

Implementation of a Wireless Gesture Controlled Robotic Arm

CS-525V: Building Effective Virtual Worlds. Input Devices. Robert W. Lindeman. Worcester Polytechnic Institute Department of Computer Science

Exergaming: Video Games as a form of Exercise

Automated Spacecraft Scheduling The ASTER Example

Tracking devices. Important features. 6 Degrees of freedom. Mechanical devices. Types. Virtual Reality Technology and Programming

ROBOT END EFFECTORS SCRIPT

Lecture Notes - H. Harry Asada Ford Professor of Mechanical Engineering

The Design and Application of Water Jet Propulsion Boat Weibo Song, Junhai Jiang3, a, Shuping Zhao, Kaiyan Zhu, Qihua Wang

animation animation shape specification as a function of time

DEVELOPING A PHYSICAL EMULATOR FOR A FLEXIBLE MANUFACTURING SYSTEM

parts of an airplane Getting on an Airplane BOX Museum Aeronautics Research Mission Directorate in a Series

Interactive Computer Graphics

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

Fusion's runtime does its best to match the animation with the movement of the character. It does this job at three different levels :

A long time ago, people looked

Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

Using WINK to create custom animated tutorials

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau

CMA ROBOTICS ROBOT PROGRAMMING SYSTEMS COMPARISON

Effects of Orientation Disparity Between Haptic and Graphic Displays of Objects in Virtual Environments

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)

An Instructional Aid System for Driving Schools Based on Visual Simulation

Stereoscopic 3D Digital Theater System. Operator Manual (MI-2100)

Automated Recording of Lectures using the Microsoft Kinect

A MOBILE SERVICE ORIENTED MULTIPLE OBJECT TRACKING AUGMENTED REALITY ARCHITECTURE FOR EDUCATION AND LEARNING EXPERIENCES

TH2. Input devices, processing and output devices

Drills to Improve Football Skills 1

Project Development Plan

AR-media Player v2.3. INSTALLATION & USER GUIDE (February, 2013) (Windows XP/Vista/7)

Internet based manipulator telepresence

Design of a Robotic Arm with Gripper & End Effector for Spot Welding

EVIDENCE-BASED BIOMECHANICS

CROPS: Intelligent sensing and manipulation for sustainable production and harvesting of high valued crops, clever robots for crops.

Kodu Curriculum: Single Session Intro

User Manual. pdoc Pro Client for Windows. Copyright Topaz Systems Inc. All rights reserved.

A Hybrid Software Platform for Sony AIBO Robots

Introduction to the Perceptual Computing

Surveillance System Using Wireless Sensor Networks

UOM Robotics and Industrial Automation Laboratory

Artificial Intelligence

Introduction to Robotics Analysis, systems, Applications Saeed B. Niku

A method of generating free-route walk-through animation using vehicle-borne video image

3D-GIS in the Cloud USER MANUAL. August, 2014

Animations in Creo 3.0

Robotics. Chapter 25. Chapter 25 1

Planning to Fail - Reliability Needs to Be Considered a Priori in Multirobot Task Allocation

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

Computer Aided Liver Surgery Planning Based on Augmented Reality Techniques

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist

Obstacle Avoidance Design for Humanoid Robot Based on Four Infrared Sensors

CIS 536/636 Introduction to Computer Graphics. Kansas State University. CIS 536/636 Introduction to Computer Graphics

Affordable Immersive Projection System for 3D Interaction

Tableau for Robotics: Collecting Data

Digital 3D Animation

Wireless Optical Mouse

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

MODELING DISTRIBUTION SYSTEM IMPACTS OF SOLAR VARIABILIY AND INTERCONNECTION LOCATION

Base Station Design and Architecture for Wireless Sensor Networks

Piezoelectric Driven Non-toxic Injector for Automated Cell Manipulation

How to increase Bat Speed & Bat Quickness / Acceleration

User Guide Installing the 3D Studio plug-ins

Pitching Drills. Our philosophies/goals

Space Exploration Classroom Activity

Visualization system and applications at BIFI

Transcription:

Robotic Control Using Six Degrees of Freedom Tracked Stylus and Virtual-Holographic 3D Display Mark Jouppi 1 University of California, Berkeley Mentors: Heather Justice, Victor Luo Jet Propulsion Laboratory, California Institute of Technology Controlling complex robots with many degrees of freedom (DoF) such as the All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) can be difficult and unintuitive for operators using conventional input modalities. An application was built to allow operators to control the position and orientation of a simulated ATHLETE s end effector using zspace, a system incorporating a virtual-holographic 3D display and a tracked stylus providing 6 DoF input. Operators can command ATHLETE s end effector to match the position and orientation of the stylus in a natural manner. This application could be used by operators to plan, simulate, and control ATHLETE movements as it drills into the surface of an asteroid in future missions. I. Introduction and Background Simple robots with fewer degrees of freedom (DoF) may often be teleoperated via conventional input modalities, such as using a joystick or a mouse and keyboard. However, this is often impractical when trying to control complex robots with many DoF, such as the All- Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) (Fig. 1). ATHLETE has six limbs, and each limb has six DoF 2. The limbs can be used as legs in rolling across a surface using wheels or in walking if the wheels are locked. Various end effectors including augers and claws can be attached to the wheel and the limbs can then be used as arms. The robot was originally designed for use on the Moon in moving and manipulating cargo [1]. However, it is now being repurposed for a possible future mission to an asteroid in which it would conduct tasks such as drilling into the surface and taking samples. This paper will describe a new interface using a 6DoF tracked stylus and virtual-holographic 3D display that could be used by operators to naturally and easily control the position and orientation of ATHLETE s end effector, such as a drill in asteroid surface operations scenarios. Conventional input modalities are often insufficient when controlling complex high DoF robots such as ATHLETE. It would be time consuming and inefficient for an operator to manually set joint angles for each limb with a keyboard and mouse. Furthermore, mappings between a joystick and a 6 DoF limb may be unintuitive. In previous years, work has been conducted at JPL to develop more natural ways to control robots with many DoFs such as ATHLETE. For example, the Tele-Robotic ATHLETE Controller for Kinematics (TRACK) is a 1 mark.jouppi@berkeley.edu 2 Multiple versions of ATHLETE exist. This paper focuses on the most recent version, which is called Tri- ATHLETE in the 6 DoF limb configuration.

Figure 1: ATHLETE in the Microgravity Testbed, where it is being repurposed for a possible future asteroid exploration mission. (Tri-ATHLETE version with 6 DoF limb configuration) physical one-eighth-scale model of an ATHLETE limb that has sensors to determine each of its joint angles [2] (Fig. 2). The user can manipulate the model to a desired configuration and the joint angles from the model can be sent to ATHLETE ground software to command limb movement. While this device may be useful for teleoperation, it is still fairly manual in that the user must physically move the joints on the model to the desired angles. Also, since it is a physical model, it can be hard to visualize it in the appropriate environment in which ATHLETE is in. For example, if ATHLETE was about to drill into a rock, the operator would have to imagine that rock as he moved the TRACK device accordingly. Currently, when ATHLETE engineers want to use an end effector, such as a drill, they have a person standing next to the robot as it performs the action, who calls out movements and alerts if unsafe conditions are created. Operators can use pre-canned sequences to get certain leg configurations and movements, but those may not be completely precise due to sag in the leg. Operators may need to tweak the leg position in these situations. In an actual future mission to an asteroid, such human-involved Figure 2: Tele-Robotic ATHLETE Controller for Kinematics (TRACK) and manual operation would be impractical. In an effort to make this unnecessary and improve upon previous work, a project was undertaken to develop a new method of controlling ATHLETE s end effector.

II. Developing a New Control Method A new method of controlling the position and orientation of ATHLETE s end effector was developed using a device called zspace, which provides a virtual-holographic experience (Fig. 3). A virtual-holographic workstation is defined as a system including head tracking, stereoscopic rendering, a view that changes in real time (at or past the human detection threshold), and a six degree of freedom input device [3]. The zspace device consists of three components, the main unit, a stylus, and passive polarized glasses. The main unit includes the 3D display as well as two infrared projectors and two infrared tracking cameras. The stylus and glasses are equipped with infrared-reflective markers that enable the zspace software to use the camera data to compute the position and orientation of the stylus and glasses with a high degree of accuracy. Knowing the position and orientation of the glasses enables the system to use head tracking, in other words, the scene that the user sees is rendered differently as the user moves his head around the unit and looks from different angles to give a realistic experience. Knowing the position and orientation of the stylus enable it to be used as a six degree of freedom input device. Figure 3: The zspace device, including the 3D display with built-in infrared projectors and infrared tracking cameras as well as the stylus and passive polarized glasses. To use zspace in creating an application for controlling ATHLETE s end effectors, it was determined that it would be most natural to have ATHLETE s end effector match the position and orientation of the stylus. That is, as the user moves or rotates the stylus in real life, a virtual stylus in the scene follows the same movement and rotation. A simulated ATHLETE in the scene would move its end effector to track with the virtual stylus (Fig. 4, 6). A central component of creating this application was the Unity game engine. Unity offers the ability to prototype and create applications at a fast pace. Furthermore, Infinite Z, the company which produces the zspace device, released a plugin that integrates a Unity project with their system. An early release version of the zspace Unity plugin was eventually made available, and although it had some bugs and offered no documentation, it was used to create an application in which the user could control ATHLETE s end effector using the tracked stylus. In the future, it is expected that a better, more comprehensive plugin will be released, at which point, the application could be upgraded. To get ATHLETE s limbs to move correctly so that the end effector would match the virtual stylus, the author initially used a numerical inverse kinematics solution. However, it could

only match a target position (not a target orientation as well), and had oscillation issues. Later on, the author was given access to the ATHLETE flight software. The inverse kinematics analytical solution was taken from the flight software and put into the Unity project, allowing the simulated ATHLETE model to accurately match the target position and orientation. It was also used to compute a reachability map over the asteroid. Points that ATHLETE can currently reach with its given base position and the orientation of the end effector are displayed in green (Fig. 4). The reachability map and the limb configuration are updated every frame. The system has fairly good performance, although some optimization could be performed. Using head tracking, the user can move around the zspace device, and get different perspectives on ATHLETE as he positions the end effector. He can also step closer to the device to zoom in and move away to zoom out. Figure 4: Screenshot of the application with stereoscopic rendering turned off. Green points on the asteroid are reachable with the current stylus orientation and ATHLETE position. The magenta pointed cylinder in the scene represents the virtual stylus, which tracks with the real stylus as the user moves and rotates it. Notice how the ATHLETE inverse kinematics flight software has been used to configure the limb properly so that its end effector, a drill, matches the position and orientation of the virtual stylus. Further work was done to allow users to freeze the configuration of the leg and unfreeze it by pressing a key. A demo drilling procedure was also created in which the user must drill at a target rock at a certain angle, while paying attention to gauges showing simulated data such as the drill motor current, revolutions per minute, and drill depth in the asteroid (Fig. 5).

III. Conclusions Overall, the application offers a natural and intuitive way to control the position and orientation of ATHLETE s end effector. Future work could be done to extend the application to more uses besides planning drilling motions. It could also be used for planning claw motions and manipulations. As more development is completed on the zspace Unity plugin, upgrades to the application would also be beneficial. Currently, evaluation and user testing is being conducted with ATHLETE operations personnel and rover planners. Figure 5: Drilling demo with target rock (brown) and gauges showing motor current, RPM, and drill depth.

Figure 6: User positioning the end effector with the device as part of the drilling application. Notice the stereoscopic rendering and stylus usage. Acknowledgements This research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, and was sponsored by JPL Summer Internship Program and the National Aeronautics and Space Administration. This work was done for the Human Robot Systems project. Heather Justice advised the project. Charles Goddard collaborated on the development. Sabrina Billinghurst managed the user testing. References 1. Wilcox, B. H., Litwin, T., Biesiadecki, J., Matthews, J., Heverly, M., Morrison, J., Townsend, J., Ahmad, N., Sirota, A., and Cooper, B., ATHLETE: A Cargo Handling and Manipulation Robot for the Moon. Journal of Field Robotics [online journal], Vol. 24, No. 5, URL: http://www3.interscience.wiley.com/journal/114211098/issue, Wiley Periodicals, May 2007, pp. 421 434. 2. Mittman DS, Norris JS, Powell MW, Torres RJ, McQuin C, Vona MA (2008) Lessons Learned from All-Terrain Hex-Limbed Extra-Terrestrial Explorer Robot Field Test Operations at Moses Lake Sand Dunes, Washington. In: Proceedings of AIAA Space Conference 3. Deering, M. F. (1993), Explorations of Display Interfaces for Virtual Reality. In: IEEE Virtual Reality Annual International Symposium.