Task Directed Programming of Sensor Based Robots

Size: px
Start display at page:

Download "Task Directed Programming of Sensor Based Robots"

Transcription

1 Task Directed Programming of Sensor Based Robots B. Brunner, K. Arbter, G. Hirzinger German Aerospace Research Establishment (DLR), Oberpfaffenhofen, Institute for Robotics and System Dynamics, Postfach 1116 D Wessling, Germany Abstract We propose the so-called TeleSensorProgramming concept that uses sensory perception to achieve local autonomy in robotic manipulation. Sensor based robot tasks are used to define elemental moves within a high level programming environment. This approach is applicable in both, the real robot s world and the simulated one. Beside the graphical off-line programming concept, the range of application lies especially in the field of teleoperation with large time delays. A shared autonomy concept is proposed that distributes intelligence between man and machine. The feasibility of graphically simulating the robot within its environment is extended by emulating different sensor functions like distance, force-torque and vision sensors to achieve a correct copy of the real system behaviour as far as possible. These simulation features are embedded in a task driven high level robot programming approach. This implicit programming paradigm is supported by a sophisticated graphical man machine interface. Programming by demonstration is performed by introducing an analytical controller generation approach. Sensor fusion aspects with respect to autonomous sensor controlled task execution are discussed as well as the interaction between the real and the simulated system. 1 Introduction Programming a robot off-line has been restricted to showing a sequence of cartesian positions or joint angle configurations. But mechanical inaccuracies at the manipulator level and differences between real and simulated world lead to difficult problems in the execution of off-line defined robot activities. Local adaptations on path generation level have to be done via teach-in commands by human operator, especially in the context of manipulation tasks, such as gripping or handling objects. Even small deviations in position, orientation and shape of all the objects to be handled are not allowed, because the task execution will fail in an uncertain environment. Graphical simulation of robot tasks and downloading the generated commands to the real robot is limited to the joint or cartesian motion level. This approach is only useful if geometrical consistency of the real environment and the simulated one can be guaranteed. This is a demand that cannot be met with available programming systems. The most important requirement to achieve autonomy is the ability to successfully execute a given task by intelligent sensor data processing. Actual sensory data are matched with a predefined reference pattern to get information for the robot controller to achieve the desired end position. Local deviations from the desired state are detected by the sensors and handled by the robot system in an autonomous way. Therefore sensor controlled movements have to be used to bring local autonomy onto the machine level. High level planning facilities for task scheduling as well as intelligent error handling mechanisms are required for full autonomy but state-of-the-art techniques are insufficient to provide the adequate tools. Presently full autonomy is not reachable. 2 The TeleSensorProgramming approach Therefore we favour a shared autonomy concept that distributes intelligence between man and machine [1]. Presuming that sufficient information about the actual environment is available from the sensor systems, partial tasks can be executed independently on the machine level. Specifications and decisions on a high task planning level have to be done by human operators. Local sensory feedback loops are executed by the robot system, while global task level jobs have to be specified interactively by a human operator. Coarse planning activities have to be done on a task oriented level by human intelligence, fine path planning on manipulator level takes place on a sensor based control level [2]. This shared autonomy approach is the basis of our programming paradigm for that we have coined the term TeleSensor- Programming (TSP). This means a new way for robot programming on a task directed level. The teach-in of a robot system occurs not on the joint or cartesian manipulator level

2 but on a high language level, i.e. the operator plans activities on a level which can be worked off by the robotic system independently from human intervention. Hence TSP means teaching by showing typical situations on a task directed level including nominal sensory patterns with the aid of sensory refinement in the real or a completely simulated world. All the taught sensor based actions can be activated in analogous situations. Graphics and Animation virtual feedback real feedback Images Plots Graphics predictive control loop task planning and error handling real system control loop controller path planning and parameter determination local autonomy loop robot model sensor models environment model controller robot sensors environment local autonomy loop a posteriori knowledge update of world model world model data base sensor fusion algorithms Figure 1 The TeleSensorProgramming Concept (Ground) Simulation a priori knowledge usage of world model (Remote) Real System Figure 1 shows the structure of our TSP concept, consisting of two parallel working control cascades. One of them is the real system control loop, containing internal control loops for local autonomy. The other one is a simulation environment which is structural equivalent to the real system, with few exceptions. The most important is a signal delay which may occur in the real (remote) system, e.g. in space applications, which is not implemented in the simulation. This makes the simulation predictive with respect to the real system. Another exception is the display of internal variables in the simulation part, which cannot be observed (measured) in the real system. This gives the operator or task planner more insight to what happens or may happen on the system according to his commands. Communication between the two loops runs via a common model data base which delivers a priori knowledge for the execution on the remote level and a posteriori knowledge for the model update of the simulated world. For such an intelligent programming system different tools are necessary to implement the required functionality. First a sophisticated simulation system has to be provided to emulate the real robot system. This includes the simulation of sensory perception within the real environment. Beside this sensor simulation, the shared autonomy concept has to provide an efficient operator interface to setup task descriptions, to configure the task control parameters, to decide what kind of sensors and control algorithms should be used, and to debug an entire job execution phase. Two applications in the field of robotics are evident to the proposed TSP approach. First the graphical off-line programming concept is extended by the processing of simulated sensory data and by the introduction of local autonomy on the task description level. This means that not only the joint and cartesian information is gathered by graphically guiding the robot during the off-line programming task, but also the simulated sensory informations are stored off-line as nominal patterns for subtask execution on a local feedback loop. The fine motion control loop for handling uncertainties takes place independently of any human intervention both at the simulation side and the real one. A main advantage of this off-line programming scheme is the feasibility to verify all the robot actions before real execution is performed. This includes the perception and processing of sensory data and the activation of the sensor controlled elemental moves. Second, the field of telerobotics with large time delays especially in space and subsea applications, is a good area for this sensor based, task directed programming approach [1]. Direct visual feedback in the control loop, where a time delay of a few seconds is inherent, is not feasible for the human operator to handle the robot movements in a suitable way. Predictive simulation is the appropriate means for an operator to telemanipulate the remote system on-line [3]. Similar approaches are known, which make usage of force reflecting hand controllers to feed back force sensor signals for shared and teleoperated control modes [4]. Furthermore an interactive supervisory user interface makes it possible to set up the environmental and control parameters. But in our approach, the sensor based elemental move concept, feasible for all kind of sensors and execution tasks, are to the fore. The operator has only to guide the robot through the task space in a coarse way and to activate specific sensor control phases. Sending these gross commands to the remote system enables the real robot to execute the desired task with its own local autonomy after the delay time has elapsed. The main feature of our telerobotic concept is to replace the time-delayed visual feedback by predictive stereo graphics with sensor simulation, providing a supervisory control technique, that will allow to shift more and more autonomy and intelligence to the robot system. The main focus of this paper lies in the task-directed programming approach which facilitates the robot programming work by an intuitive task description level. Predefined complex sensor based tasks can be described on a task driven level which allows for usage in a varying context. To describe robot activities on a high language level we introduce the elemental move concept. This concept allows us to define subtasks that can be executed by the robot system in an autonomous way. To program a complex robot task it is only necessary to define a sequence of elemental moves on an intuitive planning level. An analytical approach is proposed to define sensor based elemental moves without an extensive

3 controller design. Therefore we call our method programming by demonstration, which represents an easy variant of teaching by showing. 3 Elemental move concept In the past many attempts have been started to describe robot assembly plans on a high language level. Command language approaches [5] or planning tools such as Petri nets [6] or assembly graphs [7] have been proposed. They work well in a structured, well-defined environment for a specific application field like block world assembly [8] or compliant motion tasks [9]. Our task directed programming approach is driven by an elemental move concept, that enables us to program robot tasks in an intuitive implicit manner. A complex robot task is composed of such elemental moves (EMs) that can be divided into three categories. First pose controlled EMs that are described by the goal pose (position and orientation) in the cartesian space, second the sensor controlled EMs that have to be defined via nominal sensory patterns, and third shared controlled EMs. Each class of EMs has a template of preand postconditions that describe the prerequisites to activate or stop the execution of an EM instance. An EM instance can be regarded as a node of a state machine. The transition between two nodes is defined by the matching of post- and preconditions of the corresponding EM nodes. 3.1 Pose controlled elemental moves Pose controlled EMs are defined via the goal pose in the cartesian space. Therefore the precondition template for the pose controlled EM class is empty, the postcondition is met if the tool frame of the robot has reached the desired pose within the predefined limits. The path generation considers the inverse kinematic problem that is solved by an iterative approach. Currently an on-line collision avoidance algorithm is implemented to find a collision free path from the current start to the defined end pose [10]. 3.2 Sensor controlled elemental moves Sensor controlled EMs are defined via the nominal sensory pattern in the reference pose relative to the environment. Therefore the precondition template for the sensor controlled EM class includes the availability and correctness of the sensory data, which is necessary to apply the control algorithm for achieving the desired goal state. This class of EMs represents full autonomy, in the sense that the sensor based control mechanism leads to a properly system behaviour. An EM is fully sensor controlled, if all 6 degrees of freedoms (DOFs) in the cartesian space are controlled by sensor values. With the aid of sensor based path refinement it is possible to act in an environment under the constraints of the sensor measurements. This class of the sensor controlled EMs also gives us the possibility to handle similar tasks in analogous situations. E.g. Approach moves, that will be able to align the robot gripper with the object to be picked up, can be defined as one sensor controlled EM for a desired distance and/or vision sensory pattern. Similar objects at different poses can be gripped by the same EM. 3.3 Shared controlled elemental moves The shared controlled EM class is a mixture of the previous two. For each instance of shared controlled EM the constraint space configuration has to be defined. The constraint space configuration description specifies which degree of freedom in the cartesian 3D-space is pose or sensor controlled. For instance a contour following task in the xy-plane can be described by the pose controlled DOFs z trans, x rot, y rot, z rot and the sensor controlled x trans and y trans. Based on Mason s C-frame concept [11] sensor controlled subspaces are defined using nominal sensory patterns to control the selected DOFs. The free DOFs are controlled by the path planning algorithm or in the case of teleoperation by the human operator using a 6 DOF input device. The techniques used for projecting gross move commands into the pose and sensor controlled subspaces have been discussed in a number of previous papers, e.g. [12]. 4 High level task directed programming Task directed robot programming has been discussed so far in the field of generalized planning [13] and assembly plan generation as an AI paradigm [14],[15]. Especially for the telerobotic applications interactive planning facilities are of crucial importance. To integrate such planning tools into the framework of sensor based task execution, only theoretical work has been done so far [16]. We believe that the realization of task directed robot programming by the elemental move approach leads to an intuitive high level programming concept. Complex robot tasks are described by an action graph of elemental moves. The nodes of this graph are represented by the EM controllers. The transitions between the corresponding nodes are defined by the matching of the successor preconditions with the predecessor postconditions. A plausibility check has to be run to look for the congruence between the end conditions of the subtask and the start conditions of the following one, so that the consistency of the subtask sequence is ensured within the entire robot task. The feasibility of an EM, i.e. the question whether the preconditions can be met, depends on availability of the required sensor values as well as on determination of the constraint space configuration.

4 To implement a real task directed programming tool we propose a hierarchical declaration of the EM framework. At the lowest level we define the pose, sensor, and shared controlled EMs as so-called atomic elemental moves that cannot be split into several control items. Each node in the action graph represents one instance of an atomic EM. A sequence of connected nodes as a part of the action graph can be joined to a so-called composed elemental move, which can be further used as a logical EM on a high task description level. We have to consider that the uniqueness of the graph transitions are guaranteed by the alignment of the pre- and postconditions of successive EMs. Furthermore a composed EM is put together by a non-empty sequence of atomic and/or composed EMs in a recursive way. One remark to the definition of the atomic EMs: In contrast to other approaches like [5] or [4] we don t have to predefine all the elemental operations which can be used to execute any task. We can interactively define and configure an atomic EM via programming by demonstration which is the topic or the next chapter. 5 Programming by demonstration Visual programming tools for the direct transfer of human manipulation skills to the robot have been developed, but they are constrained to location gathering by using a marker-based vision system [17] or trajectory generation from observing object features [18]. Compliance controller identification by human demonstration [19] is restricted to straight line motion in well-defined constraint configurations. Methods proposed during the last years [20],[21],[22] for programming by demonstration utilize the real robot system to collect the required demonstration data. The same system is used for further execution, so that this class of direct demonstration approaches can be regarded as simple playback methods. 5.1 Remote demonstration We have extended this methodology to the separation of demonstration and execution system that gives us the ability to define task level robot actions off-line or to overcome the problems attended upon time-delayed telerobotic applications. Before the execution of the robot task in the real system takes place, we demonstrate the task by using sophisticated simulation tools including the emulation of the interactions between the manipulator and the environment, concerning collision information of contact sensors as well as non-contact sensor devices. Currently we have implemented the functionality of laser distance sensors, a feature-based simulation of stereo cameras, and the behaviour of a stiff force-torque sensor in contact with the environment [23]. We emphasize that the usage of simulated sensory behaviour is only feasible, if the virtual system is properly calibrated. Especially in the emulation of computer vision the knowledge of internal and external camera parameters is indispensable for finding the right mapping between the 3D world model and the 2D camera planes. 5.2 Defining the atomic elemental moves It should be noted that the following concepts for teaching atomic elemental moves are applicable to both, first to the simulation with regard to graphical off-line programming or telerobotics, second to the real robot (including sensor equipment) without any simulation environment. Applying programming by demonstration to simple pose controlled EMs is straightforward. The actual cartesian pose of the robot s tool frame is stored with current manipulator data like the velocity or acceleration. On activation of such a pose controlled EM the robot moves from any pose to the desired one regarding the inverse kinematic problems as well as the collision detection. To move the robot a 6DOF input device is used, which allows the operator an easy way to generate such target points within the robot s workspace. To define the various instances of the sensor or shared controlled EM class we propose an intuitive programming by demonstration methodology. The key issue is to find an appropriate controller to perform the desired EM correctly. 5.3 Automatic controller generation For the handling of the sensor controlled EMs we outline a method to estimate the motion parameters by the known sensor values to achieve the predefined goal state of the robot s pose (cartesian position in translation and orientation or joint angle position). In other words we have to find a control sequence that transduces the robot s end effector from a pose x 0 into the nominal pose x * which is described by its corresponding sensor values y *. x * is assumed to be unknown or uncertain. The motion parameters relative to the sensed environment are expressed by the vector increment x k, the actual sensor values by the vector y k. The nominal sensor values y * generally are non-linear functions of the actual interaction between the robot, the sensor and the environment in the robot s nominal pose x *. In the execution phase we want to find a controller sequence, that is able to reach the goal state dependent on the nominal sensor value vector, which the system was taught for. Starting from a pose x 0 we calculate stepwise the motion parameters x k in order of the actual and nominal sensor values, i.e. we apply a linear controller of the form (1) (2)

5 ' ( where C is a constant N*M-Matrix, which maps the displacement in the M-dimensional sensor space to the N-dimensional control space, and where is a scalar damping factor, which determines the dynamic behaviour of the closed control loop. The optimal controller coefficients, in the sense of least square estimate, are expressed by the pseudoinverse where the elements of J are the partial derivatives of the sensor values y to the motion parameters x: The teach-in process is implemented in an efficient way, because the nominal sensor values y * can be easily derived from either the simulation or the real system. The controller design, i.e. the construction of J is also implemented in a simple manner by experimentally estimating the Jacobian J, i.e. moving the simulated or real robot by increments, using the difference quotients The reason, why we determine the Jacobian J by experimentally estimating the partial derivatives of the sensor values to the motion parameters, is the robustness of the control approach in the local working space, i.e. if is in a suitable bound. Doing so we avoid the very high effort which would be necessary if we would analytically determine the partial derivatives using a large non-linear model. Furthermore the estimates determined by using the real system are expected to be better than the analytical ones, because they are independent of (inevitable) model abstractions. To get the mapping from the actual sensory pattern to the desired robot s reference pose no sensor calibration has to be done, because the training process delivers the inverse relation between the robot s movements and the corresponding sensor values. In the case of programming by virtual demonstration within the graphical simulation environment sufficiently correct world models with respect to sensor and object descriptions must be available for the generation of the controller matrix C. The method is plain and robust. The simplicity of the programming by demonstration paradigm is fulfilled by an easy way to configure the sensor or shared controlled EM description. The reference sensor pattern, i.e. the goal state definition of the atomic EM is directly shown by the current sensor value configuration. The controller for achieving this desired state is determined in an automatic manner by the generation of the Jacobian as described above. The human operator has only to specify the controller input variables, i.e. the sensor combination, which (3) (4) (5) should be used for the control task, and the controller output variables, i.e. the sensor controlled DOFs, to activate the controller generation. This procedure can be viewed as a form of training by doing. Currently an intelligent sensor planning system is in progress to find out the concerning sensor usage in combination with the determination of the sensor controlled subspaces in an autonomous way [24]. 5.4 Sensor fusion aspects The proposed sensorimotion approach allows a simple integration of different sensor values. An efficient sensor fusion algorithm is inherent implemented by assigning all the used sensor values to the elements of the measurement vector y. To get better estimation results, it is often helpful to use redundant and normalized sensor information. We have implemented this approach among other cases to combine stereo vision (2 cameras) data with laser range data (4 distance values) and have observed fast (exponentially) as well as wide (full sensor range) convergence. In the following the main advantages of the proposed controller generation process are shortly discussed. The controller matrix C remains constant during the control process. The training of sensor controlled EMs, i.e. to set the linear controller coefficients in the Jacobian, has to be done only once for a specific sensor/dof combination in the desired reference state. The number of training steps for the determination of the controller is equal to the number of DOFs to be controlled. Any constraint space configuration can be selected to integrate the shared control concept into the control generation process. Sensor failures can easily be handled by introducing a binary valued diagonal weight matrix W extending the controller design to "!$#% %& -/ ')( +*, 4, The controller redesign only needs a failure detection, but does not need to apply a new training process for the changed sensor configuration. If is regular then may be regular too, if the original sensor configuration contains enough redundancy. The condition number of may be used to qualify different sensor configurations. To realize such a sensorimotion coordination and sensor fusion concept we already have applied other methods such as neural networks [25]. Our ongoing work is to compare both approaches with regard to applicability, robustness, and efficiency. (6)

6 6 Experiments The TSP concept has been verified within ROTEX an advanced space RObot Technology EXperiment in May 93 at the D2 spacelab mission [12]. The experimental environment of the ROTEX ground station is used and currently extended to implement our task directed robot programming approach. A powerful man machine interface is one of the key issues to perform the programming by demonstration in an efficient way. A 6DOF control-ball is used to move the robot s tool center point in the 3D-stereographics simulation. An userfriendly graphical interface based on X and OSF/Motif has been implemented to support the operator in the definition and composition of the various elemental moves. Three types of sensors are available in the real as well as the simulated gripper environment: an array of 8 laser distance sensors with a working range from 0 3 cm, one with 0 30 cm, two 6 axis force-torque wrist sensors (a stiff strain gauge based and a more compliant optical one), and a tiny pair of cameras to provide a stereo image out of the gripper. The image processing system used in our laboratory is able to extract features like contours or markers in real time (application dependent). In addition to this direct object feature extraction more abstract measurements such as the Fourier coefficients of a contour can be delivered as an efficient tool for the determination of the pose, size, and shape of an object given by its contour line [26]. The corresponding simulation works analogously on a feature based level with respect to the internal and external camera parameters. Especially for the acquisition of stereo vision data this model based method offers the advantage to avoid the correspondence problem. Corresponding object features in the different camera views can easily be detected by using the same geometric world model interface. The experimental environment consists of a powerful Silicon Graphics workstation with a multi-processor architecture to enhance the simulation and graphical visualization of the workcell environment. The graphical user interface as well as the controller task are running as independent processes on the same machine. The inter-process-communication occurs via shared memories protected by semaphores to guarantee the mutual exclusion on apparent access collisions. The exchange of an orbital replaceable unit (ORU) seems to be a good example to explain the task directed programming approach in more detail (see figure 2). This manipulation task can be regarded as a sophisticated pick and place operation where the pick includes a bajonet closure screwing action. The composed EM ExchangeOru is split into three more detailed EMs FreeMotion, ApproachToOru, AttachOru, and the inverse EMs FreeMotionWithOru, ApproachWithOru, ReleaseOru. Let us focus on the EM ApproachToOru. This composed EM is further divided into three atomic EMs which differ in the choice of the used sensors and the constraint space configuration. First, only the vision sensor is available within its measurement range. ApproachByVision uses four corner points lying on the ORU contour as the reference pattern to achieve the first goal state. After the execution of the EM ApproachByVision the distance sensors can be applied together with the visible corner points (see figure 3 and 4). The reference state of this EM ApproachByVisionAndDistance is reached if the distance sensor values are zero or non-zero force-torque sensor values appear. The last atomic EM ApproachByDistanceAndForce ends in a state where the attaching of the ORU is possible and the next EM AttachOru can be activated. ExchangeOru AttachOru FreeMotion ApproachToOru ApproachWithOru FreeMotionWithOru ReleaseOru ApproachToOru ApproachByVision Approach ByVisionAndDistance ApproachByDistanceAndForce Figure 2 The composed EM ExchangeOru All three atomic EMs are completely sensor controlled, but for teleoperation the constraint space configuration can easily be reconfigured to allow an operator supported shared control. The Jacobian (see figure 5) for the EM ApproachByVisionAndDistance e.g. integrates the vision data of two visible corner points of the ORU contour and four distance sensor values. These sensor readings are represented by the sensor beams approximately perpendicular to the desired contact surface between the ORU and the gripper in the attach phase (see figure 3 and 4). stereo camera distance sensors ORU x y tool frame orientation Figure 3 Configuration of the EM ApproachByVisionAndDistance z

7 # y x vision system x z tool frame orientation y vision corner points values, the real sensor data must match the sensor values, which are simulated in the virtual environment. For instance the distance sensors have to deliver the correct distance value between the sensor origin and the measured object surface, or the vision system should be able to provide features such as points, lines, or contours in the image plane without lens distortion or something else. 7 Conclusion Figure 4 ApproachByVisionAndDistance, viewed out of the hand camera We experimentally estimate the Jacobian J by moving the simulated or real robot by increments of 1 mm in each translational DOF and of 2 degrees in each rotational DOF, creating the difference quotients! " $ DOF trans(x) trans(y) trans(z) rotate(x) rotate(y) rotate(z) LeftCamX LeftCamY RightCamX RightCamY LeftCamX LeftCamY RightCamX RightCamY Distance Distance Distance Distance Figure 5 The Jacobian for the EM ApproachByVisionAndDistance The incremental values of 1 mm and 2 degrees depend on the geometrical dimensions of workcell and sensor description. We have to consider that the incremental values x k have to be large enough, to avoid numerical error problems. The linear controller, automatically created and used by the EM ApproachByVisionAndDistance, generates a smooth behaviour of the tool center point movements to achieve the desired position. In the case of programming by demonstration and execution both on the real system the experimental determination of the controller coefficients considers implicitly the sensor characteristics. In an analytical approach the particular sensor parameters must be known to get the Jacobian parameters. In the case of programming on the simulated environment and execution on the real the alignment of real and virtual environment have to be guaranteed. Due to the fact that the goal state of the sensor controlled EMs is expressed in terms of sensor (7) We have introduced the TeleSensorProgramming concept as an easy way to program a robot off-line in a robust way with sensor based control structures. The proposed task directed programming paradigm is also applicable in the field of telerobotics with large time delays where local autonomy is indispensable. Teaching by showing in the real as well as in the simulated world, including all the interactions between the robot and its environment, leads to a programming by demonstration approach using an elemental moves concept with local sensor based machine intelligence. The elemental moves are defined by a sensor data processing scheme which is experimentally determined via the Jacobian in the working area respectively. Our ongoing work is to integrate the virtual and real world by world model update facilities. As the first step we have to enhance the congruence between the virtual and real world using calibrated sensor models. References [1] G. Hirzinger, J. Heindl, K. Landzettel, and B. Brunner, Multisensory shared autonomy a key issue in the space robot technology experiment ROTEX, Proc. of the IEEE Conf. on Intelligent Robots and Systems (IROS), Raleigh, July 7 10, [2] T. Sheridan, Human supervisory control of robot systems, Proc. of the IEEE Conf. of Robotics and Automation, San Francisco, [3] A. Beiczy and W. Kim, Predictive displays and shared compliance control for time-delayed telemanipulation, IEEE Int. Workshop on Intelligent Robots and Systems, Ibaraki, [4] P. Backes and K. Tso, UMI: An Interactive Supervisory and Shared Control System for Telerobotics, Proc. of the IEEE Int. Conf. on Robotics and Automation, Cincinatti, p , [5] J. Funda and R. Paul, Remote Control of a Robotic System by Teleprogramming, Proc. of the IEEE Int. Conf. on Robotics and Automation, Sacramento, 1991.

8 [6] B. McCarragher, Task-Level Adaptation Using a Discrete Event Controller for Robotic Assembly, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Yokohama, July 26 30, [7] A. Kapitanovsky and O. Maimon, Conceptual Graphbased Synthesis of Robotic Assembly Operations, Proc. of the IEEE Conf. on Robotics and Automation, Nice, France, May, [8] G. Schrott, An Experimental Environment for Task- Level Programming, Second Int. Symposium on Experimental Robotics (ISER), Toulouse, France, June 25 27, [9] P. Van de Poel, W. Witvrouw, H. Bruyninckx, and J. de Schutter, An Environment for Developing and Optimising Compliant Robot Motion Tasks, Int. Conf. on Advanced Robotics (ICAR), Tokyo, November, [10] E. Ralli and G. Hirzinger, Fast Path Planning for Robot Manipulators using Numerical Potential Fields in the Configuration Space, accepted for presentation on the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Munich, Sept , [11] M. Mason, Compliance and force control for computer controlled manipulators, IEEE Trans. on Systems, Man and Cybernetics, Vol SMC-11, No. 6, p , [12] G. Hirzinger, B. Brunner, J. Dietrich, and J. Heindl, ROTEX The First Space Robot Technology Experiment, Sixth Int. Symposium on Robotics Research, Hidden Valley, PA, Oct 2 5, [13] W. Yared and T. Sheridan, Plan Recognition and Generalization in Command Languages with Application to Telerobotics, IEEE Trans. on Systems, Man and Cybernetics, Vol. 21, No. 2, p , [14] B. Frommherz and G. Werling, Generating Robot Action Plans by means of an Heuristic Search, Proc. of the IEEE Int. Conf. on Robotics and Automation, Cincinatti, p , [15] Y. Huang and C. Lee, An Automatic Assembly Planning System, Proc. of the IEEE Int. Conf. on Robotics and Automation, Cincinatti, p , [16] M. Gini, Integrating Planning and Execution for sensorbased Robots, in: Ravani B. (ed.), CAD based Programming for Sensory Robots, NATO ASI Series F50, Springer, Berlin, [17] S. Tso and K. Liu, Visual Programming for Capturing of Human Manipulation Skill, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Yokohama, July 26 30, [18] A. Ude, Trajectory generation from noisy positions of object features for teaching robot paths, Robotics and Autonomous Systems 11, , [19] N. Delson and H. West, Robot Programming by Human Desmonstration: Subtask Compliance Controller Identification, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Yokohama, July 26 30, [20] G. Hirzinger and J. Heindl, Sensor programming a new way for teaching a robot paths and force/torques simultaneously, Third Int. Conf. on Robot Vision and Sensory Controls, [21] H. Asada and S. Izumi, Direct teaching and Automatic Program Generation for the Hybrid Control of Robot Manipulators, Proc. of the IEEE Int. Conf. on Robotics and Automation, [22] K. Kosuge, T. Fukuda, and H. Asada, Acquisition of Human Skills for Robotic Systems, Proc. of the IEEE Symposium on Intelligent Control, Arlington Virginia, August, [23] B. Brunner, G. Hirzinger, K. Landzettel, and J. Heindl, Multisensory Shared Autonomy and Tele-Sensor- Programming Key Issues in the Space Robot Technology Experiments ROTEX, IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), Yokohama, July 26 30, [24] G. Grunwald and G. Hager, Towards Task-Directed Planning of Cooperating Sensors, Proc. of SPIE conf. on Sensor Fusion V, Boston Mass., Nov., [25] G. Wei, G. Hirzinger, and B. Brunner, Sensorimotion Coordination and Sensor Fusion by Neural Networks, IEEE Int. Conf. on Neural Networks, San Francisco, [26] K. Arbter and H. Burkhardt, A Fourier-method for feature extraction and parameter estimation for planar curves in the 3 D space (in German), in: Informationstechnik it(33), Oldenbourg, Munich, 1991.

Force/position control of a robotic system for transcranial magnetic stimulation

Force/position control of a robotic system for transcranial magnetic stimulation Force/position control of a robotic system for transcranial magnetic stimulation W.N. Wan Zakaria School of Mechanical and System Engineering Newcastle University Abstract To develop a force control scheme

More information

Robot Task-Level Programming Language and Simulation

Robot Task-Level Programming Language and Simulation Robot Task-Level Programming Language and Simulation M. Samaka Abstract This paper presents the development of a software application for Off-line robot task programming and simulation. Such application

More information

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users

INSTRUCTOR WORKBOOK Quanser Robotics Package for Education for MATLAB /Simulink Users INSTRUCTOR WORKBOOK for MATLAB /Simulink Users Developed by: Amir Haddadi, Ph.D., Quanser Peter Martin, M.A.SC., Quanser Quanser educational solutions are powered by: CAPTIVATE. MOTIVATE. GRADUATE. PREFACE

More information

Industrial Robotics. Training Objective

Industrial Robotics. Training Objective Training Objective After watching the program and reviewing this printed material, the viewer will learn the basics of industrial robot technology and how robots are used in a variety of manufacturing

More information

Sensory-motor control scheme based on Kohonen Maps and AVITE model

Sensory-motor control scheme based on Kohonen Maps and AVITE model Sensory-motor control scheme based on Kohonen Maps and AVITE model Juan L. Pedreño-Molina, Antonio Guerrero-González, Oscar A. Florez-Giraldo, J. Molina-Vilaplana Technical University of Cartagena Department

More information

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY

THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY THE CONTROL OF A ROBOT END-EFFECTOR USING PHOTOGRAMMETRY Dr. T. Clarke & Dr. X. Wang Optical Metrology Centre, City University, Northampton Square, London, EC1V 0HB, UK t.a.clarke@city.ac.uk, x.wang@city.ac.uk

More information

Véronique PERDEREAU ISIR UPMC 6 mars 2013

Véronique PERDEREAU ISIR UPMC 6 mars 2013 Véronique PERDEREAU ISIR UPMC mars 2013 Conventional methods applied to rehabilitation robotics Véronique Perdereau 2 Reference Robot force control by Bruno Siciliano & Luigi Villani Kluwer Academic Publishers

More information

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS

CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS CALIBRATION OF A ROBUST 2 DOF PATH MONITORING TOOL FOR INDUSTRIAL ROBOTS AND MACHINE TOOLS BASED ON PARALLEL KINEMATICS E. Batzies 1, M. Kreutzer 1, D. Leucht 2, V. Welker 2, O. Zirn 1 1 Mechatronics Research

More information

CATIA V5 Tutorials. Mechanism Design & Animation. Release 18. Nader G. Zamani. University of Windsor. Jonathan M. Weaver. University of Detroit Mercy

CATIA V5 Tutorials. Mechanism Design & Animation. Release 18. Nader G. Zamani. University of Windsor. Jonathan M. Weaver. University of Detroit Mercy CATIA V5 Tutorials Mechanism Design & Animation Release 18 Nader G. Zamani University of Windsor Jonathan M. Weaver University of Detroit Mercy SDC PUBLICATIONS Schroff Development Corporation www.schroff.com

More information

A STRATEGIC PLANNER FOR ROBOT EXCAVATION' by Humberto Romero-Lois, Research Assistant, Department of Civil Engineering

A STRATEGIC PLANNER FOR ROBOT EXCAVATION' by Humberto Romero-Lois, Research Assistant, Department of Civil Engineering A STRATEGIC PLANNER FOR ROBOT EXCAVATION' by Humberto Romero-Lois, Research Assistant, Department of Civil Engineering Chris Hendrickson, Professor, Department of Civil Engineering, and Irving Oppenheim,

More information

Static Environment Recognition Using Omni-camera from a Moving Vehicle

Static Environment Recognition Using Omni-camera from a Moving Vehicle Static Environment Recognition Using Omni-camera from a Moving Vehicle Teruko Yata, Chuck Thorpe Frank Dellaert The Robotics Institute Carnegie Mellon University Pittsburgh, PA 15213 USA College of Computing

More information

Development of Easy Teaching Interface for a Dual Arm Robot Manipulator

Development of Easy Teaching Interface for a Dual Arm Robot Manipulator Development of Easy Teaching Interface for a Dual Arm Robot Manipulator Chanhun Park and Doohyeong Kim Department of Robotics and Mechatronics, Korea Institute of Machinery & Materials, 156, Gajeongbuk-Ro,

More information

On-line trajectory planning of robot manipulator s end effector in Cartesian Space using quaternions

On-line trajectory planning of robot manipulator s end effector in Cartesian Space using quaternions On-line trajectory planning of robot manipulator s end effector in Cartesian Space using quaternions Ignacio Herrera Aguilar and Daniel Sidobre (iherrera, daniel)@laas.fr LAAS-CNRS Université Paul Sabatier

More information

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound

High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound High-accuracy ultrasound target localization for hand-eye calibration between optical tracking systems and three-dimensional ultrasound Ralf Bruder 1, Florian Griese 2, Floris Ernst 1, Achim Schweikard

More information

Introduction to Engineering System Dynamics

Introduction to Engineering System Dynamics CHAPTER 0 Introduction to Engineering System Dynamics 0.1 INTRODUCTION The objective of an engineering analysis of a dynamic system is prediction of its behaviour or performance. Real dynamic systems are

More information

A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS

A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS A PAIR OF MEASURES OF ROTATIONAL ERROR FOR AXISYMMETRIC ROBOT END-EFFECTORS Sébastien Briot, Ilian A. Bonev Department of Automated Manufacturing Engineering École de technologie supérieure (ÉTS), Montreal,

More information

Metrics on SO(3) and Inverse Kinematics

Metrics on SO(3) and Inverse Kinematics Mathematical Foundations of Computer Graphics and Vision Metrics on SO(3) and Inverse Kinematics Luca Ballan Institute of Visual Computing Optimization on Manifolds Descent approach d is a ascent direction

More information

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr

Practical Work DELMIA V5 R20 Lecture 1. D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr Practical Work DELMIA V5 R20 Lecture 1 D. Chablat / S. Caro Damien.Chablat@irccyn.ec-nantes.fr Stephane.Caro@irccyn.ec-nantes.fr Native languages Definition of the language for the user interface English,

More information

DEOS. Deutsche Orbitale Servicing Mission. The In-flight Technology Demonstration of Germany s Robotics Approach to Service Satellites

DEOS. Deutsche Orbitale Servicing Mission. The In-flight Technology Demonstration of Germany s Robotics Approach to Service Satellites DEOS Deutsche Orbitale Servicing Mission The In-flight Technology Demonstration of Germany s Robotics Approach to Service Satellites B. Sommer, K. Landzettel, T. Wolf, D. Reintsema, German Aerospace Center

More information

THE problem of visual servoing guiding a robot using

THE problem of visual servoing guiding a robot using 582 IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, VOL. 13, NO. 4, AUGUST 1997 A Modular System for Robust Positioning Using Feedback from Stereo Vision Gregory D. Hager, Member, IEEE Abstract This paper

More information

MOBILE ROBOT TRACKING OF PRE-PLANNED PATHS. Department of Computer Science, York University, Heslington, York, Y010 5DD, UK (email:nep@cs.york.ac.

MOBILE ROBOT TRACKING OF PRE-PLANNED PATHS. Department of Computer Science, York University, Heslington, York, Y010 5DD, UK (email:nep@cs.york.ac. MOBILE ROBOT TRACKING OF PRE-PLANNED PATHS N. E. Pears Department of Computer Science, York University, Heslington, York, Y010 5DD, UK (email:nep@cs.york.ac.uk) 1 Abstract A method of mobile robot steering

More information

Solving Simultaneous Equations and Matrices

Solving Simultaneous Equations and Matrices Solving Simultaneous Equations and Matrices The following represents a systematic investigation for the steps used to solve two simultaneous linear equations in two unknowns. The motivation for considering

More information

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY

PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY PHOTOGRAMMETRIC TECHNIQUES FOR MEASUREMENTS IN WOODWORKING INDUSTRY V. Knyaz a, *, Yu. Visilter, S. Zheltov a State Research Institute for Aviation System (GosNIIAS), 7, Victorenko str., Moscow, Russia

More information

Human Interaction with Robots Working in Complex and Hazardous Environments

Human Interaction with Robots Working in Complex and Hazardous Environments Human Interaction with Robots Working in Complex and Hazardous Environments Bill Hamel, Professor & Head IEEE Fellow RAS Vice President for Publication Activities Mechanical, Aerospace, & Biomedical Engineering

More information

RealTime Tracking Meets Online Grasp Planning

RealTime Tracking Meets Online Grasp Planning RealTime Tracking Meets Online Grasp Planning Danica Kragić, Computer Vision and Active Perception Numerical Analysis and Computer Science Royal Institute of Technology SE 100 44 Stockholm, Sweden Andrew

More information

Intelligent Flexible Automation

Intelligent Flexible Automation Intelligent Flexible Automation David Peters Chief Executive Officer Universal Robotics February 20-22, 2013 Orlando World Marriott Center Orlando, Florida USA Trends in AI and Computing Power Convergence

More information

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist

Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist Design-Simulation-Optimization Package for a Generic 6-DOF Manipulator with a Spherical Wrist MHER GRIGORIAN, TAREK SOBH Department of Computer Science and Engineering, U. of Bridgeport, USA ABSTRACT Robot

More information

GOM Optical Measuring Techniques. Deformation Systems and Applications

GOM Optical Measuring Techniques. Deformation Systems and Applications GOM Optical Measuring Techniques Deformation Systems and Applications ARGUS Forming Analysis ARGUS Deformation analysis in sheet metal and forming industry Forming Characteristics of Sheet Metals Material

More information

Classifying Manipulation Primitives from Visual Data

Classifying Manipulation Primitives from Visual Data Classifying Manipulation Primitives from Visual Data Sandy Huang and Dylan Hadfield-Menell Abstract One approach to learning from demonstrations in robotics is to make use of a classifier to predict if

More information

Robotic motion planning for 8- DOF motion stage

Robotic motion planning for 8- DOF motion stage Robotic motion planning for 8- DOF motion stage 12 November Mark Geelen Simon Jansen Alten Mechatronics www.alten.nl rosindustrial@alten.nl Introduction Introduction Alten FEI Motion planning MoveIt! Proof

More information

Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework

Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework Integration of a Robotic Arm with the Surgical Assistant Workstation Software Framework Release 1.7 Jessie Young 1, Haytham Elhawary 2 and Aleksandra Popovic 2 July 21, 2011 1 Center for Computer-Integrated

More information

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD)

Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD) Simulation of Trajectories and Comparison of Joint Variables for Robotic Manipulator Using Multibody Dynamics (MBD) Jatin Dave Assistant Professor Nirma University Mechanical Engineering Department, Institute

More information

The Big Data methodology in computer vision systems

The Big Data methodology in computer vision systems The Big Data methodology in computer vision systems Popov S.B. Samara State Aerospace University, Image Processing Systems Institute, Russian Academy of Sciences Abstract. I consider the advantages of

More information

A MODULAR FORCE-TORQUE TRANSDUCER FOR REHABILITATION ROBOTICS

A MODULAR FORCE-TORQUE TRANSDUCER FOR REHABILITATION ROBOTICS A MODULAR FORCE-TORQUE TRANSDUCER FOR REHABILITATION ROBOTICS Milan Kvasnica Technical University of Zvolen, T. G. Masaryka 24, SK-96053 Zvolen, Slovakia G.R.A.S.P. Laboratory, School of Engineering and

More information

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving 3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Christian Zinner Safe and Autonomous Systems

More information

Online Risk Assessment for Safe Autonomous Mobile Robots - A Perspective

Online Risk Assessment for Safe Autonomous Mobile Robots - A Perspective Online Risk Assessment for Safe Autonomous Mobile Robots - A Perspective H. Voos, P. Ertle Mobile Robotics Lab, University of Applied Sciences Ravensburg-Weingarten, Germany, (e-mail: voos@hs-weingarten.de).

More information

ExmoR A Testing Tool for Control Algorithms on Mobile Robots

ExmoR A Testing Tool for Control Algorithms on Mobile Robots ExmoR A Testing Tool for Control Algorithms on Mobile Robots F. Lehmann, M. Ritzschke and B. Meffert Institute of Informatics, Humboldt University, Unter den Linden 6, 10099 Berlin, Germany E-mail: falk.lehmann@gmx.de,

More information

ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION

ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION 1 ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION B. Mikó PhD, Z-Form Tool Manufacturing and Application Ltd H-1082. Budapest, Asztalos S. u 4. Tel: (1) 477 1016, e-mail: miko@manuf.bme.hu

More information

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique

A Reliability Point and Kalman Filter-based Vehicle Tracking Technique A Reliability Point and Kalman Filter-based Vehicle Tracing Technique Soo Siang Teoh and Thomas Bräunl Abstract This paper introduces a technique for tracing the movement of vehicles in consecutive video

More information

Interactive Computer Graphics

Interactive Computer Graphics Interactive Computer Graphics Lecture 18 Kinematics and Animation Interactive Graphics Lecture 18: Slide 1 Animation of 3D models In the early days physical models were altered frame by frame to create

More information

Solution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH

Solution Guide III-C. 3D Vision. Building Vision for Business. MVTec Software GmbH Solution Guide III-C 3D Vision MVTec Software GmbH Building Vision for Business Machine vision in 3D world coordinates, Version 10.0.4 All rights reserved. No part of this publication may be reproduced,

More information

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System

Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System Ref: C0287 Visual Servoing Methodology for Selective Tree Pruning by Human-Robot Collaborative System Avital Bechar, Victor Bloch, Roee Finkelshtain, Sivan Levi, Aharon Hoffman, Haim Egozi and Ze ev Schmilovitch,

More information

Robot Perception Continued

Robot Perception Continued Robot Perception Continued 1 Visual Perception Visual Odometry Reconstruction Recognition CS 685 11 Range Sensing strategies Active range sensors Ultrasound Laser range sensor Slides adopted from Siegwart

More information

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM

GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM Advances in Production Engineering & Management 4 (2009) 4, 255-262 ISSN 1854-6250 Technical paper GANTRY ROBOTIC CELL FOR AUTOMATIC STORAGE AND RETREIVAL SYSTEM Ata, A., A.*; Elaryan, M.**; Gemaee, M.**;

More information

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud

How To Fuse A Point Cloud With A Laser And Image Data From A Pointcloud REAL TIME 3D FUSION OF IMAGERY AND MOBILE LIDAR Paul Mrstik, Vice President Technology Kresimir Kusevic, R&D Engineer Terrapoint Inc. 140-1 Antares Dr. Ottawa, Ontario K2E 8C4 Canada paul.mrstik@terrapoint.com

More information

A System for Capturing High Resolution Images

A System for Capturing High Resolution Images A System for Capturing High Resolution Images G.Voyatzis, G.Angelopoulos, A.Bors and I.Pitas Department of Informatics University of Thessaloniki BOX 451, 54006 Thessaloniki GREECE e-mail: pitas@zeus.csd.auth.gr

More information

Bachelor of Games and Virtual Worlds (Programming) Subject and Course Summaries

Bachelor of Games and Virtual Worlds (Programming) Subject and Course Summaries First Semester Development 1A On completion of this subject students will be able to apply basic programming and problem solving skills in a 3 rd generation object-oriented programming language (such as

More information

Colorado School of Mines Computer Vision Professor William Hoff

Colorado School of Mines Computer Vision Professor William Hoff Professor William Hoff Dept of Electrical Engineering &Computer Science http://inside.mines.edu/~whoff/ 1 Introduction to 2 What is? A process that produces from images of the external world a description

More information

Automotive Applications of 3D Laser Scanning Introduction

Automotive Applications of 3D Laser Scanning Introduction Automotive Applications of 3D Laser Scanning Kyle Johnston, Ph.D., Metron Systems, Inc. 34935 SE Douglas Street, Suite 110, Snoqualmie, WA 98065 425-396-5577, www.metronsys.com 2002 Metron Systems, Inc

More information

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA

A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA A PHOTOGRAMMETRIC APPRAOCH FOR AUTOMATIC TRAFFIC ASSESSMENT USING CONVENTIONAL CCTV CAMERA N. Zarrinpanjeh a, F. Dadrassjavan b, H. Fattahi c * a Islamic Azad University of Qazvin - nzarrin@qiau.ac.ir

More information

RIA : 2013 Market Trends Webinar Series

RIA : 2013 Market Trends Webinar Series RIA : 2013 Market Trends Webinar Series Robotic Industries Association A market trends education Available at no cost to audience Watch live or archived webinars anytime Learn about the latest innovations

More information

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research

Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and Motion Optimization for Maritime Robotic Research 20th International Congress on Modelling and Simulation, Adelaide, Australia, 1 6 December 2013 www.mssanz.org.au/modsim2013 Intelligent Submersible Manipulator-Robot, Design, Modeling, Simulation and

More information

Force and Visual Control for Safe Human Robot Interaction

Force and Visual Control for Safe Human Robot Interaction Force and Visual Control for Safe Human Robot Interaction Bruno SICILIANO www.prisma.unina.it PRISMA Team Force and Visual Control for Safe Human Robot Interaction 2/35 Bruno Siciliano Luigi Villani Vincenzo

More information

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot.

Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot. Ref BP128 Anatomy Of A Robot Stirling Paatz of robot integrators Barr & Paatz describes the anatomy of an industrial robot. The term robot stems from the Czech word robota, which translates roughly as

More information

Constraint satisfaction and global optimization in robotics

Constraint satisfaction and global optimization in robotics Constraint satisfaction and global optimization in robotics Arnold Neumaier Universität Wien and Jean-Pierre Merlet INRIA Sophia Antipolis 1 The design, validation, and use of robots poses a number of

More information

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA

TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA 2015 School of Information Technology and Electrical Engineering at the University of Queensland TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAAA Schedule Week Date

More information

Introduction to Robotics Analysis, Systems, Applications

Introduction to Robotics Analysis, Systems, Applications Introduction to Robotics Analysis, Systems, Applications Saeed B. Niku Mechanical Engineering Department California Polytechnic State University San Luis Obispo Technische Urw/carsMt Darmstadt FACHBEREfCH

More information

Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication

Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication Time Domain and Frequency Domain Techniques For Multi Shaker Time Waveform Replication Thomas Reilly Data Physics Corporation 1741 Technology Drive, Suite 260 San Jose, CA 95110 (408) 216-8440 This paper

More information

Robotics & Automation

Robotics & Automation Robotics & Automation Levels: Grades 10-12 Units of Credit: 1.0 CIP Code: 21.0117 Core Code: 38-01-00-00-130 Prerequisite: None Skill Test: 612 COURSE DESCRIPTION Robotics & Automation is a lab-based,

More information

Chapter. 4 Mechanism Design and Analysis

Chapter. 4 Mechanism Design and Analysis Chapter. 4 Mechanism Design and Analysis 1 All mechanical devices containing moving parts are composed of some type of mechanism. A mechanism is a group of links interacting with each other through joints

More information

Tracking Moving Objects In Video Sequences Yiwei Wang, Robert E. Van Dyck, and John F. Doherty Department of Electrical Engineering The Pennsylvania State University University Park, PA16802 Abstract{Object

More information

T-REDSPEED White paper

T-REDSPEED White paper T-REDSPEED White paper Index Index...2 Introduction...3 Specifications...4 Innovation...6 Technology added values...7 Introduction T-REDSPEED is an international patent pending technology for traffic violation

More information

5-Axis Test-Piece Influence of Machining Position

5-Axis Test-Piece Influence of Machining Position 5-Axis Test-Piece Influence of Machining Position Michael Gebhardt, Wolfgang Knapp, Konrad Wegener Institute of Machine Tools and Manufacturing (IWF), Swiss Federal Institute of Technology (ETH), Zurich,

More information

Robotics and Automation Blueprint

Robotics and Automation Blueprint Robotics and Automation Blueprint This Blueprint contains the subject matter content of this Skill Connect Assessment. This Blueprint does NOT contain the information one would need to fully prepare for

More information

Reflection and Refraction

Reflection and Refraction Equipment Reflection and Refraction Acrylic block set, plane-concave-convex universal mirror, cork board, cork board stand, pins, flashlight, protractor, ruler, mirror worksheet, rectangular block worksheet,

More information

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING

Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING Unit 1: INTRODUCTION TO ADVANCED ROBOTIC DESIGN & ENGINEERING Technological Literacy Review of Robotics I Topics and understand and be able to implement the "design 8.1, 8.2 Technology Through the Ages

More information

Wii Remote Calibration Using the Sensor Bar

Wii Remote Calibration Using the Sensor Bar Wii Remote Calibration Using the Sensor Bar Alparslan Yildiz Abdullah Akay Yusuf Sinan Akgul GIT Vision Lab - http://vision.gyte.edu.tr Gebze Institute of Technology Kocaeli, Turkey {yildiz, akay, akgul}@bilmuh.gyte.edu.tr

More information

How To Compress Video For Real Time Transmission

How To Compress Video For Real Time Transmission University of Edinburgh College of Science and Engineering School of Informatics Informatics Research Proposal supervised by Dr. Sethu Vijayakumar Optimized bandwidth usage for real-time remote surveillance

More information

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY

VRSPATIAL: DESIGNING SPATIAL MECHANISMS USING VIRTUAL REALITY Proceedings of DETC 02 ASME 2002 Design Technical Conferences and Computers and Information in Conference Montreal, Canada, September 29-October 2, 2002 DETC2002/ MECH-34377 VRSPATIAL: DESIGNING SPATIAL

More information

TESLA Report 2003-03

TESLA Report 2003-03 TESLA Report 23-3 A multigrid based 3D space-charge routine in the tracking code GPT Gisela Pöplau, Ursula van Rienen, Marieke de Loos and Bas van der Geer Institute of General Electrical Engineering,

More information

Performing Assembly Task Under Constraints Using 3D Sensor-Based Control

Performing Assembly Task Under Constraints Using 3D Sensor-Based Control Performing Assembly Task Under Constraints Using 3D Sensor-Based Control Sylvain Vandernotte 1, Abdelhamid Chriette 2, Adolfo Suarez Roos 3, and Philippe Martinet 2 1 IRT Jules Verne, Chemin du Chaffaut,

More information

How To Program A Laser Cutting Robot

How To Program A Laser Cutting Robot Robotics ABB Robotics Laser Cutting Software High precision laser cutting made easy - Greater manufacturing flexibility at lower capital investment Robotic laser cutting Overview Allows for the increased

More information

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM

3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM 3D SCANNING: A NEW APPROACH TOWARDS MODEL DEVELOPMENT IN ADVANCED MANUFACTURING SYSTEM Dr. Trikal Shivshankar 1, Patil Chinmay 2, Patokar Pradeep 3 Professor, Mechanical Engineering Department, SSGM Engineering

More information

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision

Shape Measurement of a Sewer Pipe. Using a Mobile Robot with Computer Vision International Journal of Advanced Robotic Systems ARTICLE Shape Measurement of a Sewer Pipe Using a Mobile Robot with Computer Vision Regular Paper Kikuhito Kawasue 1,* and Takayuki Komatsu 1 1 Department

More information

A Contribution to Expert Decision-based Virtual Product Development

A Contribution to Expert Decision-based Virtual Product Development A Contribution to Expert Decision-based Virtual Product Development László Horváth, Imre J. Rudas Institute of Intelligent Engineering Systems, John von Neumann Faculty of Informatics, Óbuda University,

More information

Machine Tool Control. Besides these TNCs, HEIDENHAIN also supplies controls for other areas of application, such as lathes.

Machine Tool Control. Besides these TNCs, HEIDENHAIN also supplies controls for other areas of application, such as lathes. Machine Tool Control Contouring controls for milling, drilling, boring machines and machining centers TNC contouring controls from HEIDENHAIN for milling, drilling, boring machines and machining centers

More information

Animation. Persistence of vision: Visual closure:

Animation. Persistence of vision: Visual closure: Animation Persistence of vision: The visual system smoothes in time. This means that images presented to the eye are perceived by the visual system for a short time after they are presented. In turn, this

More information

Architecture for Direct Model-to-Part CNC Manufacturing

Architecture for Direct Model-to-Part CNC Manufacturing Architecture for Direct Model-to-Part CNC Manufacturing Gilbert Poon, Paul J. Gray, Sanjeev Bedi Department of Mechanical Engineering, University of Waterloo Waterloo, Ontario, N2L 3G1, Canada and Daniel

More information

Operational Space Control for A Scara Robot

Operational Space Control for A Scara Robot Operational Space Control for A Scara Robot Francisco Franco Obando D., Pablo Eduardo Caicedo R., Oscar Andrés Vivas A. Universidad del Cauca, {fobando, pacaicedo, avivas }@unicauca.edu.co Abstract This

More information

INTELLIGENT SYSTEMS, CONTROL, AND AUTOMATION: SCIENCE AND ENGINEERING

INTELLIGENT SYSTEMS, CONTROL, AND AUTOMATION: SCIENCE AND ENGINEERING Robotics International Series on INTELLIGENT SYSTEMS, CONTROL, AND AUTOMATION: SCIENCE AND ENGINEERING VOLUME 43 Editor Professor S. G. Tzafestas, National Technical University of Athens, Greece Editorial

More information

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004

Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen 2004 Using NI Vision & Motion for Automated Inspection of Medical Devices and Pharmaceutical Processes. Morten Jensen, National Instruments Pittcon 2004 As more control and verification is needed in medical

More information

Computer Animation. Lecture 2. Basics of Character Animation

Computer Animation. Lecture 2. Basics of Character Animation Computer Animation Lecture 2. Basics of Character Animation Taku Komura Overview Character Animation Posture representation Hierarchical structure of the body Joint types Translational, hinge, universal,

More information

Part-Based Recognition

Part-Based Recognition Part-Based Recognition Benedict Brown CS597D, Fall 2003 Princeton University CS 597D, Part-Based Recognition p. 1/32 Introduction Many objects are made up of parts It s presumably easier to identify simple

More information

Application of Virtual Instrumentation for Sensor Network Monitoring

Application of Virtual Instrumentation for Sensor Network Monitoring Application of Virtual Instrumentation for Sensor etwor Monitoring COSTATI VOLOSECU VICTOR MALITA Department of Automatics and Applied Informatics Politehnica University of Timisoara Bd. V. Parvan nr.

More information

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model

This week. CENG 732 Computer Animation. Challenges in Human Modeling. Basic Arm Model CENG 732 Computer Animation Spring 2006-2007 Week 8 Modeling and Animating Articulated Figures: Modeling the Arm, Walking, Facial Animation This week Modeling the arm Different joint structures Walking

More information

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information.

Robotics. Lecture 3: Sensors. See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Robotics Lecture 3: Sensors See course website http://www.doc.ic.ac.uk/~ajd/robotics/ for up to date information. Andrew Davison Department of Computing Imperial College London Review: Locomotion Practical

More information

FUNDAMENTALS OF ROBOTICS

FUNDAMENTALS OF ROBOTICS FUNDAMENTALS OF ROBOTICS Lab exercise Stäubli AULINAS Josep (u1043469) GARCIA Frederic (u1038431) Introduction The aim of this tutorial is to give a brief overview on the Stäubli Robot System describing

More information

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau

Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau Introduction to Computer Graphics Marie-Paule Cani & Estelle Duveau 04/02 Introduction & projective rendering 11/02 Prodedural modeling, Interactive modeling with parametric surfaces 25/02 Introduction

More information

Synthetic Aperture Radar: Principles and Applications of AI in Automatic Target Recognition

Synthetic Aperture Radar: Principles and Applications of AI in Automatic Target Recognition Synthetic Aperture Radar: Principles and Applications of AI in Automatic Target Recognition Paulo Marques 1 Instituto Superior de Engenharia de Lisboa / Instituto de Telecomunicações R. Conselheiro Emídio

More information

Classroom Tips and Techniques: The Student Precalculus Package - Commands and Tutors. Content of the Precalculus Subpackage

Classroom Tips and Techniques: The Student Precalculus Package - Commands and Tutors. Content of the Precalculus Subpackage Classroom Tips and Techniques: The Student Precalculus Package - Commands and Tutors Robert J. Lopez Emeritus Professor of Mathematics and Maple Fellow Maplesoft This article provides a systematic exposition

More information

Integration Services

Integration Services Integration Services EXPERIENCED TEAM ADVANCED TECHNOLOGY PROVEN SOLUTIONS Integrations for large scale metrology applications Metris metrology to streamline your CAPABILITIES Advanced systems design Engineering

More information

Development of Robotic End-Effector Using Sensors for Part Recognition and Grasping

Development of Robotic End-Effector Using Sensors for Part Recognition and Grasping International Journal of Materials Science and Engineering Vol. 3, No. 1 March 2015 Development of Robotic End-Effector Using Sensors for Part Recognition and Grasping Om Prakash Sahu, Bibhuti Bhusan Biswal,

More information

Masters in Information Technology

Masters in Information Technology Computer - Information Technology MSc & MPhil - 2015/6 - July 2015 Masters in Information Technology Programme Requirements Taught Element, and PG Diploma in Information Technology: 120 credits: IS5101

More information

An Approach for Utility Pole Recognition in Real Conditions

An Approach for Utility Pole Recognition in Real Conditions 6th Pacific-Rim Symposium on Image and Video Technology 1st PSIVT Workshop on Quality Assessment and Control by Image and Video Analysis An Approach for Utility Pole Recognition in Real Conditions Barranco

More information

Distributed Sensing for Cooperative Robotics

Distributed Sensing for Cooperative Robotics Distributed Sensing for Cooperative Robotics Guilherme Augusto Silva Pereira Advisor: Prof. Mário Fernando Montenegro Campos VERLab Vision and Robotics Laboratory/UFMG Co-Advisor: Prof. Vijay Kumar GRASP

More information

Integrated sensors for robotic laser welding

Integrated sensors for robotic laser welding Proceedings of the Third International WLT-Conference on Lasers in Manufacturing 2005,Munich, June 2005 Integrated sensors for robotic laser welding D. Iakovou *, R.G.K.M Aarts, J. Meijer University of

More information

DIEF, Department of Engineering Enzo Ferrari University of Modena e Reggio Emilia Italy Online Trajectory Planning for robotic systems

DIEF, Department of Engineering Enzo Ferrari University of Modena e Reggio Emilia Italy Online Trajectory Planning for robotic systems DIEF, Department of Engineering Enzo Ferrari University of Modena e Reggio Emilia Italy Online Trajectory Planning for robotic systems Luigi Biagiotti Luigi Biagiotti luigi.biagiotti@unimore.it Introduction

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION 1 CHAPTER 1 INTRODUCTION Exploration is a process of discovery. In the database exploration process, an analyst executes a sequence of transformations over a collection of data structures to discover useful

More information

Robotics. Chapter 25. Chapter 25 1

Robotics. Chapter 25. Chapter 25 1 Robotics Chapter 25 Chapter 25 1 Outline Robots, Effectors, and Sensors Localization and Mapping Motion Planning Motor Control Chapter 25 2 Mobile Robots Chapter 25 3 Manipulators P R R R R R Configuration

More information

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving

3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving 3D Vision An enabling Technology for Advanced Driver Assistance and Autonomous Offroad Driving AIT Austrian Institute of Technology Safety & Security Department Manfred Gruber Safe and Autonomous Systems

More information