masterhead masterhead  masterhead

Emergency stop for safe human-robot-coexistence


The coexistence of humans and industrial robots provides the advantages of increased flexibility and system uptime in production. For example, the industrial robots in a work-cell currently have to be shut down during maintenance operations, e.g. cleaning of the work-cell. If the robot continues to work during much of this maintenance, the costs for maintenance operations are reduced. Moreover, because safety fences are rendered unnecessary, a reduced work-cell layout is possible, decreasing the workspace costs. Robots could also be used to bring parts to human workers and hold them in a position suitable for the human.

As the position of the human is not fixed, sensors are required to determine the current position of the human to avoid a collision. While camera images are suitable, standard cameras usually provide only between 25 and 60 images per second. Although high-speed cameras are available, these cameras are expensive and also require fast communication and powerful computers to analyze the image data.

A solution to this problem is the integration of the processing elements and the photo-detectors onto a single chip. In the Ishikawa-Namiki laboratory, a specialized tracking-vision-chip has been developed together with a commercial company. This chip allows for very fast image processing due to pixel-parallel processing on the sensor-array with a frame rate of about 1kHz, depending on the lighting conditions. This vision-chip consists of an array of processing elements (PEs). Each processing element has one bit of memory that is used as a mask. All mask-bits together define the window in which the object is tracked. The vision-chip implements the self-windowing tracking algorithm on a binary image in hardware.

This research project developed an approach that uses this specialized tracking-vision-chip to realize a high-speed emergency-stop for safe human-robot-coexistence.

The realized approach uses the ability of the vision-chip to perform pixel-parallel masking and fast summation-operations on binary images to detect whether the robot and human are too close. To validate the developed algorithm, experiments were performed both using simulations and an experimental robot-cell that was realized within the project.

The approach-presented above was implemented on a smart-camera using the tracking-vision-chip. The emergency-stop program ran on an 8-Bit AVR microcontroller clocked at 3.6MHz with 8KB of flash-memory and 512 Bytes of internal RAM and EEPROM. The support-chips provided an additional 6KB of external RAM, of which 4KB were used as the framebuffer of a grayscale-image. A PC communicated by serial RS-232 connections to both, the robot and the camera.

During the semi-simulation experiments, the PC ran an animation showing a simple scene of a sketched robot and a sketched human together in the same workspace. During this animation, a collision between robot and human occurred. This animation was displayed on a screen monitored by the camera. If the camera detected a collision, it sent a message to the PC. No dilation was used during these experiments.

As the initial semi-simulation experiments showed that the algorithm worked as intended, a small experimental robot cell was built up. The same camera used in the simulation experiments was placed to monitor the experimental area. The scenario models a robot helping a human worker with an assembly, e.g. by bringing parts. The robot should not collide with the hands of the human.

For an exposure time of 800 ms and a dilation parameter d of 10, around 500 messages per second were achieved. This rate was enough to allow for a virtually immediate stop of the robot if a collision was detected.




  1. Ebert, D., Komuro, T., Namiki, A., Ishikawa, M.: " Safe Human-Robot-Coexistence: Emergency Stop Using a High-speed Vision Chip" In: Proceeding Robotics Society of Japan, September 15 - 17, 2004
Ishikawa Watanabe Laboratory, Department of Information Physics and Computing, Department of Creative Informatics,
Graduate School of Information Science and Technology, University of Tokyo
Ishikawa Watanabe Laboratory WWW admin:
Copyright © 2008 Ishikawa Watanabe Laboratory. All rights reserved.