Robomecium - a paramecia avatar for fun and work

Naoko Ogawa, Kyouhei Kikuta, Hiromasa Oku, Takeshi Hasegawa, Alvaro Cassinelli and Masatoshi Ishikawa

Ishikawa Komuro Laboratory / Meta Perception & Dynamic Image Control Groups /University of Tokyo


Introduction

We present a system that enables humans to interact with a micro-organism through a (macroscopic) robotic avatar... of the micro-organism. This robotic avatar not only precisely mimics the motion of the microscopic living creature (here a paramecium caudatum), but it also opens a two-way communication channel between humans in their macro-world, and the paramecia in its micro-world.

In other words, what we are proposing here is a prototype zoom-existence system, in contrast with the telexistence systems that would only "teleport" the user perception and motor capacity thanks to a more or less complex robotic avatar, and also in contrast with the microscope which only opens a unidirectional channel. Again, in this work we aim at magnifying the object to act upon (i.e., the micro-world) to human dimensions, instead of miniaturizing humans as in Asimov's science fiction novel "Fantastic Voyage".

A platform for Research

We look at this work both as a research project and as a Media Art installation. From the first point of view, our aim is to explore real-world oriented interaction mechanisms bridging the gap between the micro and macro-world. In this sense, we seek to go beyond the unidirectional, limiting paradigm of the micro-scope. It is interesting to note that while some robot-assisted surgery systems do provide haptic feedback with the micro-world, in this work we focus on reproducing (a scaled up model of) a living micro-organism, which not only passively reacts to the forces send all down to the micro-world by the human user, but can also move at will (and perhaps one day impose its own desires on the macro-world of humans).

Central to this approach is the microscopic visual feedback (DIC) technique developed in our lab [1]. DIC relies on high-speed vision as the sensing mechanism, to observe and interact with microscopic objects. It has three advantages: high precision, high-speed, and no physical contact.

... an interactive Media Art installation

On the other hand, as a Media Art installation, we believe this work has the potential to raise awareness of an invisible although pervading world, a world filled with microscopic living creatures whose behavior may be fairly complex and which are, at any moment, interacting with us.

Similar to the Metazoa Ludens project [2] we seek to demonstrate the possibilities of interacting and playing with animals (or even microscopic living organisms) and showing people the level of their more or less simple "reactive" intelligence. As many people enjoy walking their dogs, perhaps one day we will also enjoy walking our paramecia.


Proposed interaction scenario

Interaction takes place in a playground area (about 2x2 meters wide) representing a scaled-up "petri-dish" like space (actually, a liquid chamber where the paramecia can swim freely). Paramecia and humans can freely interact in this "mixed-scale" space. In this proposal, a unique Robomecia will be on the floor. The robot has range-finder, light and contact sensors; when it detects an obstacle, it reports the finding to the real paramecia which will then "decide" what action to take. This is done by electrifying properly arranged micro-electrodes on the glass-slide where the paramecia lives and swims (for details, see [3]). The paramecia can detect these artificial electrical fields, and move towards or away the electrodes, depending on their polarity (this phenomena is called galvanotaxis).

Since the robotic avatar and the paramecia are not in the same continent, the sensor data and the paramecia response data is actually sent through Internet - all the way from the exhibition space in Los Angeles to the laboratory basement at the University of Tokyo, where the paramecia lives and is observed through a modified microscope [4]. Thus, the proposed scenario will probably be the first demonstration of an intercontinental zoom-existence system.


Harware and System Architecture

At the exhibition side:

A small personal computer running Processing, receives data from the computer in Tokyo (TCP-IP or UDP-IP protocol over internet), and also communicates wirelessly with the robot (using serial protocol over Bluetooth). The robot micro-controller is an Arduino-bluetooth board. The board, motors and sensors (contact and light sensors) are powered by rechargeable Li-ion polymer batteries.

We have designed two RoboMecia prototypes. One is based on a e-puck rover (image on the right). The other is a new embedding based on a three or four wheeled vehicle inside a spherical shell (image on the left). The sphere will be made of diffuse plastic, and will glow from inside, indicating different states of activity. It can also be covered by a soft fur, representing the cilia (in the future the sphere volume could change too, using an inflated rubber ball, a small compressed air tank, and an escape valve somewhere on the surface of the ball).

Hardware back in Tokyo:

A "high-speed vision system" [1] sits on top of a conventional microscope. A windows computer captures and process images in real time (1kHz frame rate), and sends the data to a Linux computer in charge of controlling the microscope XY stage. This is an example of microscopic visual feedback: a closed-loop mechanism is used to track the paramecia in real time. The image on the left is a picture of the whole system, while the image on the right is a close up of the liquid chamber where the paramecia lives (the confinement is done with glass-slides). Four electrodes can be seen, used to generate two orthogonal electrical fields, stimulating the paramecia through galvanotaxis. The tracking data (position and speed) is also sent through UDP-IP to the personal computer running at the exhibition space.


Protype demonstration (using the e-puck based rover)

Click on the image below to launch a video of a functional prototype of the Robomecia system [wmv, 50MB]:


References and related works

[1] For more information on this and related projects, visit the Microscopic Visual Feedback group webpage at the Ishikawa-Namiki-Komuro laboratory.

[2] Tan, R. T., Todorovic, V., Andrejin, G., Teh, J. K., and Cheok, A. D. 2006. Metazoa Ludens. In Proceedings of the 2006 ACM SIGCHI international Conference on Advances in Computer Entertainment Technology ( Hollywood , California , June 14 - 16, 2006 ). ACE '06. ACM Press, New York , NY , pp. 89.

[3] Ogawa, N., Oku, H., Hashimoto, K. and Ishikawa, M.: Microrobotic Visual Control of Motile Cells using High-Speed Tracking System, IEEE Trans.  Robotics, Vol. 21, No. 4, pp. 704--712 (2005).

[4] Oku, H., Ogawa, N., Hashimoto, K. and Ishikawa, M.: Two-Dimensional Tracking of a Motile Micro-organism Allowing High-Resolution Observation with Various Imaging Techniques, Rev. Scientific Instruments, Vol. 76, No. 3 (2005).

[5] N. Ogawa, K. Kikuta, H. Oku, T. Hasegawa, A. Cassinelli and M. Ishikawa. Proposal for Real-World-Oriented Interaction System with Microorganisms and Its Preliminary Study. IPSJ Journal, Vol. 49, No. 10, Oct. (2008) [PDF-628KB].