Researchers focus on man-robot interaction

Industrial Robot

ISSN: 0143-991x

Article publication date: 1 March 2006

121

Keywords

Citation

Kochan, A. (2006), "Researchers focus on man-robot interaction", Industrial Robot, Vol. 33 No. 2. https://doi.org/10.1108/ir.2006.04933bab.007

Publisher

:

Emerald Group Publishing Limited

Copyright © 2006, Emerald Group Publishing Limited


Researchers focus on man-robot interaction

Researchers focus on man-robot interaction

Keywords: Robotics, Conference

The first e-symposium on advanced robotics was held in 2005 and is set to become an annual event, according to its organisers, the International Federation of Robotics (IFR). Taking place in cyberspace, the symposium enabled the faces and voices of speakers around the world to appear with their presentations in the offices of delegates, wherever they might be. Both speakers and delegates could, in fact, participate in the event from the comfort of their own homes or offices. No travel, no hotel rooms, no discomfort and perhaps most importantly, no expense. But also no excitement. None of those chance meetings that from time to time result in new collaborative ventures, new business contracts or simply the exchange of ideas. However, the e-symposium was a well-run event that went almost without a hitch. And, it is archived on the internet so, those who wish, can experience the presentations as many times as they desire. (See www.adrob.e-symposium.com)

The 2005 symposium featured an impressive line-up of international speakers who presented papers on the future directions of robotics, particularly personal and service robots. The round-table discussions also included eminent participants.

The opening keynote speaker at the e- symposium was Martin Haegele, chair of the service robotics group at the IFR, and head of the robot systems department at the Fraunhofer Institute for Manufacturing Engineering and Automation (IPA).

He told the e-symposium how robots had now become a commodity, and spoke of a future that would bring robots and humans into a closely co- operating relationship.

By 2007, roughly one million robots will have been implemented in the world, Professor Haegele said. Looking back at the last few decades, he told how the two industries that have most influenced development are car manufacture and electronic assembly. Over the years, robots have not changed much in appearance, however. But applications in the automotive industry have led to robots becoming more precise, more reliable and less costly. The unit price of a robot is 30 per cent of what it was 15 years ago. The electronics sector has had an impact on the design of robots and mostly uses SCARA machines.

Looking at the future, one preoccupation of industry today is how to bring even more flexibility to a robot or robot work cell, said Haegele. He suggested that the robot might become an aid to a manual work cell and be used like a tool. With only a very few parts needing changing to adapt to different product variants, the result would be a rapidly deployable robot work cell. Another possibility that is under discussion is to use the robot as a helper at a manual workplace to increase productivity, improve quality and raise cost-effectiveness. So the robot and human would have to work in close co-operation.

Already, industry is developing solutions where robots co-operate with each other. With one robot holding a workpiece while another applies glue, for example, it is easy to make adjustments in the event of a product change because there are no peripheral devices, said Haegele.

However, when a robot and human co-operate, additional issues have to be resolved. If, for example, a robot is used to hand parts to an operator so that he can complete precise work that a robot is unable to perform, the robot and human are working in a common work space and the safety of the operator has to be protected.

Sensors are vital for operator safety, said Haegele. With a sensor to measure the relative movement of the robot and the worker, potential collision can be avoided by slowing the robot down. Such sensors are still in the development stage. Researchers are also working on safety sensors that distinguish between a worker and a robot by analysing form, colour, structure and patterns. They would then be able to define the safety zone and adjust robot motion accordingly.

According to Haegele, all manufacturers are actively working on robot-robot partnership and on robot- worker co-operation. The safety issues are not resolved but the relevant ISO 10218 standard is in the process of being released. It indicates how robot motion must be monitored by a controller and how a robot must keep a safe distance from the human.

Another completely different approach that Haegele finds interesting is to use intrinsically safe kinematic devices, thus avoiding the need for costly safety sensors. Impact forces and inertia would be limited to take into account the best interests of the human. The robot would be used as a helper. It would be “taught” the workpiece geometry and the task to be performed, and would then be able to reproduce it as required.

One topic that Haegele and his team are actively investigating is how to convey information to a robot. It is clear, he said, that for a robotic application to be flexible, the robot has to be instructed very intuitively. A Europe project SME is focused on this area.

On the future of robot automation, Haegele said there is a basic requirement for robots that are portable or mobile and that can be used with very little workcell reconfiguration. Also, robot cost should be one third its current level. Robots in future should be able to manipulate high forces and heavy payloads when needed, and should be able to receive interactive instruction and problem handling instead of programming. They should have force control for machining processes and 3D vision to adapt to variability in part geometry. They should withstand severe environments. Finally, immediate integration into existing environments should be possible.

The subject of robot-human co- operation was also discussed at the e- symposium by Henrik Christensen, Director of the Centre for Autonomous Systems at the Royal Institute of Technology in Stockholm, Sweden. He identified co-operation as one of the main challenges to robots moving out of their current “caged” sphere of application into new areas where man and robot have to share the same space. It is a need that arises in the manufacturing sector but even more so in the emerging markets of service and domestic robots.

When man and robot operate in a common work space, the environment has to be predictable, Christensen told the e-symposium. The robot systems have to be dependable and absolutely safe. People must feel comfortable around them. “You should not be scared about going to a supermarket because there's a robot”, he commented. Moreover, the robots have to be operated by people who are not educated in robotics, and not experienced.

As well as the technical issues of developing robot systems and sensors that are safe to work with humans, it is also necessary to consider interaction patterns, Henrik Christensen told the e- symposium. This is an area that he is investigating at the Centre for Autonomous Systems. Here, a normal living room has been assembled. And researchers are using it to study the interaction between a robot moving around the room and people who use the room, to determine what is acceptable to them.

Other challenges the researchers are assessing are variability, visibility and predictability. The robot has to operate 24 h a day, seven days a week, which involves totally different lighting patterns. It also has to handle different positions of the coffee table in the living room. And, for people to feel safe around the robot, they need to know what it is going to do next. Also, the robot has to be highly visible. Sensors will play a large part in the development of a solution, said Christensen.

He foresees many applications in the workplace, in the service industry and in the home. In the workplace, a robot could act as an assistant to fetch and carry, and to perform tasks involving hot or hazardous environments. In the service area, such as in a hospital, it could guide people to their destinations. Domestic uses are lawn moving and vacuum cleaning. Commercial products for these applications already exist but they are not so easy to use, said Christensen.

Big business opportunities will arise as robots begin to emerge from their safety cages and start to share open space with humans, he believes. System integrators, in particular, should benefit as they will have a very important role to play in providing the comprehensive portfolio of competence needed to build the systems.

The interaction between human and robot was also discussed by Albert van Breemen, senior scientist at Philips Research in the Netherlands. He presented Philips' iCAT research platform to the e-symposium, indicating that this was now a platform that was available to the research community in general.

ICAT is a desktop user-interface robot with a mechanically rendered face. The face contains mechanical parts such as eyes, eyebrows, eyelids, mouth, all controlled by RC servos. In this way, said van Breemen, “we can create facial expressions. The face of the iCAT can appear happy, sad, angry, etc. (Plate 1). This, he added, was very important for creating social interaction between the robot and the user.”

Plate 1 Philips' iCAT expresses sadness, anger, happiness, etc

What started out as an internal research project is now available to the academic community for use as a research platform. “Our goal is to help build an international research community,” commented van Breemen.

The research platform is composed of three parts: a user interface robot, a software environment for creating applications based on Open Platform for Personal Robotics (OPPR) and an infrastructure for supporting a research community.

The iCAT user interface robot has no processing on board. It has to be connected via a USB connection to a PC or laptop. The control of the face involves 11 RC servos. Body and neck movement are each controlled by DC motors.

The ears and feet feature four RGB LEDs. Four capacitive touch sensors are built into the head and the paws. A high quality webcam is integrated into the nose.

In the feet is a loudspeaker, and two on-board microphones record stereo sounds. Also included is a USB sound card and an IR proximity sensor.

The software for building applications is made up of four modules: architecture, animation, intelligence and connectivity. All are designed for ease of use. For example, the architecture module contains a dynamic module library (dml) enabling the user to build his own library of reusable software components. Each software module has an input and an output port which makes it easy to connect them. As another example of ease of use, the scripting engine in the intelligence part enables the user to create software components without the need to program in C++ or compiling source NC Code.

Van Breemen anticipates that a typical use of iCAT will be in a domestic environment. It is therefore important it should be able to receive information from the internet, he concluded.

Anna KochanAssociate Editor, Industrial Robot

Related articles