+ All Categories
Home > Documents > ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed...

ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed...

Date post: 02-Jan-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
19
iCub platform and simulation software (Deliverable 1.1) Page 1 of 19 Grant Agreement no. 214668 ITALK Integration and Transfer of Action and Language Knowledge in robots Seventh Framework Programme (FP7) Cognitive Systems, Interaction, Robotics (ICT2007.2.1) Deliverable 1.1 Testing of extended iCub platform and simulation software for action and language learning experiments Start Date: 20080301 Duration: 48 months Project coordinator name: Angelo Cangelosi Project coordinator organization name: University of Plymouth Tel: +4417525886217 Fax: +441752586229 Email:[email protected] Project website address http://italkproject.org/
Transcript
Page 1: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 1 of 19

Grant Agreement no. 214668

ITALK Integration and Transfer of Action and

Language Knowledge in robots

Seventh Framework Programme (FP7) Cognitive Systems, Interaction, Robotics (ICT­2007.2.1)

Deliverable 1.1 Testing of extended iCub platform and

simulation software for action and language learning experiments

Start Date: 2008­03­01 Duration: 48 months

Project coordinator name: Angelo Cangelosi Project coordinator organization name: University of Plymouth Tel: +44­1752­5886217 Fax: +44­1752­586229 E­mail:[email protected] Project website address http://italkproject.org/

Page 2: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 2 of 19

1 Executive Summary

ITALK is a research project aiming at developing artificial embodied agents able to acquire complex behavioural, cognitive, and linguistic skills through individual and social learning. The main theoretical hypothesis behind the project is that the parallel development of action, conceptualisation and social interaction permits the enhancement of language capabilities, which on their part enrich cognitive development.

In order to prove the achievement of the project objectives and the effectiveness of underlying hypotheses, the partners of the project have committed themselves to carry on a number of different experiments in simulated and physical cognitive robots. In particular, the iCub, an open­source humanoid robot, was chosen as the physical platform to be shared among partners in order to validate and test the project outcomes.

Deliverable D1.1 aims at reporting the project activities dedicated to the development and testing of simulated and physical robots suitable for pursuing the project aims, with specific attention to action and language learning experiments. The deliverable is divided into two parts, the first focuses on the physical robot (i.e. the iCub), the second part on the simulation.

As to the first part, activities have been mainly directed on designing, developing and building copies of an enhanced iCub for language learning and object manipulation experiments. A considerable effort has concerned the production of four iCub platforms to be distributed within the ITALK consortium. As to the software development, speech recognition/understanding modules have been integrated and evaluated on the robot. In the following periods this WP will extend the platform with manipulation skills.

As to the second part, an important contribution of the first year of the project was the development of the first prototype of the open­source iCub simulator. This simulator magnifies the value a research group can extract from the physical robot, by making it more practical to share a single robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori­motor point of view, which is ideal for the purpose of the ITALK research objectives, less so from the dynamical point of view.

Page 3: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 3 of 19

2 Table of Contents

1 Executive Summary ...................................................................................................................................... 2

3 Enhanced ICUB for language learning and object manipulation experiments .................................. 4

3.1 ICUB hardware modules.................................................................................................. 4 3.2 YARP software modules.................................................................................................. 5 3.2.1 Acquiring, rendering and broadcasting audio data...................................................... 7

3.3 ITALK automatic speech recognition .............................................................................. 9

4 Production of robots....................................................................................................................................10

5 ITALK Simulator ...............................................................................................................................11

5.1 ITALK simulator control board interface ....................................................................... 17

References..............................................................................................................................................................18

Page 4: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 4 of 19

3 Enhanced ICUB for language learning and object manipulation experiments

In consideration of the ITALK goals, objectives and methodologies, the iCub humanoid robot is probably the most suitable platform given its dexterity (53 degrees of freedom, 18 in the hands, 6 in the head), sensorization (2 cameras, 4 force/torque sensors) and Open­Source nature (GPL license on both software and hardware). However, at the beginning of ITALK, the iCub was missing a few hardware and software components which are necessary to work on speech and towards the goals of the project. These components have been developed during the first year of ITALK, both in hardware and software, with the overall objective of improving the iCub platform in compliance with the project goals. Most of the effort has been directed to the integration of speech recognition/understanding software modules and their evaluation on the physical robot. This activity has been articulated into different subparts listed hereafter:

• integration of two embedded microphones in the iCub head, • development of custom hardware for amplifying/conditioning the microphone signal, • development of suitable software modules for acquiring either embedded (i.e. iCub head) or

external (e.g. experimenter headset) microphones, • definition of suitable interfaces for integrating speech recognition/synthesis software with

the iCub software infrastructure, i.e. YARP (http://eris.liralab.it/yarp/), • integration of speech recognition software.

3.1 ICUB hardware modules One of the first activities was devoted to the identification of suitable microphones to be mounted on the iCub head. This activity has been conducted in parallel with the design of suitable custom hardware for amplifying/conditioning the microphones signals. Although we did not expect the embedded microphones to be useful for speech (typical S/N ratio is too low), they can be used for spatial sound localization and human­robot interaction. This was already decided at design time. The embedded microphones will be primarily used in simple auditory tasks (e.g. attention sharing, gazing to a salient sound) and not during complex linguistic interactions where highly fidelity signals are preferable (e.g. ASR). The reason for this choice is mainly technical and related to the fact that getting sufficiently good signal­to­noise ratios with embedded microphones poses technological challenges which are outside the scope of the project. The final design for the embedded microphones relies on two electret condenser microphones (AOM­4540L­R, see Figure 1) which can be directly plugged into a custom made board – CFW­1/1a card (http://eris.liralab.it/wiki/CFW_card) – which amplifies and conditions the signals from the microphones. These boards contain additional electronics for acquiring two FireWire signals and 4 CAN bus ports. The description of these electronic devices goes outside the scope of the current report and will therefore be omitted (it is available from the RobotCub CVS repository: http://www.robotcub.org/iCubPlatform which contains the schematics and fabrication files of the custom made card). We describe instead the custom electronics designed to acquire the signals of the two integrated microphones. The microphones signals connect to the stereo line­in input available in the PC104 1 CPU board currently embedded in the iCub head (standard in most audio chipsets). This board mounts an audio chipset (ALC655 from RealTek) meeting the performance requirements for PC99/2001 systems. These performance specifications and, in particular, the

1 In the current design a PB945+ board is embedded in the iCub head. This board is distributed by embedded­logic (http://www.embedded­logic.de/).

Page 5: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 5 of 19

associated line­in specifications (see Table 1) guarantee a fidelity level which has been considered compatible with range of uses discussed above.

Figure 1. (Right panel) a picture of the microphones which have been integrated in the iCub head. (Left panel) a picture of the CFW custom board which integrates hardware for video and audio signals acquisition.

Table 1. Line­in performance requirements for PC99/2001 systems.

Feature Requirement Value Digital recording (A­D­PC) for line input

Frequency response 44.1 kHz destination 48.0 kHz destination

Pass­band ripple Dynamic range THD+N (–3 dB FS)

20 Hz to 17.6 kHz 2 20 Hz to 19.2 kHz 2

<+/–0.5 dB ≥70 dB FS A 1, 2

≤–60 dB FS 2 (input­referenced) 1 Decibels relative to full scale (FS), measured using “A weighting” filters. 2 For mobile PC, the dynamic range requirements are relaxed by 10 decibel (dB) FS. The total harmonic distortion plus noise (THD+N) requirements are relaxed by 10 dB FS. The required frequency response is 30 Hz to 15 kHz, measured using 3 dB corners. The cross­talk requirements are relaxed by 10 dB FS.

In order to feed the microphones signal into the stereo line­in input, a custom amplifier was designed and integrated in the CFW card. The current solution relies on a dual amplifier TS912 produced by ST­microelectronics (http://www.st.com). This device is supplied with 5V by an additional dc­dc power supply (model IL1205S by Xp Power http://www.xppower.com) which has been added after initial testing to improve the signal to noise ratio.

Finally, hardware for audio rendering has been integrated in the head. This hardware consists of two components: a power amplifier and a miniature speaker. The first component is a 1.1 Watt audio power amplifier (LM4861 from National Semiconductor). This component has been successfully used to amplify the PCM output signal produced by the ALC655 audio chipset. The second hardware, the miniature speaker, is responsible for transducing the audio signal into an actual sound. At the current state, a few different speakers have been tested but their integration in the iCub head is still to be realized due to difficulties in realizing an adequate speaker enclosure within the limited available space. The current plan is to use an external speaker and to proceed in parallel with the joint design of the speaker and the enclosure so as to achieve a good trade­off between rendered sound quality and available space.

3.2 YARP software modules A second activity aiming at enhancing the iCub platform was the development of software modules for acquiring external sound sources and for synthesizing audio signals. These modules have been developed in agreement with the YARP standards [1], the iCub middleware. In particular, (1) all

Page 6: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 6 of 19

software has been released with an Open Source license 2 , (2) can be compiled on different operating systems (tested so far on Linux and Windows) and (3) uses the YARP APIs to distribute audio data on the Ethernet network for parallel processing. Based on these standards, a YARP device driver for audio synthesis and acquisition has been developed. The device accesses the low level audio hardware using the portaudio APIs, a well known cross­platform, Open Source, audio I/O library (http://www.portaudio.com). In particular, a class named yarp::dev::PortAudioDeviceDriver was created in order to include the portaudio devices within the generic yarp::dev::DeviceDriver YARP class (refer to http://eris.liralab.it/yarpdoc/ for details). Functions for accessing audio devices have been subdivided into two subgroups defined by two different YARP interfaces: one for grabbing sounds (yarp::dev::IAudioGrabberSound) and one additional interface for rendering sounds (yarp::dev::IAudioRender). The corresponding inheritance diagram for the class yarp::dev::PortAudioDeviceDriver is shown in Figure 2 visualizing the specific interface classes and their relative dependencies.

Figure 2. A simplified structure of the YARP inheritance diagram for the yarp::dev::PortAudioDeviceDriver class and the relative dependencies with the interfaces yarp::dev::IAudioGrabberSound and yarp::dev::IAudioRender.

Both the grabber and render interfaces rely on a common data structure, called yarp::sig::Sound. This data structure is meant to represent audio signals and inherits important features from the yarp::os::Portable class, which encapsulates objects to be both “read from” and “written to” the YARP network. The corresponding inheritance diagram for the class yarp::sig::Sound is shown in Figure 3 visualizing the specific interface classes and their relative dependencies.

Figure 3. The YARP inheritance diagram for the yarp::sig::Sound class and the relative dependencies with the objects yarp::os::Portable.

2 All the YARP software is released with a GNU general public license (GPL) version 2.The very same license applies to the software developed within the ITALK project. The copyright belongs to the ITALK consortium while authorship can be associated with individual project partners.

Page 7: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 7 of 19

Figure 4. Left: a simple audio acquisition example. Right: the console output of the audio acquisition example. Audio data are grabbed exploiting the yarp::dev::DeviceDriver class. A single sound block is acquired as a yarp::sig::Sound object. The frequency, number of samples and number of channels is then displayed.

3.2.1 Acquiring, rendering and broadcasting audio data The overall software architecture allows acquiring sounds from an external device with a limited number of lines of code.

As a first example, the code in Figure 4 can be used to acquire a single sound buffer (stored in a yarp::sig::Sound variable) from the default portaudio device (opened as yarp::dev:: DeviceDriver) with the default acquisition settings.

As a second example, the code in Error! Reference source not found. grabs sounds from the default audio grabber and renders the same sounds in the default audio render in a software loopback configuration.

Interestingly, audio data can be transferred over an Ethernet network by simply exploiting the YARP network infrastructure. This feature is potentially very powerful when distributing computation over multiple connected machines. The code in Figure 6 (left) shows a simple sender code which allows sending across the network the audio data grabbed from the default portaudio grabber device. Audio data can then be received by a remote computer (see Figure 6 right) responsible for rendering the sound using the default portaudio render. All these examples can be found in the YARP software repository available at http://eris.liralab.it/yarp/specs/dox/dev/html/examples.html.

Page 8: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 8 of 19

Figure 5. A simple audio loopback example. Audio data are grabbed and rendered exploiting the yarp::dev::DeviceDriver class. Interestingly, all audio data are stored in a yarp::sig::Sound object.

Figure 6. Simple code for sending and receiving audio data across an Ethernet network. Left panel: the sender grabs data from the default portaudio grabber and sends it across

Page 9: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 9 of 19

the network by exploiting the yarp::os::Port class (data are available at the yarp::os::Port named /sender). Right panel: the receiver reads audio data from the network exploiting the yarp::os::BufferedPort named /receiver and automatically connected to the /sender. Sounds are then rendered in the default portaudio render.

3.3 ITALK automatic speech recognition The software architecture described in the previous section is the basis for another fundamental activity, the definition of suitable software interfaces for (easily) integrating speech recognition/synthesis modules in YARP. This task has been carried on in parallel with the integration of Sphinx (http://cmusphinx.sourceforge.net/) an Open Source/cross­platform speech recognition engine.

The main YARP software features that will have an important role in the ITALK software development are:

1. the yarp::dev::PortAudioDeviceDriver class and its integration within the generic yarp::dev::DeviceDriver class.

2. the yarp::os::Network class and the associated yarp::os::Portable. 3. the yarp::sig::Sound object and the associated yarp::os::Portable class.

All these classes facilitate software integration by defining a standard in data representation. In particular, they implicitly define a set of interfaces for:

1. audio data acquisition and software abstraction from a specific hardware; 2. data exchange across multiple machines distributed over an Ethernet network, 3. audio data exchange across modules.

In order to exploit all these potentialities, a first step has been the integration of an automatic speech recognition module. This integration has proceeded in parallel with the definition of a software interface which will simplify the future integration of different speech recognition libraries 3 . The appropriateness of this interface has been tested by implementing an automatic speech recognition module based on Sphinx­3 (see related module documentation http://eris.liralab.it/italk/). Future plans include the implementation of a similar module based on Esmeralda [2] a speech recognition system developed within the ITALK consortium. Once again, the reason for implementing different recognition modules is the necessity of validating the abstraction and generalization capabilities of the proposed software interface.

At the current stage, the Sphinx module implements some important features which can be seen as a basic interface level. A first component is the definition of a suitable interface for the configuration parameters. As to this concern, the main tool is the yarp::os::ResourceFinder class designed for achieving a standardization of the modules configuration. As to the Sphinx recognition module, this class has been used to specify the sample rate, the acoustic model, the dictionary, the filler dictionary and the language model to be used by the Sphinx decoder.

3 Open Source Software is usually very dynamic and tends to change in terms of functions and functionalities. Quite often, new software tends to appear and old code stops being developed. For this reason, when developing software which integrates different modules, it is usually convenient to develop a software structure which does not rely on specific implementations of the individual modules (e.g. a specific automatic speech recognition library). As to this concern, library wrappers (i.e. software interfaces) are an efficient way to abstract from specific implementations.

Page 10: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 10 of 19

A second component is the audio data format to be supplied to the speech recognition module. The current implementation uses the yarp::sig::Sound object which allows the module to switch easily from locally supplied data (i.e. audio data grabbed from the microphone attached to the machine where the module is running) to remotely supplied data (i.e. audio data grabbed by a remote machine connected to the same network). Finally, a third interface level is relative to the way recognized words are represented and communicated to the other software modules. Once again, the solution chosen exploits the YARP network infrastructure. In particular, recognized sequence of words are stored in a yarp::os::impl::String and sent across the network inside a yarp::os::Bottle object.

Remarkably, a complete interface definition should abstract from the specific library used for performing the speech recognition task. In this sense, we are planning to further modify the speech recognition module by defining a suitable library wrapper that will abstract from the specific function names, data types and data processing structure. This interface level has not been implemented yet. As a consequence, the Sphinx based recognition module highly depends on the Sphinx library structure. Future works will focus on developing a suitable wrapper to create a generic recognition module independent from the specific recognition library used for audio data processing.

4 Production of robots A relevant part of the ITALK activities was concerned with the production of four new iCub

platforms (Figure 7) which will be delivered and used by different partners in the ITALK consortium: these are PLYM, BIEL, CNR, UH respectively. These groups will have the possibility of implementing and testing their scientific hypotheses on the iCub, thus strictly collaborating with the IIT (currently owning two iCub robots) in the implementation of the ITALK results on physical robots. During this reporting period, a lot of effort was directed to the realization of these four platforms. This activity is considered crucial and potentially risky if not achieved on time.

Figure 7 Four iCubs in the IIT production laboratory. Three of these robots are the ITALK copies under construction.

Page 11: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 11 of 19

Currently, three iCub platforms (PLYM, BIEL, and HERTS) have been delivered, and the fourth robot is under final testing and expected to be at CNR in May 2009. This situation seems to guarantee that the four platforms will be delivered well before M18 of the project. This situation is slightly delayed with respect to the original schedule but well within our contingency plan with the availability of a fully compatible iCub software simulator (see following sections) mitigating possible further delays. In particular, the partners had the opportunity to start their implementations on the simulator while waiting for the real robot and also of visiting the IIT laboratories in Genoa for training. Students from the ITALK Consortium attended the RobotCub summer school in 2008.

5 ITALK Simulator

An important contribution of the first year of the project was the development of the first prototype of the open­source iCub simulator. This was developed in close collaboration between PLYM, IIT and CNR. A simulator for the iCub robot magnifies the value a research group can extract from the physical robot, by making it more practical to share a single robot between several researchers. A further important asset is that the simulator is free and open, which makes it a accessible for people interested in the robot in order begin learning about its capabilities and design. Combined with an easy "upgrade" path to the actual robot, due to the protocol­level compatibility of the simulator and the physical robot, makes this tool an ideal platform for the ITALK research objectives. And for those without the means to purchase or build a humanoid robot, such as small laboratories or hobbyists, the simulator adequately opens a door to participation in this area of research. The simulator has been released in two versions, a basic version, which have already been tested by some of the partners of the projects and by other laboratories, and an extended version with additional features (see below) which will be tested in the next part of the project.

Computer simulations play an important role in robotics research. Despite the fact that the use of a simulation might not provide a full model of the complexity present in the real environment and might not assure a fully reliable transferability of the controller from the simulation environment to the real one, robotic simulations are of great interest for cognitive scientists [3]. There are several advantages of robotics simulations for researchers in cognitive sciences. The first is that simulating robots with realistic physical interactions permit to study the behavior of several types of embodied agents without facing the problem of building in advance, and maintaining, a complex hardware device. The computer simulator can be used as a tool for testing algorithms in order to quickly check for any major problems prior to use of the physical robot. Moreover, simulators also allow researchers to experiment with robots with varying morphological characteristics without the need to necessarily develop the corresponding features in hardware. This advantage, in turn, permits the discovery of properties of the behaviour of an agent that emerges from the interaction between the robot’s controller, its body and the environment. Another advantage is that robotic simulations make it possible to apply particular algorithms for creating robots’ controllers, such as evolutionary or reinforcement learning algorithms. The use of robotics simulation permits to drastically reduce the time of the experiments such as in evolutionary robotics. In addition, it makes it possible to explore research topics like the co­evolution of the morphology and the control system.

The simulated iCub robot is composed of multiple rigid bodies connected via joint structures. It has been constructed using data directly collected from the robot design specifications in order to achieve an exact replication , e.g. height, mass, Degrees of Freedom (DoF), of the first iCub prototype developed at the Italian Institute of Technology in Genoa. The environment parameters, such as gravity, objects mass, friction and joints, are based on known environment conditions.

Page 12: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 12 of 19

The current version of the iCub simulator is ideal for the purpose of the ITALK research plans as it incorporates a large amount of sensory modalities and features, such as: l stereo vision, l torque/force sensors, l touch sensors (both hands), l an emotion interface, l the ability to modify the environment as required and, l project external media on any type of object (e.g. Screen/ wall) as a stream.

The iCub simulator is therefore designed to be as similar as possible to the real physical robot, especially from a sensori­motor point of view, and attempts to reproduce, as accurately as possible, the physics and the dynamics of the robot and its environment.

Open source libraries have been used in the implementation of the simulator in order to allow free distribution to any researcher (even outside ITALK), without the necessity of purchasing restricted or expensive proprietary licenses. The basic version of the iCub simulator uses ODE [7] for simulating rigid bodies and the collision detection algorithms to compute the physical interaction with objects. The same physics library was used for the Gazebo project and the Webots commercial package. ODE is a widely used physics engine in the open source community, whether for research, authoring tools, gaming etc. It consists of a high performance library for simulating rigid body dynamics using a simple C/C++ API. ODE was selected as the preferred open source library for the iCub simulator because of the availability of many advanced joint types, rigid bodies (with many parameters such as mass, friction, sensors...), and its accuracy in real time performance. Although ODE is a good and reliable physics engine, computing all the physical interaction of a complex system can take a good deal of processing power. Since ODE uses a simple rendering engine based on OpenGL, it has limitations for the rendering of complex environments comprising many objects and bodies. This can significantly affect the simulation speed of complex robotic simulation experiments. It was therefore decided to use OpenGL directly combined with SDL [15], an open source cross platform multimedia library. This makes it possible to render the scene with much more ease and to carry out computationally­efficient simulation experiments.

As the aim was to create an reasonable replica of the physical iCub robot, the same software infrastructure and inter­process communication will have to be used as those used to control the physical robot. iCub uses YARP [1, 12] (Yet Another Robot Platform) as its software architecture. YARP is an open­source software tool for applications that are real­time, computation­intensive, and involve interfacing with diverse and changing hardware. The simulator and the actual robot have the same interface either when viewed via the device API or across network and are interchangeable from a user perspective. The simulator, like the real robot, can be controlled directly via sockets and a simple text­mode protocol; use of the YARP library is not a requirement. This can provide a starting point for integrating the simulator with existing controllers in esoteric languages or complicated environments.

Page 13: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 13 of 19

Figure 8. This figure shows the architecture of the simulator with YARP support. The User code can send and receive information to both the simulated robot itself (motors/sensors/cameras) and the world (manipulate the world). Network wrappers allow device remotisation. The Network Wrapper exports the YARP interface so that it can be accessed remotely by another machine.

The iCub simulator has been created using the data from the physical robot in order to have an exact replica of it. As for the physical iCub, the total height is around 105cm, weighs approximately 20.3kg and has a total of 53 degrees of freedom (DoF). These include 12 controlled DoFs for the legs, 3 controlled DoFs for the torso, 32 for the arms and six for the head. The robot body model consists of multiple rigid bodies attached through a number of different joints. All the sensors were implemented in the simulation on the actual body, such as touch sensors and force/torque sensors. As many factors impact on the torque values during manipulations, the simulator is not guaranteed to be perfectly correct. However the simulated robot torque parameters and their verification in static or motion are a good basis and can be proven to be reliable [13].

(a) (b)

Page 14: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 14 of 19

(c) (d)

Figure 9. Photo of real iCub (a), of simulated iCub and the binocular view (b) The simulated iCub moving all four limbs as part of a demo (c) and the simulated iCub looking at and manipulating an object in its environment.(d)

Figure 10. The iCub simulator attempting to reach a green bottle that was displayed on the virtual screen.

All the commands sent to and received from the robot are based on the YARP protocols. To simulate vision we use virtual cameras located in the eyes of the robot which in turn can be sent to any computer in the robot network using YARP for parallel processing of visual data. The system can interact in full with the world/environment. The objects within this world can be dynamically created, modified and queried by simple instruction resembling those that YARP uses in order to control the robot. The current version of the basic iCub simulator has been used for preliminary testing by partners in the RobotCub and ITALK projects. For example, some preliminary experiments have looked at human­robot interaction through the use of the simulator [14]. Experiments implement the simulated iCub capability to tracking an object live (see Figure 10). In addition to being used for experiments on the development of controllers for the iCub robot, some groups have used the simulator to create a mental model [15] used by the robot to represent the current state of the environment.

The extended version of the simulator provides the same features of the basic simulator described above but is based on a modular software architecture that allows it to support different physics engines (e.g. ODE and Newton) and to handle the simulation of two or more iCubs located in the same environment (see Figure 11 left). This feature can be crucial for performing experiments on social learning.

Page 15: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 15 of 19

The possibility to support different physics engines represent a potential advantage since different dynamical engines use different algorithms to approximate rigid­body dynamics and thus tend vary in term of accuracy and speed. Moreover, different dynamical engines often differ with respect to the improvements that are introduced in successive releases and in the level of support provided. The current version of the software has been interfaced with the Newton physics engine which represents one of the most promising alternatives to ODE. Indeed, Newton supports multi­core processors which might provide significant speed improvements in case of complex simulations (e.g. simulations including two or more iCub). Newton includes also ellipsoid objects as primitive shapes and can simulate collisions among any kind of shapes without restrictions. Finally, Newton includes a very accurate algorithm for simulating static and dynamic friction. The extended version of the simulator also offers a more advanced graphical visualisation of the 3D scene with the possibility of rendering the kinematic structures including the frame and the current position of the joints and the possibility to visualize the velocities and forces applied to selected objects (see Figure 11, right).

(a) (b)

Figure 11. (a) an exemplification of how the software handles two or more simulated iCub. Both iCub are controllable via their YARP interface over the network; (b) a screen shot of the joint visualization tool. The cyan cylinders represents the joints, the red arcs represents joints’ limits, the green lines over the red arcs represent the current position of the joints, the arrows indicate the local joint frame.

To achieve a high modularity level into the extended version of the iCub simulator, the source code has been split in two components: worldsim library consisting of C++ classes, and the application iCubSimulator with basic functionalities for viewing and interacting with simulated iCub. The worldsim library has been designed to offer a common set of C++ classes for manipulating the simulated physic world which are independent from the actual physic engine used (ODE, Newton, etc), and which are integrated into YARP.

In details, the class World represents a simulated environment where both physics and non­physic objects (Wobject) can be inserted. The Figure 12 show the WObject hierarchy, all subclasses of PhyObject represents physic entities into the World, like a cube or a sphere. The PhyCompoundObject allow the creation of complex object starting from the basic ones and combining them.

Page 16: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 16 of 19

Figure 12. This figure shows the inheritance diagram of Wobject, the basic class representing simulated objects. Classes on the left hierarchy represents object with physics properties, while on the right there are classes integrated with the YARP framework.

The PhyObject and its subclasses are wrapper classes on the actual structures used by physics engines for representing rigid­bodies. Other classes presents into worldsim library that complete the abstraction over different various engines are PhyJoint, PhyDOF and MaterialDB. The last class represents materials with their friction, elasticity and softness coefficients, and it sets the underlying physic engine for achieves what defined by the user. The PhyJoint and PhyDOF classes allow the construction of articulated structures, like the iCub joints. More specifically, the former represents a physic constraint between two objects (e.g. a hinge joint) and the latter represents a single degree of freedom of the joint. In this way, the motors can act on the PhyDOF information independently from the type of joint that has been defined. To add motors to the PhyDOFs of a joint one can use the MultiMotorController class which is interfaced with the YARP framework (Figure 13). The multi prefix means that this class (i) can be used to motorize more that one DOFs (and more that one joint), (ii) can be used to control the motor movements by specifying the desired final position, or the desired motor velocity. These capabilities are directly inherited by implementing the YARP interfaces for controlling motors.

(a) (b)

Figure 13. (a) shows the joints presents in the last version of worldsim library; (b) show the MultiMotorController that implements the YARP interfaces for controlling motors.

The YarpObject hierarchy (see Figure 12) is another example of class integrated with the YARP framework in different parts of the library. The main functionality of YarpObject is to allow an easy creation of control boards (see next Section) and frame grabbers. Indeed, the subclasses of YarpObject exploit this integration for creating the YARP ports.

Page 17: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 17 of 19

The iCub object creates all YARP ports as the real iCub robot, while the WCamera opens a YARP port that connects with the yarpviewer (which produce the 3D visualization shown in Figure 11). Finally, the WorldController opens an RPC 4 YARP port for manipulating the simulated environment. From this port, the user can add/remove objects (and iCub robots), move objects add force to them, change object parameters, stopping/pausing the simulation and querying the status of any object into the world.

Future plans on the simulator development will mostly involve the design of functionalities to model and interact with the physical environment. For example, this will allow the users to modify the objects in the world where the iCub resides, in order to allow different types of experiments. Finally, further work will focus on the systematic testing and replication of simulation studies with the physical robot

5.1 ITALK simulator control board interface The iCub simulator was designed so as to present the same user­robot (software) interfaces of the real robot. In this subsection, we focus on motor control interfaces since they represent a sufficiently rich scenario 5 . Interfaces to motor control devices are possibly complex especially in humanoid robots. Typically, control boards designed for the industry come with a standard (basic) interface which provides a proportional­integral­derivative (PID) control system using position or velocity control modes. Unfortunately many aspects become complicated when we consider programmable devices that can implement a large amount of control modalities and functionalities. Furthermore as hardware evolves quickly, especially robotic hardware (subject to changes in commercial and industrial environments) software used to control it tends to become obsolete and is therefore in need of software re­development which is time consuming and laborious. In order to avoid this continuous software alignment, YARP implements a set of library wrappers designed for long­term software development aiming towards easy interfacing with different hardware components within the field of humanoid robotics. The approach that YARP uses is to define some interfaces for classes of devices to wrap native code APIs. Therefore changes that may occur in hardware will likely require only a change in the APIs calls. This easily encapsulates hardware dependencies and the source code be removed by providing a “factory” for creating objects at run time. Among many other aspects, YARP defines interfaces to control families of devices and, in particular, to motion control boards. An interface to a YARP device is to be understood as a software specification for the functionalities the associated hardware provides. A YARP device is a “wrapper” class which implements all methods declared in its interface. All details specific to the hardware are handled in the wrapper class and are hidden by its interfaces. Therefore, interfaces to control boards have been defined on the basis of the control paradigm they implement. Accordingly, YARP defines:

• yarp::dev::IEncoders: group all methods providing access to the motor encoders, like methods for reading the current position and velocity of each axis;

• yarp::dev::IPositionControl: methods to control each axis by specifying its position;

• yarp::dev::IVelocityControl: methods to control each axis by specifying its velocity;

4 RPC: Remote Procedure Call 5 All the simulator sensors and motors (including vision, emotion interface and position sensors) have been interfaced in the same way it is done in the real robot. This feature allows to easily interface the very same code with the simulator and with the real robot depending on the needs.

Page 18: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 18 of 19

• yarp::dev::ITorqueControl: methods to control the amount of force/torque exerted by each axis.

These interfaces are independent of the particular algorithm the control board implements to realize the corresponding functionality. They basically capture the similarities among devices and permit the separation of device dependent code from user code. This includes devices with different procedure of initialization and different APIs.

As the aim of the iCub simulator was to create a reasonable replica of the physical iCub robot, the software infrastructure and inter­process communication are those that are used to control the physical robot. The simulator and the actual robot have the same interface, either when viewed via the device API, or across the network, and are interchangeable from a user's perspective. The simulator, like a real robot, can be controlled directly via sockets and a simple text­mode protocol. This is achievable using the yarp::dev::RemoteControlBoard (see Figure 14) interface which acts as the client side of the control board which in turn connects to a server yarp::dev::ServerControlBoard.

Figure 14 The inheritance diagram for the YARP RemoteControlBoard.

References

[1] G. Metta, P. Fitzpatrick, L. Natale. YARP: yet another robot platform. In the International Journal on Advanced Robotics Systems, Special Issue on Software Development and Integration in Robotics. March 2006.

[2] Gernot A. Fink Developing HMM­based Recognizers with ESMERALDA. In Vaclav Matousek, Pavel Mautner, Jana Ocelikova, and Petr Sojka, editors, Lecture Notes in Artificial Intelligence, volume 1692, pages 229­234, Berlin Heidelberg, 1999. Springer.

[3] Ziemke, T. (2003). On the role of robot simulations in embodied cognitive science. AISB Journal , 1(4):p.389­399. [4] Kumar, S., & Bentley, P. (2003). On growth, form and computers. Elsevier [5] Nolfi, S., & Floreano, D. (2000). Evolutionary Robotics: The Biology, Intelligence and Technology of Self

Organizing Machines. Cambridge: MIT­Press Bradford Books.

Page 19: ITALK - University of Plymouth · robot between several researchers. The iCub simulator is designed to be as similar as possible to the real physical robot, especially from a sensori

iCub platform and simulation software (Deliverable 1.1) Page 19 of 19

[6] Bongard, J., & Pfeifer, R. (2003). Evolving complete agents using artificial ontogeny. In F. Hara, & R. Pfeifer, Morpho­functional Machines: The New Species (Designing Embodied Intelligence) (pp. p 237­258). Springer Verlag.

[7] ODE. (n.d.). Open Dynamic Engine. Retrieved from http://opende.sourceforge.net/ [8] Koening, N., & Howard, A. (2004). Design and use paradigms for Gazebo, an open­source Multi­robot Simulator.

International Conference on Intelligent Robots and Systems [9] Michel, O. (2004). Webots: Professional mobile robot simulation. International Journal of Advanced Robotics

Systems , 1(1):p.39­42. [10] OpenGL(n.d.) The Industry's Foundation for High Performance Graphics http://www.opengl.org/ [11] SDL. (n.d.). Simple DirectMedia Layer. Retrieved from http://www.libsdl.org/ [12] Fitzpatrick, P., Metta, G., & Natale, L. (2008). Towards long lived genes. Robotics and Autonomous systems ,

56(6):p.29­45. [13] Nava, N., Tikhanoff, V., Metta, G., & Sandini, G. (2008). Kinematic and Dynamic Simulations for the design of

RobotCub upper Body structure. ESDA. [14] Tikhanoff, V., Fitzpatrick, P., Nori, F., Natale, L., Metta, G., Cangelosi, A. The iCub Humanoid Robot Simulator.

International Conference on Intelligent Robots and Systems IROS’08, Nice, France, 2008 p.22­26. [15] Dominey, P. (2007). Sharing intentional plans for imitation and cooperation: Integrating clues from child

development and neurophysiology into robotics. AISB Workshop on Imitation.


Recommended