+ All Categories
Home > Documents > MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for...

MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for...

Date post: 21-Aug-2018
Category:
Upload: nguyentu
View: 220 times
Download: 0 times
Share this document with a friend
24
International Journal of Humanoid Robotics World Scientific Publishing Company MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS R. Andrew Russell, Geoffrey Taylor, Lindsay Kleeman and Anies H Purnamadjaja Intelligent Robotics Research Centre Department of Electrical and Computer Systems Engineering Monash University, Clayton, VIC 3800, AUSTRALIA {Andy.Russell | Lindsay.Kleeman | Geoffrey.Taylor | Anies.Purnamadjaja}@eng.monash.edu.au Received (Day Month Year) Revised (Day Month Year) Accepted (Day Month Year) Sensing is a key element for any intelligent robotic system. This paper describes current progress of a project in the Intelligent Robotics Research Center at Monash University that has the aim of developing a synergistic set of sensory systems for a humanoid robot. Currently sensing modes for colour vision, stereo vision, active range, smell and airflow are being developed in a size and form that is compatible with the humanoid appearance. Essential considerations are sensor calibration and the processing of sensor data to give reliable information about properties of the robot's environment. In order to demonstrate the synergistic use of all of the available sensory modes a high level supervisory control scheme is being developed for the robot. All time-stamped sensor data together with derived information about the robot's environment are organized in a blackboard system. Control action sequences are then derived from the blackboard data based on a task description. The paper presents details of each of the robot's sensory systems, sensor calibration, and supervisory control. Results are also presented of a demonstration project that involves identifying and selecting mugs containing household chemicals. Proposals for future development of the humanoid robot are also presented. Keywords: light stripe ranging, stereo vision, electronic nose, airflow sensing, multisensor integration. 1. Introduction Most current robotic systems are sensor-poor and only contain sufficient sensory capabilities to complete their immediate task. Thus a museum guide robot may contain odometry and beacon sensors for navigation and ultrasonic, laser rangefinder and tactile bumpers to avoid obstacles. However, human guides are expected to maintain a much more complete awareness of their surrounds and to act appropriately. If a member of the public is smoking in the museum they should be asked to extinguish the cigarette whereas the fire services would be required if an electrical fire breaks out in a display. In this case a range of sensory information would be required together with a good level of judgment in order to formulate an appropriate response. It is commonly stated that humans have the five senses of sight, hearing, touch, taste and smell. However, this assessment severely underestimates the number of distinct
Transcript
Page 1: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

International Journal of Humanoid Robotics World Scientific Publishing Company

MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS

R. Andrew Russell, Geoffrey Taylor, Lindsay Kleeman and Anies H Purnamadjaja

Intelligent Robotics Research CentreDepartment of Electrical and Computer Systems Engineering

Monash University, Clayton, VIC 3800, AUSTRALIAAndy.Russell | Lindsay.Kleeman | Geoffrey.Taylor | [email protected]

Received (Day Month Year)Revised (Day Month Year)

Accepted (Day Month Year)

Sensing is a key element for any intelligent robotic system. This paper describes current progress ofa project in the Intelligent Robotics Research Center at Monash University that has the aim ofdeveloping a synergistic set of sensory systems for a humanoid robot. Currently sensing modes forcolour vision, stereo vision, active range, smell and airflow are being developed in a size and formthat is compatible with the humanoid appearance. Essential considerations are sensor calibration andthe processing of sensor data to give reliable information about properties of the robot'senvironment. In order to demonstrate the synergistic use of all of the available sensory modes a highlevel supervisory control scheme is being developed for the robot. All time-stamped sensor datatogether with derived information about the robot's environment are organized in a blackboardsystem. Control action sequences are then derived from the blackboard data based on a taskdescription. The paper presents details of each of the robot's sensory systems, sensor calibration, andsupervisory control. Results are also presented of a demonstration project that involves identifyingand selecting mugs containing household chemicals. Proposals for future development of thehumanoid robot are also presented.

Keywords: light stripe ranging, stereo vision, electronic nose, airflow sensing, multisensorintegration.

1. Introduction

Most current robotic systems are sensor-poor and only contain sufficient sensorycapabilities to complete their immediate task. Thus a museum guide robot may containodometry and beacon sensors for navigation and ultrasonic, laser rangefinder and tactilebumpers to avoid obstacles. However, human guides are expected to maintain a muchmore complete awareness of their surrounds and to act appropriately. If a member of thepublic is smoking in the museum they should be asked to extinguish the cigarettewhereas the fire services would be required if an electrical fire breaks out in a display. Inthis case a range of sensory information would be required together with a good level ofjudgment in order to formulate an appropriate response.

It is commonly stated that humans have the five senses of sight, hearing, touch, tasteand smell. However, this assessment severely underestimates the number of distinct

Page 2: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

sensing modalities available to humans. For instance, touch is not a single sense but awide grouping of sensory modes that can detect:

(1) contact with an object, (2) applied force, (3) temperature of a touched object, (4) a skin temperature rise of 0.01° per second can be detected and used to infer the

proximity of a warm object, (5) thermal properties via contact (metal feels cold, cork feels warm), (6) close-in proximity using body hairs, and (7) air current strength and direction (also using body hairs or the skin's thermal sense).

By broadening the view to active sensing involving the guided search of an externalobject, additional properties can be sensed. For example:

(8) material compliance, (9) object weight, (10) object articulation, (11) object shape, (12) surface texture, and (13) surface friction.

In addition, there are other sensing modes, such as gravity vector information providedby the vestibular system, that are completely outside the ambit of the five classical humansenses. Robot designs may be inspired by humans but are not restricted to copying themand so further sensory modes are available such as radar, sonar, infra-red imaging, laserrange-finding, infra-sound detection, and seismometers. The aim of this project is toinvestigate the challenges and benefits of providing humanoid robots with simultaneousaccess to a variety of sensory modalities and exploiting the synergies that exist betweenthem.

2. Metal Man the humanoid robot

In the Intelligent Robotics Research Centre at Monash University an upper-torsohumanoid robot is being developed. For historical reasons this robot has been namedMetal Man. Metal Man represents 'work in progress' and the major effort has beenapplied to developing sensory and actuator systems together with software forcoordinating these resources in a synergistic manner. While building all of the componentparts into an integrated structure would look attractive it is felt that this could beinflexible and premature at this stage in the project. Metal Man's arms are approximatelyanthropomorphic in configuration and scale. They consist of two 6-DOF Puma 260robots each carrying a 1-DOF Otto Bock prosthetic hand. All signal processing andsupervisory control is implemented on a dual 2.2 GHz Intel Xeon PC.

Currently Metal Man is provided with a pair of PAL cameras mounted on aBiclops pan/tilt/verge robotic head. The cameras capture stereo images at 320x240 pixel

Page 3: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

resolution and image processing is performed at PAL frame rate (25 Hz). A laser stripegenerator is mounted on the Biclops head above the cameras. The generator consists of a5mW red laser diode module with a cylindrical lens to produce a vertical light plane. ADC motor drives the laser about a vertical axis to scan the stripe across the scene and therotation angle is measured via an optical encoder. Motor control is implemented on a PICmicrocontroller, that communicates with the host PC via a serial link.

Fig.1 The experimental environment: Metal Man determines the pose of its left hand.

To provide a sense of smell Metal Man is equipped with an electronic nose containing of4 tin oxide gas sensors manufactured by Figaro Engineering Inc. In addition there aresensors for air temperature and humidity. Temperature and humidity are monitoredbecause both of these quantities affect the response of the tin oxide sensors. Theirresponse is also useful for assessing the temperature and constituents of beveragesoffered to the electronic nose. The electronic nose also incorporates a small fan that isused to control the induction of air into the sensor. Major component parts of Metal Manare identified in Fig. 1.

Measurement of airflow provides useful complementary information for a senseof smell. To sense air movements an airflow sensing whisker system has been designed.The sensing elements consist of a strip of aluminized plastic film and vibrations of thisstrip are detected by an optical sensor. By measuring the frequency of vibration of a ringof 8 whisker sensors airflow direction and intensity can be estimated. Each sensor systemrequired its own specialized data processing software in order to derive information at auseful level of abstraction such as:

• there is a red cylindrical object at a location x, y, z with its axis of symmetryvertical, with a height of h and a radius r.

• an airflow of about 0.3 m/s is blowing from an angle of 45˚.• there is a weak fluctuating smell of ethanol.

Page 4: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Sensor construction and data processing for each sensor system is described in moredetail in the following section.

3. Metal Man's sensory systems

3.1. Vision

Vision is the primary sense used by the humanoid to locate and classify objects ofinterest, by acquiring dense color/range measurements of the workspace. Passive stereo isusually associated with humanoid sensing, but the accuracy and reliability of currenttechniques often depend on the contents of a scene. Light stripe ranging is acomputationally efficient alternative, but also presents unique challenges when used on ahumanoid robot; the sensor must operate in normal ambient light and must be capable ofrejecting sensor noise, spurious reflections and cross talk from other robots. Conventionalmethods do not distinguish the light stripe from secondary reflections and cross talk,making them unsuitable for robots operating in a domestic environment. Robust stripescanners have been proposed in previous work1,2,3, but suffer from issues includingassumed scene structure, acquisition time and lack of error recovery. Taylor, et al.4 havedeveloped a robust stereoscopic light stripe scanner to address these issues, and thisscanner provides color/range measurements for Metal Man.

The scanner we have developed uses two cameras to measure the stripe, andexploits redundancy to disambiguate the stripe from noisy measurements. A vertical lightplane is generated by a laser diode and cylindrical lens, and scanned across a scene whilean optical shaft encoder measures the rotation angle. Stereo images are captured at 40 msintervals, and each image is processed using edge filters to determine candidate stripelocations. Measurements corresponding to the actual stripe are identified (using thetechnique described below) to recover the 3-D profile of the illuminated surface. Theprofiles are assembled into an array of 3-D points which we refer to as the range map.After each complete scan, a color image is captured and implicitly registered with therange data.

To understand the procedure for eliminating reflections and cross talk, considerthe reconstruction of a point X on the light stripe from measurements on the image plane.The stereo rig is modeled using central projection (or pin-hole) cameras with focal lengthf and stereo baseline 2b, and we make the further assumption of rectilinear stereo(parallel camera axes and coplanar image planes). The camera models can besummarized by the homogeneous projection matrices PL and PR, which project X onto theleft and right image planes according to x X= P . Now, let xL = (xL, y)T and xR = (xR, y)T

represent noisy candidate measurements of the light stripe on the same epipolar line, andΩ represent the current light plane position measured from the encoder such that Xsatisfies the plane equation ΩTX = 0 . The noise is modeled as a constant variance in xL

and xR over the entire image plane. We can then cast the reconstruction problem as aconstrained optimization: find the point X on the light plane satisfying ΩTX = 0 that

Page 5: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

minimizes the image plane error d2 between the projection of X and the measurements xL

and xR.

d L L R R2 2 2= − + −x X x XP P

Under the above assumptions, it can be shown that the minimum image plane error isgiven by

d x x y fL R2 2 1= + + + +( ) /( )α β γ α (1)

where

αβγ

= − += += +

( ) /( )

/( )

( )

Ab d Ab d

Bb Ab D

Cb Ab D

2

2

The corresponding optimal reconstruction X = (X, Y, Z)T is:

X x x y f b k

Y by k

Z bf k

L R= −( ) −( ) − +( ) −( )[ ]= +( )= +( )

α α α β γ

α

α

1 1

2 1

2 1

2

2

/

/

/

where k x x y fL R= + − + − +( )( ) ( )( )α α α β γ1 1 .

The robust reconstruction problem can now be solved by evaluating the reconstructionerror (Equation 1) for all possible candidate measurements xL and xR on each scanline,and choosing the pair with the minimum error. The result is validated by ensuring theminimum error is below a fixed threshold d < dth. The framework described aboverequires a calibrated model of the scanner to determine the light plane parameters Ω for agiven encoder measurement. A simple self-calibration process using measurements of anarbitrary non-planar target is described in Taylor et al.4.

(a) (b)

Fig. 2. Scan results for a scene with interference generated by a mirror. (a) Conventional scan resulting inphantom measurements. (b) Robust scan showing noise rejection.

Page 6: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Fig. 2 demonstrates the robustness of our stereoscopic stripe scanner. The scene containscommon domestic objects, while a mirror creates a reflection of the objects and laserstripe to simulate the effect of cross-talk and secondary specular reflections. Fig. 2(a)shows the range data measured using a conventional single-camera scanner, while ourrobust scan is shown in Fig. 2(b). The inability of the conventional method to distinguishthe laser from its reflection results in multiple phantom measurements, while our methodprovides dense, accurate range data suitable for high level processing (see for exampleFigure 4).

3.1.1. Task Planning

Once the range/color measurements are acquired, the robot must localize and classifyobjects of interest. We apply a segmentation algorithm to divide the range data intosmooth regions satisfying the constraints described below, and assume each connectedregion corresponds to a single object. For each object that is to be manipulated, a graspplanning algorithm then determines the optimal pose of the hand for a stable grasp. Inthis application we are particularly interested in locating mugs. Identifying a prioriunknown mugs is challenging when particular instances can vary significantly in size andshape. We overcome this problem by representing mugs using data-driven geometricprimitives. In fact, geometric primitives can be used to adequately model many commondomestic objects.

Range data segmentation is typically based on iteratively growing selected seedregions according to a homogeneity constraint5. However, our approach is based on thenotion that geometric primitives fit more robustly to large segments rather than smallpatches. Thus, we have developed a split-and-merge segmentation algorithm6 that splitsthe range map at depth discontinuities and creases, and fits each continuous region with ageometric primitive. If the initial segments cannot be accurately modeled, furthersplitting occurs at changes in local surface type (described below). A final merging stepcompensates for over-segmentation. The primitives fitted to each region can be used todirectly model objects for classification, tracking and task planning.

Fig. 3. Surface type classes for segmenting range data according to local shape.

Page 7: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

Surface type classification is the process of identifying the local shape of each element inthe range map according to the six classes shown in Fig. 3. Classification is typicallybased on local curvatures calculated by fitting analytic curves to the range data, but thisprocess is costly and the result depends on the arbitrary selection of an approximatingfunction. To avoid this problem, we developed a novel non-parametric surface typeclassifier based on analysis of the Gaussian image and surface convexity6.

Fig. 4 shows the result of our segmentation algorithm applied to a scene withtypical domestic objects. Additional stereoscopic light stripe/segmentation experimentshave been performed on a variety of scenes with objects such as bowls, bottles andfunnels, and the results can be viewed at http://www.irrc.monash.edu.au/laserscans.When the reasoning system determines that a particular object should be manipulated, agrasp planner calculates the pose of the robot for a stable grasp. We adopt a typicalapproach to grasp planning7: the force applied by the fingers should be normal to thegripped surface to minimize the effect of unknown surface friction, and the object shouldbe grasped near the center of mass to minimize load torque when lifted. These principlesare easily applied to calculate a stable grasp for a mug. The hand is positioned so that theline between the thumb and forefinger is perpendicular to the axis of the mug, and thecontact points are about 10 mm below the rim. The orientation of the hand is chosen tominimize the angle between the wrist and forearm of the robot. The wireframe model ofthe gripper in Fig. 5 shows a typical planned grasp.

(a) b) (c)

Fig. 4. Segmentation and object modeling for a color/range scan of typical domestic objects. (a) Rawcolor/range scan. (b) Segmented regions. (c) Extracted objects.

Fig. 5. Grasp planning result (the hand is indicated by a red wireframe model).

Page 8: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

3.1.2 Visual Servoing

The visual servoing component of the system is based on work first published by Taylorand Kleeman8. Visual servoing describes the feedback control of a robot usingmeasurements from a camera, and allows the robot to accurately position the hand in thepresence of calibration errors in the kinematic model and hand-eye transformation.Visual servoing techniques are usually classified as image-based, position-based orhybrid depending on how the control error is formulated9. In image based visualservoing, control errors are measured on the image plane without any transformation toreal space. This avoids complex 3D scene reconstruction, but also results in unpredictablerobot trajectories in Cartesian space, although recent hybrid schemes aim to alleviate thisproblem10. However, recent studies in applying biological principles of reach-to-grasp torobotic systems suggest that humans use 3D structural cues rather than projected imagefeatures11, and that human motions are planned in Cartesian space rather than jointspace12. Both properties are characteristic of position-based visual servoing, in whichcontrol errors are formed in a visually reconstructed Cartesian space. The authors havethus adopted position-based servoing as a more flexible framework for addressingvarious planning and control issues encountered in humanoid manipulation tasks.

Fig. 6 illustrates the basic visual servoing task. The target pose T of the gripperis planned with respect to the camera frame C using information from the light stripesensor. The current pose of the gripper G with respect to the robot base R (not shown) isknown from the Puma kinematic model. In an open loop control system, the robot wouldrequire accurate knowledge of the transformation between C and R to drive the gripper tothe target position. However, visual servoing allows the transformation between G and Tto be measured directly without explicit knowledge of the position of the Puma base withrespect to the cameras. Thus, visual servoing provides the humanoid robot with onlineand continuous hand-eye calibration.

Fig. 6. Visual servoing framework. The task is to align the gripper frame G with the target frame T, based onvisual reconstruction in the camera frame C.

Page 9: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

Artificial cues in the form of red LEDs are attached to the gripper to simplify imageprocessing and increase tracking robustness. The positions of the LEDs in the gripperframe are manually calibrated and form an internal model of the hand. During visualservoing, stereo images are captured at 40 ms intervals and a color filter allows thecentroid of the visible LEDs to be measured on each image plane. A Kalman filter with aconstant velocity dynamic model performs implicit 3D reconstruction of the stereomeasurements to determine an optimal estimate of the current pose G in the cameraframe. A simple proportional control law is used for visual servoing, based on thetransformation between G and the target pose T, which is passed to the Puma controllerto calculate appropriate joint motions.

An autonomous initialization procedure determines the pose of the gripper at thecommencement of each new servoing task. First, all LEDs are activated and the camerasscan the workspace with a color filter to locate the hand. The LEDs are then flashedindividually to provide unambiguous position measurements, and these are processed bythe Kalman filter to estimate the initial pose of the gripper. During servoing, loss ofvisual feedback is minimized by actively tracking the motion of the gripper with theBiclops head. If tracking is lost, the initialization procedure provides an automaticrecovery mechanism.

3.2. Smell

Although not as refined as the sense of smell of some animals, particularly dogs, thehuman nose is thought to be able to distinguish about ten thousand different odors.Therefore, it is reasonable to require that an artificial sense of smell for a humanoid robotshould also be capable of discriminating a number of different odors. Such a sensor iscalled an electronic nose and mimics the mammalian nose’s ability to detect and identifya range of volatile chemicals. Electronic noses commonly consist of a number of sensorseach with a distinct but broad and overlapping sensitivity to a range of chemicals. Thepattern of responses from the sensor array is characteristic of the applied chemical. Whenan unknown odor is detected by an electronic nose a pattern recognition process isperformed to compare the pattern of sensor responses with stored templates. The bestmatch is used to classify the unknown odor13.

Tin oxide gas sensors are a popular choice for use in electronic noses and also inother robotics experiments. This is because tin oxide sensors are readily available in arange of sensor types with different peak sensitivity to a number of common chemicals.These sensors were chosen for the humanoid robot electronic nose. Tin oxide sensorscontain a heater and usually operate at around 300˚C. The response of these sensors todifferent chemicals depends on the heater temperature14. In order to reduce the size andpower consumption of the electronic nose it was decided to make use of the temperaturevariable response of tin oxide sensors and to gather additional information about sensedodors by varying the sensor temperature. In this way a small number of tin oxide sensorscan produce enough data to discriminate a large number of chemicals15.

Page 10: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Fig. 7. Thermal response of a TGS2600 sensor.

By using the temperature dependent resistance of the heating element it was possible tomeasure the heating and cooling responses of the TGS26XX tin oxide sensors used inthis project (Fig. 7). From the graph it can be seen that the sensor cools 90% of the rangebetween operating temperature and room temperature in 20 seconds. Therefore thisperiod was chosen as the minimum practical cooling time. The sensor can be heatedthrough the same temperature range in 8 seconds. However, the heating part of the cycleis the period during which data is recorded and it was decided to extend this time to 10seconds in case there is a delay between the sensor reaching a particular temperature andthe corresponding response of the tin oxide sensor.

The prototype humanoid electronic nose contains of four tin oxide gas sensors(TGS2600, TGS2610, TGS2611 and TGS2620). In addition there are sensors for airtemperature (LM35) and humidity (SMTRH 05). Temperature and humidity aremonitored because both of these quantities affect the response of the tin oxide sensors.Their response is also useful for assessing the temperature and constituents of beveragesoffered to the electronic nose. The electronic nose also incorporates a small fan that isused to control the induction of air into the sensor. A cross-section view of the electronicnose is shown in Fig. 8.

Fig. 8. Cross-section view of the electronic nose.

A sensing cycle was developed for the humanoid electronic nose that includes a heatingand cooling period as well as turning the fan on and off to introduce new odor samplesinto the sensor. The full sensing cycle is as follows:

Page 11: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

• 10 seconds - heater on, fan off record sensor data every 0.5 sec.(record sensor response while it is heating)• 5 seconds - heater on, fan on.

(additional heating period to speed up sensor recovery when an odor is removed)• 16 seconds - heater off, fan on.

(cool sensor and introduce a new air sample)• 4 seconds - heater off, fan off.

(allow air movements to stop)

Fig. 9 shows the response of one of the tin oxide gas sensors in the humanoid nose(TGS2600) heated in the presence of four different chemicals. Although having asuperficial similarity the pattern of the responses is significantly different for eachchemical. As well as being dependent on temperature the output of tin oxide sensorsvaries non-linearly with chemical concentration. For this reason families of sensorresponses were recorded for a range of concentrations of each target chemical and thisstored template data was used to identify unknown chemicals. In order to classify anunknown chemical its recorded response is compared with all of the members of each ofthe stored families of templates. The template that has the smallest Euclidean distancefrom the unknown response is assumed to originate from the same chemical measured atthe same concentration.

Fig. 9. The response of a TGS2600 gas sensor exposed to acetone, ammonia, camphor and ethanol.

It is common practice to use an artificial neural network to match the response of anelectronic nose to an unknown chemical with prerecorded responses from knownchemicals. At least for initial investigation it was decided to match data from an unknownchemical with the entire database of templates. This allowed us to quickly add andremove chemicals without the necessity of retraining the system. Matching was achievedusing a simple nearest neighbor technique. One sample consisted of 20 readings taken athalf second intervals from each of 4 gas sensors (this was later reduced to 4 readings withlittle loss of accuracy in the classification process). Our full database of templatesincludes groups of 20 samples taken for different concentrations for acetone, ammonia,camphor, ethanol, ground coffee, espresso coffee, mocha coffee, cinnamon tea,peppermint tea, room fragrance oil, red musk oil, eucalyptus oil, Vegemite (a

Page 12: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

concentrated yeast extract) and incense. At high concentrations recognition accuracy wasbetween 98% and 100% for all of these odors providing that the unknown sample wasrecorded at the same ambient temperature and humidity as the pre-recorded templates. Asis to be expected at lower concentrations there is increasing ambiguity between differentchemicals and the reading for clean air.

3.3. Airflow

Monitoring airflow does not provide such a rich source of environmental information asvision. However, it does indicate environmental conditions that are difficult or impossibleto monitor by other means. Familiar applications of airflow information include:

• combining with odor information to help locate the source of a volatile chemical,• air currents caused by air vents, open doors or open windows can be sensed, and• the air disturbance caused by a person walking past can be detected.

There are many methods of measuring airflow such as wind turbines and hot wireanemometers but these do not work well at low velocities. For robotics applications anovel windvane has been developed by one of the authors16. However, this sensorcontains a rotating paddle that would not be compatible with the form of a humanoidrobot. In order to measure airflow in a non-intrusive manner it was decided to investigateone of the techniques that humans use for detecting airflow. At least two mechanismsprovide information about airflow. The skin on the face can give a statistically significantsensation for changes of 0.001˚C/s17. Therefore the cooling effect of airflow on the skincan indicate wind direction based on the varying velocity of airflow around the head. Theside of the head away from the wind will be sheltered and therefore indicate a lowercooling effect associated with the lower velocity. An alternative effect is the disturbanceof fine body hairs caused by fluctuations in the airflow passing over the skin. If thisturbulence can be detected then it could be used to indicate wind direction as it does inhumans.

Fig. 10. The whisker airflow sensor.

To test this idea the airflow sensing whisker shown in Fig. 10 was designed. The sensingelement is a 5mm wide strip of aluminized plastic film and vibrations of this strip aredetected by an optical sensor. To get a suitable combination of stiffness and dimensionalstability 25µm thick PVDF film was used (note that the piezoelectric properties of the

Page 13: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

film are not used in this application). A tab formed by putting a 90˚ bend in the end of thewhisker interrupts the light beam in a slotted optical switch (Omron type EE-SX1109).The output of the optical switch was fed into a schmitt inverter to give a signal with a fulllogic swing. A measured movement of 0.04mm gave a change in logic output of theinverter. Positive going logic transitions were counted by a microcontroller and this countformed the output of the sensor.

Fig. 11. Whisker sensor output averaged over 10 seconds in an airflow of 0.25m/s.

Fig. 11 shows whisker sensor measurements spaced every 20˚ around the humanoid head.Measurements were made in an airflow of 0.25 m/s approaching the head on a bearing of0˚. The airflow was created using a domestic cooling fan running from a variabletransformer to give control of the fan velocity. There are three main regions wherefluctuations in the airflow excites vibrations in the whisker sensors. If the airflow isincident on the head at 0˚ then at about 90˚ and 270˚ the air flowing around the headseparates with the shedding of vortices. At these symmetrical points there is a great dealof fluctuation in the airflow. A stagnation point occurs in the area where the incidentairflow meets the head (0˚). In laminar airflow there would be little air movement at thispoint. However, airflow from the fan is far from steady and these fluctuations as detectedby the whisker sensor at this point. The far side of the head (180˚) is sheltered from theairflow and little fluctuation in the airflow is detected. This interpretation agrees with thesensor data plotted in Fig 11.

At low airflow velocities the magnitude of the whisker sensor output is relatedto velocity of the airflow. The humanoid head and a turbine anemometer were positioned2 m down-wind of the fan. Fig. 12 shows the sensor reading in pulses per second for thesensor positioned at the stagnation point (0˚) versus anemometer reading of airflowvelocity. There is a consistent relationship between the anemometer reading and theoutput of the whisker sensor over the range 0.1 m/s to 1.0 m/s. The anemometer could notindicate below 0.1 m/s. However, the whisker sensor still gave readings at lower airflowvelocities.

Page 14: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Fig. 12. Sensor output (averaged over 10 seconds) related to airflow velocity.

In order to determine the direction of airflow 8 whisker sensors were mounted on asweatband worn by the humanoid head. Two sets of reference readings were taken withthe head rotated 22.5˚ to give sensor readings spaced 22.5˚ apart round the head. Theairflow was set at 0.45 m/s during the reference measurements. In order to calculate thebearing of airflow incident on the head a single measurement xi is taken from the 8sensors. This set of readings is then correlated with a subset of the 16 referencemeasurements yi. Fig. 13 shows the correlation between sensor readings for an airflowincident at 300˚ and the reference readings. The best correlation corresponds to a headingof 315˚.

Fig. 13. The correlation between a set of reference measurements and readings taken with airflow of 0.35m/sincident at a bearing of 300˚.

In addition to measuring the relatively steady airflow from a cooling fan the whiskersensors also respond to the current of air displaced by walking close to the head. Fig. 14shows the sensor response to a person walking left to right in front of the humanoid headfollowed by walking right to left behind the head. In both cases the walker traveled atabout 2m/s and passed within one metre of the head. The response of the sensors to thewash did not give a strong correlation with the reference measurements taken in a steady

Page 15: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

airflow. However, in each of the sets of results shown in Fig. 14 there are regions thatseem to be sheltered from the disturbed air. Walking left to right in front this region iscentered about 290˚ and walking left to right behind 150˚. The regions are almost 180˚apart and therefore seem to relate to the trajectory of the walker with respect to the head.

0 50 100 150 200 250 300 350

0

1

2

3

4

5

6

7

Tim

e (s

ec.)

Sensor angle (degrees)

0

5

10

15

20

25

Fig. 14. Airflow sensor system response to the wash caused by walking left to right in front of the humanoidhead and then right to left behind.

Thus the whisker sensor can be used to measure both direction and velocity of steadyairflows and also register the transient airflows resulting from the 'wash' of air producedby moving objects.

4. Sensor coordination and activity focus

Interaction between a humanoid robot and a human being is an important consideration ifthe robot is to communicate or to provide assistance. In order to facilitate interaction therobot must be equipped to provide and respond to human-like non-verbal cues. Thehumanoid robot requires a number of sensor systems, particularly vision, to perform thiscommunication successfully. At MIT the robot Kismet18 and WE-3RV from WasedaUniversity19 are recent examples of humanoid robots that are designed to studyinteraction with humans.

By contrast this project focuses on developing a robot that can acquire andmaintain an awareness of its immediate environment and act upon the resulting internalmodel of its surroundings. Metal Man's current range of sensors is almost completelyorthogonal. Quantities measured by one sensor cannot be detected by any other. Thus, thevision systems does not respond to airflow or odor, the electronic nose cannot detectclose by solid objects or air movement and the airflow sensing whiskers are not affectedby odor and cannot sense solid objects. However, these senses do provide synergistic

Page 16: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

information that can be combined to give a more complete Gestalt of the robot'senvironment. For example, vision can be used to locate a mug and detecting an odor withthe electronic nose provides evidence of what the mug might contain. In order toinvestigate synergies between Metal Man's sensors we considered the scenario of findinga mug containing ethanol from amongst a number of mugs within the robot's reach. Asmall cooling fan was used to establish an airflow through Metal Man's working area.Rather than propose an elaborate but untried sensory architecture for this task we arefollowing the path of testing a relatively simple architecture. This will then be elaboratedon an as-needs basis.

The raw sensor data, data processing and the resulting percepts are quite different innature for each of Metal Man's sensor systems. They provide information in the form ofpositions, directions and magnitudes. The way in which information from individualsensors interacts will also vary. These differences will become more pronounced asfurther sensors are added to Metal Man. In order to accommodate these widely varyingsensor systems Metal Man's sensor coordination architecture is based around ablackboard-style data structure20. Raw sensor data is gathered and processed by sensor-specific software to provide percepts that would be understood by an untrained humanobserver. These percepts are stored in the common blackboard area. Typical percepts are:

• the ith object in view is a red cylinder with its axis vertical, diameter di, height hiand centre of volume located at xi, yi, zi,

• there is a strong smell of ethanol vapor,• an airflow of v m/s is detected, and• the airflow is arriving at a heading of θ˚ limited to ±90˚ (straight ahead for Metal

Man is 0˚).

Information in the blackboard data structure is further combined and processed to deriveadditional properties of the items in the blackboard. For instance, 'taking into account theairflow strength and direction, the level of alcohol vapor detected and the location of thegreen mug it is very likely that it contains alcohol'. An activity generator maintains one ora number of 'motivations' that can be triggered by or modified by data contained in theblackboard. For the current scenario examples of motivations might be:

• do nothing,• monitor the immediate environment,• move a red mug next to the green mug,• position the red mug close to the nose to see if it contains alcohol.• hold up a mug that contains alcohol.

Each motivation would be associated with a short script describing how the actions wereto be performed taking into account information available in the blackboard. For instanceholding up a mug that contains alcohol would involve considering all of the mugs withinreach. The first mug tested would be the one with the highest likelihood of containingalcohol. Holding the mug close to the nose could confirm its contents.

Page 17: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

For these experiments, a likelihood function was developed to combine vision, odor andairflow information into a measure of the likelihood that a particular mug containsalcohol. This function is used to order all available mugs so the relative value isimportant but absolute values are not. Fig. 15 shows typical graphs of chemical sensoroutput in response to mugs of ethanol placed within reach of Metal Man's grasp and at 0˚,45˚ and 90˚ to the upwind direction.

Fig. 15. TGS2600 chemical sensor output sampled every 5 seconds in a slow airflow (fan voltage 120V)with amug of ethanol placed at three different headings.

Once the initial transient has died down it seems that the peak-to-peak variation inchemical sensor response within a fixed sample window provides a good way ofcharacterizing the sensor response. Thus, in slow airflow the peak-to-peak variationbetween samples 30 and 59 was 24 for 0˚, 7 for 45˚ and 4 for 90˚.

Fig. 16. TGS2600 chemical sensor output sampled every 5 seconds in a fast airflow (fan voltage 160V) with amug of ethanol placed at three different headings.

Page 18: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

For a fast airflow (Fig. 16) the peak-to-peak variation between samples 30 and 59 was 48for 0˚, 7 for 45˚ and 14 for 90˚.

This information was used to develop the likelihood function. In order toproduce a time averaged result, the updating of the likelihood function was formulated asa first order, recurrent stochastic process. For each mug i the estimate at time step n of thelikelihood Ln

i that it contains ethanol is:

L L A L mni

ni i

ni= + −( )− −1 1 / (if there is low odor and no airflow)

L L A A L mni

ni i i i

ni= + − ( ) −( )− −1 1cos /θ (if there is low odor and airflow)

L L A L mni

ni i

ni= + −( )− −1 1 / (if there is intermediate odor and airflow)

L L A A L mni

ni i i i

ni= + + −( ) ( ) −( )− −1 11 cos /θ (if there is high odor and airflow)

where:

Ai = a priori likelihood that mug i contains ethanol, m = a constant governing the speed with which changes in likelihood are tracked, and θ = the heading of airflow arrival.

The overall architecture of Metal Man's control strategy is illustrated in Fig. 17.

Fig. 17. Metal Man's control architecture.

Page 19: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

The results of a preliminary experiment using this architecture are given in the followingsection.

5. Find the mug containing ethanol

To provide a preliminary test of the control architecture a experiment was set up thatinvolved locating a mug containing ethanol. For the experiment two mugs were placedwithin reach of Metal Man's left hand and a small cooling fan established an airflowthrough the robot's working area at a heading of 0˚. At the start of the experiment MetalMan performed its initialization procedure to determine the pose of its left hand. Thephotograph in Fig. 1 shows a front view of Metal Man, the two mugs and the illuminatedLEDs on the left hand that were used for determining pose. Following initialization,Metal Man scans its working area where it locates and recognizes the two mugs and atennis ball. Fig. 18a gives the view through one of Metal Man's cameras while the sceneis being swept by the laser stripe. Models of the mugs and ball are shown superimposedon the camera image in Fig. 18b.

(a) the laser stripe (b) superimposed models of the mugs and ball

Fig. 18. Metal Man scans its working area and identifies two mugs and a ball.

A small quantity of ethanol was then poured into one of the mugs. Fig. 19 shows thealcohol being introduced into the left-hand mug positioned at a heading of 36˚.

Page 20: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Fig. 19. Ethanol is poured into a mug.

One minute after the chemical sensor registered the ethanol vapor updating the likelihoodfunction was started. Fig. 20 shows the chemical sensor reading together with the valuesof likelihood assigned to the two mugs.

Fig. 20 Chemical sensor reading and likelihood assigned to each of the two mugs (ethanol placed in the left-hand mug positioned at 36˚)

When the command was given to pick up a mug containing ethanol (210 seconds fromthe start of the experiment) the left-hand mug was selected because it had the highestlikelihood value. Fig. 21a shows an observer's view of Metal Man's left hand grasping themug and Fig. 21b gives Metal Man's view.

Page 21: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

(a) grasping the mug. (b) Metal Man's view of the mug.

Fig. 21 Having detected the ethanol vapor and identified the most likely mug, Metal Man grasps the mug.

To ensure that the correct mug has been chosen, Metal Man holds the mug close to itschemical sensor (Fig. 22). As can be seen in Fig. 20 there is a substantial increase in thechemical sensor response that confirms the presence of ethanol. If this test had provednegative Metal Man would return the mug to its original position and test the secondmug.

Fig. 22 A large increase in chemical intensity resulting from moving the mug towards the chemical sensorconfirms that the grasped mug contains ethanol.

Fig. 23 gives results from an identical experiment where ethanol was poured into theother mug. In this case the likelihood function indicated the right-hand mug and afterlifting this mug towards the chemical sensor a similar increase in sensor responseconfirmed that the correct mug had been selected.

The likelihood function combined orthogonal information provided by vision,airflow and chemical sensors to provide an improved model of Metal Man'ssurroundings. In the event that the likelihood function cannot provide information forchoosing between two possible sources of ethanol vapor then the order of testing themugs is based on how close they are to the active hand (the closer mug is tested first).

Page 22: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

Fig. 23. Chemical sensor reading and likelihood assigned to each of the two mugs (ethanol placed in the right-hand mug positioned at 6˚)

6. Conclusions

The motivation for the project described in this paper is to develop sensors, sensor dataprocessing algorithms and robot control techniques that exploit the synergies betweensensing modes and allow robots, especially humanoid robots, to interact with theirenvironment. Currently a vision system has been developed that can determine the range,shape and color of objects in the robot's environment. Vision data has been used tolocate/model objects and to perform self-calibrated, position based visual servoing forgrasping and manipulating the objects. An electronic nose has been built that uses thetemperature varying characteristics of tin oxide sensors to discriminate different odorsusing a small number of physical sensors. Information on airflow direction and velocityis provided by a novel whisker sensor array. In order to investigate synergies betweenthese sensory modes a simple experimental scenario has been developed involving themanipulation of mugs some of which contain ethanol. This experiment was intended toprovide guidance in the development of a more complete sensory control scheme for ahumanoid robot and to identify improvements and additions that should be made to thecurrent sensor suite.

Given the command to find a mug containing ethanol, Metal Man was able toestablish the location of all mugs within reach and then using additional odor and airflowinformation prioritize the order in which they should be tested. These preliminaryexperiments have shown that synergistic sensory information can be employed tomaintain an integrated view of a robot's environment. Future development plans for

Page 23: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Multi-Sensory Synergies in Humanoid Robotics

Metal Man will center around the provision of additional sensory modes and theirintegration into improved versions of the sensory control scheme.

Acknowledgements

This project was funded by the Strategic Monash University Research Fund for theHumanoid Robotics: Perception, Intelligence and Control project at IRRC.

References

[1] M. Magee, R. Weniger and E. A. Franke, Location of features of known height in the presenceof reflective and refractive noise using a stereoscopic light-striping approach, OpticalEngineering, 33 (4), April 1994, 1092–1098.

[2] E. Trucco and R. B. Fisher, Acquisition of consistent range data using local calibration, Proc.of the IEEE Int. Conf. on Robotics and Automation, 4 (1994) 3410–3415.

[3] Kouichi Nakano, Yasuo Watanabe, and Sukeyasu Kanno, Extraction and recognition of 3-dimensional information by projecting a pair of slit-ray beams, Proceedings of the 9thInternational Conference on Pattern Recognition (1988) 736–738.

[4] G. Taylor, L. Kleeman and Å. Wernersson, Robust colour and range sensing for roboticapplications using a stereoscopic light stripe scanner, Proc. IEEE/RSJ InternationalConference on Intelligent Robots and Systems (2002) 79-84.

[5] P. J. Besl and R. C. Jain, Segmentation through variable-order surface fitting, IEEETransactions on Pattern Analysis and Machine Intelligence, 10 (2), March 1988, 167–192.

[6] G. Taylor and L. Kleeman, Robust Range Data Segmentation Using Geometric Primitives forRobotic Applications, accepted to International Conference on Signal and Image Processing,(2003).

[7] G. Smith, E. Lee, K. Goldberg, K. Böringer and J. Craig, Computing parallel-jaw grips, Proc.International Conference on Robotics and Automation (1999) 1897–1903.

[8] G. Taylor and L. Kleeman, “Flexible self-calibrated visual servoing for a humanoid robot”, inProc. 2001 Australasian Conference on Robotics and Automation, (2001) 79-84.

[9] P.I. Corke and S.A. Hutchinson, Real-time vision, tracking and control, Proc. IEEE Conf. onRobotics & Automation (2000) 622-629.

[10] F. Chaumette and E. Malis, 2 1/2 D visual servoing: a possible solution to improve image-based and position-based visual servoing, Proc. IEEE Int. Conf. on Robotics & Automation(2000) 630-635.

[11] Y. Hu, R. Eagleson and M.A. Goodale, Human visual servoing for reaching and grasping: therole of 3-D geometric features, Proc. IEEE Int. Conf. on Robotics & Automation (1999) 3209-3216.

[12] A. Hauck, M. Sorg, G. Faber and T. Schenk, What can be learned from human reach-to-graspmovements from the design of robotic hand-eye systems? Proc. IEEE Int. Conf. on Robotics &Automation (1999) 2521-2526.

[13] J.W. Gardner and P.N. Bartlett (eds.), Sensors and Sensory Systems for an Electronic Nose(Kluwer Academic Publishers, 1992).

[14] T. Nakamoto, T. Fukuda and T. Moriizumi, Gas identification system using plural sensors withcharacteristics of plasticity, Sensors and actuators B 3 (1991) 1-6.

[15] A.H. Purnamadjaja and R.A. Russell, A sense of smell for a humanoid robot, InternationalConference on Artificial Intelligence in Science and Technology, (Hobart, Tasmania, 17-20December 2000) 312-316.

[16] R.A. Russell and S. Kennedy, A novel airflow sensor for miniature mobile robots,Mechatronics, Elsevier Science, 10 (8) (2000) 935-942.

[17] Geldard, F.A., The Human Senses: Second Edition, John Wiley and Sons, Inc., 1972.

Page 24: MULTI-SENSORY SYNERGIES IN HUMANOID ROBOTICS · developing a synergistic set of sensory systems for a humanoid robot. ... Results are also presented of a demonstration project ...

Russell, Taylor, Kleeman and Purnamadjaja

[18] C. Breazeal and B. Scassellati, Context-dependent attention system for a social robot,Proceedings of the Sixteenth International Joint Conference on Artificial Intelligence(IJCAI99) Stockholm, Sweden, (1999) 1146-1151.

[19] H. Miwa, A. Takanishi and H. Takanobu, Experimental study on robot personality forhumanoid head robot, Proceedings of the IEEE/RSJ International Conference on IntelligentRobots and Systems, Maui, Hawaii, USA, Oct. 29- Nov. 03 (2001) 1183-1188.

[20] H.P. Ni, Blackboard systems: the blackboard model of problem solving and evaluation ofblackboard architectures, AI Magazine, 8 (2) (1986) 38-53.

R. Andrew Russell received his B. Eng. and Ph.D. degrees in 1972and 1976, respectively, from the University of Liverpool, U.K. Hewas formerly an Engineer working on robot design and applicationsin the Department of Artificial Intelligence at the University ofEdinburgh, Scotland. For the past 19 years he has been a member ofacademic staff at the University of Wollongong and now at MonashUniversity, both in Australia. His research interests include robottactile sensing, the design of robotic mechanisms and olfactory

sensing for robots. He has written books on robot tactile sensing and odor sensing formobile robots as well as authoring over 100 other publications.

Geoffrey Taylor received his BSc degree in Mathematics andPhysics in 1998 and BE degree in Electrical and Computer SystemsEngineering in 2000 from Monash University, Australia. He iscurrently pursuing a PhD degree in the field of humanoid roboticsand computer vision at Monash University. His research interestsinclude robotics, human-machine interaction, computer vision andcomputer graphics.

Lindsay Kleeman received his Bachelor degrees in Engineering andMathematics both with university medals from the University ofNewcastle, Australia in 1982 and 1983 and his PhD from the sameuniversity in 1986. In 1986 he was appointed to the academic staff ofthe Department of Electrical and Computer Systems Engineering atMonash University Australia and is currently an Associate Professor.He has research interests in ultrasonic and other sensors for robots,mobile robotics and humanoid robots and has authored over 100publications.

Anies H Purnamadjaja received her M.Eng.Sc degree in Electricaland Computer Systems Engineering from Monash University,Australia, in 2001. From 2002 to 2003 she was at Petra ChristianUniversity, Indonesia, as a lecturer at the Electrical EngineeringDepartment. She is currently continuing her Ph.D. in Electrical andComputer Systems Engineering at Monash University. Her researchinterests include robotics, sensory systems for humanoid robot,especially the sense of smell.


Recommended