+ All Categories
Home > Documents > Charlie Robotic Humanoid Head Description Document

Charlie Robotic Humanoid Head Description Document

Date post: 01-Dec-2015
Category:
Upload: sriram-kishore-kumar
View: 174 times
Download: 0 times
Share this document with a friend
Description:
Charlie is a robotic human head project. This is the document that describes the robot.
Popular Tags:
19
BIOLOGICALLY INSPIRED ROBOTICS 1 Charlie: the head robot Sriram Kumar, Devangini Patel, Sara Adela Abad Guaman, Suraj Padhy, Stabak Nandi MSc. Artificial Intelligence, ECS, University of Southampton, [email protected], [email protected], [email protected], [email protected], [email protected] Abstract—We have a lot of technological developments like computers, cell phones, Facebook, and many more. As technology is advancing, it is becoming difficult for common man to learn the ways to access them. So, we are aiming to create an I/O device which is easy for any human to access the internet and communicate with others. We went through many designs and found that humans are the best and comfortable forms which with humans would like to communicate with. If you want people to communicate with machines, then it has to be done naturally. So we decided to make a robot that is human like and has all the features of a human face. The features include camera eyes, microphone ears, speaker voice, touch sensor skin, the motor powered eyes and jaws, facial muscles and many more. It has flexibility with around 17 degrees of freedom. This robot has all the peripherals to aid social experience. These human like features bring in trust, reliability and emotional intelligence to any piece of artificially intelligent software. Index Terms—Head robot, expressions, sensors, actuators I. PREVIOUS PROJECTS W E have looked at some of the previous head robots made which are as follows: Kismet: Kismet is an expressive robotic creature with perceptual and motor modalities tailored to natural human communication channels. To facilitate a natural infant- caretaker interaction, the robot is equipped with visual, auditory, and proprioceptive sensory inputs. The motor outputs include vocalizations, facial expressions, and mo- tor capabilities to adjust the gaze direction of the eyes and the orientation of the head. [1] Professor Hiroshi Ishiguro’s geminoids and telenoid robot: A geminoid is a robot that will work as a duplicate of an existing person. It appears and behaves as a person and is connected to the person by a computer network. [2]. All the geminoids are shown in Fig. 1 Fig. 1. The geminoids made so far. II. SPECIFICATIONS OF THE ROBOT Dimension (HxWxB) 21.5 cm x 23 cm x 15 cm Degree of freedoms Eye : 2 x 2 = 4 Mouth : 2 x 1 = 1 Eye led : 2 x 2 = 4 Linear actuators : 10 x 2 = 20 Sensors Piezo : 13 QTC : 7 Capacitive Sensors : 5 Peripheral devices Camera, microphone and speakers Processing Arduino Mega 2560 Arduino Uno Micro processor : Computer Webserver Servo motors 5 X 5 V, 1 Amp Linear actuators 10 X 4.4V, 1 Amp Sensors 4.4V, 1 Amp Controllers and processors 5V, 1.5 Amps III. ARCHITECTURE The architecture of the head robot system is shown in Fig. 2. The head robot consists of the following components: Processors: arduino mega 2560, arduino uno, and laptop
Transcript
Page 1: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 1

Charlie: the head robotSriram Kumar, Devangini Patel, Sara Adela Abad Guaman, Suraj Padhy, Stabak Nandi

MSc. Artificial Intelligence,ECS,

University of Southampton,[email protected], [email protected], [email protected], [email protected], [email protected]

Abstract—We have a lot of technological developments likecomputers, cell phones, Facebook, and many more. As technologyis advancing, it is becoming difficult for common man to learnthe ways to access them. So, we are aiming to create an I/Odevice which is easy for any human to access the internet andcommunicate with others. We went through many designs andfound that humans are the best and comfortable forms which withhumans would like to communicate with. If you want people tocommunicate with machines, then it has to be done naturally.So we decided to make a robot that is human like and has allthe features of a human face. The features include camera eyes,microphone ears, speaker voice, touch sensor skin, the motorpowered eyes and jaws, facial muscles and many more. It hasflexibility with around 17 degrees of freedom. This robot hasall the peripherals to aid social experience. These human likefeatures bring in trust, reliability and emotional intelligence toany piece of artificially intelligent software.

Index Terms—Head robot, expressions, sensors, actuators

I. PREVIOUS PROJECTS

WE have looked at some of the previous head robotsmade which are as follows:

• Kismet: Kismet is an expressive robotic creature withperceptual and motor modalities tailored to natural humancommunication channels. To facilitate a natural infant-caretaker interaction, the robot is equipped with visual,auditory, and proprioceptive sensory inputs. The motoroutputs include vocalizations, facial expressions, and mo-tor capabilities to adjust the gaze direction of the eyes andthe orientation of the head. [1]

• Professor Hiroshi Ishiguro’s geminoids and telenoidrobot: A geminoid is a robot that will work as a duplicateof an existing person. It appears and behaves as a personand is connected to the person by a computer network.[2]. All the geminoids are shown in Fig. 1

Fig. 1. The geminoids made so far.

II. SPECIFICATIONS OF THE ROBOT

Dimension (HxWxB) 21.5 cm x 23 cm x 15 cm

Degree of freedoms Eye : 2 x 2 = 4Mouth : 2 x 1 = 1Eye led : 2 x 2 = 4Linear actuators : 10 x 2 = 20

Sensors Piezo : 13QTC : 7Capacitive Sensors : 5

Peripheral devices Camera, microphone and speakers

Processing Arduino Mega 2560Arduino UnoMicro processor : ComputerWebserver

Servo motors 5 X 5 V, 1 AmpLinear actuators 10 X 4.4V, 1 AmpSensors 4.4V, 1 AmpControllers and processors 5V, 1.5 Amps

III. ARCHITECTURE

The architecture of the head robot system is shown in Fig.2. The head robot consists of the following components:

• Processors: arduino mega 2560, arduino uno, and laptop

Page 2: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 2

Fig. 2. The architecture of the system.

• Actuators: as facial muscles; servo motors for controllingthe eyes, eyelids and jaw; linear actuators for generatingthe facial expressions

• Sensors: as skin; capacitive sensors for detecting proxim-ity of humans, QTC pills for detecting hard pressure atsmall areas like nose and ears and piezo electric sensorsfor picking up the sudden change in pressure.

• Cameras: as eyes; to do stereo vision, to detect peopleas this considers the presence of humans around themand gives them attention; can also do additional imageprocessing.

• Microphone: as ears; to listen to humans words and theirtone and can do speech processing

• Speaker: as mouth; to speak to humans so that the headrobot can communicate verbally with humans.

As shown, there are various inputs of the environmentthat are considered i.e. vision, touch, hearing and then theseare processed and output is given in terms of speech, facialexpression, movement of different parts of the face.

Here, Arduino Mega is connected to the face robot’s eyes,eyelids and jaws. This processor sends all the angle valuesto the servo motors attached to these parts and they moveaccordingly.

Arduino Mega is connected to the sensors, linear actuators,Arduino Uno and the laptop. This is the master processor andthe angles for Arduino Uno come from Arduino Mega. TheArduino reads the value from the sensors and concurrentlyreads the values of the actuators from the laptop. If it isintended for Arduino Uno, then these values are forwardedthere or else they are send to the linear actuators.

The cameras, microphone and the speaker are connected tothe laptop. Image processing is done from the camera input.Microphone input is just taken as of now and sent to thelaptop. We have made code so that the sound at the laptop’s

Fig. 3. The skull base.

microphone is passed over to the robot’s speakers. Also, thereis test to speech code, so that words can be stored in somedatabase and depending on the mood of the robot (sensorvalues), it can say certain things.

IV. MECHANICAL STRUCTURE

The base idea of constructing the mechanical structure weretaken from TJ Animatronic Puppet which is the first opensource robotic expression platform. [3]

The whole aluminium skull is fixed on a blue perspex piecewhich is shown in Fig. 3. The engineering drawings of thesepieces are given in Fig. 4. We got these pieces milled at themechanical labs.

The eye box for controlling the pulling of the eyes andmoving of the eye lids is shown in Fig. 6. The engineeringdrawing of the eybox is shown in Fig. 5.

The solidworks assembly are given in 7, 8. Where there are2 layered platform attached behind. These platforms are usedto hold the servo motors and guide the eye ball for moving.The more details structure of the eyebox are being shown inthe Fig. 9 and 12.

The eye balls and the eyelids are being designed in such away that they are projected out of this platform and the centreof gravity is being balanced on both the sides.

This projection will be holding the eyeball and the holes inthe side of this holder will be helping to glide the eyebrowsand help it open and close. The eyeball and the eyebrow arebeing powered by micro servo motors. The detailed view ofthis is shown in the Fig. . The eyelissds are again being guidedby servo motors. This way the Eye box is being made in partsand attached together.

This eye box is placed inside the big hole of the skull baseand it is clamped to the top of the skull base as shown in Fig.3. The teeth were then fixed onto the jaw using polymorph asshown in Fig. 13

This structure is quite weak and we can’t directly placelatex mask over it. So we decided to make a metal casingwhich would be screwed with the skull base and provide a

Page 3: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 3

Fig. 4. The skull base engineering drawing.

Fig. 5. The skull base engineering drawing.

platform for the polymorph face and latex mask. The metalcasing consists of head which covers the top part, metal jaws(so it takes the space that actual jaws take up), cheeks thatprotect the area under the eyes and over the metal jaw whichare shown in Fig. 14. The metal casing also consists of eyeprotection cases shown in Fig. 15 ; these are put on at theend (especially after the eye box is kept inside).

The main objective of the metal casing is to protect theservo motors and mechanics inside the skull. There is no spaceinside the metal casing, so the processors are kept behind the

Fig. 6. The eye box for the eyes and eyelids mechanism.

Fig. 7. The eye box solidworks assembly.

Fig. 8. The eye box solidworks assembly.

Page 4: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 4

Fig. 9. The eye ball solidworks assembly.

Fig. 10. The eye ball.

head robot.

After the metal casing a layer of polymorph is put. Thispolymorph resembles the human face more than the metalcasing. This layer of polymorph is for fitting the sensors andactuators on and there is not much measurement differencebetween the polymorph face and the latex mask so that whenthe person interacting with the robot touches it, he/she has topress a lot or feels that there is hollow space.

The polymorph face is shown in Fig. 16.

The polymorph face is placed over the metal face and thenthe latex mask is placed over the polymorph. The fitting ofthe latex mask over the polymorph face is discussed in theactuator section.

Fig. 11. The eye ball.

Fig. 12. The eye lid mechanism.

V. SENSORS

Touch sensation is the amount of stress over the contactarea between both surfaces. [4] However, touch sense has 2submodalities: the kinaesthetic submodality receives the inputsfrom the receptor within the tendons, muscles and joints, whilethe cutaneous submodality receives its inputs from the skin.[5] As a result, the second submodality will be replicatedin the present project. Consequently, pressure sensors locatedin the face are needed in order to obtain the input data.Nevertheless, there are many constraints such as cost, facephysical dimensions, repeatability, and accuracy for selectingthese devices. Based on this constraints, the capacitive sensorwas chosen in order to sense a caress, quantum tunnellingpills are utilized in small areas such as ears and nose, andpiezoelectric sensors are employed to sense harder interactionsin large areas such as cheeks, shin, foreheads, and eyebrows.The analysis made to select the sensor is described in thedesign subsection.

Page 5: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 5

Fig. 13. The skull base with eyes and teeth.

Fig. 14. Partial metal casing over the skull base.

A. Justification of the sensors

Sensors have to be able to differentiate between a caressand a hard touch. Additionally, they have to cover most of theface area. However, the size has to be small enough to identifythe location of the touch.

For sensors the main input is the variation of pressure whenthe face is touched.

Making capacitive sensors for arduino is simple. It justrequires a resistor and an electrical conductive material. It cansense the electrical human capacitance through more than aquarter of inch of electrical insulator material. Plus, the sensingdistance depends on the value of the resistor. The larger theresistance, the larger the range of the sensor. However, itstime response is also proportional to the resistance because is

Fig. 15. The full metal casing

Fig. 16. The polymorph face

Page 6: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 6

Fig. 17. The capacitive sensor electric circuit.

Fig. 18. The piezo electric circuit diagram.

equal to R*C. Other advantages are the cost, dimensions, andshapes. [6]

On the other hand, the piezoelectric sensors have a definedshape. They can be found in a low variety of dimensions, buttheir repeatability is better than that of the capacitive sensors.As a consequence they are employed to measure the pressureapplied to the skin. Its electrical connection requires just aresistor because when pressure is applied to the sensor, avoltage is generated in the device. However, they are too bigto be placed in areas such as nose and ears.

In order to sense pressure in small areas of the face,quantum tunnelling pills which size is 3mmX3mm are a goodalternative. Its output signal is the change of resistance. Theresistance is infinite when there is not an applied pressure, butthe resistance might be zero due to the amount of pressuredapplied.

B. Circuits of the sensors

The capacitive sensor is made of copper tape. The size andshape are defined by the available space in the surface. Theaim of using this sensor is to cover big areas in order to sense acaress. As a result, the design will contemplate 7 sensors. Theycover the forehead, left cheek, right cheek, chin, eyebrows,nose, and lips. The conditioning circuit is illustrated in theFig. 17. The resistor R2 is employed to protect the pin of thecontroller. .

The piezoelectric sensor circuit implemented is recom-mended by [7]. Based on the size of the sensor and face,15 piezoelectric sensors are required.

On the other hand, Fig. 19 illustrates the circuit for theQTP sensors. It is a voltage divider circuit. If the pressureis zero, the input of the pin is also zero. If there is pressure

Fig. 19. The QTC circuit diagram.

Fig. 20. The prototype of the capacitive sensors.

applied, the resistances decreases and the input in the pin isbigger than zero. Because of the area available in the ears andnose, this sensors will be paced in these locations.

The full circuit of all the sensors is given in Appendix Fig.40.

C. Location of the sensors

In order to test the performance of the sensors a prototypephase was implemented. Due to the lack of a surface as a face,the sensors performance evaluation was made using the backof a chair. Its simplicity, strength, size, shape, and accessibilitywere the reason for selecting as a test surface. 7 capacitivesensors were tested at the beginning, but because of the spacein the face they were reduce to 5. Fig. fig:capacitiveprototypeshows the prototype of the capacitive sensors on the chair andFig. 21 illustrate the implementation phase for the capacitivesensors.

On the other hand, the piezoelectric sensors are attachedto the latex. Every sensor is made of at least 2 QTC pills

Page 7: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 7

Fig. 21. The capacitive sensors on the polymorph face.

Fig. 22. The piezo electric sensors on the polymorph face.

connected in parallel. Fig. 22 shows the distribution of thesesensors.

The location of the sensors on the polymorph face is givenin Fig. 23 and 24.

VI. ACTUATORS

A. Justification for the actuators

Actuators are needed to show different emotions. Their sizeshould be small because of the available space on the face. Weneed enough actuators on the face to pull the latex mask by 1-1.5 cm easily and also to produce a range of facial expressionssmartly. Additionally, another type of actuator is required formoving the jaw, eyes, and eyelids. Their size could be biggeror the same as the other actuators because they can be placedinside the head, but its torque has to be enough to move theseparts.

Fig. 23. The location of the capacitive sensors on the polymorph face.

Fig. 24. The location of the piezo electric sensors on the polymorph face.

B. Different actuators considered for producing facial expres-sions

The following table depicts the list of materials and actua-tors researched about for being the potential actuators as facialmuscles and also enlists the advantages and drawbacks.

Page 8: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 8

Fig. 25. The NiTi experiment.

Actuator Name Advantages DisadvantagesPneumatic light weight Need huge andMuscles uses low power bulky air

high flow rates for compressors andrapid contract- air reservoirsrelax cycling [8]

Electro very flexible expensiveActive very natural not easilyPolymer behaviour availableShape Memory light long coolingAlloy compact time

strong [9], [10]Ultra Micro tinyLinear cheapActuators

C. Experiments with Shape Memory Alloy

We carefully studied the experiments performed by re-searchers on SMA and videos available online. We tried toreplicate a simple experiment, as shown in 25, based on theones given in [9] to check the effectiveness of NiTi wires.

5V was supplied by an external power supply and around2.14A flowed through the circuit. It took approximately 1.5minutes for it to pull by 1cm and 1.5 minutes again to expandto its original size. Also, the size of the whole setup ispretty large; though we did think of putting these structuresbehind the head and then having the SMA structure pull wireswhich would be attached to the latex. The cooling time wasunacceptable, so we moved with another actuators.

D. Experiments with ultra micro linear actuators

Then we thought of using ultra micro linear actuatorsbecause of their advantages and suitability for this project.The ultra micro linear actuator looks like the one shown inFig. 26.

Such a fragile component can’t be placed unprotected underthe polymorph place because there are chances of some

Fig. 26. The ultra micro linear actuators placed besides a finger and a penny.

Fig. 27. The ultra micro linear actuators placed inside a casing.

mechanical component breaking it, the electrical ciruit beingtampered by fluxes around it and so on. So it becomesnecessary to cover it with some mechanical protective casingas shown in Fig. 27.

Here, pin connectors are placed through the legs of theactuator and then two pcb board plates are placed above andbelow it. A solder wire is made to go through the hole in thearm of the linear actuator. An aluminium piece, with a holedrilled though it, is kept on the pcb plate and the solder wire ismade to go through this hole too. The two ends of the solderwire are placed on the aluminium piece and stuck there withvelcro.

E. Actuator circuits

The circuits of the actuators is shown in the Fig. 28. Thedifference between the two types of servo motors is the sourcevoltage. Actuators for showing expressions require a powersupply from +3.3V to +4.8V, while the actuators utilized formoving jaw, eyes and eyelids need +5V.

Page 9: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 9

Fig. 28. The actuator circuit.

Fig. 29. The location of the linear actuators below the polymorph face.

The full circuit design is shown in Fig. 41 in the Appendix.

F. Location of the actuators

We studied where we need to place the actuators to producethe 6 basic expressions: happy, sad, anger, surprise, disgust,neutral. [11], [12]. Looking at the size of one linear actu-ator unit, as shown in Fig. 27, we will only be able toplace around 10 below the polymorph face. Considering theavailable space below the polymorph as well as the spacebetween the polymorph and the mechanical structure, wedecided to place the linear actuators at the positions as shownin Fig. fig:actuatorlocation. The actuators can be seen belowthe polymorph from top view in Fig. fig:actuatorlocation2 andfrom bottom view in Fig. fig:actuatorlocation3.

To attach the linear actuators with the latex mask as shownin Fig. 32 , velcro was stuck under the latex masked and bothsides of the velcro were fastened.

VII. PROCESSORS

A. Microcontroller selection

Several parameters have to be defined in order to choose themicrocontroller. These features are summarized in the table I.

Based on the table table:microcontrollerrequirements, theArduino Mega 2560 microcontroller board was selected. It isillustrated in the Fig. 33 and its features are described in thetable II.

Although Arduino Mega board has enough digital ports tohandle the 5 servomotors whose power supply is +5V, the

Fig. 30. The location of the linear actuators below the polymorph face.

Fig. 31. The location of the linear actuators below the polymorph face.

Fig. 32. The latex mask.

Page 10: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 10

TABLE IMICROCONTROLLER REQUIREMENTS

Device QTY Analogue Input/ # Inputs # Outputs/Digital Output

Capacitive 7 D I 7sensorsPiezoelectric 15 A I 15sensorsQTC sensors 3 A I 3Actuators +4.4V 22 D O 22Actuators +5V 5 D O 5I2C Comm. 1 D I/O 1 1portSerial Comm. 1 D I/O 1 1port

Total of inputs / outputs 27 29

Fig. 33. The Arduino Mega 2560 microcontroller.

induino board was chosen to manage them because it facil-itates the parallel processors rather than concurrent softwareprocessing. During the testing phase, the arduino uno (Induino)could not perform the sensing and controlling both types ofactuators in a way that for human perception is simultaneously.As a result, the induino board was selected to control these5 servomotors. Fig. 34 exhibits the induino board, Fig. 35illustrates the shield used to control the motors, and the tabletable:induinofeatures summarizes its features. [13], [14]

B. Pin connections

The connections of Arduino Mega 2560 with differentsensors and the linear actuators are given in the table IVand those for Induino are given in table table:induinopin.

VIII. POWER SUPPLY

We needed 5V and 3A to drive our circuits and then splitthe 3A into two parts: 1A for Induino and 2A for Arduino

TABLE IIARDUINO MEGA2560 MICROCONTROLLER BOARD MAIN FEATURES

Feature DescriptionMicrocontroller ATmega2560Operating Voltage 5VDC current per I/O Pin 40 mADigital I/O pins 54Analogue inputs 16Serial communication ports 4TWI communication ports 1Flash memory 256 KBSRAM 8 KBEEPROM 4 KBClock Speed 16 MHz

Fig. 34. The Induino(Arduino Uno) microcontroller.

Mega 2560. First, we bought 5V 10A power supply unit asshown in Fig. 36. But this blew off some of the servo motors.

Then we went for usb power supply of 5V and two outputsof 2.1A and 1A.

Fig. 35. The Induino(Arduino Uno) microcontroller’s shield.

Page 11: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 11

TABLE IIIINDUINO MICROCONTROLLER BOARD MAIN FEATURES

Feature DescriptionMicrocontroller ATmega328/168/88Operating Voltage 5VDC current per I/O Pin 40 mADigital I/O pins 23Analogue inputs 8Serial communication ports 1TWI communication ports 1Flash memory 32 KBSRAM 2 KBEEPROM 1 KBClock Speed 16 MHz

TABLE IVARDUINO MEGA 2560 PIN CONNECTIONS

Device Name Pin Name

Capacitive sensors

Send 22Eyebrows 26Right Cheek 30Left Cheek 34Chin 38Forehead 42

Piezoelectric sensors

PRS0 A0PRS1 A1PRS2 A2PRS3 A3PRS4 A4PRS5 A5PRS6 A6PRS7 A7PRS8 A12PRS9 A8PRS10 A9PRS11 A10PRS12 A11

QTC sensorsLeft Ear A13Right Ear A14Nose A15

Linear Actuators

Left Mouth 23Left cheek 25Left eye 27Left eyebrow 29Forehead 31Right eyebrow 33Right eye 35Right cheek 37Right mouth 39jaw 41

A. Voltage regulator

The linear actuators work on voltage levels between 3.3vand 4.8V [15]. To step voltage from 5V to 4.4V, rectifier diode(1N 5400) component has been used.

IX. SOFTWARE

A. Arduino code

1) Master slave communication between the microcon-trollers: I2C is a multimaster serial signal-ended comptuerbus invented by Philips. [16] For Charlie, we used I2C tocommunicate between the Arduino Mega and Arduino Uno.In this Arduino Mega will act as master and Arduino Unowill act as slave. The code for communication between themaster and the slave is in Appendix listing 1. In ArduinoMega pin no.s 20 and 21 are used for I2C communication

TABLE VINDUINO PIN CONNECTIONS

Device Name Pin Name

Servomotors

Eyes horizontal motion 6Eyes vertical motion 9Left eyelid 11Right eyelid 12Jaw 13

Fig. 36. The old power supply unit.

and in Arduino Uno Analog pin 4 and 5 are used for I2Ccommunication. Pin 4 of arduino uno and pin 21 of arduinomega and pin 5 of arduino uno and pin 20 of arduino mega aresimply connected with each other respectively and this formsa I2C communication between the two controllers.

The reason for us going to I2C communication is wewanted Charlie to do multi processing. The difference betweenmulti-threading and multi-processing is in multi-threading theprocessor time is being shared and the process are beingswitched between different stated to get the simultaneouswork. On the other hand multi-processing is having multipleprocessors handling separate processes and as a result wehave got double the speed and double the accuracy. Usuallycontrollers are not so quick in switching time and process andso having multi-processors for building robotics would be abetter option.

2) Expression code: So far three basic expressions arecreated. This code is given in the appendix listing 2. Thevalues used for linear actuators to produce anger, surpriseand happy are given in the table table:expressiontable. Forunderstanding reasons, the happy expression is shown in Fig.37.

B. Python code

1) Image processing: We have used OpenCV for doing im-age processing. [17] Stereo Vision code has been implemented.Face tracking has been done by using Haar cascades. The irishas been detected using the methods prescribed by [18].

Page 12: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 12

TABLE VITHE VALUE OF THE LINEAR ACTUATORS TO PRODUCE DIFFERENT

EXPRESSIONS

Expression Pin Number Angle Value

Happy

23 18025 18037 18039 18041 0

Anger

27 029 031 033 035 180

Surprise

27 18029 18031 18033 18035 0

Fig. 37. The linear actuator values for happy expression with the pin numbersfor all the linear actuators..

2) Speech processing: PyAudio library has been used fortransferring speech from microphone to speaker. Pyttsx enginehas been used for producing speech from text.

C. Higher level communication between all the components

We have devised the higher level protocol to be of theform of packets as shown in Fig. 38. The first character is acontrol number and the second trailing payload is the data thenfollowing by packet terminal character. Mode number one isreserved for deciding the mode. If mode = 0, then it operates inautonomous mode and if model=1, it operates in skype mode.The modes 2 to 4 are used for sending the actuator values andreading the sensors and these are illustrated in Fig. 39.

Fig. 38. The communication unit for higher level communication.

Fig. 39. The communication unit for higher level communication.

X. POSSIBLE APPLICATIONS

Such a robot could have multiple uses:• Telenoid: This can be the next step for Skype where the

emotions of the caller can be captured and transferredon to the robotic head and hence feel having loved onescloser to you.

• Leave off the boring navigator and having a human headface that sits next to you in a car and communicates theroute and you can share your personal feelings.

• A robot that can understand your feelings and emotionsand help you to get out of stress

• Record the incidents around a human and learn from themto create a total digital copy of the human which can liveforever and develop its own personality and thus to extendhuman life.

• To take care of the elderly emotionally and physically.• Be a personal assistant or even a companion.

XI. CONCLUSION

We made a human head that would look, hear and talk. Itcould sense people touching it and respond. It would be ableto move its eyes, eyelids and jaws. It was able to make facialexpressions like anger, surprise, happy. Thus, we made a basichuman head with some basic functionality which could serveas a platform on which AI software could be built on it.

REFERENCES

[1] C. Breazeal, Sociable Machines: Expressive Social Exchange BetweenHumans and Robots. PhD thesis, Department of Electrical Engineeringand Computer Science, MIT, 2000.

[2] N. H. Shuichi Nishio, Hiroshi Ishiguro, Geminoid: Teleoperated Androidof an Existing Person. Vienna, Austria: I-Tech Education and Publishing,2007.

[3] “Tj animatronic puppet.” http://tjrobot.weebly.com/.[4] G. Robles-De-La-Torre and V. Hayward, “Force can overcome object

geometry in the perception of shape through active touch,” in Journalof Nature, vol. 412, pp. 445–448, July 2001.

[5] G. V. M. S. G. Dahiya, R.S.; Metta, “Tactile sensingfrom humans tohumanoids,” in IEEE Transactions on Robotics, vol. 26, pp. 1–20, Feb.2010.

[6] “Arduino capacitive sensing.” http://playground.arduino.cc//Main/CapacitiveSensor?from=Main.CapSense.

[7] “Arduino piezo electric sensing.” http://arduino.cc/en/Tutorial/Knock.

Page 13: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 13

[8] S. U. . W. L. . G. J. Majoe, D. ; MA Syst. & Control Ltd., “Pneumaticair muscle and pneumatic sources for light weight autonomous robots,”in 2011 IEEE International Conference on Robotics and Automation,pp. 3243–3250, May 2011.

[9] S. P. E. V. S. V. Adelaide Nespoli, Stefano Besseghini, “The high poten-tial of shape memory alloys in developing miniature mechanicaldevices:A review on shape memory alloy mini-actuators,” vol. 158, pp. 149–160,May 2010.

[10] I. N. A. V. P. A. A. P. G. A. M. A. T. R. Z. P. S. R. L. P. A. W. John D.W. Madden, Member and I. W. Hunter, “Artificial muscle technology:Physical principles and naval prospects,” vol. 29, pp. 706–728, July2004.

[11] H. KOBAYASHI and F. HARA, “Study on face robot for active humaninterface - mechanisms of face robot and expression of 6 basic facialexpressions,” pp. 276–281, 1993.

[12] P. R. M. S. B. J. R. M. Tingfan Wu, Nicholas J. Butko, “Learningto make facial expressions,” in 2009 IEEE 8TH INTERNATIONALCONFERENCE ON DEVELOPMENT AND LEARNING, pp. 1–6, 2009.

[13] “Induino site.” http://induino.blogspot.co.uk/.[14] “Induino specification site.” http://www.atmel.com/devices/atmega328p.aspx?tab=parameters.[15] “Ultra micro servo 1.7g for 3d flight.”

http://www.hobbyking.com/hobbyking/store/ 11737 HobbyKing Ultra Micro Servo 1 7g for 3D Flight Left .html.[16] “I2c wikipedia page.” http://en.wikipedia.org/wiki/I[17] “Opencv official site.” http://opencv.willowgarage.com/wiki/.[18] R. V. Jian-Gang Wang, Eric Sung, “Eye gaze estimation from a single

image of one eye,” in Proceedings of the Ninth IEEE InternationalConference on Computer Vision (ICCV 2003) 2-Volume Set, 2003.

Page 14: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 14

APPENDIX

A. I2C Code

Listing 1. I2C Master Slave communication Arduino CodeMaste r code :/ / Wire Mas te r Code/ / by Sr i r am Kumar K

/ / C r e a t e d 29 March 2013

/ / Th i s example code i s i n t h e p u b l i c domain .

# i n c l u d e <Wire . h>

vo id s e t u p ( ){

Wire . b e g i n ( ) ;S e r i a l . b e g i n ( 9 6 0 0 ) ;

}

b y t e x = 0 ;

vo id loop ( ){

i n t s l a v e n o = 2 ;i n t d a t a L e n g t h = 1 9 ;sendData ( s l a v e n o ) ;s l a v e n o = 3 ;sendData ( s l a v e n o ) ;

}

vo id sendData ( i n t s l a v e n o ){

Wire . b e g i n T r a n s m i s s i o n ( s l a v e n o ) ; / / t r a n s m i t t o d e v i c e #4Wire . w r i t e ( ” x i s ” ) ; / / s e n d s f i v e b y t e sWire . w r i t e ( x ) ; / / s e n d s one b y t eWire . e n d T r a n s m i s s i o n ( ) ; / / s t o p t r a n s m i t t i n g

x ++;d e l a y ( 5 0 0 ) ;

}

vo id r e c e i v e D a t a ( i n t s l a v e n o , i n t amountofData ){

Wire . r e q u e s t F r o m ( s l a v e n o , amountofData ) ; / / r e q u e s t 6 b y t e s from s l a v e d e v i c e #2

w h i l e ( Wire . a v a i l a b l e ( ) ) / / s l a v e may send l e s s t h a n r e q u e s t e d{

c h a r c = Wire . r e a d ( ) ; / / r e c e i v e a b y t e as c h a r a c t e rS e r i a l . p r i n t ( c ) ; / / p r i n t t h e c h a r a c t e r

}S e r i a l . p r i n t l n ( ) ;

d e l a y ( 5 0 0 ) ;}

Page 15: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 15

S l a v e Code :

/ / Wire S l a v e Code/ / by Sr i r am Kumar K

/ / C r e a t e d 29 March 2013

# i n c l u d e <Wire . h>

vo id s e t u p ( ){

Wire . b e g i n ( 2 ) ; / / j o i n i 2 c bus wi th a d d r e s s #2Wire . onReques t ( s endEven t ) ; / / r e g i s t e r e v e n tWire . onRece ive ( r e c e i v e E v e n t ) ;S e r i a l . b e g i n ( 9 6 0 0 ) ;

}

vo id loop ( ){

d e l a y ( 1 0 0 ) ;}

vo id sendEven t ( ){

Wire . w r i t e ( ” message from s l a v e 2 ” ) ; / / r e s p o n d wi th message o f 6 b y t e s/ / a s e x p e c t e d by m a s t e r

}vo id r e c e i v e E v e n t ( i n t howMany ){

w h i l e (1 < Wire . a v a i l a b l e ( ) ) / / l oop t h r o u g h a l l b u t t h e l a s t{

c h a r c = Wire . r e a d ( ) ; / / r e c e i v e b y t e as a c h a r a c t e rS e r i a l . p r i n t ( c ) ; / / p r i n t t h e c h a r a c t e r

}i n t x = Wire . r e a d ( ) ; / / r e c e i v e b y t e as an i n t e g e rS e r i a l . p r i n t l n ( x ) ; / / p r i n t t h e i n t e g e r

}

B. Expression Code

Listing 2. Facial Expressions Arduino Code

/ / L i n e a r A c t u a t o r Code t o p roduce f a c i a l e x p r e s s i o n s/ / by Devang in i P a t e l

# i n c l u d e <Servo . h>

c o n s t i n t s e r v o C o u n t = 1 0 ;c o n s t i n t dontMove = 200 ;

Servo musc l e s [ s e r v o C o u n t ] ;i n t pos = 0 ; / / v a r i a b l e t o s t o r e t h e s e r v o p o s i t i o ni n t s t a r t P i n N u m b e r = 2 3 ;

b o o l e a n c o n s i d e r [ s e r v o C o u n t ] = { t r u e , t r u e , t r u e , t r u e , t r u e , t r u e , t r u e , t r u e , t r u e , t r u e } ;

Page 16: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 16

i n t p i n V a l u e s [ s e r v o C o u n t ] = {23 , 25 , 27 , 29 , 31 , 33 , 35 , 37 , 39 , 41} ;i n t i n i t i a l V a l u e s 2 [ s e r v o C o u n t ] = {0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0 , 0} ;i n t i n i t i a l V a l u e s [ s e r v o C o u n t ] = {180 , 0 , 90 , 90 , 90 , 90 , 90 , 180 , 180 , 90} ;

i n t s m i l e L e n g t h = 9 0 ;i n t smi l eEndVa lues [ s e r v o C o u n t ] = {0 , 180 , dontMove , dontMove , dontMove , dontMove , dontMove , 0 , 0 , dontMove } ;

i n t s u r p r i s e L e n g t h = 9 0 ;i n t s u r p r i s e E n d V a l u e s [ s e r v o C o u n t ] = {dontMove , dontMove , 0 , 0 , 0 , 0 , 180 , dontMove , dontMove , dontMove } ;

i n t a n g e r L e n g t h = 9 0 ;i n t ange rEndValues [ s e r v o C o u n t ] = {dontMove , dontMove , 180 , 180 , 180 , 180 , 0 , dontMove , dontMove , dontMove } ;

vo id i n i t i a l S e t u p ( ) {f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++){

musc le s [ i ] . w r i t e ( i n i t i a l V a l u e s [ i ] ) ;}

}

vo id doAct ion ( i n t s t a r t V a l u e s [ ] , i n t endValues [ ] , i n t a c t i o n L e n g t h ){f o r ( i n t j = 0 ; j < s m i l e L e n g t h ; j ++){

f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++){i f ( endValues [ i ] != dontMove ){

i n t d e l t a = endValues [ i ] − s t a r t V a l u e s [ i ] ;musc l e s [ i ] . w r i t e ( s t a r t V a l u e s [ i ] + j ∗ ( d e l t a ) / a c t i o n L e n g t h ) ;

}}d e l a y ( 1 0 ) ;

}d e l a y ( 1 0 0 0 ) ;

}

vo id s impleRun ( ) {f o r ( pos = 0 ; pos < 180 ; pos += 1) / / goes from 0 d e g r e e s t o 180 d e g r e e s

{ / / i n s t e p s o f 1 d e g r e ef o r ( i n t j = 0 ; j < s e r v o C o u n t ; j ++)

i f ( c o n s i d e r [ j ] )musc l e s [ j ] . w r i t e ( pos ) ; / / t e l l s e r v o t o go t o p o s i t i o n i n v a r i a b l e ’ pos ’

d e l a y ( 1 0 ) ; / / w a i t s 15ms f o r t h e s e r v o t o r e a c h t h e p o s i t i o n}f o r ( pos = 180 ; pos >=0; pos−=1) / / goes from 180 d e g r e e s t o 0 d e g r e e s{

f o r ( i n t j = 0 ; j < s e r v o C o u n t ; j ++)i f ( c o n s i d e r [ j ] )

musc l e s [ j ] . w r i t e ( pos ) ; / / t e l l s e r v o t o go t o p o s i t i o n i n v a r i a b l e ’ pos ’d e l a y ( 1 0 ) ; / / w a i t s 15ms f o r t h e s e r v o t o r e a c h t h e p o s i t i o n

}}

vo id s e t u p ( ){

f o r ( i n t i = 0 ; i < s e r v o C o u n t ; i ++)musc l e s [ i ] . a t t a c h ( p i n V a l u e s [ i ] ) ;

i n i t i a l S e t u p ( ) ;}

vo id loop ( ){

Page 17: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 17

s impleRun ( ) ;doAc t ion ( i n i t i a l V a l u e s , smi leEndValues , s m i l e L e n g t h ) ;doAc t ion ( i n i t i a l V a l u e s , s u r p r i s e E n d V a l u e s , s u r p r i s e L e n g t h ) ;doAc t ion ( i n i t i a l V a l u e s , angerEndValues , a n g e r L e n g t h ) ;i n i t i a l S e t u p ( ) ;

}

C. Circuit Designs

Page 18: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 18

Fig. 40. The electric ciruit design for the sensors.

Page 19: Charlie Robotic Humanoid Head Description Document

BIOLOGICALLY INSPIRED ROBOTICS 19

Fig. 41. The electric ciruit design for the actuators.


Recommended