+ All Categories
Home > Engineering > Iaetsd gesture

Iaetsd gesture

Date post: 18-Jul-2015
Category:
Upload: iaetsd-iaetsd
View: 33 times
Download: 0 times
Share this document with a friend
Popular Tags:
7
GESTURE RECOGNITION BY AMREEN AKTHAR .J. AKSHAYA .B. III YEAR E.C.E PANIMALAR ENGINEERING COLLEGE INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014 INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in 42 ISBN: 378-26-138420-01
Transcript
Page 1: Iaetsd gesture

GESTURE RECOGNITION

BY

AMREEN AKTHAR .J.

AKSHAYA .B.

III YEAR E.C.E

PANIMALAR ENGINEERING COLLEGE

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in42

ISBN: 378-26-138420-01

Page 2: Iaetsd gesture

Abstract—Tongue and Ear plays a major role for

speaking and hearing by normal person. But it is impossible to

speak and hear by deaf and dumb people. But they normally

speak using their sign action with others. It is easily

understood by their community, but they fell difficult when

they communicate with normal people because normal person

can’t able to understand their sign symbol. To tackle this

situation we design a system which converts their sign symbol

to text as well as voice output and normal person’s voice to

corresponding sign symbol for two way communication. This

system has flex sensor and IMU (Inertial Measurement Unit)

to recognize their sign symbol, speech synthesis chip for voice

output and speech recognizing module for converting voice to

sign symbol. These are interfaced with microcontroller, which

is programmed to obtain corresponding output.

Keywords— Flex sensor, IMU, speak jet IC, speech

recognition module.

I. INTRODUCTION

Many research people have undergone research in order to overcome difficulties faced by physically challenged people. Many developed the system related to prosthetic hands which is used to find behavior of human hand. This project is similar kind as a part of determining sign words using hand motion.

The main feature of this system is:

Hand held real time embedded device. Low cost and reliable. Operated using battery.

In order to get accurate words or sentences IMU is used

to find the exact position of the hand because placing hands in any position will give same values, to overcome this IMU is used. It is placed in the hand along with flex sensors so that its value gets changed according to the position of hand. So that for a particular word for which the hand should be

placed in particular Position is satisfied. This value is fed to microcontroller, which is preprogrammed will display the corresponding words. At the same time voice output will be heard for the corresponding words with the help of speak jet chip. On the other hand voice from normal people is converted and displayed in to corresponding sign symbol which is pre stored.

II. RELATED WORK

L.k.Simone has introduced a low cost method to measure the flexion of fingers. He use flex sensor for measuring flexion. He has evaluated the custom glove for measuring finger motion. Some of the parameters he has evaluated are donning, glove comfort and durability. Wald developed software for editing automatic speech recognition in real time for deaf and hard-hearing people. Syed Faiz, Ahmed, Syed Baber Ali, Saqib Qureshi developed an electronic speaking glove for speechless patients which are one way communication. Jingdong Zhao, developed a five finger under actuated prosthetic hand system.

III. SYSTEM LAYOUT

Figure 1 below shows the proposed system module. In

this system flex sensor is used to recognize finger position to obtain words, phrases, sentences etc. This value is signal conditioned using LM342 IC and other components, which is given as input to micro controller. In order to get accurate sign symbol, words or phrases as output, microcontroller is interfaced with IMU which consists of gyroscope, accelerometer and magnetometer. This sensor is used to determine tilt, rotation and rate of finger. By comparing flex sensor and IMU values microcontroller will display corresponding words or phrases. As an option these captured words also send to mobile using Bluetooth module. Output from Speak jet IC is fed to speaker to speak according to phonemes stored in controller for the captured values by combining flex sensor and IMU sensor.

Similarly voice from normal people is captured using

microphone and fed to microcontroller through speech

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in43

ISBN: 378-26-138420-01

Page 3: Iaetsd gesture

recognizing module. Controller which is preprogrammed will display corresponding symbol.

Figure1. Block Diagram.

IV. IMPLEMENTATION

Obtaining signal from fingers which is used to recognize sign symbol consists of lot of methods including [5]

EMG (Electromyography) Load cell Wearable conductive fiber Sliding fiber optic cable Flex sensor

In this system flex sensor is used to recognize hand

gesture due to reliable and cost effective.

A. Flex sensor:

Flex sensors technology is based on resistive carbon elements. As a variable printed resistor the sensor achieves great form factor on a thin substrate. When the substrate is bent, the sensor produces a resistance output correlated to bent radius as shown in figure 2. Smaller the radius, higher the resistance value. It varies approximately from 10k to 50k.

It offers a superior solution for application that

requires accurate measurement and sensing of deflection, acceleration. It is placed inside the gloves which are to be worn. As the finger is bent for corresponding words,

resistance values get changed which is fed to controller after signal conditioning it.

Figure 2. Flex sensor . Signal conditioning circuit: It is shown in figure 3. For a

simple deflection to voltage conversion, the bend sensor device is tied to a resistor Rm in a voltage divider configuration. The output is described by

Vout = (V+)/ [1 + Rm/Bend sensor]

Figure.3. Flex sensor signal conditioning circuit

Output from this sensor is fed to Op-Amps LM324 to boost up the circuit current. For different values of resistor Rm, different deflection Vs voltage curve is obtained as shown in figure 4.

Figure4. Deflection Vs Vout [6].

B. Inertial Measurement Unit:

IMU used in this system is MinIMU-9 v2 from Pololu. It consists of 3-axis accelerometer, 3-axis magnetometer and 3-axis gyro sensors. An I2C interface access nine independent rotations, acceleration and

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in44

ISBN: 378-26-138420-01

Page 4: Iaetsd gesture

magnetic measurement that can be used to calculate the sensor’s absolute orientation.

L3GD20: It is low power 3-axis angular rate sensor capable of providing the measured angular rate to the external world through I2C terminal. The direction of angular rate is shown in figure 5.

Figure.5. Direction of angular rate

LSM303DLHC: It is a combination of digital linear accelerometer and magnetometer sensor. It can support standard and fast mode I2C serial interface. Direction of acceleration and magnetic field is shown in figure 6 and figure 7.

Fig.6.Direction of ACC Fig.7.Direction of Mag field

Whenever we change the position of hand for

particular word, the values of IMU gets changed. This IMU sensor is interfaced with microcontroller. By comparing the values of flex sensor and IMU we can recognize the correct word or phrase with correct position of hand, which can be displayed in LCD.

C. Speak jet IC

Speak jet IC is self contained single chip for sound

synthesizer using mathematical sound architecture technology. It is preconfigured with 72 speech elements, 43 sound effects and 12 DTMF touch tones. This is interfaced with microcontroller which is pre-programmed to send serial data to speak jet IC to speak the corresponding words or sentences by combining words. In order to hear by normal people output from speak jet IC is amplified by giving it to LM386 audio amplifier which is connected for particular gain and then to speaker. The connection recommended by manufacturer is shown in figure 8.

Figure.8. Speak jet typical connection.

D. Speech recognizing module

The module used here is VRbot. It is used to recognize

voice from normal people through inbuilt microphone. It communicates with microcontroller using UART interface which is shown in figure 9.

Figure.9. VRbot interfaced with microcontroller .

It has built-in speaker independent command and also

supports 32 user defined commands, which is used to recognize corresponding words spoken by people and displayed it in LCD which can be understood by physically challenged people.

V. PROGRAMMING THE HARDWARE

The code required for this system is written in embedded

C language which can be compiled and debugged using integrated development environment (IDE). The software used to program the hardware’s are

A.MPLAB

This is the IDE with integrated toolset for developing

an embedded application using PIC microcontroller. It consists of a text editor, simulator, and device drivers made by microchip. In this code can be built using assembly or in C language. Device and language for coding can be selected according to our need.

B. CCS Compiler

A compiler is a programming language used to transfer source code in to object ode. The compiler used

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in45

ISBN: 378-26-138420-01

Page 5: Iaetsd gesture

here is CCS compiler, which is commonly used for PIC controller. It has many inbuilt function by including that we can call the whole functions, which makes the program to be very simple this compiler can implement normal C constructs, input/output operations etc.

C. Phrase-A-Lator

This software is a demonstration program from

Magnevation which allows speak jet IC to speak. It is used to set voice quality like pitch, volume and speed. It will generate the code for the corresponding words we need which is then used in main code to make the speak jet IC to speak. The main menu for this software is shown in fig.10, is used to select communication settings and other editor menu. When connected to PC the serial port check box will turn the color to green if correct COM port is selected.

Figure10. Magnevation phrase transistor for speak jet.

Phrase editor is used to develop words, phrases and

sound effects using built-in phonemes and sound effects. Required words or phrases can be written in say data area and corresponding code can be obtained by pressing view code button. This is shown in figure 11.

Figure11. Phrase editor menu with words and code.

VI. RESULT

The hardware circuit of the module is shown in figure12. It consists of microcontroller interfaced flex sensors, speak jet IC, etc.

Figure12. Hardware circuit of the system.

The hex file obtained for the code after compilation is

downloaded in to PIC controller and corresponding words is displayed in LCD. The words or conversation is obtained by taking signed English as a reference. For each and every word, values from flex sensors is compared with IMU sensors and fed to micro controller which is displayed in LCD display and also voice output is obtained through speak jet IC. This output is also transmitted to mobile using Bluetooth module.

Accelerometer output obtained from IMU sensor is shown in figure 13. It shows result for all three axis. Magnetometer output obtained from IMU sensor for all three axes is shown in figure 14. Gyroscope output obtained from IMU sensor for all three axis is shown in figure 15.

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in46

ISBN: 378-26-138420-01

Page 6: Iaetsd gesture

from normal people is not converted in to sign symbol.

In this system flex sensor is used along with IMU sensor to capture the words. Flex sensor is used to obtain fingers change to capture words. Even though we change the position of hand the flex sensor values will not get changed because flex sensor is placed in fingers. All the conversation used by physically challenged people will be using fingers or hands in particular position and rotating hands or fingers. To capture these positions of fingers or hand IMU sensor is used.

Figure13. Accelerometer output in display.

Figure14. Magnetometer output in display

Figure15. Gyroscope output in display.

VII. RESULT COMPARASION

Outputs obtained in previous work are simply

obtaining finger flex, using software for speech recognition which uses computer to run the software. Since these system uses pc it can’t able to use it for any situation. Another system used only flex sensor to recognize words. Using change in flex sensor, the output is obtained. Some system which use computer is not portable to use and also only one way communication which will display result, which will be understood by normal people, but speech

IMU sensor which consists of accelerometer, magnetometer, and gyroscope will tackle this situation. This sensor is placed along with flex sensors in the hand so that by changing the finger for conversation, flex sensor and IMU sensor will get changed, by comparing these two values, output is displayed in display.

VIII. OUTPUT

Sample output obtained for some important conversations such as “WELCOME and HOW ARE YOU”, are shown in below figure 16a and b. The Left portion of figure 16a, and b shows the exact position to keep the fingers for the word which is obtained using the reference site and right portion of figure 16 a and b shows the output obtained in digital display by wearing gloves with flex sensors and IMU in the hands and by keeping the hands in the position referred by the picture. At the same time same word which is displayed is also heard through speaker using speak jet IC.

Figure16.a. Digital display shows WELCOME

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in47

ISBN: 378-26-138420-01

Page 7: Iaetsd gesture

[3] Syed Faiz, Ahmed, Syed Baber Ali, Saqib Qureshi,”Electronic Speaking Glove For Speechless Patients A Tongue to a Dumb”, Proceedings of the 2010 IEEE Conference on Sustainable Utilization and Development in Engineering and Technology University Tunku Abdul Rahman 2010.

[4] Jingdong Zhao, Li Jiang, Shicai Shi, Hegao Cai, Hong Liu, G.Hirzinger, "A Five-fingered Underactuated Prosthetic Hand System", Proceedings of the 2006 IEEE International Conference on Mechatronics and Automation, June 2006, pp. 1453 1458.

[5] N. P. Bhatti, A. Baqai, B. S. Chowdhry, M. A. Umar, "Electronic Hand Glove for Speech Impaired and Paralyzed Patients", EIR Magazine, pp. 59-63, Karachi, Pakistan. May 2009

[6] Flex Point Inc. USA,”http://www.flexpoint.com” Last Accessed on September 06, 2010.

Figure16.b. Digital display shows HOW ARE YOU

IX. CONCLUSION AND FUTURE ENHANCEMENT

This system will be useful for physically challenged people and will tackle the gap between them and normal people. Since it is two way portable communication systems, it can be used at any time.

This system can be enhanced by using extra flex sensors

in wrist and elbow, so that conversation which uses these bent positions can be obtained accurately. Further storage device like SD card can be used to store more phrases as a dictionary to speak and to visualize it. It can also be enhanced by covering it with water proof layer to use it in any situation.

REFERENCES

[1] L. K. Simone, E. Elovic, U. Kalambur, D. Kamper, "A Low Cost Method to Measure Finger Flexion in Individuals with Reduced Hand and Finger Range of Motion", 26th Annual International Conference of the IEEE Engineering in Medicine and Biology Society 2004 (IEMBS '04), Volume 2, 2004, pp. 4791-4794.

[2] M. Wald, "Captioning for Deaf and Hard of Hearing People by Editing Automatic Speech Recognition in Real Time", Proceedings of 10th International Conference on Computers Helping People with Special Needs ICCHP 2006, LNCS 4061, pp. 683-690.

[7] www.pololu.com/catalog/product/1268 [8] Magnevation Speak jet http://www.speakjet.com.

[9] VRbot module, www.VeeaR.eu

[10] http://www.sign.com.au

INTERNATIONAL CONFERENCE ON CURRENT TRENDS IN ENGINEERING RESEARCH, ICCTER - 2014

INTERNATIONAL ASSOCIATION OF ENGINEERING & TECHNOLOGY FOR SKILL DEVELOPMENT www.iaetsd.in48

ISBN: 378-26-138420-01


Recommended