+ All Categories
Home > Documents > Embedded Systems Final Report

Embedded Systems Final Report

Date post: 24-Feb-2018
Category:
Upload: andrewalbert
View: 225 times
Download: 0 times
Share this document with a friend

of 22

Transcript
  • 7/25/2019 Embedded Systems Final Report

    1/22

    EECE 4038C, Embedded Systems Design

    Final Project Report

    Andrew Albert - Thinh Nguyen

    12/11/15

    Project idea:This project is called Vision+ Glasses for blind people. The idea of this project came

    from a time I witnessed a blind person hit his head on a small truck door in Chicago on the

    street because their cane could not detect an object at eye level. To solve this problem, we

    decided to design a device which can help people avoid an accident caused by hitting an object

    at eye level which would not interfere with day-to-day life. We wanted to make the device

    affordable and fashionable to every person and be easily integratable into their life and

    minimally invasive.

    We decided to use an input-output system to alert the user when there is an object at

    head level that they could potentially run into. A single IR distance sensor was used as our input

    to convert distance between an object and the users head into an analog voltage input. The IRsensor was placed on a servo which was designed to sweep the sensor back and forth across a

    60 degree range oif detection in front of the user. The relative angle in front of the user that this

    servo is set to as well as the IR distance measurement was used to determine the relative

    direction and distance of any object in front of the user.

    Lastly a piezo element was placed on each side of the glasses and used to output this

    measurement information to the user as a pulsing sound. A pulsing sound coming from a single

    piezo element indicates that there is an object on that side of the user that is between 10 and

    30 on that side of the user both piezo elements alternating pulses indicates the the object is

    within 10 of the user and is therefore located straight ahead. The pulses start at 10 pulses per

    second when the object is two feet away and gets linearly faster until they are 20 pulses persecond when the object is a foot away. All objects located within a foot of the user regardless of

    direction will result in both piezos producing a continuous sound to indicate this to the user.

    While this type of design requires the user to go through an adjustment period to get

    used to interpreting these sounds, once the learning curve has been surpassed the device is

    relatively easy to use and is discrete by only producing sound loud enough for the user to hear.

    Below: a sketch of our initial design of the product.

  • 7/25/2019 Embedded Systems Final Report

    2/22

    Final user manual:

    1. Stay in an open space area without any walls or objects in your way. There should be

    only clear space within a 3 foot radius of you.

    2. Put on the glasses. Make sure to secure the glasses as close to your face as possible,

    so that the plane of detection will be parallel to the ground3. Clip the control box on your belt then switch on the power. If you dont have any belt to

    clip on, you can put it in your pocket, or hold it in your hand.

    4. Put your hand right in front of your nose and slowly extend it away from your nose until

    you reach your full arms length.

    5. Continue to move your hand to and away from your face and note the difference in

    sound.

    6. As your hand hits the 2 foot mark away from your face you should begin to hear a

    pulsing sound which rapidly gets faster until once a foot away from your face the sound

    becomes continuous instead of pulsing. Become familiar with this sound and learn how

    to interpret the relative difference through trial and error.

    7. You will want to repeat steps 5 and 6 for several different directions from your face to

    learn at what points an object will be detected as being in front of or to the side of you. If

    an object is on your right only the right speaker will make a sound, the left side will use

    the left speaker respectively, and for an object in front of you both speakers will

    alternate.

    8. Within a foot of you however, direction will not be distinguishable as both speakers will

    be continuously on. This is to alert you that a single step forward will invariably cause

    you to smack your head against whatever object there is. Continue to practice with these

    distances and directions until you feel comfortable with interpreting the noise you hear.

    9. Now you can begin to walk about and use the devvice in your everyday life and will be

    able to detect a collision before it occurs. If the device stops working, you should try

    replacing the 9V battery before checking for other problems.

  • 7/25/2019 Embedded Systems Final Report

    3/22

    List of components:

    Item Part number Vendor Cost Picture

    Infrared

    Proximity Sensor

    Sharp

    GP2Y0A21YK

    Sparkfun $13.95

    Piezo element 7NB-31R2DM-1

    R5 L10

    Sparkfun $1.50 x 2

    Servo (sub

    micro size)

    Sparkfun

    ROB-09065

    Sparkfun $8.95

    Table1. List of component

  • 7/25/2019 Embedded Systems Final Report

    4/22

    Final Design:

    Shown below are the various diagrams explaining our final solution. The first two shown

    here are a block level diagram indicating the different parameters passed between the various

    components of the Vision+ Glasses and how these components control or are controlled by the

    Arduino board. The other diagram shown directly below is a diagram of the different detectionzones of objects in front of our users there is an immediate detection zone where both

    speakers will be on constantly to let the user know the object is within 1 foot of their head, and

    three relative direction zones where the speakers will pulse at 10 beats per second when the

    object is 2 feet away up to 20 beats per second when the object is just over 1 foot away. The

    piezo elements will also indicate the relative direction with each respective piezo being used

    when the object is on that side and both pulsing when the object is in front of the user.

    Shown to the right is a graph of the analog

    output voltage of the IR sensor compared to the

    distance of an object in front of the sensor. Thediagram shown that reflectance of an object has

    little effect on the voltage provided that the object

    maintains a minimum level of reflectivity (~10% or

    more). This analog voltage is recorded as a 10-bit

    input where the maximum voltage of 5V

    corresponds to 1023 (the maximum value for a

    10-bit reading).

    Shown on the following page are our circuit diagrams and program flowchart for the final

    implementation of the project. The components and their various input/output pins are shown asconnected to the Arduino UNO R3 microcontroller. The program flowchart if read entirely shows

    how our two main sections of our program: servo angle incrementation, and piezo element

    speaker output are implemented as a series of conditional statements. The flowchart details

    how the logic for incrementing the servo through its +/- 30 of motion and how the IR sensors

    input is recorded and used in conjunction with servo angle to determine which speakers to play

    and how often to pulse them.

  • 7/25/2019 Embedded Systems Final Report

    5/22

    Below: Circuit Diagram showing the various components and their input/output pins on the

    Arduino Uno R3 board.

    Below: Program Flow Chart showing the logic used to measure from our IR sensor and actuate

    our servo, and piezo elements.

  • 7/25/2019 Embedded Systems Final Report

    6/22

    Shown below is a picture detailing the final implementation of our projects prototype. The

    picture is labeled showing the various components and how they were placed onto the glasses.

  • 7/25/2019 Embedded Systems Final Report

    7/22

    Test Plan and Results:

    Our test plan involved having our user wear the device and determine if they could

    accurately determine where an object was placed in front of them as well as the relative

    distance to the object. The object used for this test was an approximately 1 foot by 1 foot piece

    of cardboard which could be reliably identified by the IR sensor regardless of the relative

    direction of the piece of cardboard to the sensor. We tested for a variety of ranges anddirections to ensure that the device would pick up all objects within a 2 foot range of the user

    and within a 30 degree offset of the users facing direction while correctly outputting the objects

    relative distance and direction from the user. Shown below are pictures of two successful tests

    as well as a table detailing all of our tests performed and their results.

    Two successful tests for an object in front of user (left) and to the users right side (right)

    Test Test Result Notes

    Object 2 feet in front of user

    (~5 degrees to the right)

    Success Object detected, both speakers

    pulsing on and off

    Object 8 inches in front of user, right

    side (~15 degrees to the right)

    Success Object detected, right speaker

    continuously on

    Object 3 feet in front of user Success Object not detected as expected

    Object 1.5 feet in front of user, left side

    (~20 degrees to the elft)

    Success Object detected, left speaker

    pulsing on and off

    Object 2 feet in front of user moves

    slowly closer until 6 inches away

    Success Object detected, both speakers

    start pulsing on getting faster as

    the object moves forward until both

    are continuously on when the

    object is approximately 1 foot

    away.

  • 7/25/2019 Embedded Systems Final Report

    8/22

    Bugs/Issues and Potential Remedies:

    One of our first major issues involved a mechanical failure of the micro servo. This

    caused us to use a replacement component that was four times as large as originally designed.

    The larger servo was not necessary but obtaining another copy of our original servo would have

    taken too long to be ready by the deadline.

    Our other issue was the detection of certain types of objects. The IR sensor requires acertain level of reflectivity and requires the object to be plainly visible in its cone of vision

    people have a harder time being depicted by this device since usually the cone of vision is

    aimed towards their head which is a smaller target and is not very reflective. Therefore, we have

    a harder time detecting people at two feet away then other objects, the best remedy is to either

    wait until the person is closer to our user and then they will be detected, or to simply upgrade

    our current sensor for a more powerful one.

  • 7/25/2019 Embedded Systems Final Report

    9/22

    EECE 4038C, Embedded Systems Design

    Final Project Implementation Report

    Andrew Albert - Thinh Nguyen

    12/4/15

    Design Changes:

    Since our last report a couple of minor design changes have occurred. The most

    significant is our change in signaling an object to our user. While we are still capable of telling

    our user the relative distance and direction of an object, we were unable to do this by using a

    rumble motion. The buzzers we had selected did not produce enough vibration to be detectable

    by the user and so we have switched to using a piezoelectric motion detector as a speaker

    instead. The speakers selected emmit a 90 dB sound and so we have added a 50 k resistor to

    the supply pin of each speaker to muffle the sound to the point where the speakers are clearly

    audible to the user but not to anyone near the user. This way, the device avoids irritating nearby

    people with a buzzing sound.The other design change made during our implementation was the switching of our

    servo motor for one that is about 3 times larger. The reason for this swap was our servo motor

    being broken during testing of the design. We would select another similarly sized rotor if time

    permitted, but the servo would not be delivered by the deadline so we have opted for this

    substitute. The two servos operate identically and so nothing about our design was needed to

    be changed in order for this substitution to occur.

    Commented Program://final project ESD

    //date: 12/4/15

    #include

    Servo myservo

    #define sensorPin A0 // select the input pin for the IR sensor

    int ledPin = 13 // select the pin for the LED

    int sensorValue = 0 // variable to store the value coming from the sensor

    int blinktime=0 // gap between each pulse

    int swingmode = 0

    boolean blinkmode = 0 // 1 is blinking, 0 is not

    int angle = 90 //we call it 90 degree for this simulation

    void Blink()

    void setup() {

    // put your setup code here, to run once:

    pinMode(A0, INPUT)

    //Serial.begin(9600)

    //pinMode(ledPin, OUTPUT)

    pinMode(5,OUTPUT)

    pinMode(6,OUTPUT)

    myservo.attach(3)

    myservo.write(angle)

  • 7/25/2019 Embedded Systems Final Report

    10/22

    }

    void loop() {

    while (true)

    {

    sensorValue = analogRead(sensorPin)

    if ((sensorValue >65) || (blinkmode) ){ // 80 is a upper level of the blink cutoff --> 0.4V

    Blink()

    continue

    }

    else{

    digitalWrite(5,LOW)

    digitalWrite(6,LOW)

    }

    if (swingmode == 0)

    {

    angle = angle + 1

    if (angle >= 135)

    swingmode = 1

    }else

    {

    angle = angle - 1

    if (angle 75)&&(angle < 105)){

    analogWrite(5,50)

    analogWrite(6,50)

    }

    else if (angle

  • 7/25/2019 Embedded Systems Final Report

    11/22

    analogWrite(5,50)

    analogWrite(6,50)

    }

    if (sensorValue >30) // 50 is a upper level of the blink cutoff --> 0.25V

    blinkmode = 1

    else

    blinkmode = 0

    }

    Circuit Setup:

    Shown below is our circuit and corresponding components attached to the frame of the

    glasses. The wires have all been soldered together end and the components are attached using

    hot glue in such a way to ensure the product is durable without preventing the device from

    functioning properly. The device is labeled below with all of our components and has been

    constructed to minimize the electronics interfering with normal day-to-day activities.

    System in Use:

  • 7/25/2019 Embedded Systems Final Report

    12/22

    Shown below are two images of our system in use. On the left side is an object directly

    in front of our user and shows our user lifting both of their hands to indicate that both buzzers

    are going off and alerting them that the object is in front of them. On the right side is the same

    object approximately 10 degrees on the left side of the user and shows our user lifting their left

    hand to indicate that only the left buzzers is going off.

    Initial Test Results:

    Test Test Result Notes

    Object 2 feet in front of user Success Object detected, both speakers

    pulsing on and off

    Object 8 inches in front of user, right

    side

    Success Object detected, right speaker

    continuously on

    Object 3 feet in front of user Success Object not detected as expected

    Object 1.5 feet in front of user, left side Success Object detected, left speaker

    pulsing on and off

    Object 2 feet in front of user moves

    slowly closer until 6 inches away

    Success Object detected, both speakers

    start pulsing on getting faster asthe object moves forward until both

    are continuously on when the

    object is approximately 1 foot

    away.

  • 7/25/2019 Embedded Systems Final Report

    13/22

    EECE 4038C, Embedded Systems Design

    Final Project Design Report

    Andrew Albert - Thinh Nguyen

    11/23/15

    Design Decisions and Revisions:

    While creating the first design of this product we had to make several design decisions to

    ensure our product would work as intended. The design decisions involved our choice in our two

    main components: the infrared proximity sensor and the piezoelectric motion sensors. The

    infrared proximity sensor was chosen over other types of distance sensors due to its ability to

    sense objects even when the object comes in a variety of colors and reflectivities, and its ability

    to sense these objects even when they are at sharp angles from the sensor (sonar based

    sensors cannot do this). The sensor is capable of distinguishing distances between 24 and 80

    cm and can determine if there is an object within 80 cm of the sensor. This is perfect for our

    design since we will be able to differentiate how close an object is between approximately 1 and3 feet and will still be able to determine if an object is within 1 foot of the user.

    The piezoelectric motion sensors are a device that can be used as a sensor or an

    actuator. The device is capable of turning motion into a voltage and vice versa. We will use the

    two devices as actuators and apply a proportional voltage to indicate to the user the direction

    and a frequency of pulses to indicate the relative distance of the detected object. Since there is

    an available piezo element for each side of the glasses, it is possible to develop a stereo

    feature for the piezo elements. Two variables will be initiated together to indicate the angle of

    the detected object. These two variables will control the volume of the two piezo element so

    that when the angle is 90 degrees (object directly in front of the user), both the left and rightelements will have the same values. Right side will have larger volume when angle is larger

    than 90 degree and vice versa. The change should be gradient to create the best user

    experience with the two speakers varying from max volume when at 90 degrees or in their

    relative direction (> 90 for right side, < 90 for left side) to zero when the object is completely to

    the right/left side respectively. The frequency of pulses to both devices will vary from a constant

    input at

    In order to create the pulse for the piezo elements, we will utilize a for loop instead of

    using the method of a delay function to translate the relative angle into a pulse strength. Each

    time the MCU goes through the loop, it will turn on the piezo for x microseconds then turn it off

    for y microseconds to effectively dim the vibration strength of the piezo element.

    The relationship of x and y and the volume variable will follow these equations below.

    x + y = 1000 us volume = 100*(x/1000) = x

    Therefore, we can calculate x = 10*volume and y = 1000 - 10*volume.

  • 7/25/2019 Embedded Systems Final Report

    14/22

    The volume variable will vary from 100 when the object is at 90 degrees or in the direction of the

    selected piezo element to 0 when the object is completely in the opposite direction.

    Alternative Designs:

    While designing this product several other design solutions were considered but

    eliminated in favor of our current design. The first was alternatives for the piezoelectric motion

    sensors which we considered using small piezoelectric speakers for. While we could use theseto vary the frequency and volume of a sound and accurately relay this information to the user,

    the device would be too noisy and defeat its purpose to be a discrete safety device for the blind.

    The project also would require the user to be able to distinguish different noise frequencies

    which may be difficult depending on whether the user has their full range of hearing.

    Originally we also considered utilizing a two dimensional distance sensor to eliminate the

    need for a servo motor to sweep the sensor through its full field of vision. However two

    dimensional sensors were found to be more expensive and less accurate as well as having a

    much shorter distance of detection. Therefore we settled on the use of a one dimensional

    distance sensor and will use the servo to sweep this sensor in front of the users field of vision.This will allow us to more accurately determine the object's location as well as pinpoint its angle

    from the user which the two dimensional distance scanners would be unable to do.

    Complete Parts List:

    1 x Arduino Uno (Rev3) ICSP

    1 x Infrared Proximity Sensor (GP2Y0A21YK)

    1 x Servo motor

    2 x Piezoelectric Motion Sensors (utilized as rumblers by applying a voltage to them)

    1 x Eyeglasses

    Circuit Diagram:

  • 7/25/2019 Embedded Systems Final Report

    15/22

    Above is the circuit diagram for our device. We used Arduino Uno R3 with an ATmega328P as

    the microcontroller. The board was supplied with 9V power from a regular duracell 9V battery.

    Voltage regulators on the boards provides 5V power for servo motor and piezo elements and

    3.3V power for IR distance sensor GP2Y0A21YK. Signal from pin D3 will be outputted as PWM

    signal to control rotation on the servo. Also, PWM signal from D5 and D6 will provide vibration

    frequency for the two piezo elements. Pin A0 will receive an analog signal from IR distancesensor.

  • 7/25/2019 Embedded Systems Final Report

    16/22

    Flowchart:

  • 7/25/2019 Embedded Systems Final Report

    17/22

    Above is the flowchart for devices firmware. MCU will read the IR distance sensor analog signal

    from pin A0 and store that in a variable called sensorValue. Using this variable, we can

    determine 3 different ranges of obstacles distance (distance between users head the object on

    their path). In the immediate range (less than 20cm), piezo will make a solid vibration of

    frequency of C3 music note. In the middle range (20cm to 80cm), piezo will make a rhythmic

    series of vibrations of frequency of C4 music notes, the gap between each note (vibration) willbe longer when distance get larger. And when it is out range, servo will be activated to

    constantly scan for obstacle on users path.

    VISION PLUSGLASSES

    Project proposal

    TEAM INFO

    Thinh Nguyen

    (808)[email protected]

    Andrew Albert

    (513)444-7634

    [email protected]

    1.Description

  • 7/25/2019 Embedded Systems Final Report

    18/22

    Vision Plus Glasses is a device which can give an extra value to quality of life for blind people

    especially people who live in a urban area with a lot of dangerous obstacles in their way.

    Normally, blind people use their walking sticks to detect obstacles while walking. However, they

    might accidentally hit an object which only protrudes at head high and therefore is completely

    undetectable until it is too late. Therefore, there is a need of a device which can detect potential

    risks at head-height to help blind people avoid them. The device has to be intuitive for users andshould not interfere with their lives. Using those criteria as our design standard, we introduce

    our sunglasses called Vision Plus.

    Vision Plus is a fashionable wearable device in the form of a pair of sunglasses. Vision Plus

    implements an IR proximity sensor to check for objects at head-height and relay their

    approximate distance to the user. In order to make the device more intuitive for all users, we will

    integrate a piezoelectric element on each side of the sunglasses so that user can feel the small

    vibrations from it in order to determine the relative position of an object. In order to cause

    minimal interference, vibration is only sent when the user gets close enough (~ 3 feet) to the

    object. In this way Vision Plus will alert users only when a credible risk is present and tell them

    how to avoid it.

    Below is a sketch of the devices design.

    Figure 1. Design sketch of Vision Plus. (1) is the IR proximity sensor, (2) is the angle adjustment

    servo and (3) are the responding piezo elements.

    2. User manual

    The device includes a pair glasses which are wired to a control box.

    The device is ready to be used out of the box.

    Setup:

    First, put on your glasses

    Clip the box on your belt or put it in your pocket

  • 7/25/2019 Embedded Systems Final Report

    19/22

    Ask a friend to help hide the wire nicely and fashionably underneath your clothes

    Then, turn on the device by a switch on the control box.

    Test:

    Now, put your hand in front of you face and move your hand back and forth as well as

    left to right. When objects are close to your face, you will feel some evenly spaced pulses of

    vibration close to your ears. The rate of pulsed will correspond to the distance between

    you and the object. Fast rate when close and slow rate when far.

    Operation:

    Now you can confidently enjoy your walk outside

    As you walk, the buzzers will alert you to any object near your head. A buzzer being

    active corresponds to an object on that side of your face while both buzzers active

    indicates an object in front of your face.

    The buzzers will pulse at a speed proportional to the distance from your head when theobject is 1 to 3 feet from your head and will buzz continuously for objects within a foot of

    your head.

    You can move your head around until both buzzers are active in order to determine the

    actual location of the object.

    Notice: Replace battery when you cant feel any vibration when you put your hand close to your

    face.

    3. Flowchart

  • 7/25/2019 Embedded Systems Final Report

    20/22

    The flowchart shown above illustrates a high level description of how our product will

    work. The system will continuously scan for an object within three feet of the users view within a

    30 degree angle to both sides of the persons face in order to sufficiently detect any objects that

    could hit the user. Once an object has been detected, the camera will freeze in position and

    activate the buzzers to indicate to the user where the object is. Each buzzer will alert the user

    that an object is on that respective side while both buzzers activated at the same time will

    indicate that the object is in front of the user. A pulsing buzzer will indicate that an object is

    within three feet of the user while a continuous buzzer while indicate that the object is within a

    foot and the user is in imminent danger of being hit. If the IR sensor no longer detects an object,it will continue to scan again in order to relocate the object and update the position information

    for the user.

    4. Block Diagram

  • 7/25/2019 Embedded Systems Final Report

    21/22

    Above is shown a simplified block diagram of how our product is expected to function. We will

    use the Arduino UNO R3 board to interface with to Piezoelectric Buzzers, a Servo, and an

    Infrared Distance Sensor. The IR Distance sensor will supply an analog voltage input which we

    will be able to translate into a distance measurement indicating how far away an object is in

    front of the sensor. The servo will be controlled by the Arduino board to angle the IR Distance

    sensor so that it sweeps right to left in front of the user scanning for any objects within

    approximately 30 degree angle to either side of the users face.

    The two piezoelectric sensors will be used to indicate the relative direction of any objects to the

    user as well as the relative distance to the object. Finally the Arduinos included SRAM will be

    used to store variable information about objects near the user as well as the different scanning

    states of the program while the Flash memory will be used to store and read the program

    instructions.

    5. Test PlanTo test the operation of our device, we shall blindfold a test subject and have them use

    the Vision Plus Glasses to navigate a series of obstacles as well as determine the location of

    people in front of them. The product will be considered successful if the subject is comfortably

    able to avoid obstacles that could be encountered in the real world as well as accuratelydetermine the direction and relative distance of these obstacles or other people.

    The tests will individually be broken down into three stages of increasing difficulty:

    1. Determining a relative distance for an object directly in front of the person,

    2. Determining a relative distance and direction for an object placed slightly to the side of

    the subject,

  • 7/25/2019 Embedded Systems Final Report

    22/22

    3. A full course navigation around a minimum of three obstacles that the user could hit.


Recommended