Robot Name: DogBot
Author: Valerie Serluco
Date: December 08, 2015
Instructor(s):
Dr. Arroyo
Dr. Schwartz
TA(s):
Andrew Gray
Jacob Easterling
EEL 5666C
FALL 2015
1 | P a g e
INTRODUCTION
ABSTRACT
One of the fun things to do when you have a dog is to go outside and play a game of fetch.
Unless you’re like me that has a dog that doesn’t like to fetch. Well in any case, whether your
dog is living with your parents or they just don’t like to fetch; DogBot is your solution. DogBot
will fetch every ball you throw and bring it back no matter where you are.
OBJECTIVE
DogBot was built to track the ball throughout the process of being thrown to where it lands on
the ground using visual input. The visual input will be through PlayStation Eye camera using
opencv and python software. Once DogBot picks up the ball, it will use ultrasonic tracking to
make its way back to the user to return the ball. This will done through PlayStation Eye camera
searching for the base.
BEHAVIOR
Search
• User Search: DogBot will locate through using ultrasonic pulses
• Ball Tracking: Once user is located PlayStation Eye will locate ball
Fetch
• Ball Tracking: PlayStation Eye will track ball being thrown
• Avoidance: While DogBot is in motion, it will autonomously be looking for obstacles and avoid contact
• Grab-n-Lift Arm: DogBot's arm will grab and lift the ball up in the when to be transported
Return
• User Search: DogBot will use PlayStation eye to find user's location
• Grab-n-Lift Arm: DogBot will then place ball at user's feet
2 | P a g e
INTEGRATED SYSTEM
DogBot is using a Raspberry Pi 2 to fulfill the role of a microprocessor. The Raspberry
Pi 2 will then control an Arduino Mega 2560, which will act as a slave. Attached to the Arduino
Mega is the Bricktronics MegaShield which is the controller for all the LEGO Mindstorms NXT
components. The infrared sensors will be connected to the Arduino Mega. The infrared sensors
feedback will be used for conditions of the NXT motor controls. The PlayStation Eye which is
used to track the ball is connected to the Raspberry Pi 2. The PlayStation eye is mounted on the
servo motors allow the robot to have a moveable view. The infrared sensors are used for
obstacle avoidance allowing DogBot to maneuver away from walls and objects.
Raspberry Pi 2
Servo Motor (2x)
Arduino Mega 2560
Infrared Sensor (3x)
Bricktronics MegaShield
NXT Motor (3x)
NXT Touch Sensor
PlayStation Eye
3 | P a g e
DESIGN DIAGRAM
The Raspberry Pi 2 and the accessories connected directly to it will handle the higher level
thinking. The Arduino Mega is used for the lower thinking of motor control and obstacle
avoidance.
4 | P a g e
MOBILE PLATFORM
The DogBot platform will be built using Legos parts from a LEGO Mindstorms NXT kit. I
chose to use Legos because it was cost effective due to that I already owned it. I then used the
LEGO Digital Designer to sketch a CAD drawing of the platform. I have chosen to use a two
wheel drive and having a roller wheel on the back.
PLATFORM ARM
The DogBot arm will be powered by one NXT motor. The motor will be used to control the
motion of clamping the ball as well as raising up the arm. This is done by using a mixture of
online and my own designs. The forces on the clamp gears will stall after it grasps the ball and
the following gear motion will raise the arm.
5 | P a g e
ACTUATION
LEGO MindStorms NXT motors (3x)
Reason for Purchase
Cost effective, since I have already owned these motors
Motor Specifications
Rotation Encoder
Returning the position of the
shaft with 1° resolution
Nominal Voltage: 9 V
Nominal Current: 60 mA
Free RPM: 170
Stall Torque: 50 𝑁 ∙ 𝑐𝑚
Stall Current: 2 A
Weight: 80.179 g
MG 995 Servo Motors (2x)
Motor Specifications
Nominal Voltage: 4.8 V to 7.2 V
Stall Torque: 8.5 𝑘𝑔𝑓 ∙ 𝑐𝑚
Weight: 55 g
SENSOR
Analog Distance Sensor (3x)
Specification Vendor: Pololu Model: Sharp GP2Y0A21YK0F Range: 10 – 80 cm Supply Current: 30 mA
Application
Find road obstacles
6 | P a g e
NXT Touch
Specifications Unknown
Application
Used to when the arm is fully lifted
PlayStation Eye
Specifications
Resolution: 640 x 480
Video Frame Rate: 120 FPS
Dimensions: 2.56 x 3.15 x 2.17 in
Application
Used as a special sensor to track the ball being thrown
CONCLUSION
WORK ACCOMPLISHED
In summary of my work, I have successfully built and coordinated a robot to pick up and return a
ball to the user.
LIMITATIONS
At the beginning of this project, I have planned for the robot to find the user by using Ultrasonic
sensors. I was not able to reduce the amount of noise feedback when testing in the New
Engineering Building Rotunda. So instead I use the PlayStation eye to find the user base.
IMPROVEMENTS
If I could do this project over again, I would re-design my mobile platform to be for compact and
efficient on tracking the ball.
7 | P a g e
CODE
https://vserluco.wordpress.com/intelligent-machines-design-lab/source-code/