+ All Categories
Home > Documents > ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

Date post: 16-Oct-2021
Category:
Upload: others
View: 2 times
Download: 0 times
Share this document with a friend
23
UNIVERSITY OF TORONTO ECE532 - DIGITAL SYSTEM DESIGN ROBOTIC ARM CONTROLLED BY VIDEO INPUT FINAL DESIGN REPORT Izaz Ullah Willian Rigon Silva Yuka Kyushima Solano April 9th, 2014
Transcript
Page 1: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

UNIVERSITY OF TORONTO

ECE532 - DIGITAL SYSTEM DESIGN

ROBOTIC ARM CONTROLLED BY VIDEO INPUT

FINAL DESIGN REPORT

Izaz Ullah

Willian Rigon Silva

Yuka Kyushima Solano

April 9th, 2014

Page 2: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

1

1

Table of Contents

1 Overview ..................................................................................................................................... 3

1.1 Background and Motivation ................................................................................................ 3

1.2 Goals .................................................................................................................................... 4

1.3 Design Changes ................................................................................................................... 6

1.3.1 Servo motor has serial/UART communication protocol ............................................. 6

1.3.2 MATLAB as interface between the FPGA and robotic arm ......................................... 6

1.3.3 No Depth Motion Recognition .................................................................................... 6

2 Outcome ...................................................................................................................................... 7

2.1 Results ................................................................................................................................. 7

2.2 Improvements Suggestion .................................................................................................. 9

3 Description of the blocks ........................................................................................................... 10

3.1 VmodCam IPcore ............................................................................................................... 10

3.2 HDMI_Out IPcore .............................................................................................................. 10

3.3 Color and Position Recognition Algorithm ........................................................................ 11

3.3.1 Software Implementation ......................................................................................... 11

3.3.2 Hardware Implementation ........................................................................................ 13

3.3.3 MATLAB Interface ..................................................................................................... 18

4 Description of Design Tree ........................................................................................................ 19

5 Project Schedule ........................................................................................................................ 20

6 References ................................................................................................................................. 22

Page 3: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

2

2

Table of Figures

Figure 1 – Overall view of the design .................................................................................................. 4

Figure 2 - AX-12A robotic arm, FPGA (Atlys Spartan 6), and VmodCAM ............................................ 5

Figure 3 – First full system`s test ......................................................................................................... 8

Figure 4 - Ideal block diagram of the system – FSL not implemented in actual system ..................... 9

Figure 5 – Ideal design of the Color and Position Recognition state machine .................................. 12

Figure 7 - - State Machine for reading from the DDR memory ......................................................... 15

Figure 6 - State machine for reading from the FIFO and recognizing the pixel color ....................... 15

Figure 8 – Start of SEARCH_POSITIONS_PROC process .................................................................... 16

Figure 9 - End of the loop of SEARCH_POSITIONS_PROC process .................................................... 17

Page 4: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

3

3

1 OVERVIEW

1.1 BACKGROUND AND MOTIVATION

Currently, robots are being used to execute a great deal of tasks. In the manufacture

industry, the tasks executed by robots are repetitive and pre-programmed, so the robot work

on their own. Their movements are calculated based on a mathematical model of the robot,

using Lagrangian Mechanics, and afterwards calculating the motion plan using Forward

and Inverse Kinematics. Moreover, to make sure that the joint angles will be precisely the

ones calculated by the Forward and Inverse Kinematics, it is also required a control

feedback loop. This approach requires a deep understanding and study of the task that the

robot shall execute in a specific environment.

On another hand, in the medical and defense industry robots are required to perform

precise and dangerous tasks – a surgery or a bomb disarming, for instance. In this life

treating tasks, a person usually controls the robot directly. Therefore, a high precision and

reliable human-machine interface is required; however, these interfaces are expensive and

this is our main project motivation.

The main goal of our project is to develop a cheap human-machine interface to

control a robotic arm using the human body movements. A FPGA – Digilent Atlys Spartan

6 [1] as the video input/processing and control signal generation, and a robotic arm,

composes the system (figure 2). The robotic arm used was an AX-12A - USB [2]. This type

of approach permits to extend the project to a more complex robot controlled by the same

interface.

The differences between this project and existing ones are the low cost to assembly

all the system and the lack of physical connection between the user and the robot (no wires

or sensors, just video input). Another advantage is the portability of this low cost

methodology to control any kind of robotic arm, or even expand the system to allow control

of an entire robot using all the body movements (or a not humanoid type of robot). The

main market of this application could be the medical area – actual surgery robots are

expensive and this type of system can assist in specific situations, without any physical

contact between the doctor and the machine – or in the defense area – used for bomb

disarming or remotely controlling a robot to access dangerous places.

Page 5: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

4

4

After the technology is full developed and accessible to the public, hence, if it is

cheap, this human-machine interface can be applied to a large variety of applications.

1.2 GOALS

The goal of our project is to develop a human-machine interface, using video input,

to control a robotic arm. The main idea was to attach coloured points in the operator`s

hand/arm, a stereo camera [3] captures the image of the hand/arm, afterward an IPCore

inside the FPGA will recognize the colour and relative position of each coloured point and

convert this to a control signal to the robotic arm in real-time and with good precision.

The control chain works as follows, for the video input it was used a VmodCam

connected directly to the FPGA, a custom IPcore process the image and generates the

control signals for the robotic arm servo motors, that will change the robotic arm joint

angles (figure 1).

Figure 1 – Overall view of the design

In order to control the robotic arm`s rotation we thought to use a stereo (3D) camera

to compute also the motion of the operator`s hand/arm in depth (see next section). Since the

majority of servo motors uses PWM signal as control input, we originally tought in using

the FPGA`s internal clock to generate this PWM and put it in a GPIO I/O port (Pmod

conector). Alternatively, as a plan B, we thought to add an AVR ATMega microcontroller

Page 6: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

5

5

as interface between the FPGA and the robotic arm via UART/serial communication;

however, this was not necessary and will be discussed in the next section.

The used IPcores were:

VmodCam IPcore (from previous group) – This IPcore manages the video input

from the stereo camera and stores in the DDR memory. It was necessary little

modifications to adapt to our project.

HDMI_Out IPcore (from previous group) – We used this IPcore mostly for

debbuging and evaluation that our Color and Position Recognition algorithm works.

The main purpose is to show on a HDMI monitor what the camera is capturing at

that moment.

Color and Position Recognition IPcore (designed by our group) – This custom

IPcore implements our state machine for recognizing and calculating the X,Y

position on the screen of each coloured point on the operator`s arm. These positions

are returned to the MicroBlaze processor, through a Faster Simplex Link (FSL), to

compute the control signals to the robotic arm.

See section 2.1 for block diagram and section 3 for detailed description of each IPcore.

Figure 2 - AX-12A robotic arm, FPGA (Atlys Spartan 6), and VmodCAM

Page 7: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

6

6

1.3 DESIGN CHANGES

After the initial design, we had to make some changes in order to advance with the

project. The changes were:

1.3.1 SERVO MOTOR HAS SERIAL/UART COMMUNICATION PROTOCOL

When we began testing all communications between the hardware parts, we

discovered that the servo motors of the AX-12A robotic arm are a modern type that

does use a serial/UART communication protocol [4] instead of PWM signal as

control input. However, this had an advantage and a drawback at the same time. The

advantage was that we did not had to be concerned about the generation of the PWM

signal and its output in the FPGA board (or use the AVR ATMega microcontroller as

interface). The drawback is that the serial communication protocol of the servo

motors are complex and with few documentation, hence we spent a good time

learning how to properly send the correct commands.

1.3.2 MATLAB AS INTERFACE BETWEEN THE FPGA AND ROBOTIC ARM

With our color and position recognition algorithm fully operational on software,

we tried to do a full system`s test; however, we found difficulties communicating the

FPGA with the robotic arm through the Serial/UART communication (as stated in the

previous section). We evaluated that to make it properly work we had 2 options, first

one was to add more hardware to handle the USB/UART communication, or the

second one was to use the MATLAB software, running in a PC/MAC, as interface.

Since we had a deadline for the project, and with Prof. Paul Chow`s advice, we opted

to use the MATLAB as communication interface between the FPGA and the robotic

arm (see section 3.3.3 for detailed info.).

1.3.3 NO DEPTH MOTION RECOGNITION

After we did the first software test with our custom color and position

recognition algorithm, we evaluated the difficulty to implement the depth recognition

algorithm using input from the 2 cameras of the VmodCam board. Our conclusion

was that this algorithm will not be trivial and will have some intricacies. Therefore,

after asking advice to Prof. Paul Chow, we decided to not implement this algorithm

Page 8: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

7

7

because the limited time, also because another group was developing this as a custom

IPCore. For future expansion of this project, the integration of the depth recognition

IPCore to our project could be made.

2 OUTCOME

2.1 RESULTS

We achieved our main goals that was to create a human-machine interface system

where you can control a robotic arm from video input (figure 3); however, our color and

position recognition algorithm only fully worked on software, hence the system`s

performance was not real-time. We created a custom IPcore that implements our Color and

Position Recognition algorithm as a state-machine (see section 3 for detailed info.), but our

IPcore was not working properly (table 1); the performance it is approximately real-time,

but only the color recognition is working properly (the position recognition its returning

random values for the X,Y positions). Our custom IPCore need improvements, due to the

short time we did not discovered how to solve the position recognition error in our custom

IPcore. Moreover, we did not had time to implement the Faster Simplex Link (FSL)

between our IPcore and the MicroBlaze processor (figure 4).

In summary, we designed the algorithm that does the color and position recognition,

it is tested in simulation, it is fully working on software, however it is partially working on

hardware (IPcore). In order to make our custom IPcore fully operational it is necessary

more testing and debugging to find what is wrong with the position recognition (as stated

before, the problem is that it is returning random values for the X,Y position of the

coloured point).

Table 1 - Project goals status

Goals Status

Video input IPcore Working

HDMI Out IPcore Working

FPGA – AX-12A communication Working (through MATLAB)

Depth Recognition using Stereo camera Changed

Color and Position Recognition custom IPcore Created (needs more testing)

Page 9: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

8

8

The Color and Position Recognition algorithm works as follows:

First it recognizes the color of each point on the operator`s arm/hand (based

on configurable color threshold), and writes the image back in the DDR

memory identifying the points were the colours were been recognized by the

algorithm (we write back in the DDR just for debugging).

Secondly it computes the (X,Y) position of each coloured point on the

operator`s arm/hand.

Finally, will compute the distance between each coloured point and generates

a control signal to a specific joint in the robotic arm (based on a maximum

and minimum servo input mapping).

Figure 3 is showing the HDMI output of the Color and Position Recognition

algorithm; note the 2 black squares and the calculated distance (black line) from the center

of each square. This distance is translated as a control signal to the robotic arm claw (end-

effector)

Figure 3 – First full system`s test

Page 10: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

9

9

Figure 4 - Ideal block diagram of the system – FSL not implemented in actual system

In conclusion, our custom IPcore recognizes the colour of each point and re-write

the image back into the DDR memory; afterward the HDMI_Out IPcore reads from that

address and shows in a HDMI monitor. Note that this operation is done only for debbuging,

for fast system response this function (re-write back in the DDR) and the HDMI_Out

IPcore could be excluded from the system.

2.2 IMPROVEMENTS SUGGESTION

The first improvement that should be made is discovering what is wrong with the

position recognition part in our custom IPcore; the second main improvement should be the

communication between the custom IPcore and the MicroBlaze processor through a Faster

Simplex Link (FSL). These two improvements shall enable the real-time control of the

robotic arm. After the real-time control is guaranteed, the next improvement could be the

precision optimization of the robotic arm movement based on the operator`s arm/hand

movement. This could be done by changing the function that relates the servo limits with

the computed distance of the coloured points; currently this curve has a proportional

Page 11: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

10

10

relation, but for instance, could be changed to an exponential curve. However, have in mind

that the precision also depends on the actuator, in this case is the AX-12 servo motor.

Afterwards, when you have both real-time control and precision of the movement, the

next step is to integrate a depth recognition IPCore. Integrating it to the system will give 3D

control over the robotic arm; this human-machine control interface could be applied to any

other kind of system. In order to exclude the MATLAB interface from the system you need

a USB to TTL converter (for instance - CP2102 [5]).

The variety of applications of this human-machine interface is virtually infinite, and

this approach is relatively simple and cheap compared to existing ones.

3 DESCRIPTION OF THE BLOCKS

In this section are descibred details of each IPcore used in this project.

3.1 VMODCAM IPCORE

This IPcore, designed by our classmates, receives the streams from the VmodCam

board and stores it in the DDR memory. Since this is a stereo camera, the stream from one

camera are stored at the memory address 0xA0000000 and the other at 0xA0100000. The

image resolution is 640x480 and the pixel size is 2bytes (RGB565); therefore, the memory

size needed for each camera is 640x480x2 = 614400 bytes (0x96000).

3.2 HDMI_OUT IPCORE

The HDMI_Out IPcore was designed by Ana Klimovic, Bryce Long and Victor

Zhang, 2013 ECE532 students. This IPcore reads an image stored in the DDR and displays

it at a monitor via a HDMI cable. In the XPS we can define the resolution of the image

(640x480 or 1280x720) and the pixel size (2bytes or 4bytes). In the software application we

need to set 3 parameters via slave registers: the image width, the base address of the frame

in the DDR and the go signal. An example is showed below.

Page 12: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

11

11

volatile int *hdmi_addr = (volatile int *) XPAR_HDMI_OUT_0_BASEADDR;

hdmi_addr[0] = 640; // image width in pixels

hdmi_addr[1] = 0xA0000000; // base address of the frame

hdmi_addr[2] = 1; // go

3.3 COLOR AND POSITION RECOGNITION ALGORITHM

We implemented our project in two different approaches. The first one it traverse the

whole image in software searching for the red and green colors. The second one is an

IPcore that reads the frames from the DDR in packets of 64 bytes and searches the red and

green colors in these pixels.

In figure 5 we show the original design for hardware implementation of our

algorithm as a state machine; this figure helps to understand our main idea, but it was not

implemented as it shows (see section 3.3.2).

3.3.1 SOFTWARE IMPLEMENTATION

In the software we use the display library, it has functions such as ‘gpDrawSquare’

and ‘gpFillLine’ that help us for debug. The main function of this library is ‘search’; it

traverses the whole image and for each read pixel from DDR converts it to RGB888 and

checks if it is red or green. At the end of the loop this function computes the red and green

object positions and the distance between them and returns this value.

To find the objects positions for each pixel red or green found we increment its

coordinates and keep track of the number of pixels red and green found. At the end of the

loop we divide the sum of coordinates of the red pixels by the number of red pixels and do

the same for the green pixels. After that, we have the center of mass of the red and green

pixels in the image.

Therefore in the software we only need to set the base address where is stored the

frames from the camera, set the size and base address where the new image will be stored,

set the parameters of HDMI_Out IPcore and call our search function inside of the infinite

while.

Page 13: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

12

12

Figure 5 – Ideal design of the Color and Position Recognition state machine

Page 14: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

13

13

//set the base address of frames from the CAMERA (CAM1 = 0xA0000000)

volatile unsigned short *ddr_addrCAM = (volatile unsigned short *) CAM1;

//set the size and base address of the image (X = 640, Y = 480, MEM = 0xA0600000)

gpImg *img = gpCreateImage(X, Y, MEM);

gpSetImage(img, 0x00, 0x00, 0x00);

//set paramenters of HDMI out

volatile int *hdmi_addr = (volatile int *) XPAR_HDMI_OUT_0_BASEADDR;

hdmi_addr[0] = img->xres; // stride length in pixels

hdmi_addr[1] = (int)img->imageData; // set frame base address

hdmi_addr[2] = 1; // go

//d: distance between red and green positions

int d;

// if DEBUG = 1 it will show squares around the red and green positions

// and a line between them. It will also paint as pure red all the pixels

// it considered red and paint as pure green all pixels it consired green

// if PRINT = 1 it will print the found position (x,y) and the distance

// values on the terminal. It need to be set 0 when communicate with the robot.

while(1){

d = search(img, ddr_addrCAM, 'r', 'g', DEBUG, PRINT);

}

3.3.2 HARDWARE IMPLEMENTATION

We created an IPcore that implements our Color and Position Recognition algorithm

- attached to an AXI4 bus which is burst capable for high-throughput memory mapped

interface. This custom IPcore reads an image from a base address in the DDR and compute

the red and green object positions.

Page 15: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

14

14

In order to implement this functionality we had added two processes:

SEARCH_SM_RW_PROC (figure 6) and SEARCH_POSITIONS_PROC (figure 7). The

first one requests the function ‘read’ to the DDR in packets of 64 bytes and stores them in

the FIFO. The second one reads 4 bytes (2 pixels) each time from the FIFO and check if

they are red or green; in case they are, it sums their coordinates and keeps track of the

number of red or green pixels found. After finish traversing the whole image, it computes

the geometric center of the red and green objects and write them in the slave registers,

which can be accessed by the MicroBlaze processor. Then it reset all signal and restart the

loop.

Our custom IPcore has 8 slave registers.

slv_reg0 and slv_reg1 are for debug;

slv_reg2 saves the threshold (2 bytes for the red threshold, 2 bytes for the

green and 2 bytes for the blue);

slv_reg3 and slv_reg4 saves the base address of read and write;

slv_reg5 and slv_reg6 has the X,Y positions (4 bytes for the x and 4 bytes

for the y) of the red and green geometric center;

slv_reg7 save the ‘done’ value.

In the software implementation, we only need to tune the color threshold, the read

address and the write address - currently our custom IPcore does not write back to the

DDR, so we can leave it without the write base address.

volatile unsigned int *search_addr = (volatile unsigned int *) SEARCH_BASEADDR;

search_addr[2] = T_RED << 24 + T_GREEN << 16 + T_BLUE << 8; //color

threshold

search_addr[3] = CAM1; // read base address

search_addr[4] = MEM; // write base address

int x_red, y_red, x_green, y_green, done;

while(1){

x_red = search_addr[5] >> 16;

Page 16: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

15

15

y_red = search_addr[5] & 0xFFFF;

x_green = search_addr[6] >> 16;

y_green = search_addr[6] & 0xFFFF;

done = search_addr[7];

printf("RED (%4d, %4d) GREEN (%4d, %4d) => done: %d\n\r", x_red,

y_red, x_green, y_green, done);

}

Figure 6 - - State Machine for

reading from the DDR memory

Figure 7 - State machine for reading from the

FIFO and recognizing the pixel color

Page 17: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

16

16

Label of the signals:

f_full: FIFO is full;

cmdack: bus2ip_mst_cmdack -- Bus to Ip master command acknowledgment;

cmplt: bus2ip_mst_cmplt -- Bus to Ip master transfer complete;

f_rd_en: FIFO read enabled.

In figures 8 and 9 we can view a simulation of our custom IPcore in the software

ISE Project Navigator. This simulation tests the SEARCH_POSITIONS_PROC process.

Figure 8 – Start of SEARCH_POSITIONS_PROC process

Page 18: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

17

17

Figure 9 - End of the loop of SEARCH_POSITIONS_PROC process

Figure 8 shows the start of the process. It stays on the search_idle state while

f_rd_en is zero. When this signal is set, the process changes to search_color state and starts

counting the red and green pixels. Figure 9 shows the end of loop of the process. When

end_loop signal is set, it changes to search_end_loop state, computes the x and y values for

the red and green objects, sets the search_done signal and goes to search_reset state, where

it resets all signals and restart the loop.

Page 19: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

18

18

3.3.3 MATLAB INTERFACE

In this section, we will present some examples to operate the robot through

MATLAB, only for debugging/testing without the FPGA. In table 2, we can view a simple

example to turn LED ON in the servo motor 1. The packet initialize with a default value

0xFF 0xFF, and at the end it is calculated a Check-Sum for guaranteeing the command

reliability (detailed information is available in [4])

ID 0x01 is the identification of servo motor 1.

Instruction 0x03 means the write in the EEPROM of the servo.

Parameter 1 0x19 means to write in the LED parameter address.

Parameter 2 0x01 is to command ON (0x00 is OFF).

Table 2 – AX-12A Robotic Arm communication protocol example

For more examples of instructions refer to our project`s MATLAB files. There are 3

files which implements the Check-Sum function (buildPacket.m), tests the robotic arm

alone (Robot_move_TEST.m), and initialize the communication with the FPGA

(FPGA_to_Robot.m).

Page 20: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

19

19

4 DESCRIPTION OF DESIGN TREE

Our design directory contains the following folders:

RoboticArm Project

This folder contains our two main XPS project - Search_Hardware and Search_Software - ,

a Documantation folder with the project documentation and a README with explanations

about how to use it.

The Search_Hardware and Search_Software folders contain the following files:

System.mhs: A higher level description of the hardware modules in the system

System.xmp: The system project file used by XPS

SDK: It contains an XML description of the hardware description that is exported

by the XPS and then used by SDK to build a board support package

Data: contains the user constraint file(.ucf)

HDL: Contains the upper level system file and wrapper for each of the peripherals

Implementation: contains the synthesis files, bit files and initialization files for the

BRAMS

Sysnthesis: Has the output from the XST

Pcores: user defined peripherals

Workspace: main SDK workspace with ready to use bitstream files

README: Top level description of the content of this directory

The Documentation folder contains the following files:

Robotic Arm Manual

VmodCam Reference Manual

VmodCam Schematic

Xilinx Device Driver Programmer Guide

Page 21: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

20

20

AXI Slave and Master Burst Datasheets

Atlys Schematic

Contains a copy of this design report

5 PROJECT SCHEDULE

The original project schedule was the following:

Atlys PWM I/O interface to the servo motors – 1 week.

o Adjusting Atlys board to properly generate PWM signals and send them

to board outputs (Pmod conector probably).

o If Atlys board has some limitation using PWM, we will use an AVR

ATMega microcontroller to control the servo motors of the robotic arm.

Integrating IP (VmodCAM) – 1 week.

o Integrate the VmodCAM 3D camera in the Atlys FPGA board, for

image capture.

Colour recognition algorithm – 1 weeks.

o Improvement of our IP project for colour recognition, adjustment of

colors thresholds.

Position calculation algorithm – 2 weeks.

o Improvement of our IP project for position recognition.

o We will try a new approach to be able to calculate multiple points of the

same color.

System integration + optimization – 1 week.

o General system integration, testing, movement dynamics evaluation and

system`s time-response optimization.

As the project advanced, we had to face and solve several issues, related to the

robotic arm communication protocol, the Color and Position algorithm and its

implementation in hardware (custom IPCore). Therefore, we slightly changed our schedule

Page 22: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

21

21

to accomplish the keys points first and fully implement the algorithm in software before

going to the VHDL implementation. The schedule was changed to:

AX-12A Robotic Arm Communication Protocol – 1 week.

o We focused all our efforts to learn, as soon as possible, the

communication protocol of the AX-12A servo motors.

o We used MATLAB to test the initial communication with the robotic

arm.

Color and Position Recognition Algorithm - Software – 1 week.

o Full Software implementation and testing of our color and position

recognition algorithm that will be implemented in our custom IPcore

afterwards.

Integrating VmodCam and HDMI_Out IPcores – 1 week.

o The integration of the VmodCam and HDMI_Out IPcores make it

possible to test and tune our Color and Position Recognition algorithm

working on software.

System integration + optimization – 1 week.

o We integrated our FPGA with the robotic arm, at this point we mapped

the robotic arm`s servo limits and related these limits as a proportional

gain from the operator`s hand movement.

Custom IPcore creation – implementation of our Color and Position

Recognition algorithm in VHDL – 2 weeks.

o We modeled our algorithm as a state machine, tested it with ISE Project

Navigator and implemented as a custom IPCore.

o We focused in making the IPcore work correctly before trying to

improve the FIFO/burst functions to read from the DDR, or creating the

FSL between our IPcore and the MicroBlaze processor.

Our group achieved all our proposed goals, but we run-out of time and could not

improve our custom IPcore to work in real-time. The changes in the schedule were

necessary in order to achieve a minimal functionality, and this also gave us the assurance

that our approach works.

Page 23: ECE532 DIGITAL SYSTEM DESIGN - University of Toronto

22

22

6 REFERENCES

Hold Ctrl key on your PC/MAC keyboard (or Shift, depends on your reader) and click

in one of the following links to go to the related website

[1] Atlys Spartan 6 FPGA board reference manual

[2] AX-12A USB robotic arm

[3] VmodCam for Atlys FPGA Board

[4] AX-12A Communication Protocol

[5] CP2102 USB to TTL module converter


Recommended