+ All Categories
Home > Documents > RoboSub Report

RoboSub Report

Date post: 21-Apr-2015
Category:
Upload: silvio-traversaro
View: 54 times
Download: 1 times
Share this document with a friend
11
Design and Realization of an Autonomous Underwater Robotic Vehicle Sabrina Varanelli, Clionadh Martin, Stefano Mafrica, Fabio Boi, Silvio Traversaro Abstract Cristoforo is the 2011 entry for the AUVSI Robosub competition built by a team of five students from the European Master on Advanced Robotics and the University of Gen- ova Robotics Engineering Program in the short time span of four months. The team’s goal for their first year was to design and realize a working vehicle capable of holonomic under- water motion consisting of a symmetrical, octagonal physical structure, a rotating camera and torpedo launcher, a marker dropper, a depth sensor, a doppler velocity sensor, hy- drophones, and an inertial measurement unit. The design features custom built thrusters that are mounted in a modular manner to give optimum placement in all situations and provide both vertical and horizontal motion in a holonomic manner. The electronics are housed in a cylindrical, vertically placed central hull structure. This structure is equipped with a pneumatic system for end effector actuation as well as custom made underwater electrical connectors and pneumatic line ports. The vision system integrates information from both the rotary camera and a bottom facing stationary camera to give the robot a full picture of its environment and the control algorithm uses the numerous on board sensors to control the heave, sway and surge of the vehicle.
Transcript
Page 1: RoboSub Report

Design and Realization of an Autonomous UnderwaterRobotic Vehicle

Sabrina Varanelli, Clionadh Martin, Stefano Mafrica, Fabio Boi, Silvio Traversaro

Abstract

Cristoforo is the 2011 entry for the AUVSI Robosub competition built by a team offive students from the European Master on Advanced Robotics and the University of Gen-ova Robotics Engineering Program in the short time span of four months. The team’s goalfor their first year was to design and realize a working vehicle capable of holonomic under-water motion consisting of a symmetrical, octagonal physical structure, a rotating cameraand torpedo launcher, a marker dropper, a depth sensor, a doppler velocity sensor, hy-drophones, and an inertial measurement unit. The design features custom built thrustersthat are mounted in a modular manner to give optimum placement in all situations andprovide both vertical and horizontal motion in a holonomic manner. The electronics arehoused in a cylindrical, vertically placed central hull structure. This structure is equippedwith a pneumatic system for end effector actuation as well as custom made underwaterelectrical connectors and pneumatic line ports. The vision system integrates informationfrom both the rotary camera and a bottom facing stationary camera to give the robot a fullpicture of its environment and the control algorithm uses the numerous on board sensorsto control the heave, sway and surge of the vehicle.

Page 2: RoboSub Report

1 Introduction

Cristoforo is our team’s first robot en-tered into the AUVSI Robosub competition.Named for the great ocean explorer Cristo-foro Colombo who was born and raised in theItalian port city Genova, and later who had toovercome many obstacles in order to achievehis goals, the realization of our robot is an ex-ample of perseverance in the face of adversi-ty. The team, formed just four months beforethe competition, is both a multidisciplinaryand multinational first year team made upof five graduate students from the EuropeanMaster on Advanced Robotics (EMARO) andthe University of Genova Robotics Engineer-ing master’s program. Our team consists ofthree Italian students with backgrounds inprogramming, electronics and controls, oneIrish student and one American student, bothwith backgrounds in robotics and mechan-ical engineering. The team works in theGenoa Robotics and Automation Laborato-ry (GRAAL) at the Department of Informat-ics, Systems and Telecomunications (DIST)at the University of Genova. The team is co-sponsored by the EMARO program and theInterunivesity Center for Integrated Systemsfor Marine Environment (ISME), through theEuropean project Trident. We also have aworking parternship with the Italian Con-siglio Nazionale delle Ricerche1 (CNR).This year’s AUVSI Robosub challenge has aValentine’s Day theme. The robot’s goal is tomake its way to “sweetie’s”house by followinga predefined path and accomplishing a varietyof tasks along the way. During all these tasksthe robot has to operate autonomously, with-out any help from a person or an off-boardcomputer. The challenge tasks are outlinedbelow. Because of time constraints, the teamdecided to attempt to accomplish all tasksexcept surfacing with the vase.

1Council for National Research

1. Flower Collection: The robot musttouch two of three different colored buoysin the order stated by the judges at thebeginning of the match.

2. Lovers Lane: The robot must passwithin a specified rectangular region.

3. Delivery: The robot must differenti-ate between four black bins marked withsmall and large Xs and Os to determinewhere to drop its markers.

4. Cupid Task: The robot must navigateto, recognize, and fire a torpedo througha small or large heart shaped cutout ina two sided panel. The torpedoes firedthrough the cutouts must match withthe color of the panel.

5. Vase Collection and Surfacing task:Two acoustic pingers are placed undercylindrical objects with large octagonfloating on the surface of the water di-rectly above them. The robot must nav-igate to the active pinger, pick up theobject and surface within the octagon.

2 Project Methodology

Like most engineering design projects theteam started out with an elaborate plan to re-alize their goal of building a robot that couldaccomplish these tasks in such a short time.And, like most engineering projects, this planturned out to be merely a rough guideline,and nearly all of the deadlines were missed.Even if the deadlines were not necessarily re-alized, the team still have followed the ba-sic project plan that was composed of thefollowing steps:

1. Problem Exploration

2. Conceptual Design

3. Detail Design

1

Page 3: RoboSub Report

4. Simultaneous Design, Integration Anal-ysis and Testing

5. Hardware and Software Testing, Evalua-tion and Redesign Cycle

3 Design Decisions

After analyzing the game and strategical-ly thinking about the best way to effectivelycomplete game tasks, the team decided that itwould be very beneficial to give the robot theability to holonomically navigate through thewater. The team reasoned that having thisability would allow for more accurate motionin the water because the robot would not haveto turn in order to drive along a fixed head-ing. By necessity of the holonomic design,it was decided that the robot would be sym-metrical and would be comprised of an openframed octagonal structure with a cylindrical,vertically-placed central watertight hull con-taining the electrical components. The visionsystem would facilitate the holonomic motionby consisting of two cameras, one stationaryand fixed to the bottom of the robot for track-ing path markers, and the other coming outfrom the central hull like a periscope and hav-ing the ability to rotate. The rotary camerawould give the robot the ability to analyzeits environment with only a small amount ofmotion, as opposed to having a fixed camerathat would require to ratate the entire robotin order to get a full view of its environment.It was decided that this rotating camera headwould also be the ideal place to launch thetorpedoes from because of the ability to moreaccurately move this part of the robot in com-parison to moving the entire robot to lock inon the targets. The marker dropping mecha-nism would be mounted towards the bottomof the robot, since the bottom-facing camerawould be the one used to locate the markerbins.

4 Structure

The robot’s frame consists of a PVC struc-ture that holds a central cylindrical electri-cal pod constructed from acrylic. PVC waschosen as the building material for the framebecause it is inexpensive, simple to use andfabricate (with respect to the original idea ofwelding the frame) and it has a high strengthto weight ratio. The initial idea for the mainvertical hull structure, or the “pod”, was tohave a series of short interlocking cylindersthat contained electrical components in or-der to prevent damage to the entire system.This idea was discarded in favor of a lightersingle-cylinder option. This also reduced thenumber of electrical connectors required forthe design. Because the team went with amore cost effective method for making elec-trical connectors, they are relatively perma-nent and cannot be attached and detachedquickly. For this reason, the connectors usedfor thrusters, hydrophones and other com-ponents that would always be fixed to therobot needed to be semi-permanently mount-ed in the pod. For this reason, it was decidedthat the cylinder would contain a removablerack to house the electronics that could betaken out of the robot for maintenance andthe more permanently attached componentswould remain fixed to the bottom of the maincylinder. The wiring for these more perma-nent components would be disconnected fromthe rest of the electrical rack using quick re-lease electrical connections inside the pod atthe top, thus enabling the entire rack to comeout of the robot. The lid of the pod is sealedusing an o ring in static radial squeeze, andis held on using a set of four quick releaselatches that are attached to the pod using o-ring seals that surround attaching hardwareto prevent leaks. The custom made thrustersare mounted to the frame using pipe clamps.This modular mounting method also allowsfor the placement of the thrusters with re-

2

Page 4: RoboSub Report

spect to the center of gravity (GOG) and thecenter of buoyancy (GOB). The robot wasbrought to the required buoyancy level us-ing foam that is added to the top part of theframe once the robot is placed in water.

5 Custom Thrusters

Because of the high costs associated withpurchasing premade thrusters that allow forclosed loop control, the team decided to man-ufacture their own thrusters. To help sim-plify this task, the engineers at CNR helpedby providing the motor part number thatthey use for their robotics projects and theirpropeller design to use as a mold that hasbeen optimized for low velocity motion (i.e.it has minimal cavitation at low velocities).They also provided the team with their ex-perimental data in graph form that gives thethrust/rpm relationship using this motor andpropeller combination.A waterproof housing was created for the mo-tor using an o-ring compressed in a face sealconfiguration to create a watertight contain-er. One of our custom electrical connectorswas fastened to one end and the other sidefeatures a sealed output shaft from the motorthat attaches to the propeller. The thrusteris shielded by a large ring of PVC that has abayonet fitting to hold it in place. Two op-tions were considered to make a seal on themotor’s output shaft. The first option was amagnetic coupling, which the team decidedwould be the safest way for creating a water-tight seal because there is physically no wayfor water to enter from around the outputshaft because the motor is coupled to the pro-peller through a magnet acting through thewalls of the container. This method however,was found to be costly and heavy. Instead,the team was able to find a company calledRoten in Italy who produced mechanical sealsthat work using a ceramic and graphite bear-

ing surface that provides constant pressureon an o-ring via springs while the propeller isin motion. The company generously donatedthe seals for this project. Incidental leakageof the mechanical seal is hindered by a mod-ified stuffing box created in the grease-filledcavity between the bearing and the mechani-cal seal. The picture in Fig.1 shows the designof the thruster.

(a) (b)

Figure 1: (a) Overall view of thruster design,(b) cross section view of thruster design.

6 Custom Connectors

The reliability and cost of the electricalconnectors used on the robot were the twomost important factors when choosing whichkind to use. The team initially sought to pur-chase the electrical connectors from a sup-plier, but these were significantly out of ourbudget. Based on some input from our part-ners at CNR, the team designed an inexpen-sive and reliable solution based on a conceptalready tested by CNR. The base of the con-nector resembles a bolt with a hole drilledthrough the center and a cylinder bored intothe head. A cable with polyurethane jack-eting, is inserted into the hole, and then,polyurethane rubber is poured into the bored-out cylinder and hardens, bonding with thepolyurethane jacketing and the porous brasssurface that the connector is made from. Agroove for an o-ring is cut into the under-side of the bolt head and the bolt is passedthrough a hole and secured with a nut onthe inside “dry”part of the robot. These

3

Page 5: RoboSub Report

connectors were also adapted to work withpneumatic polyurethane tubing and were themeans by which the pneumatic tubes reachedpassed through the hull. The picture in Fig.2shows a 3D model of the design.

(a) (b)

Figure 2: (a) Front view of the custom elec-trical connector, (b) back view.

7 End Effectors

The team choose to use a pneumatic sys-tem to actuate the robot’s marker dropperand torpedo launcher. Using pneumatics al-lowed for power actuation for all end effectorsthis year, and give the flexibility to improvethe robot’s design and add additional end ef-fectors in future years. The pneumatic sys-tem consists of an air reservoir, pressure reg-ulator and four two-way solenoid valves, onefor each of the torpedoes, and two for the re-verse acting cylinder.

The torpedo launcher is placed below thecamera to take advantage of the camera rota-tion for the correct aiming of the torpedoes.The torpedoes fit onto tubes, which are thenfitted with one-way valves, to stop the inflowof water into our pneumatic circuit. Torpe-does were molded from polyurethane rubberand a cigar case. Large marbles were cho-sen as simple and cost-effective markers todrop into the bins. The two markers sit in atube, with a double-acting cylinder placed be-side tube. A fixture is mounted to the cylin-der, which acts as a hatch, releasing a marblewhen the cylinder is fully extended and keep-ing the marbles in place when it is retracted.

8 Modeling and Control

The mathematical model of an underwa-ter vehicle can be very complex, dependingon the structure of the vehicle, in partic-ular on how the vehicle is actuated (num-ber of thrusters, their position and orienta-tion), and on how it is controlled, based onthe kinematic chain describing the motion ofthe vehicle. In this application, the robot’sstructure and the particular cameras place-ment allow motion to be describe the dy-namic equations of a rigid body projectedonto the robot’s body frame. In this case, wedo not have a kinematic chain to be computedto project the equations of the motion ontoa fixed frame, but instead, the reference ve-locities described on the camera’s frame haveto be projected onto the body frame. Todo this, a two dimensional rotation matrixmust be computed from the top camera to thebody frame (the z-axis of the camera frameis fixed with the z-axis of the body frame).Taking into account these considerations, thefollowing simplified mathematical model foran AUV can be written in the matrix form:

Mx+C(x)x+D(x)x = τ − τext (1)

where x = [x, y, z, α, β, γ]T is the 6-dimensional vector containing the lin-ear and angular position on the x, y, zaxes, M = diag(m,m,m, Ix, Iy, Iz)with m as the mass of the robot andIi is the inertial momentum alongthe i-th axis, C(x) and D(x) =−diag(Dxx, Dyy, Dz z, Dαα, Dββ, Dγ γ)are respectivly the Coriolis and the dragmatrix, τ and τext are the applied torque andthe external torque.The equations in (1) can be simplifiedbecause the roll (α) and the pitch (β) donot need to be controlled since their stabilityis ensured by the correct positioning ofthe COB and COG, so that only the statevariables x = [x, y, z, γ]T can be considered.

4

Page 6: RoboSub Report

Also, the motors have high relative torquewith respect to the external torque, soτext is negligible. Additionally, the robotis holonomic so the applied torque can bewritten: τ = [ux, uy, uz, uγ]

T = Bf wheref = [f1, f2, f3, f4, fz], with fz =

∑8i=5 fi,

is the vector containing the thrusts fi pro-duced by each motor and B is the matrixthat projects the these thrusts onto thex, y, z axes. The resulting equations of thesimplified system are:

x1 =x2x3 + Dxmx21 + ux

m

x2 =−x1x3 + Dymx22 + uy

m

x3 = Dzmx23 + uz

m

x4 = DγIzx24 + uγ

Iz

(2)

where [x1, x2, x3, x4]T = [x, y, z, γ]T .

The control scheme for the motion can besummarized as is shown in Fig.3.

Figure 3: Control scheme for the motion ofthe robot.

The augmented state of the system X =[xTp ,x

Tv ]T = [z0, γ0, x, y, z, γ]T is measured by

different sensors (see section 9) providing anoisy state X that is filtered by an extendedKalman filter (EKF) to achieve the optimalestimation of the state X = [xp

T , xvT ]T .2

2with abuse of notation, z0 and γ0 in xTp represent

respectively the depth and the heading of the robotwith respect to a fixed frame that is set at the be-ginning of the match. These two absolute positionsare needed since they remain almost constant duringthe navigation, except for the accomplishment of thetasks, so a control loop based only on desired veloc-

The EKF and the controller that computesthe reference control u∗ for the motors areimplemented on the PC board as well as thecomputation of the desired velocities x∗

v to begiven to the controller. The reference controlis sent to the servo controllers (see section 9)that directly control the motors. The desiredvelocities x∗

v are computed in the navigationalgorithm by the vision, for x and y, and bycomparing the actual position xp with thedesired position x∗

p, for the z and γ. Thereare two types of implemented controls: PIand computed torque control. Only oneof them can be used at a time by selectingit with an enable/disable flag; which controlto be used will be decided during the testingphase.In order to have an initial idea of the val-ues of the gains to be set in the controller,a model of the system has been implementedin Simulink, implementing also the saturationof the control to the maximum value of thethrust produced by the motors, and differentsimulations have being done varying the gainsof the control. The pictures in Fig.4 and inFig.5 show the different behaviors of the xand y velocity and of the z and γ positionhaving constant desired values, respectivelyin Fig.4 for some values of the proportionalgain (KP ) and the intragrative gain (KI) andin Fig.5 for some values of the proportionalgain (KP ) and the relative error on theC andD matrices (Cerr, Derr).

3

Comparing Fig.4 and Fig.5 it can be easilyseen that the PI is much more robust thancomputed torque since it is not as sensitiveto variation between the real and the esti-mated parameters of the system. Also, if the

ities would be not enough. Conversely, the absolutepositions x0 and y0 are impossible to obtain with anysensor since the pool is too big and not square, andare not really needed since the relative positions areprovided by the vision.

3for a sake of simplicity, KPi= KP and KIi = KI

for i = x, y, z, γ

5

Page 7: RoboSub Report

(a) (b)

Figure 4: x and y velocity and z and γ po-sition varying KP in [1, 10] with: (a) KI = 1(b) KI = 10.

(a) (b)

Figure 5: x and y velocity and z and γ po-sition varying KP in [1, 10] with: (a) Cerr =Derr = 0% (b) Cerr = Derr = 100%.

parameters are well estimated (low percent-age error) computed torque has, in general,higher performance than PI. From the resultsof different simulations using PI control, itcan also be inferred that low gains of KP andKI cause higher overshoot in x and y veloc-ity and lower in z and γ position, and viceversa for high gains of KP and KI ; for thisreason, the best option can be chosen only inthe testing phase. Note that for KP , KI ≥ 1the times needed to reach the equilibrium arealmost equal, this is due to the saturation ofthe control.

9 Actuators and Sensors

The robot is actuated by eight brushlessMaxon EC-4pole thrusters, equipped withHall sensors. The positions of the thrustersare symmetric with respect to the z axis ofthe robot. Four of them are located horizon-tally on the xy plane containing the COG,and the other four are located vertically on

the xy plane in the middle between the COGand COB. This positioning garantees that thethrust produced by the horizontal thrustersdoes not affect the stability for the pitch androll and so that the equations in (2) hold.The control for each motor is given by theservo controller Maxon DECV 50/5 digital 4-Q-EC Amplifier that takes an analog refer-ence voltage from the Arduino and sends thecontrol signals on the three windings to themotor through a control on the feedback fromthe Hall sensors in the motor. More precisely,the Arduino produces a PWM signal that isfiltered by a second order filter4, so that thedesired speed of the motor (in rpm) are givenby tuning the duty cicle of the PWM.The linear velocities of the robot on its threeaxes are measured both directly by the theDoppler Velocity Log (DVL) and indirectlyby integrating the accelerations provided bythe accelerometers in the Inertial Measure-ment Unit (IMU). The angular positions andvelocities are directly measured by the mag-nometers and the gyros in the IMU, whilethe depth (the absolute position z0) is pro-vided indirectly by converting the pressuremeasured by the Setra 209 pressure sensor.The relative position of the robot with re-spect to the objects it encounters is providedby the two cameras as is explained in section10.

10 Vision and Navigation

The rotating top camera is a SeaView Sea-master Supermini underwater videocamera.Its rotation is achieved using a magnetic cou-pling between the a rotary axle, housed ina support cone, and a Phidgets NEMA-23Bipolar stepper motor controlled by a Phid-getStepper Bipolar 1-Motor that takes the

4the filter is realized with an operational ampli-fier in inverting configuration on a PCB designed onCadence OrCAD 16.3.

6

Page 8: RoboSub Report

desired rotation angle directly from the PCboard.The stationary camera is a Logitech HD C3105Mpx webcam that is able to acquire RGBimages up to 720 pixels. This camera is usedto provide a constant view of the environ-ment below the robot so that the path andthe moves in the marker dropper challengecan be seen.To simplify the design and to make the com-putations lighter for the main processor, itwas decided to use the knowledge of the realdimensions of the various objects to computedepth in a frame as opposed to using stereo-vision techniques. To do this, a simple pro-portion is computed between the real knownmeasures and the measures of the object inthe image frame. This is possible because allof the dimensions of the objects of interest aregiven to teams in the rules documents. Thedepth in the image Zim can be found withformula (3)

Zim = fdrealdpixel

(3)

where Zim can be the z on the body framefor the bottom camera or x for the bottomcamera, f is the focal length of the camera,dreal is the real dimension of the object anddpixel is the dimension in pixels.The code development was divided into twoparts, the first was a feasibility study usingMATLab to determine which vision process-ing techniques would work the best and thesecond involved optimization of these meth-ods using C++ with the OpenCV library.

10.1 Path Following

To follow the competion course, the robotfirst detects the orange color of the markerusing the bottom facing camera. To detectthe orange, the images are first transformedinto the HSV space and the hue histogram

is analyzed. To get a orange value to com-pare the image to, the team did research andfound videos of past competitions in variouslighting conditions and computed the meanand standard deviation of the orange valuesin the markers. Next, to determine the head-ing that the robot must achieve, the programcreates a binary map representing the edge ofthe objects, and then applies a linear Houghtransform to the binary image to detect thedirection of the lines as it’s shown in Fig.6.

(a) (b)

(c) (d)

Figure 6: Images representing the main stepsfor the path following: (a) orange bouy, (b)hue value of the orange, (c) edge detection,(d) Hough transform.

Next, the robot determines its distancefrom the orange marker to itself using for-mula (3). When the robot gets the distance,it can choose one point of interest and us-ing the point’s xpixel and ypixel coordinates(clearly expressed in pixel positions), the realx, y, z coordinates that robot has to follow inthe real world can be obtained using equa-tions in (4).

x=xpixelzf

y= ypixelzf

(4)

Once the robot obtains these three x, y, zvalues, they are passed to an external func-tion that computes the desired velocities xv

that are given to the main control loop as itis shown in Fig.3.

7

Page 9: RoboSub Report

10.2 Task Navigation

The vision processing approach for taskidentification and completion is generalizedbelow:

1. Acquire relevant image from either topcamera or bottom camera, depending onthe task

2. Search, in the HSV space of the image,for the hue component corresponding tothe dominant color in the current task

3. When a significant amount of the domi-nant color is detected, build an edge mapand apply a Hough Transform (either cir-cular or linear) to find edges that de-scribe relevant shapes for each task

4. Combine information from both the im-age segmentation and the Hough Trans-form to acquire all information requiredto complete the task

• Alignment of the robot with respectto task objects (determined by an-gles of lines from Hough Line Trans-forms)

• Coordinates of regions of interestfor the task (X’s and O’s, align-ment points on the targets, bouylocations and distances from theircenters)

• Pixel dimensions of objects in thescene (to compute distance fromthe object)

• Provide main control algorithmwith real-time supplementary infor-mation while the robot is carryingout its task by continuously com-puting the positions of objects inthe frame in 3D space with respectto the robot’s current location.

11 Computing Environ-

ment

An ADVANTECH PCM-3362, equippedwith an Intel AtomN450 processor was cho-sen as the main processor for the robot inorder to handle the intense vision processingneeded to accomplish the challenge tasks. AnArduino MEGA 2560 equipped with a At-mel AVR ATmega2560 microcontroller wasused to gather data from the DVL, IMU anddepth sensor, and to command the motorcontrollers, mainly because the PC board didnot have a sufficient number of IO ports tohandle all of these tasks. The PC board andthe Arduino are connected via USB emulat-ing a Serial interface. A custom packet for-mat has been designed for the communicationbetween Arduino and the PC Board. Thefunction of the Arduino is twofold. The firsttask is gathering data packets from IMU andDVL, and gathering analog data values fromthe depth sensor. This data is then sent tothe PC Board for use in the control scheme.The packets from the IMU and DVL are spec-ified by the manufacturer and so contain un-necessary data that we do not use in our ap-plication, so the program cleans the data be-fore sending it to the PC board for processing.The second task is to get the command pack-ets from the board for commanding the motorcontrollers and the pneumatics controller. In-stead of using the limited Arduino API, whichdid not have the frequency choices or serialcommunications protocol that were needed,the lower level avr-libc library was used, to-gether with the avr-gcc compiler to createcustomized code that worked for our appli-cation. A thread is tasked to read the pack-ets from the serial interface, and to fill thebuffer containing the last available raw datafrom the sensors. A second thread containsthe EKF code, and uses the raw data fromthe sensors to compute and estimate of the

8

Page 10: RoboSub Report

variables needed for the control. These val-ues are then also placed in a buffer. Thisbuffer is used by a control thread, which alsouses the buffer with reference values providedby the planning routines. The control threadcomputes the input values for the motors, andthen sends the appropriate packets to the Se-rial Interface. This flow of information be-tween the Arduino and the PC board can beseen in Fig.7.

Figure 7: Data flow of the main threads inthe PC board.

12 Power

The power is given to the robot by three24V 10Ah NiMh batteries connected in par-allel in order to achieve the right amper-age capability such that the robot can workfor more than 15 minutes (duration of thecompetition).5 in particular, the motors, aswell as the DVL and IMU, work at 24V ,the Arduino, the top camera and the step-per motor at 12V , while the pressure sen-sor and the PC board at 5V . The power isdistributed to each component of the robotthrough the PCB (Printed Circuit Board)shown in Fig.8. The PCB is mainly com-posed by two DC/DC converters that convert

5the estimated maximum power consumption is1000W (each motor has 120W power), that meansa capability of 84Ah; with three 10Ah batteries inparallel the amperage capability becomes 10Ah ∗ 3 =30Ah, that means a duration of 20 minutes if themotors work at maximum power.

respectively the 24V to 12V and the 12V to5V , different terminals that fit the size of thedifferent wires coming to and from the board,and re-enable fuses of different hold amper-age to protect all electronic components frompossible high currents given by diseases. Itcontains also an additional area with four re-lays and four diodes as electrical interfaceto switch on/off the pneumatic actuators forthe torpedo shooters and the marker drop-per (see section 7). This electrical interfaceis needed because the current required fromthe pneumatic actuators is of the order of 1A,so the Arduino cannot be connected directlyto them since it can provide only 10mA oneach port. In this case, on the correspondingdigital ports of the Arduino is connected tothe coil of the relays with one diode in par-allel in order to prevent the peak of voltageprovided by the coil during the transient on-off.6 The PCB has been designed with Ca-dence OrCAD 16.3 and has been built in thetechnical laboratory of DIBE7.

Figure 8: PCB for the power distribution.

6Axom DPDT relay is used, with a coil resis-tance of 500Ω so that the Arduino can handle di-rectly the required current to close the relay contact( 5V

500Ω = 10mA) without using a transistor to amplifythe current.

7Department of Biophysics and Electronics

9

Page 11: RoboSub Report

13 Conclusion

Overall, this project was a great way forteam members to not only gain the real-world experience that goes with completinga project of this magnitude, but it also gaveteam members the chance to learn about thechallenges (the language barrier for instance)and advantages of working as an internationaland multidisciplinary group. There are a fewthings that the team would do differently ifthey were to complete the project again, us-ing all the experience gained so far, but in all,we are proud of what we have been able to ac-complish in such a short amount of time.

10


Recommended