+ All Categories
Home > Documents > FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals...

FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals...

Date post: 22-Sep-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
8
FAR E ASTERN F EDERAL U NIVERSITY Page 1 of 8 FEFU RoboSub 2016 Autonomous underwater vehicle Anton Tolstonogov, Vladislav Goi, Maksim Sporyshev, Sergey Kulik, Roman Babayev, Artem Kostianko, Ivan Chemezov, Igor Blinov, Tatiana Ian, Kirill Podberezin, Oleg Snegirev, Olga Riabaia Far Eastern Federal University 8, Sukhanova str., 690950, Vladivostok, Russia Email: [email protected] Abstract—The following paper describes the develop- ment of the AUV which our team has prepared for RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based navigation and control methods, and the research of group control methods. KeywordsAutonomous underwater vehicle, RoboSub 2016. I. INTRODUCTION Our team has been participating in Robosub for last 4 years and each year we consistently reach the final stage of the competition. Each year give us lessons of robotics development. We started to develop the vehicle from scratch twice, given the experience of previous years during four years of participation. All components of the vehicle: both electronics design and programming have been up- graded, and sometimes re-designed from scratch. We prefer the evolutionary path of development of the vehicle, wherein the only one part of the AUV is significantly changing at a time. For example, we substantially redesigned the frame of the vehicle, almost not touching other parts from last year. Now we have kept the design of the last year’s vehicle and partially renovated electronics. But our main achievement of the year is radically modified software architecture. This year we have completely migrated to ROS. You can see at figure 1 how the vehicle has been implemented. Fig. 1: Implemented of the vehicle. II. HARDWARE A. General construction Approximate dimensions of the vehicle are: 0.9 × 0.5 × 0.4 meters, weight 30 kilos. The frame of the vehicle is represented by two main vertical polypropylene plates, which hold all the equipment by means of special clamps. The following housing layout is applied: electronics unit, front camera, and depth sen- sor are in front of the AUV; battery unit and side thruster in the center; DVL unit, bottom camera, air housing for
Transcript
Page 1: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 1 of 8

FEFU RoboSub 2016Autonomous underwater vehicle

Anton Tolstonogov, Vladislav Goi, Maksim Sporyshev, Sergey Kulik,Roman Babayev, Artem Kostianko, Ivan Chemezov, Igor Blinov,

Tatiana Ian, Kirill Podberezin, Oleg Snegirev, Olga RiabaiaFar Eastern Federal University

8, Sukhanova str., 690950, Vladivostok, RussiaEmail: [email protected]

Abstract—The following paper describes the develop-ment of the AUV which our team has prepared forRoboSub 2016. The major goals of the vehicle are thecompetition, the research and development of vision-basednavigation and control methods, and the research of groupcontrol methods.

Keywords—Autonomous underwater vehicle, RoboSub2016.

I. INTRODUCTION

Our team has been participating in Robosub forlast 4 years and each year we consistently reachthe final stage of the competition. Each year giveus lessons of robotics development. We started todevelop the vehicle from scratch twice, given theexperience of previous years during four years ofparticipation. All components of the vehicle: bothelectronics design and programming have been up-graded, and sometimes re-designed from scratch.

We prefer the evolutionary path of developmentof the vehicle, wherein the only one part of the AUVis significantly changing at a time. For example, wesubstantially redesigned the frame of the vehicle,almost not touching other parts from last year.Now we have kept the design of the last year’svehicle and partially renovated electronics. But ourmain achievement of the year is radically modifiedsoftware architecture. This year we have completelymigrated to ROS.

You can see at figure 1 how the vehicle has beenimplemented.

Fig. 1: Implemented of the vehicle.

II. HARDWARE

A. General construction

Approximate dimensions of the vehicle are: 0.9×0.5× 0.4 meters, weight 30 kilos.

The frame of the vehicle is represented by twomain vertical polypropylene plates, which hold allthe equipment by means of special clamps.

The following housing layout is applied:

• electronics unit, front camera, and depth sen-sor are in front of the AUV;

• battery unit and side thruster in the center;

• DVL unit, bottom camera, air housing for

Page 2: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 2 of 8

pneumatic actuators and two remaining hor-izontal thrusters in the back.

There are two vertical thrusters located in frontand back side of vehicle.

A compass, Wi-Fi access point and LED ofControl-Emergency System are placed in a separatetransparent plastic housing located in the uppersection of the vehicle as far as possible from thethrusters.

Hydrophones are spaced at four corners andlocated at the bottom in the inner side.

Block scheme of the vehicle is demonstrated onfig. 2.

B. Frame and housings

The vehicle construction consists of a rigid framemade of polypropylene sheets. This material per-fectly suits to our vehicle because it does not changeits properties under water, it is easy to process,its density is less than water density but still it isstrong enough. The whole hardware is attached bythe clamps which are made of the same materialas a frame. They hold the hardware hard enoughand can be easily detached with the hardware fortechnical maintenance, repair, etc. Prepared sketchesare forwarded to the water jet cutting where allthe frame components are made from the planepolypropylene sheet.

Render of the vehicle frame with new propulsionsystem is demonstrated on fig. 3.

All electronic housings are custom-build. Theyare manufactured of aluminum and then oxidizedfor corrosion prevention. Our special focus is puton the waterproofing of all electronic housings. Itis accomplished with the O-rings, that are sizedaccording to each housing.

C. Sensors

Navigation system represented by three sensors:

1) Depth sensor (model PD100). Accuracy ofthis sensor is about a couple of centimeters,which is sufficient to stabilize a given depthduring missions.

Fig. 3: Render of the vehicle frame with new propul-sion system.

2) Inertial Module Xsens Mti IMU. It inte-grates three-axis gyroscope, accelerometerand magnetometer, as well as the Kalmanfilter is implemented, which provides suffi-cient accuracy for stabilization of roll, pitchand heading.

3) Doppler Velocity Log from RD Instruments.We integrate velocities in order to get cur-rent coordinates. This coordinate system isfurther used to return to the locality ofpoints of interest. It is very useful whenimplementing a sequence of missions.

These sensors allow us to control the vehicle in5 out of 6 degrees of freedom.

Video system. This system includes two profes-sional cameras (model: Prosilica GC 1380). Oneof them is directed forward and the other one isdirected down. The cameras work in 400 × 300 pxmode (maximum resolution: 1360×1024) and allowus to adjust exposure, color correction and frameresolution in a fairly wide range, both in manual andautomatic mode. This greatly facilitates of computer

Page 3: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 3 of 8

Fig. 2: Block scheme of the vehicle.

vision detection, allowing to deal with the adverseconditions of the open water. The camera operatesat 30 frames per second at maximum resolution, butwe are working with a frequency of 5-7 frames persecond.

Finder sonar. There are digital signal processorbased on STM32F4Discovery with integrated DSPunit and 4 hydrophones used in the AUV to performthe pinger task.

Safety system. Safety sensors are mounted inevery waterproof housing. A warning message issent immediately to a GUI operator program if oneof them is activated. Control emergency system ofthe vehicle signals for a dangerous situation, if thevehicle is under the water and is not connected withoperator.

D. Actuators

For successful tasks execution the AUV isequipped with pneumatic system for torpedo shoot-ing. It contains 2 tubes with aluminum torpedoesinside. Tubes are linked to the air bottle whichbecomes opened right after the supervisor signal.

AUV is equipped with a system for droppingcargoes based on electromagnets.

Fig. 4: Render of pneumatic actuator.

We have designed a pneumatic grabber to dealwith new buckets (fig. 4).

All actuators are spaced near the correspondingcameras to work efficiently with computer vision.

There are 5 thrusters on the AUV: 2 vertical onesand 3 horizontal. Thrusters use 24 volts Faulhabermotors and polyurethane propellers.

Page 4: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 4 of 8

E. Computers

Two uniform computers are used for calcula-tions — single-board computer PC/104-Plus withprocessor Cool RoadRunner 945GSE with processorIntel Atom N270 (1.6 GHz, 1 GB SDDR2, 533MHz). Most of our software modules (includinga mission module) and sensor data processing isexecuted on the mission computer. The second com-puter processes computer vision and acoustics.

F. Low-level devices

Supervisor. There is a supervisor board of ourteam design placed in the electronic hull. The boardis based on a STM32F207 micro controller andhas an Ethernet interface for the communicationpurpose. This board digitizes the data from the depthsensor, stores the data from all water sensors and thevoltage and amperage from the battery. It ensures thethrusters control signal transmission and controls theactuators power. Also the low level of the control-emergency system is implemented on supervisor.

Digital signal processor. For the acoustic pingersignals processing we use a separate signal proces-sor board. First step is signal amplification with 4hydrophones, then a passive filtration and a voltagelevel conditioning for the further signal digitizing.After that, the signal digitizing and further process-ing of the signals from four channels is ensuredby the micro controller STM32F407VGT6, whichcontains DSP instructions. The controller does thedigital filtration with the narrow bandpass IIR filterand if the signal was received in all four channels,the relative time of signal delay is calculated. Then,the calculated time is sent to the computer, that con-trols the acoustic data processing, via USB protocol.

G. Power supply

The power to run our vehicle is provided by alithium-polymer battery 5 Ah, which is located ina separate housing with a detachable connector. Ifnecessary, we simply remove the discharged batteryhousing and replace it with the charged one. Suchan approach saves a lot of time and is safe enough.Battery power is supplied to the main electronicsunit, where it has already been distributed betweenall devices.

Fig. 5: Power scheme.

Our power system scheme is presented at fig. 5.

III. SOFTWARE

A. ROS structure

This year we decided to use Robot OperatingSystem as an interprocess communication frame-work and all the utils that come with it. ROS pro-vides us with a lot of different features. Among themmessage passing system, compatible with a lot ofdifferent ROS packages. Recording and playback ofmessages with later visualization of camera imagesand plotting numerical values helps us to analyzethe mission execution process.

In our system we decided to follow the conven-tion over configuration principle. Our messages aredivided into two groups commands and data mes-sages. Only one module can receive command of aspecific type, but everyone may send them. Only onemodule can send data message of a specific type, buteveryone may subscribe to receive it. That’s whyevery command is kept in a topic with a name ofreceiving node plus the name of message type andevery data message is kept in a topic with a name ofsending node plus the name of message type. Thisconvention allowed us to build simpler and moreconvenient messaging system with a help of ROS.And still if you want to have an exception fromthis convention you may have it, but use raw ROSfunctions to subscribe to a topic. On the following

Page 5: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 5 of 8

Fig. 6: ROS nodes graph of our system.

Fig. 7: Example of our webGUI.

figure (fig. 6) you can see our nodes visualized byrqt.

B. WebGUI

This year we created web interfaced console tovisualize numerical data for messages in real time.It uses roswww package as a server running on avehicle computer and roslib.js to implement clientside part. We may plot the numerical values overtime and send commands to our vehicle througha web page which may be accessed through everydevise even not having ROS on it. We also mayvisualize navigation data for some object that canbe detected by our vehicle on the map. Web GUIconsole is one of the most useful tools for us. On thefigure 7 you may see example of the web interface.

C. Navigation node description

Navigation is developed to solve two main prob-lems:

1) Filtration and broadcasting data from IMU,DVL and other sensors.

2) Path‘s calculation based on information ofvelocity and heading.

The first problem is related to the abstractionconsumer of navigation data from the sensor dataproviders. For example, motion node uses a velocitycalculated by navig node during motion stabiliza-tion, without worrying about sensor delivers thisspeed actually. In turn, the navigation node receivemessages from DVL and publishes speed on itsown behalf if it just received. If the data becomeobsolete navigation node corrects old DVL datausing acceleration’s data provides by compass.

The problem of path’s calculation is solved byclassic method (integration of the vehicle’s speed,taking into account the orientation data).

D. Mission node description

The mission node contains the motion and ac-tions logic of the vehicle.

This node consists of separate tasks, which areexecuted in a linear sequence. The order of exe-cution is described in a set of configuration files.There is a global map that stores information about

Page 6: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 6 of 8

recently found orange stripes. This information isused for vehicle positioning before starting the nextscheduled task.

Our previous task execution scheme was prettysimilar to the state machine. The set of statesfor each task is explicitly formulated for a statemachine. We have strictly structured our code inaccordance with that and developed a correspondingobject oriented structure. A typical action in one ofthe states is stabilization of the vehicle in front ofan object on the video frame.

YAML is used for mission configuration files.Each task reads parameters from three sources inthe following order:

• task.yml

• task name.yml

• mission.yml

The task.yml file describes default parameters whichare common for all tasks. A separate file for eachkind of task gives parameters common to all in-stances of that kind. The mission.yml file containsthe whole mission parameters. A task may haveabout 50 parameters, and this approach allows todecrease the size of the main configuration fileand to make preparation for the run easier. Themission execution time can be essential this year.So, it is important to stabilize faster at objects ofinterest. We have added a differential component toa vision regulator to deal with that. We have adjustedcoefficients using ROS tools.

E. Motion control

Motion control divided in two functional part:motion node and TCU (thruster control unit orpropulsion system) node.

Motion solve high level motion problem includ-ing stabilization. The vehicle motion control is per-formed in the event loop. The data from navigationmodule is processed at the each iteration of thisloop. The message containing thrusts in all sixdegrees of freedom is formed on the basis of thisdata. The vehicle is able to control only five degreesof freedom, so the roll thrust is always equal to zero.

One of the aims during module development wasthe extensibility and opportunity of code reusingfor different vehicles with different configurations.Hence, we use the following structure: a separateregulator implemented as a module is created foreach control command (heading stabilization, posi-tion stabilization, etc.). Such a module subscribesto a message, that describes the module commandand activates the regulator. Regulator’s lifetime isdefined by the command execution time. After itscreation regulator reserves necessary control degrees(e.g. position regulator reserves longitudinal, trans-verse and heading axes). Capability control channelsconflicts are solved in command creation order: newcommand, which uses these axes, replaces the oldone.

Each regulator is the means for solving the phys-ical control problem. PID controller is used on thisvehicle for solving mathematical control problem.

Propulsion system node is to receive thrusts andtorques from motion module and calculates controlsignals for each thruster on the basis of receivedvalues. In accordance to each thruster orientation itsdirection can be described with 3-dimensional vec-tor in vehicle coordinate system. Wherein thrustershaving a part in calculating thrust at different axeshave the contribution to the corresponding axisproportionally the angle between the axis and thethruster vector. Given information above and eachthruster shoulder we get 5 x 5 matrix which showsthe thruster (moment) allocation at the certain axisamong thrusters.

Thruster’s firmware receives control signals(codes) in [-128, 127] interval. So the module haveto convert these variables from [-1, 1] interval tothe proper interval. The vehicle uses symmetricalpropellers. Its performances were received duringbench tests. We are able to calculate control signalfor any thrust sign linear interpolation if we knowa number of conformance points between thruster’sthrust and control signal.

F. Video module

The main goal of the video module is analyzingimages taken from cameras and passing informationabout detected objects to mission module. Mission

Page 7: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 7 of 8

sends to video module a list of required objects andthe number of camera to take images from.

OpenCV library is used for image processing. Weuse version 2.4.8, because this is the most stable andreliability version at the current time.

Each detection algorithm usually consists of thefollowing parts:

• Preprocessing. It’s done by applying filters orclustering algorithms to correct underwatercolors, remove noise and small useless de-tails. Color correction is done by increasingred channel in RGB image. Then three waysof image preprocessing are used: medianblur, Gaussian blur. Median and Gaussianfilters are implemented in OpenCV.

• Color binarization. The source image istranslated to HSV color space and the bi-narization is executed by hue and saturationthreshold.

• Contour analysis. It is based on OpenCVimplementations of Suzuki-Abe algorithmand polygon approximation, a custom Houghimplementation is also used.

We have calibrated our cameras using OpenCVcalibration tools. A chessboard of 1 x 1 meter sizewas made. The chessboard is positioned so that itlies entirely in the field of vehicle camera view, thecamera takes images. Then the images are givento calibration program which detects angles on thechessboard and calculates camera parameters, suchas focal distance and radial, tangential distortionfactors. All this parameters are considered in ourcamera model which helps us to calculate the head-ing to the detected object and determine distance tothis object more accurately.

G. Implementation

C++ programming language is used in our vehi-cle control system software implementation. Somemodules significantly depend on C++11 standardfeatures so we were to update GNU compiler to4.8 version. Boost libraries provides us with someuseful extensions that we apply to, for example,our thread pool implementation. We also need itin command line interface implementation for our

Fig. 8: Gazebo GUI.

tools. Cmake intelligent build system is used tobuild our system. Bash scripts are used to launchit. We have 6 software engineers in our team. Gitversion control system and central private repositoryon bitbucket.org are chosen to organize their teamdevelopment process.

IV. TESTS AND TRIALS

A. Simulator

Our simulation system was implemented usingGazebo simulation tools. One of the advantagesusing ROS core is native Gazebo interaction. ROShave build-in plugins for using Gazebo playground,so we need just build competition scene for testingour mission algorithms. Simulation system GUIshown in fig. 8.

B. Debugging

For data logging and playback, ROS uses the bagformat. We use bag file for recording vehicle dataand telemetry. Rosbag is a ROS plugin for recordingall messages between nodes in our system. We caneasily look the states of all values in the messagesin any time of recording. Rosbag can play backselected messages and visualize the contents withplotting of numerical values. These benefits allowus to conveniently and quickly check the progressof the mission and look for bugs analyzing bagfilesof the mission. Example of rosbag build-in vieweris shown on figure 9.

Page 8: FAR EASTERN F U FEFU RoboSub 2016 Autonomous underwater …€¦ · RoboSub 2016. The major goals of the vehicle are the competition, the research and development of vision-based

FAR EASTERN FEDERAL UNIVERSITY Page 8 of 8

Fig. 9: Rosbag build-in viewer.

C. Pool tests

One of the most essential parts in vehicle prepa-ration is testing in the swimming pool. This processis required to detect hardware failures, produce somecomponents setup and find out unexpected bugs,that haven’t been appeared earlier. We started to testour vehicle in pool at the beginning of June in ourcampus swimming pool.

We used a new way to monitor our vehicle stateusing a real time webGUI and thin debugging wire.That gave us the ability to find out bugs and failuresin real time and not spend to much time watchingbagfiles after mission execution. Furthermore, theGazebo simulation reduced necessity of vehicle test-ing in vivo to a minimum.

ACKNOWLEDGMENT

The development of FEFU AUV is supported byFar Eastern Federal University and by Institute forMarine Technology Problems of Far Eastern Branchof Russian Academy of Sciences. We would like tothank our mentor Dr. Alexander Scherbatyuk. Wealso would like to thank those from FEFU and IMTPFEB RAS who have supported us in this work.

REFERENCES

[1] P. Newman, MOOS - mission orientated operating suite, Mas-sachusetts Institute of Technology, Tech. Rep. 2299/08, 2008.

[2] M. Burkardt, L. Barron, T. Brook et al, Cornell UniversityAutonomous Underwater Vehicle: Design and Implementationof the Ragnarok, http://cuauv.ece.cornell.edu.

[3] M. Quigley, B. Gerkey, K. Conley et al, ROS: an open-sourcerobot operating system, Open-source software workshop of theInt. Conf. on Robotics and Automation, Kobe, Japan, 2009.

[4] O.T. Chang, G.E. Wei, J. Ong et al, BBAUV: AutonomousUnderwater Vehicle, software overview, www.bbauv.com.

[5] M.V. den Bergh, X. Boix G. Roig et al, SEEDS: SuperpixelsExtracted via Energy-Driven Sampling, ECCV 2012.


Recommended