+ All Categories
Home > Documents > [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA...

[IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA...

Date post: 14-Dec-2016
Category:
Upload: mikel
View: 226 times
Download: 7 times
Share this document with a friend
9

Click here to load reader

Transcript
Page 1: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

Abstract— In this paper, a new automatic system in almost

real time for overhead power line inspection by Unmanned

Aerial Vehicle (UAV) is presented. This method is focused on

the prevention and maintenance of aerial power lines. The main

contribution and novelty is the design and development of a new

fully automatic aerial inspection system, without requiring

manual intervention on the ground station or control center, to

detect possible anomalies. The main goal of the method is to

process the consecutive images together with the telemetry data

sent from the autopilot in order to identify online areas of

vegetation, trees and buildings close to power lines and to

calculate the distance between power lines and the vegetation,

trees and buildings. Simultaneously, the system processes the

images captured by the infrared camera in order to detect bad

conductivity and hotspots in the power lines, transformers and

electrical substations. The system response times does have a

constant of one second for calculating distance measurement and

down to 0.3 seconds for detecting hotspots.

In addition, the system automatically generates reports offline

in the ground station once the flight is finished. These reports

require more execution time and more CPU resources. The

results show 3D information obtained through processing of

consecutive frames of the HD images together with the telemetry

data sent by autopilot.

This paper is structured as follows: the first section briefly

introduces the current methods of inspection of power lines; the

second section presents our proposed system “RELIFO”; the

third section describes the algorithms designed and developed

for this system; the fourth section presents and discusses the

experimental results; the next section offers the conclusions

drawn. Finally, we cite the references and publications used in

this paper.

I. INTRODUCTION

Overhead Power lines of high and medium voltage are the means by which the distribution networks transmit the electricity bringing electricity to our homes and businesses.

*Research supported by Deusto University - DeustoTech.

J. I. Larrauri is Professor in the Engineering Faculty at Deusto University

and Researcher in DeustoTech. Avda. Universidades, 24. 48007 Bilbao;

SPAIN (phone: +34 944139000; fax: +34 944139101; e-mail:

[email protected]).

G. Sorrosal is researcher in DeustoTech Energy, 48007 Bilbao; SPAIN

(e-mail: [email protected]).

Mikel González is researcher in DeustoTech Energy, 48007 Bilbao;

SPAIN, (e-mail: [email protected]).

Inspections of overhead power lines play an essential

factor in maintenance and electricity transmission. Surveillance and maintenance of electrical infrastructures is the key to ensuring and improving the quality of service. A system for overhead power line inspections can prevent and predict future anomalies and damages. This responsibility usually lies with the electricity network company.

On the other hand, trees and vegetation near power line corridors need to be cleared to avoid power outages, damage to the power lines, fires and personal safety risks. It is a legal requirement by which electricity network operators must maintain safety distances between vegetation and overhead power lines, along the corridor.

In order to comply with the statutory requirements, companies use one or more inspection methods: manual inspection, robotics inspections, manned and unmanned aerial vehicles [1].

A. Inspection method based on manual inspection

This is the first generation of the inspection methods used by electricity network operators. The Company’s maintenance personnel inspect the power line corridors on foot or with a vehicle using binoculars and Infrared thermal cameras. Operators have available an equipment inspection and a maintenance Worksheet in electronic format. This form is used by the operators to record the results of the inspection and these are sent electronically to the company control centre [2]. This inspection method has the following disadvantages: results are influenced by the subjectivity and mood of the operators, the difficulty to access some areas of inspection due to topography, inspection mistakes caused by personnel fatigue due to repetitive processes and tasks and the inspection times are too high

B. Inspection method based on Manned Aerial Vehicles

An alternative to the inspection method based on manual inspection, which is considered too slow, is the inspection by a manned helicopter which is able to inspect great distances of overhead power lines in short times. Typically, the smallest team is made up of two persons from the power company on board the helicopter. This inspection method has the following disadvantages: accident risk, payment of substantial compensations to customers accidents, high inspection costs, danger of collision when penetrating areas that may be dangerous due to their approximation to the

Automatic System for Overhead Power Line Inspection

using an Unmanned Aerial Vehicle

- RELIFO Project -

Juan I. Larrauri, Gorka Sorrosal, and Mikel González

2013 International Conference on Unmanned Aircraft Systems (ICUAS)May 28-31, 2013, Grand Hyatt Atlanta, Atlanta, GA

978-1-4799-0817-2/13/$31.00 ©2013 IEEE 244

Page 2: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

surrounding power lines and vegetation, and the start point of flight and its return to the airport increase the cost of the inspection.

C. Inspection methods based on Unmanned Aerial Vehicles

The use of Unmanned Aerial Vehicles for overhead power line inspections has been identified as a viable alternative to aerial manned inspections due to the use of unmanned vehicles which eliminate the accidental loss of life, and they are easily carried to the starting point and can be picked up at the same point at the end of the inspection flight. The advantages of these systems are: significant cost savings, staff safety is guaranteed and improved, UAVs can hover as close as even 10 meters away from the medium voltage power line, the inspection is documented by video, images and data, reduce the risk for having to pay compensations to customers.

This inspection method allows us to take advantage of the implementation of inspection systems based on preventive and predictive models [3].

As an alternative to the conventional inspection methods, we present a new system for automatic inspection in which the system automatically generates near real time reports [4] and alarms about damages detected in the inspection in and these are sent by email and SMS to the company control centre.

The system proposed is fully automatic and it does not require manual intervention on the ground station [5]. The vision algorithms designed and developed for this project process the HD and infrared images in order to detect anomalies, damages and they automatically generate reports. These reports are sent online via email and SMS. In addition, other reports are generated after completing the inspection flight.

II. DESCRIPTION OF THE PROPOSED SYSTEM - RELIFO

Since 2007, IBERDROLA and the University of Deusto (Deustotech Energy) are collaborating in the development of the RELIFO research project based on a system of power line inspection. The purpose of this system is to detect and prevent possible anomalies along the power lines, in substations and transformers. The following figure illustrates the structure of the system.

Figure 1. Configuration of the proposed system. RELIFO Project

Initially, the flight plan is created and edited in the ground station software. This flight path can be modified during the flight mission. The flight is performed automatically without

manual intervention. The UAV hovers as close as even 10 meters over the power lines and the inspection is carried out both in the first flight and on its return.

Figure 2. Flight planset up. RELIFO Project

An antenna is available in the ground station and three computers with their video capture cards. A first computer controls the flight plan, telemetry and commands between the UAV and the ground station and vice versa. This computer is used to control the flight of the UAV.

A second computer simultaneously receives the video signal from the HD camera through a capture card and telemetry by the RS232 serial port. The video signal is converted into consecutive frames (4 frames/s). Telemetry data and images acquisition are synchronized.

The aim of the processing of these images is to measure distances from power lines to the surrounding vegetation and buildings [6]. The system response time is in almost real time [7] and generates online reports that are sent via SMS and email.

Figure 3. HD Video Camera and Data Communications. Computer-2

The third computer captures the video signal from the infrared thermal camera turning it into frames (4 frames/s) and telemetry received by the RS232 serial port. The purpose of this computer is to detect hotspots and bad conductivity [8] [9]. Artificial vision algorithms have been developed to detect differences of 30 °C. Once detected this difference enhances the hotspot pixels with maximum intensity value of pixel (pixels = 255) and represents most of the remaining pixels of the image with zero value. Therefore, the processing of an image with these features allows a system response time of less than one second and it also provides online reports, which are sent via SMS and email.

Figure 4. Infrared Camera Image. Computer-3

245

Page 3: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

Each computer is independent and they are not interconnected in order to perform independent processes. However, once the flight data and images from computer 2 and 3 are integrated to generate new offline reports.

III. ARTIFICIAL VISION ALGORITHMS

In this research project we have been developed and implemented robust algorithms of artificial vision to calculate distances and detect hotspots [10].

Figure 5. Distance measurement and Hotspots detection

The following steps are used to calculate distances.

Step 1. Power lines identification and recognition

Step 2. Calculation of distance between UAV and power

lines

Step 3. Calculation of distance between the conductors and

the ground

Step 4. Detection and identification of trees, vegetation and

buildings Step 5. Calculation of distance from the conductors to trees,

vegetation and buildings

The algorithms below describe the process of the method

proposed for measuring distances.

Step1. Power lines identification and recognition

Image binarization is typically treated as simply a thresholding operation on a grayscale image to create binary images which are represented by black and white pixels. It is difficult to determine a good threshold value for a binarization. In fact, we apply the Otsu method and we find that the threshold value is similar in consecutive frames. Each frame provides relevant information to obtain the threshold value of the next frame, and so on.

Figure 6. Image Binarization

Once the image has been binarized, we suggest carrying out three fundamental morphological operations, dilation, erosion and filter mean to find lines (conductors) in the binarized image. The resulting images are shown in figure 7.

.

Figure 7. Dilation, Erosion and Filter Mean operations

In order to identify the points of the conductor lines, we apply a robust algorithm for edge detection [11]. The aim of this algorithm is to identify points in a digital image at which the brightness has discontinuities.

Figure 8. Edge detection operation

At the end of this step, the conductor lines are digitally

reconstructed applying the proposed sequence of

morphological operations [12] [13].

Step 2. Calculation of distance between UAV and power

lines

The cable diameter is known beforehand, therefore, we

can calculate the relation between the size of the real cable

and the size in pixels in the CCD image taking into account

the calibration camera and the intrinsic parameters of the

camera: distance focal (f), focal (F), zoom, field of view, etc.

To determine the distance from UAV camera to the

conductors, object distance (do) , there are two ways: we can

apply the thin lens equation because the distance to image

(di) is known or the proportional calculation.

Figure 9. Calculate the diameter size of the cables

(1)

246

Page 4: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

In addition, after calculating the size of each of the

conductor lines in the CCD image it is easy to calculate the

following measurements: distances between cables and the

distance from conductor lines to UAV camera. Moreover, the

proposed algorithm can detect power lines and the automatic

measurement of overhead power line joints.

Step 3. Calculation of distance between the conductor lines

and ground

Laser altimeters for UAV measure the height above the

ground and different surfaces, trees, vegetation, mountains,

landing strip, grass, etc. They use a sensor that can measure

long distances to surfaces of different reflectivity.

Figure 10. Automatic 3D Power line reconstruction using altimeter laser

The distance measurement from an UAV camera to

overhead power line has been calculated in step 2, and the

distance between overhead power lines to ground has been

calculated in step3. These values are used in the next steps to

calculate distances to vegetation, trees, and buildings.

Step 4. Detection and identification of trees, vegetation and

buildings

In this step, we proposed a fast algorithm for identifying

vegetation, building and non-building objects from an aerial

image. This algorithm identifies solid surfaces or group of

pixels that contain the same intensity value.

Figure 11. Detection of solid surfaces: trees, vegetation and buildings

Detection of trees, vegetation and buildings close to

power lines are detected as solid surfaces in the binarized

image and they are identified extracting their corresponding

pixels in the source image.

Figure 12. Identification of vegetation

Step 5. Calculation of distance from the conductor lines to

trees, vegetation and buildings

In order to calculate distances between overhead power

line and trees, vegetation or buildings, we apply stereoscopic

vision methods using consecutive image frames. The system

uses intrinsic camera parameters, camera geometry,

calibration, pan, tilt, angle and the relative orientation of the

current camera frame with respect to the previous camera

frame.

Image processing and the analysis of stereoscopic

sequences include three stages: first, estimate of the disparity

of every point through the matching of points in a pair of

images; second, estimate of the disparity map, and third

detection and compensation of occlusion.

Stereoscopic Analysis

Our objective is to analyze the possibility of obtaining

real distances using the information that can be obtained from

two consecutive digital images. In the stereoscopic analysis,

once we have a pair of pictures we can obtain information

about the depth.

Figure 13. Stereoscopic analysis

Let’s consider a stereoscopic system with two cameras of

focal length f and baseline distance b as shown in the scheme

1 below.

Analysis over the plane

To simplify the study of the problem, we will work with a

flat trajectory of the camera over the plane, and we will not

consider the third special coordinate.

Rectilinear trajectory Let’s suppose that the camera is in rectilinear movement over the point which we wish to determine the distance, placed above it. In this case, there is not a camera rotation, just a simply translation. The angle formed with the horizontal level is kept constant.

We consider that the coordinates of the point P relative to the camera when it is in the first position are P1(X1,Z1) and the distance to the point d1; while in the second position (second plane) will have other coordinates P2(X2,Z2) and the distance d2. Of course, the position of the point is constant in

247

Page 5: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

time. The values over the image can be easily changed from pixels to lengths knowing the resolution of the camera, in pixels per inch:

image 2nd the of resolution

25,4mm· image 2nd the in P point the of pixels ofnumber x

image 1st the of resolution

25,4mm· image 1st the in P point the of pixels ofnumber x

=

=

2

1

Figure 14. Stereoscopic analysis

Figure 15. Stereoscopic points

Considering this, and working in mm:

Also, we have identified the next geometric relationships:

And operating with the expressions leads us to:

We can work with these expressions getting Z1 y Z2, and

in this way we can obtain the values of X1 and X2. We have

now the distances in each position of the camera to the point

P.

Point 1 Point 2

21

21

xx

tgxfcosTZ

γ−γ=

21

12

xx

tgxfcosTZ

γ−γ=

f

Z·xX 11

1 = f

Z·xX 22

2 =

2

1

2

11 ZXd += 2

2

2

22 ZXd +=

As we can see, we get a different distance for each

position of the camera.

Analysis in the space

If we know the data about the desired point on the image,

the next relationships can be established.

Figure 16. Stereoscopic analysis

As far as the images taken the positions of the points in

relation to the image captured by other camera is displaced

by the same camera.

=

=

f

uzx

f

vzy

1 Image 1

11

1

11

=

=

f

uzx

f

vzy

2 Image 2

22

2

22

o define the distance to the point P from the camera we

have two values:

We need to define the values of the coordinates of P1 (the

point relative to the reference system in the first position) and

P2 (the point relative to the reference system in the second

f

ZxX

Z

X

f

x

f

ZxX

Z

X

f

x

2

2

22

1

1

11

2

2

1

1

·

·

=→=

=→=

γ

γ

γ

f·T·cos·Zx-·Zx

tgxf

tgxfZZ

2211=

−=

)2(

·

··)1(

1

2

21

γ

γγ

cos21

2211

TXX

tgXZtgXZ

=−

−=−

=

=

⇒=

f

u

z

xf

v

z

y

f

i

z

j

(2)

(3)

(4)

(5)

(6)

(7)

(8)

(9)

2

2

2

2

2

2

2

1

2

1

2

1

zyxd : 2 Position

zyxd : 1 Position

2

1

++=

++=

248

Page 6: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

position). There are six unknown values but we only have

four equations, so we need another relationship.

We will look for a new relationship between both points,

through a Transformation Matrix.

Transformation Matrix The Transformation Matrix can relate the coordinates of the same point in relation to two different coordinate systems that we consider to be the same one after a rotation and a translation.

We will first consider both transformations in an

independent way, first the rotation of the system, and

afterwards this his translation.

Rotational Matrix Let’s imagine two reference systems, centred in the same original point, but with no aligned axis. The values of the coordinates in relation to one system or another will be:

As we are talking about the same point P, if we

project 2r on the axis of the system 1 we can obtain:

)k·wj·vi·u·(kr·kz

)k·wj·vi·u·(jr·jy

)k·wj·vi·u·(ir·ix

wvuz2z

wvuy2y

wvux2x

++==

++==

++==

And in the matrix way it can express as:

2 System sCoordinate

w

v

u

·

)Z,Wcos()Z,Vcos()Z,Ucos(

)Y,Wcos()Y,Vcos()Y,Ucos(

)X,Wcos()X,Vcos()X,Ucos(

z

y

x

1 System sCoordinate

w

v

u

·

k·kj·ki·k

k·jj·ji·j

k·ij·ii·i

z

y

x

wzvzuz

wyvyuy

wxvxux

=

=

If we consider the properties of the scalar product, being for example cos(U,X) the cosine of the angle formed by the

director vectors xi and ui .

The obtained matrix is called Rotational Matrix, and has

this shape:

=

)Z,Wcos()Z,Vcos()Z,Ucos(

)Y,Wcos()Y,Vcos()Y,Ucos(

)X,Wcos()X,Vcos()X,Ucos(

R12

This means that the rotational matrix of system 2 is relative

to system 1, or the matrix for the change of coordinates of

system 2 to system 1. As the scalar product is commutative,

then: 1T

R12

R12

=

T

The Rotation Matrix is an orthonormal matrix:

22

T

2

21

r·R21

r·R12

r

r·R12

r

=

=

=

To define a Rotational Matrix that is able to express any

turn, we consider three rolls, each one around a coordinate

axis, being the final matrix a roll composition from them.

� Roll around the X axis (Roll)

αα

α−α=α

cossin0

sincos0

001

R1

2

Figure 17. Roll around the X axis

� Roll around the Y axis (Pitch)

=

ϕϕ

ϕϕ

ϕcos0sin

010

sin0cos12

R

Figure 18. Roll around the Y axis

� Roll around the Z axis (Yaw)

Figure 19. Roll around the Z axis

The resulting matrix can be obtained by multiplying these

three matrixes, and one should be really careful with the

direction of the product, so that the matrix product is not

commutative.

IRECTIOND RODUCTP RRRZYX

RR ←==αϕθθϕα

··),,)(,,(

1

2

The equations that define the elements of the Rotational

Matrix are:

=

100

0θ cosθ sin

0θ sinθ cos

θR

1

2

(10)

(11)

(12)

(13)

(14)

(15)

(16)

wvu2

zyx1

kwjviur : 2 systemthe to Relative

kzjyixr : 1 systemthe to Relative

···

···

++=

++=

=

333231

232221

131211

),,(

1

2

rrr

rrr

rrr

Rθϕα

249

Page 7: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

αθ+αϕθ=

αθ−αϕθ=

ϕθ=

sinsincossincosr

cossinsinsincosr

coscosr

13

12

11

Rotation and translation. Transformation Matrix

If the movement between two reference systems is

parallel, the relationship between coordinates in both systems

would be:

Figure 20. Rotation and Translation

+

=

+=

w

v

u

d

d

d

z

y

x

rdr

z

y

x

2

1

21

For a general movement, the Rotational Matrix must be

included, and in compact notation it can be expressed as

follows:

=

=

+=

12·

1000

12

12

11

2·12

1

122·1

21

rdRr

rTr

drRr

Being the Transformation Matrix homogeneous:

Factor caleSw

Submatrixtion Transforma ePerspectivp

Vector ranslationTd

Matrix otationalRR

wherewp

dRT

3x1

1x3

3x3

3x1

1x33x31

2

=

=

=

=

=

a

a

a

a

As an example, we will analyze the following situation:

Figure 21. Transformation Matrix

The equations that relate the coordinates relating either

reference system are:

f

uzy

f

vzx

f

uzy

f

vzx

z

y

x

sen

sen

d

z

y

x

2·22

2·22

1·11

1·11

12

2

2

·

1000

0cos0

0cos0

001

11

1

1

==

==

−=

αα

αα

As we know the values of the coordinates of a same point in two consecutive images, u1, v1, u2 y v2, we just must solve a system of equations made up by six unknown values and seven equations.

Once these coordinates are obtained, the distances to the point will be:

In order to test these expressions, we will need a full known

system like the following one. Let’s suppose a two

dimensional system, because we intend to calculate the

distances using geometry, and we need an easier system:

αθαϕθ

αθαϕθ

ϕθ

sincoscossinsin

coscossinsinsin

cossin

23

22

21

−=

+=

=

r

r

r

(17)

(18)

(19)

(20)

(21)

2

2

2

2

2

22

2

1

2

1

2

11

zyxd

zyxd

++=

++=

250

Page 8: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

Figure 22. Calculation of distance

The two positions of the camera can be established by

assigning values that we consider a good example, as:

º10

2y ,3x ; 4y ,3x

2d ,1f

2211

y

====

==

And we can calculate the other parameters using

geometry:

º3096,46

º44,19DD2

DDdarcos

cosDD2DDd

0 180180

º8696,36y

xarctg

605551,313D ,5D

21

2

2

2

1

2

y

21

2

2

2

1

2

y

1

1

21

=θ−γ+τ=σ

=

−−=γ

γ−+=

=θ+σ−γ+τ→=θ+σ−+γ+τ

==τ

===

60711,2sinDY

49057,2cosDX

0468,1tanfv

75,0tanfv

22

22

2

1

=σ=

=σ=

=σ=

=τ=

We have developed a program using LabWindows CVI in

order to solve the general equations, which are a linear

system of six equations by replacing the values known, we

obtain:

Figure 23. Distance calculation software

There is a small difference between the values due to the

resolution used by LabWindows CVI Software to solve the

linear system.

IV. EXPERIMENTAL RESULTS

This section shows the experimental results obtained by system proposed for two types of inspection applications: the first, calculation of distance to vegetation, trees and buildings, using the images captured by the HD camera; and the second, hotspots detection using infrared-thermal camera.

The first application detects in almost real-time when the distance from power line to obstacle is less than 5 meters. The system response time between the processing of two frames and the calculation of distance is less than 1 second.

Once detected a distance less than 5 meters, the system proposed creates an alarm with the image and telemetry data received. The following figures illustrate the application screens in the event of an anomaly.

Figure 24. Online report and alarm generation. Computer-2

The alarm displays the following information: image ID, image, GPS position, altitude of the UAV at sea level, laser measurement height, obstacles identified, minimum distance to the obstacle, obstacle height, date and time, and camera parameters (angle, tilt, pan, zoom, ...).

251

Page 9: [IEEE 2013 International Conference on Unmanned Aircraft Systems (ICUAS) - Atlanta, GA, USA (2013.05.28-2013.05.31)] 2013 International Conference on Unmanned Aircraft Systems (ICUAS)

The second application uses infrared-thermal imaging for

detecting hotspots on power lines. Overhead power line

temperature can be obtained by mapping the colour

information into the corresponding temperature scale. The

video signal from the infrared thermal camera turning it into

grayscale frames (4 frame/second). Simultaneously, the

telemetry sent by autopilot is received by RS232 serial port.

In order to detect temperature changes we have developed

artificial vision algorithms to identify difference of more than

30 °C.

Figure 25. Hotspot detection in Electrical Substation. Computer-2

The detection of hotspots displays the following information: identification of the image, image, GPS coordinates, date and time of the hot spot detection.

Figure 26. Notification by SMS and E-mail. Computer 1&2

The application automatically generates online reports that are sent by email and SMS. The difference between both reports is that SMS format contains information in a short alphanumeric message without images.

V. CONCLUSIONS

The major contributions of the new automatic system for power lines inspection using a UAV are as follows:

• Distances and hotspots can be detected and identified by means image processing.

• Automatic inspection system

• Detection in almost real time

• ONLINE system response

• ONLINE reports via SMS and e-mail

• OFFLINE reports with 3D information

Overhead power line inspection in rural areas is an ideal civil application for Unmanned Aerial Vehicles. In addition, this system can operate under adverse weather conditions. Moreover, UAV can have access to those areas where it would be extremely dangerous for a manned helicopter to go into. UAVs can fly relatively close to the power line corridor avoiding the adjacent trees.

Finally, the authors would like point out their interest on exploring future works and challenges on Geodetic Reference Systems applied to design the initial flight plan.

ACKNOWLEDGMENT

The work reported in this paper has been funded by The

Spanish Centre for Industrial Technological Development

(CDTI) and IBERDROLA. The authors would like to thank

DeustoTech Energy staff for granting us access to their

facilities.

REFERENCES

[1] Juan I. Larrauria, Gorka Sorrosal and Mikel González, “A Predictive

Model based on a Cartographic and Geodetic System for Overhead

Power Line Inspection by UAV” Proceedings of the 8th International

Conference on Intelligent Unmanned Systems, ISBN: 978-981-07-

4225-6, 22 – 24, October 2012.

[2] K.Beck, and R. Mathieu, “Can Power Companies use Space Patrols to

Monitor Transmission Corridors?” in ESRI User Group Conference.:

San Diego. 2004.

[3] K. Valavanis, “Advances in unmanned aerial vehicles: state of the art

and the road to autonomy. Intelligent Systems, Control and

Automation: Science and Engineering 33. 2007.

[4] Fernandez, L.A.F. and M.M. Oliveira, “Real-time line detection

through an improved Hough transform voting scheme”. Pattern

Recognition, 41: p. 299-314. 2008.

[5] G. Conte, and P. Doherty, “An integrated UAV navigation system

based on aerial image matching”, IEEE Aerospace Conference, pp. 1–

10. 2008.

[6] P.J. Appelt, and J.W. Goodfellow, “Research on How Trees Cause

Interruptions- Applications to Vegetation Management”, IEEE Rural

Electric Power Conference. 2004: Scottsdale, Arizona.

[7] S. Clode and F. Rottensteiner, “Classification of trees and powerlines

from medium resolution airborne laser scanner data in urban

environments”. APRS Workshop on Digital image computing.

Brisbane,Australia.191-196. 2005

[8] K. Yamamoto and K.Yamada, “Analysis of the infrared images to

detect power lines”. IEEE Annual International Conference,

Proceedings/TENCON. Brisbane, Australia. 343-346. 1997.

[9] J.A Berni,” Thermal and Narrowband Multispectral Remote Sensing

for Vegetation Monitoring From an Unmanned Aerial Vehicle”. IEEE

Transactions on Geoscience and RemoteSensing,. 47(3): p. 722-738.

2009

[10] I. Golightly and D. Jones. “Visual Control of an Unmanned Aerial

Vehicle for Power Line Inspection”. Advanced Robotics, 2005. ICAR

'05. Proceedings, 12th International Conference. 2005.

[11] N.Agganval, and W.C. Karl, “Line Detection in Images Through

Regularized Hough Transform”, International Conference on Image

Processing: Vancouver, BC. 2000

[12] Yan, G., “Automatic extraction of power lines from aerial images.

IEEE Geoscience and Remote Sensing Letters, 2007. 4(3): p. 387-391.

[13] N.Aggarwaland W. Karl, “Line Detection in Images Through

Regularized Hough Transform”. IEEE Transactions on Image

Processing, 15(3): p. 582-591. 2006

252


Recommended