+ All Categories
Home > Documents > Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

Date post: 30-Dec-2015
Category:
Upload: delilah-armstrong
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
19
Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley
Transcript
Page 1: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

Vision-based Landing of an Unmanned Air Vehicle

Omid ShakerniaDepartment of EECS, UC Berkeley

Page 2: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

2

Applications of Vision-based Control

Fire Scout

Global HawkPredator

SR/71

UCAV X-45

Page 3: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

3

Goal: Autonomous landing on a ship deck

Challenges Hostile environments

Ground effect Pitching deck High winds, etc

Why vision? Passive sensor Observes relative

motion

Page 4: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

4

Simulation: Vision in the loop

Page 5: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

5

Vision-Based Landing of a UAV

Motion estimation algorithms Linear, nonlinear, multiple-view Error: 5cm translation, 4° rotation

Real-time vision system Customized software Off-the-shelf hardware

Vision in Control Loop Landing on stationary deck Tracking of pitching deck

Page 6: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

6

Vision-based Motion Estimation

Pinhole Camera

Landing target

Image plane

Feature Points

Current pose

Page 7: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

7

Pose Estimation: Linear Optimization

Pinhole Camera: Epipolar Constraint: Planar constraint:

More than 4 feature points Solve linearly for Project onto to recover

Page 8: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

8

Pose Estimation: Nonlinear Refinement

Objective: minimize error

Parameterize rotation by Euler angles

Minimize by Newton-Raphson iteration

Initialize with linear algorithm

Page 9: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

9

Multiple-View Motion Estimation

Multiple View Matrix

Rank deficiency constraint

Pinhole Camera

Page 10: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

10

Multiple-View Motion Estimation n points in m views

Equivalent to finding s.t.

Initialize with two-view linear solution

Least squared solution:

Use to linearly solve for Iterate until converge

Page 11: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

11

Real-time Vision System Ampro embedded Little Board PC

Pentium 233MHz running LINUX 440 MB flashdisk HD robust to vibration Runs motion estimation algorithm Controls Pan/Tilt/Zoom camera

Motion estimation algorithms Written and optimized in C++ using LAPACK Estimate relative position and orientation at

30 Hz

UAV Pan/Tilt Camera Onboard Computer

Page 12: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

12

Hardware Configuration

On-board UAVVision System

Vision Computer

RS232

RS232

Vision Algorithm

Frame Grabber

Camera

WaveLAN to Ground

Navigation SystemNavigation Computer

RS232 RS232

Control & Navigatio

n

INS/GPS

WaveLAN to Ground

Page 13: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

13

Feature Extraction

Acquire Image Threshold Histogram Segmentation Target Detection Corner Detection Correspondence

Page 14: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

14

Pan/Tilt to keep features in image center Prevent features from leaving field of view Increased Field of View Increased range of motion of UAV

Camera Control

Page 15: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

15

Ground Station

Comparing Vision with INS/GPS

Page 16: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

16

Motion Estimation in Real Flight Tests

Page 17: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

17

Landing on Stationary Target

Page 18: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

18

Tracking Pitching Target

Page 19: Vision-based Landing of an Unmanned Air Vehicle Omid Shakernia Department of EECS, UC Berkeley.

19

Conclusions Contributions

Vision-based motion estimation (5cm accuracy)

Real-time vision system in control loop Demonstrated proof of concept prototype:

first vision-based UAV landing Extensions

Dynamic vision: Filtering motion estimates Symmetry-based motion estimation Fixed-wing UAVs: Vision-based landing on

runways Modeling and prediction of ship deck motion Landing gear that grabs ship deck Unstructured environments: Recognizing good

landing spots (grassy field, roof top etc)


Recommended