+ All Categories
Home > Documents > 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta...

1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta...

Date post: 29-Dec-2015
Category:
Upload: sarah-gregory
View: 220 times
Download: 5 times
Share this document with a friend
Popular Tags:
49
1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta http://webdocs.cs.ualberta.ca/~vis/ research.htm Page 1 Towards Practical Visual Servoing in Robotics R. Tatsambon Fomena
Transcript
Page 1: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

1

Computer Vision and Robotics Research Group

Dept. of Computing Science, University of Alberta

http://webdocs.cs.ualberta.ca/~vis/research.htm

Page 1

Towards Practical Visual Servoing in Robotics

R. Tatsambon Fomena

Page 2: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

2

Manus ARM, iARM Exact Dynamics Joysticks and keypads common user interfaces

Image from http://www.exactdynamics.nl

Example of a fully integrated system

Page 3: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

3

Click-grasp application with the intelligent Manus Arm

Examples of a fully integrated system

http://www.youtube.com/watch?feature=player_embedded&v=LBUyiaAPCcY

Page 4: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

4

1. User selects objectThe position of the object is computed using stereo vision from the shoulder of the camera

2. Robot arm moves to that position expressed in its base frame

3. Having the object in view the robot arm computes a more precise target and adjust orientation 4. Using left gripper camera, robot searches database for best object match.

5. Using the object template the robot arm moves to align the feature points.6. Once aligned gripper moves forward and closes its gripper.

7. Robot returns object to user. (Tsui et al., JABB, 2011)

Visual servoing: Example of a fully integrated system

Page 5: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

5

Joystick control.3 Control Modes:

1. Move robot's hand in three dimensional space, while maintain the orientation of the hand.

2. User can modify orientation of the hand, but keeping hand centered at the same point in space.

3. User can grasp and release of the hand using either two or three fingers.

http://www.youtube.com/watch?feature=player_embedded&v=O0nr8NdV6-M

http://www.youtube.com/watch?feature=player_embedded&v=vV4tbS7WTL0

Kinova Robotics aims also for a similar system

Image from http://kinovarobotics.com

Page 6: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

6

Page 6

Visual servoing: The control concept

HRISpecification

of Goal S*

WorldACTION

PERCEPTION

Robot+

Camera(s)

S*

S

+-

perception for action

(Espiau et al., TRA, 92) (Hutchinson et al., TRA, 96)

Page 7: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

7

Page 7

Visual servoing: Why visual sensing?

How to control the position of the end-effector of a robot with respect to an object of unknown location in the robot base frame?

How to track a moving target?

A visual sensor provides relative position information

Page 8: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

8

Page 8

Visual servoing: How can you use visual data in control?

Look then move

Visual feedback control loop

ACTION

PERCEPTION

Robot+

Camera(s)

S*

S

-+

ACTIONPERCEPTION

Robot+

Camera(s)

S-S*

Page 9: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

9

Page 9

Quiz

What are the advantages of closed-loop control over open loop

control approach?

Page 10: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

10

Page 10

Visual servoing: Ingredients for a fully integrated system

HRI

Visual tracking method

Motion control algorithm

HRISpecification

of Goal S*ACTION

PERCEPTION

Robot+

Camera(s)

S*

S

Page 11: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

11

Page 11

Visual servoing: Visual tracking

Crucial as it provides the necessary visual feedback• coordinates of image points or lines

Should give reliable and accurate target position in the image

Camshift color tracker provides 2D (x,y) coordinates of the tracked objects

PERCEPTION

Current imageTracker searches for the end-effector

S

S=(x,y)

Selection of the set ofMeasurements to use for control

Page 12: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

12

Visual Tracking Applications: Watching a moving target

Camera + computer can determine how things move in an scene over time.

Uses:

Security: e.g. monitoring people moving in a subway station or store

Measurement: Speed, alert on colliding trajectories etc.

Page 12

Page 13: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

13

Visual Tracking Applications: Human-Computer Interfaces

Camera + computer tracks motions of human user and interprets this in an on-line interaction.

Can interact with menus, buttons and e.g. drawing programs using hand movements as mouse movements, and gestures as clicking

Furthermore, can interpret physical interactions

Page 13

Page 14: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

14

Visual Tracking Applications: Human-Machine Interfaces

Camera + computer tracks motions of human user, interprets this and machine/robot carries out task.

Remote manipulation

Service robotics for the handicapped and elderly

Page 14

Page 15: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

15

Page 15

Visual servoing: Example of visual tracking

Registration based Tracking Nearest Neighbor tracker N vs Efficient Second Order Minimization

http://www.youtube.com/watch?v=do5EQGMpv50

Page 16: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

16

Page 16

Visual servoing: Motion control algorithm

3 possible control methods depending on the selection of S: 2D, 3D, and 2 ½ D

(Corke, PhD, 94)

High bandwidth requires precise calibration: camera and robot-camera

3D VS

2D VS

2 ½ D VS

Page 17: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

17

Page 17

Visual servoing: Motion control algorithm

Key element is the model of the system

3D VS

2D VS

2 ½ D VS

Robustness to image noise, calibration errorsSuitable for unstructured environments

(Corke, PhD, 94)

2D VS 2 ½ D VS 3D VS

Abstraction for control

Page 18: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

18

3D Visual servoing

Page 18

How to sense position and orientation of an object?

(Wilson et al., TRA, 1996)

Page 19: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

19

2-1/2 D = Homography-based Visual servoing

Page 19

Euclidean Homography?

(Malis et al., TRA, 1999)

Page 20: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

20

2D Visual servoing

Page 20

Example of 2D features?

http://www.youtube.com/watch?v=Np1XFuDFcXc

(Espiau et al., TRA, 1992)

(Jagersand et al., ICRA, 1997)

Page 21: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

21

Page 21

Quiz

What are the pro and cons of each approach?

1-a) 2D 1-b) 3D 1-c) 2 ½ D

Page 22: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

22

Page 22

Visual servoing: Motion control algorithm

Key element is the model of the system: how does the image measurements S change with respect to changes in robot configuration q?

can be seen as a sensitivity matrix

Page 23: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

23

Page 23

Visual servoing: Motion control algorithm

How to obtain ?

1) Machine learning technique• Estimation using numerical methods, for example Broyen

2) Model-based approach• Analytical expression using the robot and the camera projection model• Example S=(x,y)

How to derive the Jacobian or interaction matrix L?

(Jagersand et al., ICRA, 1997)

Page 24: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

24

Page 24

Visual servoing: Motion stability

How to move the robot knowing e = S-S* and ?

Classical approach: the control law imposes an exponential decay of the error

Classical control

Page 25: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

25

Page 25

Visual servoing: Motion control algorithm

:=VisualTracker(InitImage)

Init =

Init

While ( > T ) {

CurrentImage := GrabImage(camera)

:= VisualTracker(CurrentImage)

Compute =

Estimate

Compute

Change robot configuration with }

Page 26: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

26

Page 26

Visual servoing: HRI

Important for task specification• point to point alignment for gross motions• points to line alignment for fine motions

Should be easy and intuitive

Is user dependent

Page 27: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

27

Page 27(Hager, TRA, 1997)

Page 28: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

28

Page 28

(Kragic and Christensen, 2002)

How does the error function looks like?

Page 29: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

29

Page 29

(Kragic and Christensen, 2002)

How does the error function looks like?

Page 30: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

30

Page 30

(Kragic and Christensen, 2002)

How does the error function looks like?

Page 31: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

31

Visual Servoing: HRI-point to point task error

Point to Point task “error”:

E = [yã2 à y0]

y0

y0E =

y1...y16

2

4

3

5

ã

ày1...y16

2

4

3

5

0

Why 16 elements? Page 31

Page 32: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

32

Visual Servoing: HRI-point to line task error

Point to Line

Line:

E pl(y;l) = yl ál lyr álr

ô õ

l l = y3â y1

y4â y2

ô õ

yl = y5

y6

ô õ

y1

y2

ô õ

y3

y4

ô õ

Note: y homogeneous coord. Page 32

Page 33: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

33

Visual Servoing: HRI-parallel composition example

E (y ) =wrench

y - y4 7

y - y2 5

y • (y y )8 3 4

y • (y y )6 1 2

(plus e.p. checks) Page 33

Page 34: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

34

Page 34

(Li, PhD thesis, 2013)

http://www.youtube.com/watch?feature=player_embedded&v=QQJIVh0WICM

Maintaining visibility

Page 35: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

35

Visual Servoing: HRI with virtual visual fixtures

Motivation

Virtual fixtures can be used for motion constraints

Potential Applications

Improvements on vision-based power lines or pipelines inspection• Flying over a power line or pipeline by keeping a constant yaw angle

relative to the line (line tracking from the top)• Hovering over a power pole and moving towards the top of a power pole for

a closer inspection

Page 35

https://www.youtube.com/watch?v=5W3HiuOYuhg

Page 36: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

36

Where does virtual fixtures can be useful?

Robot Assistant method for microsurgery (steady hand eye robot)• “Here the extreme challenge of physical scale accentuate the need for

dexterity enhancement, but the unstructured nature of the task dictates that the human be directly “in the loop””

Page 36

-

EyeRobot1

Page 37: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

37

Where does virtual fixtures can be useful?

How to assist the surgeon?

Cooperative control with the robot • Incorporate virtual fixture to help protect the patient, and eliminate hand’s

tremors of the surgeon during surgery

Page 37

-

Page 38: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

38

Where does virtual fixtures can be useful?

Central retinal vein occlusion, solution= retinal vein cannulation

Free hand vein cannulationhttp://www.youtube.com/watch?v=MiKVFwuFybc&feature=player_embedded

Robot assisted vein cannulationhttp://www.youtube.com/watch?v=s5c9XuKtJaY&feature=player_embedded

What to prove? “robot can increase success rate of cannulation and increase the time the micropipette is maintained in the retinal vein during infusion”

Page 38

Page 39: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

39

Virtual fixture: example

JHU for VRCM (Virtual Remote Center of Motion)http://www.youtube.com/watch?v=qQEJEM7YeXY&feature=player_embedded

Page 39

Page 40: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

40

What is a virtual fixture?

“Like a real fixture, provides surface that confines and/or guides a motion” (Hager, IROS, 2002)

Its role is typically to enhance physical dexterity

Page 40

(Bettini et al., TRO, 2004)

Page 41: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

41

What is a virtual fixture?

“Software helper control routine” (A. Hernandez Herdocia, Master thesis, 2012)

Line constraint, plane constraint

Page 41

Page 42: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

42

What is virtual visual fixture?

Vision-based motion constrains

Geometric virtual linkage between the sensor and the target, for 1 camera visual servoing (Chaumette et al., WS, 1994) Extension of the basic kinematic of contacts

Image-based task specification from 2 cameras visual servoing (Dodds et al., ICRA, 1999)

Task geometric constraint defines a virtual fixture

Page 42

Page 43: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

43

What is virtual visual fixture?

Vision-based motion constraints

Geometric virtual linkage(Chaumette et al., WS, 1994)

Page 43

Page 44: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

44

What is virtual visual fixture?

Vision-based motion constraints

Image-based task specification(Dodds et al., ICRA, 1999)

Page 44

Page 45: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

45

Mathematical insight of virtual fixture

As a control law – filtered motion in a preferred direction

As a geometric constraints – virtual linkage

As condition for observer design - persistency of excitation

Page 45

Prove it? (Hager, IROS, 1997)

Page 46: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

46

Mathematical insight of virtual visual fixture

Take home message: Kernel of the Jacobian or interaction matrix in visual servoing

Page 46

(Tatsambon, PhD, 2008)

Page 47: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

47

Summary: where are the next steps to move forward?

The conclusion is clear• So far only a handful existing fully integrated and tested visual servoing

system– Mechatronics & theoretical developments than actual practical software

development– Our natural environment is complex: Hard to design an adequate representation

for robots navigation

It is time to free visual servoing from its restrictions to solving real world problems

• Tracking issue: reliability and robustness (light variation, occlusions, …)• HRI problem: Image-based task specification, new sensing modalities

should be exploited

Page 47

Page 48: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

48

Short term research goal: new virtual visual fixture

Virtual fixtures• Line constraint

To keep tool on the line

Can be done with point-to-line alignment

• Ellipse constraint

To keep the tool on the mapping of a circle

Has never been done

Page 48

Page 49: 1 Computer Vision and Robotics Research Group Dept. of Computing Science, University of Alberta vis/research.htm Page 1.

49

Short term research goal: grasping using visual servoing

Page 49

Instead of havingpredefined grasping points in a database ofobjects Where to grasp?


Recommended