+ All Categories
Home > Documents > The Visual Object Tracking VOT-TIR2016 Challenge...

The Visual Object Tracking VOT-TIR2016 Challenge...

Date post: 12-Mar-2020
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
32
The Visual Object Tracking VOT-TIR2016 Challenge Results Michael Felsberg, Matej Kristan, Jiři Matas, Aleš Leonardis, Roman Pflugfelder, Gustav Häger, Amanda Berg, Abdelrahman Eldesokey, Jörgen Ahlberg, Luka Čehovin, Tomáš Vojiř, Alan Lukežič, Gustavo Fernández, et al.
Transcript
Page 1: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

The Visual Object Tracking VOT-TIR2016 Challenge ResultsMichael Felsberg, Matej Kristan, Jiři Matas, Aleš Leonardis, Roman Pflugfelder, Gustav Häger, Amanda Berg, Abdelrahman Eldesokey, Jörgen Ahlberg, Luka Čehovin, Tomáš Vojiř, Alan Lukežič, Gustavo Fernández, et al.

Page 2: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

2

Outline

1. Scope of the VOT-TIR challenge

– Thermal infrared imaging

2. VOT-TIR2016 challenge overview

– Evaluation system

– Dataset

– Performance evaluation measures

3. VOT-TIR2016 results overview

4. Summary and outlook

Felsberg et al., VOT-TIR2016 results

Page 3: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

3

Scope of the VOT-TIR challenge

• Single-object, single thermal infrared (TIR) camera, model-free, short-term, causal trackers

• Model-free:

– Nothing but a single training example is provided by the bounding box in the first frame

• Short-term:

– Tracker does not perform re-detection

– Once it drifts off the target we consider that a failure

• Causality:

– Tracker does not use any future frames for pose estimation

• Object state defined as an upright bounding box

Felsberg et al., VOT-TIR2016 results

Page 4: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

4Felsberg et al., VOT-TIR2016 results

Thermal Infrared

htt

p:/

/ww

w.t

hes

es.u

lava

l.ca/

20

05

/23

01

6/a

pb

.htm

l

Page 5: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

5

Applications of TIR

• Scientific research

• Security

• Fire monitoring

• Search and rescue

• Automotive safety

• Personal use

• Military

Felsberg et al., VOT-TIR2016 results

Page 6: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

6

Why a separate challenge?

Tracking in TIR different from tracking in lowresolution grayscale visual?

Many similarities but also interesting differences

• 16-bit

• Constant values if radiometric

• Less structure/edges/texture

• No shadows

• Noise: blooming, resolution, dead pixels

Felsberg et al., VOT-TIR2016 results

Page 7: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

7

Evaluation system from VOT 2016

• Matlab-based kit to automatically perform a battery of standard experiments

• Download from our homepage

– https://github.com/vicoslab/vot-toolkit

– select the vottir2016 experiment stack

• Plug and play!

– Supports multiple platforms and programming languages (C/C++/Matlab/Python, etc.)

• Easy to evaluate your tracker on our benchmarks

• Deep integration with tracker - Fast execution of experiments

• OTB-like evaluation omitted

Felsberg et al., VOT-TIR2016 results

Page 8: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

8Felsberg et al., VOT-TIR2016 results

VOT-TIR2016 Dataset: LTIR2016

• Follows VOT 2013 selection and annotation approach:

– Keep it sufficiently small, diverse and well annotated

– Follow the VOT dataset construction methodology

• Modification of Linköping Thermal InfraRed (LTIR) datasetA. Berg, J. Ahlberg, M. Felsberg, A Thermal Object Tracking Benchmark. AVSS 2015.

Page 9: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

9Felsberg et al., VOT-TIR2016 results

• Different sources

• Different applications

• Different sensors

• Moving + stationary sensors

• Radiometric + non-radiometric

• 8/16 bits

Page 10: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

10Felsberg et al., VOT-TIR2016 results

• Different sources

• Different applications

• Different sensors

• Moving + stationary sensors

• Radiometric + non-radiometric

• 8/16 bits

Page 11: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

11Felsberg et al., VOT-TIR2016 results

• Different sources

• Different applications

• Different sensors

• Moving + stationary sensors

• Radiometric + non-radiometric

• 8/16 bits

Page 12: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

12Felsberg et al., VOT-TIR2016 results

• Different sources

• Different applications

• Different sensors

• Moving + stationary sensors

• Radiometric + non-radiometric

• 8/16 bits

Page 13: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

13

Sequence details

Felsberg et al., VOT-TIR2016 results

ASL-TID

Page 14: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

14Felsberg et al., VOT-TIR2016 results

Page 15: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

15

Will it be different? Test against VOT2014

Felsberg et al., VOT-TIR2016 results

VOT2014 LTIR

Page 16: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

16

Problem 2015: Sequence ranking

• A_f: average number of trackers failed per frame

• M_f: max. number of trackers failed at a single frame

Sequence Scorecrowd 2quadrocopter 2,5quadrocopter2 2,5garden 3mixed_distractors 3saturated 3,5selma 3,5street 3,5birds 4crouching 4

Sequence Scorejacket 4hiding 4,5car 5crossing 5depthwise_crossing 5horse 5rhino_behind_tree 5running_rhino 5soccer 5trees 5

challenging:0.06<=A_f<=0.214<=M_f<=22

intermediate:0.04<=A_f<=0.1

6<=M_f<=11easiest:

0<=A_f<=0.040<=M_f<=7

Felsberg et al., VOT-TIR2016 results

Page 17: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

17

Modifications of LTIR

• VOT-TIR2015 was already saturated

• Call for sequences – limited success (3 new sources, too easy)

• Easiest sequences have been removed: Crossing, Horse, and Rhino behind tree

• New, more difficult sequences have been added: Bird, Boat1, Boat2, Car2, Dog, Excavator, Ragged, and Trees2

Felsberg et al., VOT-TIR2016 results

BeihangUniversity

Page 18: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

18

Modifications of LTIR

• VOT-TIR2015 was already saturated

• Call for sequences – limited success (3 new sources, too easy)

• Easiest sequences have been removed: Crossing, Horse, and Rhino behind tree

• New, more difficult sequences have been added: Bird, Boat1, Boat2, Car2, Dog, Excavator, Ragged, and Trees2

Felsberg et al., VOT-TIR2016 results

BeihangUniversity

Page 19: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

19

Properties

• 25 Sequences

• Average sequence length 740

• Annotations in accordance with VOT

– Bounding-box

– 11 global attributes (per-sequence)

– 6 local attributes (per-frame)

Felsberg et al., VOT-TIR2016 results

Occlusion, dynamics change, motion change, size change, camera motion, neutral

Blur, dynamics change, temperature change, object motion, size change, camera motion, background clutter, aspect ratio change, objectdeformation, scene complexity, neutral

Page 20: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

20

Performance evaluation measures

• Basically the same as VOT2016 (based on 8-bit)

– accuracy

– robustness

• Evaluated pooled and normalized per-attribute

– raw value

– rank

• Overall: expected average overlap

• Speed in EFO units

Felsberg et al., VOT-TIR2016 results

Page 21: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

21

Results• 21 submitted trackers

• 3 added by VOT committee (NCC, DSST2014, SRDCFir)

• All 24 trackers in both challenges

• Various classes of trackers

– 8 part-based trackers: BDF, BST, DPCF, DPT, FCT, GGTv2, LT-FLO, and SHCT

– 7 trackers based on DCFs: DSST2014, MvCF, NSAMF, sKCF, SRDCFir, Staple-TIR, and STAPLE+

– 3 trackers based on deep features/learning: deepMKCF, TCNN, and MDNet-N

– 2 fusion based trackers: MAD and LOFT-Lite

– 4 other: EBT, PKLTF, DAT, and NCC

Felsberg et al., VOT-TIR2016 results

Page 22: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

22

Results (sequence pooling)Felsberg et al., VOT-TIR2016 results

Page 23: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

23

Results (attribute normalization)Felsberg et al., VOT-TIR2016 results

Page 24: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

24Felsberg et al., VOT-TIR2016 results

VOT’14

Page 25: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

25Felsberg et al., VOT-TIR2016 results

Page 26: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

26Felsberg et al., VOT-TIR2016 results

Page 27: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

27

Sequence ranking• A_f: average number of trackers that failed per frame

• M_f: maximum number of trackers that failed at a single frame

Felsberg et al., VOT-TIR2016 results

Sequence ScoreBird 1,5Quadrocopter2 1,5Trees2 1,5Car2 2Crowd 2Garden 2Quadrocopter 2,5Ragged 2,5Excavator 3Boat2 3,5Crouching 3,5Dog 3,5

Mixed_distractors 3,5Selma 3,5Street 3,5Trees1 3,5Boat1 4Jacket 4Birds 4,5Car1 4,5Saturated 4,5Soccer 4,5Depthwise_crossing 5Hiding 5Running_rhino 5

challenging:0.08<=A_f17<=M_f

intermediate:0.03<=A_f<=0.08

4<=M_f<=11easiest:0.01<=A_f<=0.05

2<=M_f<=7

Page 28: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

28

Problems with Overlap Measure

• Systematic overestimation of the bounding box

– avoids failures

– at cost of moderateaccuracy degradation

• Paper suggests a new,quantization-basedcriterion

Felsberg et al., VOT-TIR2016 results

Page 29: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

29

Bias in Overlap Precision Measures

• assume:

– 1D case

– position equally distributed

• expectation

– GT: 0.19

– 0.5xGT: 0.11

– 2xGT: 0.21

Felsberg et al., VOT-TIR2016 results

Page 30: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

30

Summary

• More difficult challenge with LTIR2016 dataset

• 1/3 of trackers show different ranking than in VOT2016

– in contrast to VOT-TIR2015

• Top-performing triple: SRDCFir, EBT, TCNN

• Best real-time method: MvCF

• Issues with the overlap measure

• Available at http://www.votchallenge.net/vot2016

Felsberg et al., VOT-TIR2016 results

Page 31: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

31

Winners of the VOT-TIR2016 Challenge:

Gao Zhu, Fatih Porikli, and Hongdong Li:

Edge Box Tracker (EBT)

Presentation at VOT2016 right after this talk

Award sponsored by

Felsberg et al., VOT-TIR2016 results

Page 32: The Visual Object Tracking VOT-TIR2016 Challenge Resultsdata.votchallenge.net/vot2016/presentations/vot_tir_2016_presentation.pdf · The Visual Object Tracking VOT-TIR2016 Challenge

32

Thanks

• The VOT2016 Committee

• Everyone who participated:

Felsberg et al., VOT-TIR2016 results

Alfredo Petrosino, Alvaro Garcia-Martin, Andres Solis Montero, Anton Varfolomieiev, Aykut Erdem, Bohyung Han, Chang-Ming Chang, Dawei Du, ErkutErdem, Fahad Shahbaz Khan, Fatih Porikli, Fei Zhao, Filiz Bunyak, Francesco Battistone, Gao Zhu, Guna Seetharaman, Hongdong Li, Honggang Qi, Horst Bischof, Horst Possegger, Hyeonseob Nam, Jack Valmadre, Jianke Zhu, Jiayi Feng, JochenLang, Jose M. Martinez, Kannappan Palaniappan, Karel Lebeda, Ke Gao, KrystianMikolajczyk, Longyin Wen, Luca Bertinetto, Mahdieh Poostchi, Mario Maresca, Martin Danelljan, Michael Arens, Ming Tang, Mooyeol Baek, Nana Fan, Noor Al-Shakarji, Ondrej Miksik, Osman Akin, Philip H. S. Torr, Qingming Huang, Rafael Martin-Nieto, Rengarajan Pelapur, Richard Bowden, Robert Laganiere, Sebastian B. Krah, Shengkun Li, Shizeng Yao, Simon Hadfield, Siwei Lyu, Stefan Becker, Stuart Golodetz, Tao Hu, Thomas Mauthner, Vincenzo Santopietro, Wenbo Li, Wolfgang Hubner, Xin Li, Yang Li, Zhan Xu, and Zhenyu He


Recommended