Date post: | 11-Feb-2017 |
Category: |
Engineering |
Upload: | somar-boubou |
View: | 358 times |
Download: | 0 times |
Visual Impression Localization of Autonomous Robots
Somar Boubou, A.H. Abdul Hafez, Einoshin Suzuki
1. Dept. of Informatics, Kyushu University, Japan.
2. Control Systems Laboratory, Toyota Technological Institute, Japan.
3. Dept. of Computer Engineering, Hasan Kalyoncu University, Turkey.
1
1,2 1 3
Topological visual localization:
โข Appearance-based methods:
2
โข Landmark-based methods:
[ Pronobis 06]
https://www.dyson360eye.com
Previous Localization methods are precise = every node in the topological map represents a (relatively) precise position of the robot. [Abdul-Hafez13]
Precision: around 1m in outdoor applications, order of mm in indoor applications when geometric features are available. [Badino12][BK Kim15]
3
We achieved a
rough but fast localization with BIRCH.
Background and Objective
Base Work: Autonomous Mobile Robot that Models HSV Color Info. of
the Environment [Suzuki 2012]
Navigating indoor, the robot uses online clustering BIRCH [Zhang 97] and detects peculiar colors
4
X4
Proposed extension to our localization problem
โข Robot in [Suzuki12] signals an observation which is sufficiently far from similar past observations
โข Our robot inherits most of [Suzuki 12] but solves a localization problem by comparing a pair of CF trees based on All Common Sequence [Wang 97]
5
Observed
data CF tree
on RAM Incremental construction of
the model
Leaf: compressed
similar observations
Outlier (very different from
the corresponding leaf)
Localization problem
Ref1
Nav
Ref4
Ref3
Ref2 Nav CF-tree
Ref CF trees
on ROM
?
Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.
6
Localization problem
Ref1
Nav
Ref4
Ref3
Ref2 Nav CF-tree
Ref CF trees
on ROM
Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.
6
Localization problem
Ref1
Nav
Ref4
Ref3
Ref2 Nav CF-tree
Ref CF trees
on ROM
Robot localize itself by comparing its tree with several reference trees. Each of which is representing one area of interest.
6
BIRCH [Zhang 97]
BIRCH, Balanced Iterative Reducing and Clustering using Hierarchies:
โข Groups similar examples by building a data index structure called a CF tree (i.e., Clustering Feature tree).
โข An efficient and scalable clustering method for a huge data set. [Zhang 97]
Applications:
โข Peculiar data discovery [Suzuki 12] and intrusion detection [Horng 11]
7
The Clustering Feature ๐ถ๐น of a cluster ๐ is a triple, denoted as:
CF tree [Zhang 97] 8
N d-dimensional data points or feature vectors x1, x2, โฆ , x๐
Cluster ๐
๐ฟ๐ = x๐๐๐=1 ๐๐ = x๐
2๐๐=1
๐ถ๐น = ๐, ๐ฟ๐, ๐๐
๐ ๐ถ๐น๐ฅ , ๐ถ๐น๐ < ฯ
CF vector [Zhang97]
9
๐ถ๐น๐โ๐ถ๐น๐ฅ = ๐๐ + ๐๐ฅ , ๐ฟ๐๐ + ๐ฟ๐๐ฅ , ๐๐๐ + ๐๐๐ฅ
๐ถ๐น๐ฅ = ๐๐ฅ , ๐ฟ๐๐ฅ , ๐๐๐ฅ ๐ถ๐น๐ = ๐๐ , ๐ฟ๐๐ , ๐๐๐ insert
Yes
No Try again in new location
CF Vector for HSV color histogram [Suzuki 12]
๐ถ๐น = ๐, ๐ฟ๐, ๐๐
๐ถ๐น = (โ,๐, ๐๐ข๐, ๐๐๐ฆ)
10
๐๐๐ฆ = [๐ต; ๐บ;๐; ๐0:3; ๐0:3; ๐ฆ0:3; ๐0:3; ๐0:3; ๐0:3; ๐0:3]
Our extension: introduction
of weights
[Lei 99]
Robot Navigation
Navigation tree (๐ฏ)
Paths of tree 1 P
Reference tree (๐ฎ)
Paths of tree 1 Q
Navigation tree (๐ฏ)
Comparison of the paths
Arrangement of the comparison results
ฮด(1,1)= โฆ ฮด(1,2)= โฆ
.
. ฮด(P,Q)= โฆ
ฮด(1)> ฮด(2)>โฆ> โฆ> ฮด(P.Q)
S(๐ฎ,๐ฏ) = ๐ธ ฮด(๐)๐ท.๐ธ๐ Similarity(๐ฎ,๐ฏ) =
๐บ(๐ข,๐ฃ)
๐บ(๐ฃ,๐ฃ)
๐ฎ= ๐1, ๐2, โฏ , ๐๐
๐ฏ= ๐1, ๐2, โฏ , ๐๐
๐ฟ๐๐๐ (๐, ๐) =๐๐๐ ๐๐, ๐๐
2๐+๐ฯ๐๐๐
๐ธ = ๐ธ๐/๐ท๐ธ ๐๐ ๐ ๐ ๐๐๐๐๐๐ ๐ฃ๐๐๐๐๐๐๐.
Flow Chart for CF-trees comparison
Node weighting
๐ฟ๐๐๐ =๐๐๐ ๐ , ๐ก
๐๐๐ ๐ , ๐ ๐๐๐ ๐ก, ๐ก๐๐๐๐ โ ๐ฟ๐๐๐ =
๐๐๐ ๐ , ๐ก
2๐+๐๐๐๐๐
๐ = ๐๐, ๐ก = ๐๐
ฯ(๐) =ฮฑฯ๐ + ฮฒฯ๐๐๐
ฮฑ + ฮฒ
Let us consider two paths:
12
Weights are used: - to define compression type.
- to eliminate noise.
๐๐๐๐ = 1 โmax ฯ๐ , ฯ๐ก โmin (ฯ๐ , ฯ๐ก)
max (ฯ๐ , ฯ๐ก)
ฯ๐ = ฯฯ(๐)๐๐=1
[Wang 97]:
Node weighting
๐ฟ๐๐๐ =๐๐๐ ๐ , ๐ก
๐๐๐ ๐ , ๐ ๐๐๐ ๐ก, ๐ก๐๐๐๐ โ ๐ฟ๐๐๐ =
๐๐๐ ๐ , ๐ก
2๐+๐๐๐๐๐
๐ = ๐๐, ๐ก = ๐๐
ฯ(๐) =ฮฑฯ๐ + ฮฒฯ๐๐๐
ฮฑ + ฮฒ
Let us consider two paths:
12
Weights are used: - to define compression type.
- to eliminate noise.
๐๐๐๐ = 1 โmax ฯ๐ , ฯ๐ก โmin (ฯ๐ , ฯ๐ก)
max (ฯ๐ , ฯ๐ก)
ฯ๐ = ฯฯ(๐)๐๐=1
s={a,b,c} t={a,b} acs(s,t)={โ , ๐, ๐, ๐๐}= 4
[Wang 97]:
Node weighting
๐ฟ๐๐๐ =๐๐๐ ๐ , ๐ก
๐๐๐ ๐ , ๐ ๐๐๐ ๐ก, ๐ก๐๐๐๐ โ ๐ฟ๐๐๐ =
๐๐๐ ๐ , ๐ก
2๐+๐๐๐๐๐
๐ = ๐๐, ๐ก = ๐๐
ฯ(๐) =ฮฑฯ๐ + ฮฒฯ๐๐๐
ฮฑ + ฮฒ
Let us consider two paths:
12
Weights are used: - to define compression type.
- to eliminate noise.
๐๐๐๐ = 1 โmax ฯ๐ , ฯ๐ก โmin (ฯ๐ , ฯ๐ก)
max (ฯ๐ , ฯ๐ก)
ฯ๐ = ฯฯ(๐)๐๐=1
[Wang 97]:
ฯ๐ ๐ =๐๐๐๐๐๐๐ก
ฯ๐๐๐ =๐ฃ
3
Tree types of comparison (favor of root)
๐๐๐ฆ = [๐ต; ๐บ;๐; ๐0:3; ๐0:3; ๐ฆ0:3; ๐0:3; ๐0:3; ๐0:3; ๐0:3]
49_๐๐๐ฆ๐๐๐๐ก = [๐: 57%;๐ต: 43%; ]
21_๐๐๐ฆ๐โ๐2 = [๐ต: 100%; ]
28_๐๐๐ฆ๐โ๐1 = [๐: 100%; ]
49_๐๐๐ฆ๐๐๐๐ก = [๐: 57%; ๐0: 43%; ]
28_๐๐๐ฆ๐โ๐1 = [๐: 100%; ]
21_๐๐๐ฆ๐โ๐2 = [๐0: 100%; ]
13
Tree types of comparison (favor of leaves)
๐๐๐ฆ = [๐ต; ๐บ;๐; ๐0:3; ๐0:3; ๐ฆ0:3; ๐0:3; ๐0:3; ๐0:3; ๐0:3]
49_๐๐๐ฆ๐๐๐๐ก = [๐: 90%; ]
[๐0; ๐0; ๐ฆ0; ๐0; ๐0; ๐0; ๐0 = 2% < 5%]
1_๐๐๐ฆ๐โ๐2 = [๐ต: 100%; ]
44_๐๐๐ฆ๐โ๐1 = [๐: 100%; ]
1_๐๐๐ฆ๐โ๐3 = [๐บ: 100%; ]
1_๐๐๐ฆ๐โ๐6 = [๐0: 100%; ]
1_๐๐๐ฆ๐โ๐4 = [๐ฆ0: 100%; ]
1_๐๐๐ฆ๐โ๐5 = [๐0: 100%; ]
49_๐๐๐ฆ๐๐๐๐ก = [๐: 90%; ]
1_๐๐๐ฆ๐โ๐2 = [๐ต: 100%; ]
44_๐๐๐ฆ๐โ๐1 = [๐: 100%; ]
1_๐๐๐ฆ๐โ๐3 = [๐บ: 100%; ]
1_๐๐๐ฆ๐โ๐6 = [๐0: 100%; ]
1_๐๐๐ฆ๐โ๐4 = [๐ฆ0: 100%; ]
1_๐๐๐ฆ๐โ๐5 = [๐0: 100%; ]
In ๐๐๐ฆ๐๐๐๐ก :
14
Experiments (1)
Favor of the root
Favor of the Leaves
Neutral
- Six areas.
- One reference tree for each area.
- Five navigation trials in each area.
- Three types of comparison were introduced:
Experiments (2): KTH-IDOL2 Dataset [ Pronobis06]
- 5 rooms and three illumination conditions which are, cloudy, night, and sunny.
17
Four navigation trials under each condition:
- Three trials were used to create reference CF trees.
-The forth trials were used to create navigation trees.
KTH-IDOL2 Results (2)
0%
20%
40%
60%
80%
Cloudy Night Sunny
Training /Cloudy/
CAMML
NBM
Filter
0%
20%
40%
60%
80%
Cloudy Night Sunny
Training /Night/
0%
20%
40%
60%
80%
Cloudy Night Sunny
Training /Sunny/
[Rubio 14] - /CAMML/ Bayesian Network - Naive Bayes Method
19
- PC with 32-bit Ubuntu 12.04 system.
- Equipped with Intel Core i7 CPU 920.
- Clock speed: 2.67GHz.
- RAM: 11.8GB.
Computation time /Our Platform/
Computation time 20
๐ก๐ =29ms
per frame
Paths of tree 1 P
Reference tree (๐ฎ)
Paths of tree 1 Q
Navigation tree (๐ฏ)
Compare ๐๐ = ๐ก๐๐๐
๐ก๐ =0.031 ms, P=Q=60 ๐๐ = 111.56 ๐๐
15 fps ๐ โ 60
๐๐ = ๐ก๐โ 60 = 435 ๐๐
๐๐ก๐๐ก = ๐ก๐ + ๐ก๐ = 546.56 ๐๐
320ร240
Contributions
โข We are planning to investigate more robust features (e.g., SIFT, SURF, WI-SURF, HOG) to the changes in the environment due to illumination etc.
21
โข Extend the discovery robot [Suzuki 12] for our localization problem.
โข Color-based feature were not stable under different illumination conditions.
โข Introduce a new measure for CF-tree similarity based on ACS.
Future work