Atlas Encoding by Randomized Forests for Efficient Label Propagation
Darko Zikic, Ben Glocker, Antonio Criminisi
Microsoft Research Cambridge
Atlas Forests – Main Idea
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 2
Within the context of multi‐atlas label propagation (MALP)Encode a single atlas
by training a corresponding atlas‐specific classifier
→Lightweight MALP framework capable of efficient label propagation with promising accuracy
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 3
BackgroundMulti‐atlas Label Propagation (MALP)
Multi‐atlas Label Propagation (MALP)special class of segmentation methods, standard for brain labelling, uses database of atlases
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 4
target labellingtarget
atlas 1
atlas 2
atlas N
…
Multi‐atlas Label Propagation (MALP)special class of segmentation methods, standard for brain labelling, uses database of atlases
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 5
atlas 1
atlas 2
atlas N
…
target labellingtarget
Registration of all N atlases to the target image
1Fusion of warped atlas labels into a final labelling and post‐processing
2
Multi‐atlas Label Propagation (MALP)
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 6
MALP Patch‐based MALP
[Rohlfing et al., NeuroImage, 2004][Warfield et al., TMI, 2004][Heckemann et al., NeuroImage, 2006][Aljabar et al., NeuroImage, 2009]and many others…
[Rousseau et al., TMI, 2011][Coupe et al., NeuroImage, 2011]…
• deformable registration• one‐to‐one correspondence• encoding: single point/patch
• affine registration• one‐to‐many correspondence• encoding: collection of patches
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 7
Method
Atlas Forests Framework
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 8
Encoding: Train an individual classification forest per atlas
average probs.
target target labelling
Target Labelling: Testing
Fusio
n(currently by averaging of probabilities)
forest training
forest training
forest training
atlas 1
atlas 2
atlas N
…Atlas Forest
1
Atlas Forest2
Atlas ForestN
…
Probabilistic Atlas used for Spatial Contextmean image aggregate probs.label probs.
left/rightinner/outerupper/lower
AF‐1 probs.
…
AF‐2 probs.
AF‐N probs.
testing of target on individual AFs
probabilistic atlas registered to target
prob
abilistic atla
s registered
to individu
al atla
ses
Note 1: only one registration per target labelling
Note 2: standard forest scheme would learn one forest from all data
Method Properties
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 9
1. Using a classifier as an encoder
2. Data representation
3. Relation to • other MALP frameworks• standard forest scheme
Using a Classifier as an Atlas Encoder
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 10
Training:
Testing on Intensity Image approximates the Label MapSpecializing the classifier to training scan (i.e. overtraining) improves approximation
AF‐1 probs.
Atlas Forest1
intensity image approximatedlabel map
Overtraining the classifier can be used for Encoding.
≈
Atlas Forest1
atlas 1Intensity image label map
Trees with Context‐aware Features→ Learned Variable Encoding across Image Domain
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 11
targetaligned prob. atlas
Each point is described by a different chain of features, depending on its appearance.
Relation to Other MALP Frameworks
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 12
Properties MALP Patch‐based MALP Atlas Forests
Atlas Encoding Local intensity Collection of patches • Variable context‐aware features• Implicit spatial context
Registrationsper Target
N(all atlases to target)
N(all atlases to target)
1(probabilistic atlas to target)
Registration Type Deformable Affine Affine/Deformable
Training Required NO NO YES
Prob. AtlasRequired NO NO YES
Why is this good?• More efficient training possible (training one big forest vs. several small ones)• Simple Atlas Selection • Simple Atlas Addition
Compared to “Standard” Forests for MALP
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 13
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
1. Atlas Selection
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 14
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
Principled modification of trained forest is not obvious→ requires retraining
Atlas Selection is trivial: Keep only corresponding forests
?
2. Addition of New Atlases
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 15
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
Principled Treatment: Re‐Training Atlas Addition is trivial: Train a single new atlas forest
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 16
Evaluation
Datasets & Settings
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 17
Pre‐processing Steps• skull‐stripping• inhomogeneity correction• histogram matching
Relevant Settings:Only a few deep trees per AF• trees per atlas forest = 5• max. depth = 36• min. samples per leaf = 8
Global Timings:• Training of 1 tree: up to 36 minutes• Registration of prob. atlas: 30 seconds
Datasets:• IBSR (used for development)
• 18 atlases• 32 labels
• LPBA40• 40 atlases• 54 labels
• MICCAI 2012 Multi‐Atlas Labeling Challenge• 15 training, 20 testing • OASIS data• 134 labels
(98 cortical,36 non‐cortical)
AF Variation: no use of probabilistic atlas 77.38%
AF Variation: Affine registration (instead of non‐rigid) 82.71%
Standard forest bagging 84.08%
Results on IBSR (18 atlases, 32 labels, leave‐1‐out validation)
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 18
[Rousseau] Fast Multi‐point
[Rousseau]Group‐wise MP
Atlas Forests
DSC 82.25% 83.5% 84.60%
Time [min] 22 130 3[Rousseau] A Supervised Patch-Based Approach for Human Brain Labeling.Rousseau, Habas, Studholme. IEEE TMI 2011
Manual Reference SegmentationAtlas ForestsAtlas Forests ‐ no use of priors
contr
ibution o
f pr
ob.
Atlas
≈7% D
SC
Results on LPBA4040 atlases, 54 labels, leave‐1‐out cross‐validation
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 19
[Wu] PBL [Wu] SPBL [Wu] SCPBL Atlas Forests
DSC 75.06% 76.46% 78.04% 77.46%
Time [min]
10 (per label)
28(per label)
45(per label)
8 (for all labels)
Robust patch‐based multi‐atlas labelling by joint sparsity regularization. Wu, G., Wang, Q., Zhang, D., Shen, D.In: MICCAI Workshop STMI. (2012)• PBL = Patch Based Labelling• SPBL=Sparse‐only PBL• SCPBL=Spatially Consistent PBL
Results on Data from MICCAI 2012 Multi‐Atlas Labeling Challenge (MALC)OASIS data, 134 labels (98 cortical,36 non‐cortical), 15/20 train/test
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 20
[PICSL‐BC]Wang, H., Avants, B., Yushkevich, P.A combined joint label fusion and corrective learning approach. In: MICCAI Workshop on Multi‐Atlas Labeling. (2012)
PICSL‐BC (1st at MALC) Atlas Forests
DSC 76.54% 73.66% ( ‐02.88% )
DSC cort 73.88% 71.04% ( ‐02.94% )
DSC non‐cort 83.77% 80.81% ( ‐02.96% )
Time “computation time for registering each pair of images is about 20 hours”the fusion “finishes processing one brain image in about three hours”
4 min
• clearly higher accuracy• more sophisticated pipeline
• high efficiency
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 21
Summary
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 22
Use of random forest classifier to encode atlases in MALP• High efficiency• Good accuracy• Interesting use of classifiers as encoders
Compared to other MALP frameworks• Efficient labelling and experimentation• Variable data representation• Requires training and a probabilistic atlas
Compared to Standard Forest Scheme• Keeps benefits of MALP for Atlas Selection and Addition • Efficient training and experimentation
Future Work• Framework specific fusion and atlas selection• Explore applicability to other problems
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 23
thank you
Context‐aware Features
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 24
• Deterministic features: local intensity of image and priors at
• Randomized, context‐aware feature types , , with params p s, r, u, v
1. Local cuboid mean intensity:
1. Difference of local intensity and offset cuboid intensity mean:
1. Difference of local and offset cuboid intensity means:
1. Difference of two offset cuboid intensity means:
Comparison: Problem Size at Training
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 25
Standard Forest Sampling from all datasets
Atlas ForestsScan‐specific sampling
All data required at training time Only a 1 scan required for each AF→ less data for each training→ trivial parallelization
Efficiency for large data sets
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 26
for each point x in target for each y within search region S(x) in a registered atlas intensity image
compute similarity between patches P(x) and P(y)and assign to each label at y a probability according to this similarity
atlas ntarget image
Patch‐based MALP (MALP)
Why is this strategy applicable to MALP?Intuition: Since scans are similar, a meaningful random sample set from all scans will be similar to that of a single scan
Why is this advantageous?• Simple Atlas Selection• Simple Atlas Addition• More efficient training possible
Compared to “Standard” Forests for MALP
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 27
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
Comparison: Atlas Selection
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 28
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
Principled modification of trained forest is not obvious
Atlas Selection is trivial: Keep only corresponding forests
Note: validation with K folds equals K atlas selections→Standard Forests: training of K forests→Atlas Forests: each AF trained only once→ Efficient Experimentation
?
Comparison: Addition of New Atlases
Atlas Forest: Atlas Encoding by Randomized Forests for Efficient Label Propagation 29
Standard Forest Sampling from all scans
Atlas ForestsScan‐specific sampling
Principled Treatment: Re‐Training Atlas Addition is trivial: Train a single new atlas forest
©2013 Microsoft Corporation. All rights reserved.