+ All Categories
Home > Technology > M1 gp用 20120511ver2

M1 gp用 20120511ver2

Date post: 16-Dec-2014
Category:
Upload: keijro
View: 387 times
Download: 1 times
Share this document with a friend
Description:
 
Popular Tags:
52
PBN: Towards Practical Activity Recognition Using Smartphone- based Body Sensor Networks The real authors …Matt Keally, Gang Zhou, Guoliang Xing 1 , Jianxin Wu 2 , and Andrew Pyles College of William and Mary, 1 Michigan State University, 2 Nanyang Technological University Nakagawa Keijiro The University of Tokyo, Sezaki Lab Sensys2011
Transcript
Page 1: M1 gp用 20120511ver2

PBN: Towards Practical Activity Recognition Using Smartphone-

based Body Sensor Networks

The real authors …Matt Keally, Gang Zhou, Guoliang Xing1, Jianxin Wu2, and Andrew Pyles

College of William and Mary, 1Michigan State University, 2Nanyang Technological University

Nakagawa Keijiro The University of Tokyo, Sezaki Lab

Sensys2011

Page 2: M1 gp用 20120511ver2

Outline

2

!   Related Work

!   Proposal

!   Abstract

!   Experimentation

!   The most important point

! AdaBoost

!   Sensor Selection

!   Results

!   Future

Page 3: M1 gp用 20120511ver2

Related Work

LAN

Body Sensor Networks

v  Athletic Performance

v  Health Care

v  Activity Recognition

Page 4: M1 gp用 20120511ver2

Related Work !   No mobile or on-body aggregator

!   (Ganti, MobiSys ‘06), (Lorincz, SenSys ‘09), (Zappi, EWSN ‘08)

!   Use of backend servers !   (Miluzzo, MobiSys ‘10), (Miluzzo, SenSys ‘08)

!   Single sensor modality or separate classifier per modality !   (Azizyan, MobiCom ‘09), (Kim, SenSys ‘10), (Lu, SenSys ‘10)

!   Do not provide online training !   (Wang, MobiSys ‘09), (Wachuri, UbiComp ‘10)

Page 5: M1 gp用 20120511ver2

Proposal

PBN: Practical Body Networking

Page 6: M1 gp用 20120511ver2

Abstract

Hardware: TinyOS + Android Software: Android Application

USB

Activity recognition

Page 7: M1 gp用 20120511ver2

Abstract

Sensor Selection

7

Sensor Node

Phone

Base Station Node

Local Agg. Sensor Sensor Sensor

Sample Controller

Sample Controller

Sensor Sensor Sensor

Local Agg.

GUI TinyOS

Comm. Stack

Sensor Selection

Ground Truth Management

Activity Classification

Retraining Detection

Activity Decision, Request Ground Truth Agg. Data

Agg. Data

Labeled Data

Agg. Data Start/Stop

Input Sensors Activity Prob, Agg. Data

Training Data

USB 802.15.4

Page 8: M1 gp用 20120511ver2

Experimentation v  2 subjects, 2 weeks v  Android Phone

3-axis accelerometer, WiFi/GPS Localization v  5 IRIS Sensor Motes

2-axis accelerometer, light, temperature, acoustic, RSSI

Node ID Location

0 BS/Phone

1 L. Wrist

2 R. Wrist

3 L. Ankle

4 R. Ankle

5 Head

Page 9: M1 gp用 20120511ver2

Abstract

Sensor Selection

9

Sensor Node

Phone

Base Station Node

Local Agg. Sensor Sensor Sensor

Sample Controller

Sample Controller

Sensor Sensor Sensor

Local Agg.

GUI TinyOS

Comm. Stack

Sensor Selection

Ground Truth Management

Activity Classification

Retraining Detection

Activity Decision, Request Ground Truth Agg. Data

Agg. Data

Labeled Data

Agg. Data Start/Stop

Input Sensors Activity Prob, Agg. Data

Training Data

USB 802.15.4

Page 10: M1 gp用 20120511ver2

The most important point

AdaBoost (Activity Classification)

Sensor Selection

Page 11: M1 gp用 20120511ver2

AdaBoost

Ensemble Learning: AdaBoost.M2 (Freund, JCSS ‘97) v  Lightweight and accurate v  Maximizes training accuracy for all activities v  Many classifiers (GMM, HMM) are more demanding

Iteratively train an ensemble of weak classifiers v  Training observations are weighted by misclassifications v  At each iteration:

l  Train Naïve Bayes classifiers for each sensor l  Choose the classifier with the least weighted error l  Update weighted observations

The ensemble makes decisions based on the weighted decisions of each weak classifier

Page 12: M1 gp用 20120511ver2

Sensor Selection v  Identify both helpful and redundant sensors v  Train fewer weak classifiers per AdaBoost iteration v  Bonus: use even fewer sensors

Raw Data Correlation

Sensors

Page 13: M1 gp用 20120511ver2

Unused

Selected

Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlation

Find the correlation of each pair of sensors selected by AdaBoost Use average correlation as a threshold for choosing sensors

2,TEMP

1,ACC

2,MIC

1,LIGHT

3,MIC

3,TEMP

AdaBoost All Sensors

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

Page 14: M1 gp用 20120511ver2

Selected

Unused

Sensor Selection

2,TEMP

AdaBoost All Sensors

1,ACC

2,MIC

1,LIGHT

3,MIC

3,TEMP

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

correlation(2,TEMP;  1,LIGHT)  

Goal: determine the sensors that AdaBoost chooses using correlation Find the correlation of each pair of sensors selected by AdaBoost Use average correlation as a threshold for choosing sensors

Page 15: M1 gp用 20120511ver2

Selected

Unused

Sensor Selection

2,TEMP

AdaBoost All Sensors

1,ACC

2,MIC

1,LIGHT

3,MIC

3,TEMP

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

correlation(2,TEMP;  1,LIGHT)  

correlation(3,MIC;  3,TEMP)  

Goal: determine the sensors that AdaBoost chooses using correlation Find the correlation of each pair of sensors selected by AdaBoost Use average correlation as a threshold for choosing sensors

Page 16: M1 gp用 20120511ver2

Selected

Unused

Sensor Selection Goal: determine the sensors that AdaBoost chooses using correlation

Find the correlation of each pair of sensors selected by AdaBoost Use average correlation as a threshold for choosing sensors

2,TEMP

AdaBoost

Set threshold α based on average correlation:

All Sensors

1,ACC

2,MIC

1,LIGHT

3,MIC

3,TEMP

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

correlation(2,TEMP;  1,LIGHT)  

correlation(3,MIC;  3,TEMP)  …  

α  =  μcorr  +  σcorr  

Page 17: M1 gp用 20120511ver2

Unused

Selected

Sensor Selection

1,LIGHT

AdaBoost

All Sensors

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,ACC

1,LIGHT

correlation(1,ACC;  1,LIGHT)  ≤  α  

Page 18: M1 gp用 20120511ver2

Unused

Selected

2,TEMP

AdaBoost

All Sensors

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT 1,ACC

1,LIGHT

correlation(2,TEMP;  1,ACC)  >  α  

acc(2,TEMP)  >  acc(1,ACC)  

Sensor Selection

Page 19: M1 gp用 20120511ver2

Unused

Selected

2,TEMP

AdaBoost

All Sensors

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

1,ACC

1,LIGHT

3,TEMP

correlation(1,ACC;  3,TEMP)  ≤  α  

Sensor Selection

Page 20: M1 gp用 20120511ver2

Selected

Unused

3,MIC

2,MIC

2,TEMP

AdaBoost

All Sensors

2,MIC

1,ACC

3,MIC

3,TEMP

2,TEMP

1,LIGHT

1,ACC

1,LIGHT

3,TEMP

Sensor Selection

Page 21: M1 gp用 20120511ver2

Results We improved excluding redundant sensors more about 15% than others by “AdaBoost+SS”

Page 22: M1 gp用 20120511ver2

Conclution

!   PBN: Towards practical BSN daily activity recognition

!   PBN provides: !   Strong classification performance

!   Retraining detection to reduce invasiveness

!   Identification of redundant resources

!   Future Work !   Extensive usability study

!   Improve phone energy usage

Page 23: M1 gp用 20120511ver2

23

Thank you for your listening

Page 24: M1 gp用 20120511ver2

Appendix

Page 25: M1 gp用 20120511ver2

Kullback–Leibler divergence

確率論と情報理論における2つの確率分布の差異を計る尺度

P 、 Q を離散確率分布とするとき、P の Q に対するカルバック・ライブラー情報量は以下のように定義される。

ここでP(i) 、Q(i) はそれぞれ確率分布P 、 Q に従って選ばれた値がi になる確率。

Page 26: M1 gp用 20120511ver2

AdaBoost training Algorihm

Input: Max iterations T, training obs. vector Oj for each sensor s j ∈ S, obs. ground truth labels Output: Set of weak classifiers H Initialize observation weights D1 to 1/n for all obs. for t=1toT do forsensorsj∈Sdo Train weak classifier ht,j using obs. Oj, weights Dt Get weighted error εt, j for ht, j using labels [8] end for Add the ht, j with least error εt to H by choosing ht, j with least error εt Set Dt+1 using Dt , misclassifications made by ht [8] endfor

Page 27: M1 gp用 20120511ver2

Raw Correlation Threshold for Sensor Selection

Input: Set of sensors S selected by AdaBoost, training ob- servations for all sensors O, multiplier n Output: Sensor selection threshold α R = 0/ // set of correlation coefficients for all combinations of sensors si and s j in S do Compute correlation coefficient r = |rOi ,O j | R=R∪{r} end for // compute threshold as avg + (n * std. dev. ) of R α = μR + nσR

Page 28: M1 gp用 20120511ver2

Sensor Selection Using Raw Correlation Input: SetofallsensorsS,training observations for all sensors O, threshold α Output: Selected sensors S∗ to give as input to AdaBoost S ∗ = 0/ E = 0/ // set of sensors we exclude for all combinations of sensors si and s j in S do Compute correlation coefficient r = |rOi ,O j | if r < α then i f s i ∈/ E t h e n S ∗ = S ∗ ∪ { s i } i f s j ∈/ E t h e n S ∗ = S ∗ ∪ { s j } else if r ≥ α and acc(si) > acc(sj) then // use accuracy to decide which to add to S∗ i f s i ∈/ E t h e n S ∗ = S ∗ ∪ { s i } E = E ∪{sj}; S∗ = S∗ \{sj} else i f s j ∈/ E t h e n S ∗ = S ∗ ∪ { s j } E = E ∪{si}; S∗ = S∗ \{si} end if endfor

Page 29: M1 gp用 20120511ver2

Personal Sensing Applications !   Body Sensor Networks

!   Athletic Performance

!   Health Care

!   Activity Recognition

Pulse Oximeter

Heart Rate Monitor

Mobile Phone Aggregator

29

Page 30: M1 gp用 20120511ver2

A Practical Solution to Activity Recognition

!   Portable

!   Entirely user controlled

!   Computationally lightweight

!   Accurate

30

On-Body Sensors +Sensing Accuracy +Energy Efficiency

Phone +User Interface +Computational Power +Additional Sensors

Page 31: M1 gp用 20120511ver2

Challenges to Practical Activity Recognition

31

!   User-friendly !   Hardware configuration

!   Software configuration

!   Accurate classification !   Classify difficult activities in the presence of dynamics

!   Efficient classification !   Computation and energy efficiency

!   Less reliance on ground truth !   Labeling sensor data is invasive

Page 32: M1 gp用 20120511ver2

PBN: Practical Body Networking

32

! TinyOS-based motes + Android phone

!   Lightweight activity recognition appropriate for motes and phones

!   Retraining detection to reduce invasiveness

!   Identify redundant sensors to reduce training costs

!   Classify difficult activities with nearly 90% accuracy

Page 33: M1 gp用 20120511ver2

Hardware: TinyOS + Android

33

!   IRIS on-body motes, TelosB base station, G1 phone

!   Enable USB host mode support in Android kernel

!   Android device manager modifications

! TinyOS JNI compiled for Android

Page 34: M1 gp用 20120511ver2

Software: Android Application

34

Sensor Configuration Runtime Control and Feedback Ground Truth Logging

Page 35: M1 gp用 20120511ver2

Data Collection Setup

35

!   Classify typical daily activities, postures, and environment

!   Previous work (Lester, et. al.) identifies some activities as hard to classify

!   Classification Categories: Environment Indoors, Outdoors

Posture Cycling, Lying Down, Sitting, Standing, Walking

Activity Cleaning, Cycling, Driving, Eating, Meeting, Reading, Walking, Watching TV, Working

Page 36: M1 gp用 20120511ver2

Sensor Selection

PBN Architecture

36

Sensor Node

Phone

Base Station Node

Local Agg. Sensor Sensor Sensor

Sample Controller

Sample Controller

Sensor Sensor Sensor

Local Agg.

GUI TinyOS

Comm. Stack

Sensor Selection

Ground Truth Management

Activity Classification

Retraining Detection

Activity Decision, Request Ground Truth Agg. Data

Agg. Data

Labeled Data

Agg. Data Start/Stop

Input Sensors Activity Prob, Agg. Data

Training Data

USB 802.15.4

Page 37: M1 gp用 20120511ver2

Retraining Detection

37

!   Body Sensor Network Dynamics affects accuracy during runtime: !   Changing physical location

!   User biomechanics

!   Variable sensor orientation

!   Background noise

!   How to detect that retraining is needed without asking for ground truth? !   Constantly nagging the user for ground truth is annoying

!   Perform with limited initial training data

!   Maintain high accuracy

Page 38: M1 gp用 20120511ver2

Retraining Detection

38

!   Measure the discriminative power of each sensor: K-L divergence !   Quantify the difference between sensor reading distributions

!   Retraining detection with K-L divergence: !   Compare training data to runtime data for each sensor

Sensors

Page 39: M1 gp用 20120511ver2

Retraining Detection

39

!   Training !   Compute “one vs. rest” K-L divergence for each sensor and activity

1,LIGHT

1,ACC

2,MIC

Walking Driving Working

Training Data Ground Truth:

Sensors

DKL(Twalking,Tother)  =   √

Walking Training Data Distribution

{Driving, Working} Training Data Distribution

Walking Data Partition:

vs.

For each sensor:

Page 40: M1 gp用 20120511ver2

Retraining Detection

40

!   Runtime

!   At each interval, sensors compare runtime data to training data for current classified activity

1,LIGHT

1,ACC

2,MIC

Sensors Current AdaBoost Classified Activity: Walking

DKL(Rwalking,Twalking)  =   √

Walking Runtime Data Distribution

vs. √

Walking Training Data Distribution

For each sensor:

Page 41: M1 gp用 20120511ver2

Retraining Detection

41

!   Runtime

!   At each interval, sensors compare runtime data to training data for current classified activity

!   Each individual sensor determines retraining is needed when:

DKL(Rwalking,Twalking)              >            DKL(Twalking,Tother)    

Walking Runtime Data Distribution

vs. √

Walking Training Data Distribution

Walking Training Data Distribution

{Driving, Working} Training Data Distribution

vs.

Intra-activity divergence Inter-activity divergence

Page 42: M1 gp用 20120511ver2

Retraining Detection

42

!   Runtime

!   At each interval, sensors compare runtime data to training data for current classified activity

!   Each individual sensor determines retraining is needed !   The ensemble retrains when a weighted majority of sensors

demand retraining

Page 43: M1 gp用 20120511ver2

Ground Truth Management

43

!   Retraining: How much new labeled data to collect? !   Capture changes in BSN dynamics

!   Too much labeling is intrusive

!   Balance number of observations per activity !   Loose balance hurts classification accuracy

!   Restrictive balance prevents adding new data

!   Balance multiplier !   Each activity has no more than δ times the average

!   Balance enforcement: random replacement

Page 44: M1 gp用 20120511ver2

Sensor Selection

44

! AdaBoost training can be computationally demanding !   Train a weak classifier for each sensor at each iteration

!   > 100 iterations to achieve maximum accuracy

!   Can we give only the most helpful sensors to AdaBoost? !   Identify both helpful and redundant sensors

!   Train fewer weak classifiers per AdaBoost iteration

!   Bonus: use even fewer sensors

Page 45: M1 gp用 20120511ver2

Sensor Selection

45

Choosing sensors with slight correlation yields the highest accuracy

Page 46: M1 gp用 20120511ver2

Evaluation Setup

46

!   Classify typical daily activities, postures, and environment

!   2 subjects over 2 weeks

!   Classification Categories:

Environment Indoors, Outdoors

Posture Cycling, Lying Down, Sitting, Standing, Walking

Activity Cleaning, Cycling, Driving, Eating, Meeting, Reading, Walking, Watching TV, Working

Page 47: M1 gp用 20120511ver2

Classification Performance

47

Page 48: M1 gp用 20120511ver2

Classification Performance

48

Page 49: M1 gp用 20120511ver2

Classification Performance

49

Page 50: M1 gp用 20120511ver2

Retraining Performance

50

Page 51: M1 gp用 20120511ver2

Sensor Selection Performance

51

Page 52: M1 gp用 20120511ver2

Application Performance

52

!   Power Benchmarks

!   Training Overhead

!   Battery Life

Mode CPU Memory Power

Idle (No PBN) <1% 4.30MB 360.59mW

Sampling (WiFi) 19% 8.16MB 517.74mW

Sampling (GPS) 21% 8.47MB 711.74mW

Sampling (Wifi) + Train 100% 9.48MB 601.02mW

Sampling (WiFi) + Classify 21% 9.45MB 513.57mW


Recommended