+ All Categories
Home > Documents > Machine Learning for Cities CUSP-GX 5003.002, Spring 2020

Machine Learning for Cities CUSP-GX 5003.002, Spring 2020

Date post: 21-Jan-2023
Category:
Upload: khangminh22
View: 0 times
Download: 0 times
Share this document with a friend
44
Machine Learning for Cities CUSP-GX 5003.002, Spring 2020 Lecture 10: Trajectory Prediction with Hidden Markov Model Instructor : Sheng Wang (swang @nyu.edu) Website : https://wp.nyu.edu/ml4c2020/week-12/ 1
Transcript

Machine Learning for CitiesCUSP-GX 5003.002, Spring 2020

Lecture 10: Trajectory Prediction withHidden Markov Model

Instructor: Sheng Wang ([email protected])

Website: https://wp.nyu.edu/ml4c2020/week-12/

1

Outline

• Trajectory prediction– For human– For vehicles

– Common definition– kNN as a baseline

• Sequential data models– Hidden Markov Model– Learning and training– Applying to trajectory prediction

2

Motivation

• Trajectory prediction: predict the next location– Self-driving– Human trajectory prediction in crowded space

Path planning Central station

3

Problem Formulation

• Given a sequenced set of points, predict the future point

4

kNN-based Method

• Find the most similar trajectories in the dataset

• See where is the next point at the corresponding time stamp

5Zaiben Chen, Heng Tao Shen, Xiaofang Zhou, Yu Zheng, Xing Xie: Searching trajectories by locations: an efficiency study. SIGMOD Conference 2010: 255-266

Sequential Data Models

• Given a set of historical observations, predict the future– Time-series: stock market, speech, video analysis– Ordered: text, genes

• Fully observable formulation: data is sequence of coin selections – AAAABBBBAABBBBBBBAAAAABBBBB

6

Hidden Markov Models

• A hidden Markov model (HMM) is a statistical model, in which the system being modeled is assumed to be a Markov process (Memoryless process: its future and past are independent) with hidden states.

7

Hidden Markov Models• Has a set of states each of which has limited number of transitions

and emissions• Each transition between states has an assigned probability,• Each model strarts from start state and ends in end state,

8

Markov Models

• Talk about weather

• Assume there are three types of weather:

• Sunny

• Rainy

• Foggy

Weather prediction is about the what would be the weather tomorrow,– Based on the observations on the past

9

Markov Models

– qn depends on the known weathers of the past days (qn-1, qn-2,…)

Weather at day n is qn={sunny ,rainy , foggy}

We want to find that:

– means given the past weathers what is the probability of any possible weather of today.

10

Markov Models

For example:• if we knew the weather for last three days was:

is:• the probability that tomorrow would be

P(q4= | q3= , q2= , q1= )

11

Markov Models

• Markov Models and Assumption:– Therefore, make a simplifying assumption Markov

assumption:• For sequence:

• the weather of tomorrow only depends on today(first order Markov model)

12

Markov Models

13

Markov ModelsMarkov Models and Assumption:

Examples:

14

Markov Models

• Markov Models and Assumption:• Examples:

• If the weather yesterday was rainy and today is foggy what is the probability that tomorrow it will be sunny?

Markov assumption15

Hidden Markov Models

• Hidden Markov Models (HMMs):

• What is HMM:

• Suppose that you are locked in a room for several days,

• you try to predict the weather outside,

• The only piece of evidence you have is whether the

person who comes into the room bringing your daily

meal is carrying an umbrella or not.

16

Hidden Markov Models

• Hidden Markov Models (HMMs):• What is HMM: assume probabilities as seen in the table:

17

Hidden Markov Models

• is based on the observations xi:

• Hidden Markov Models (HMMs):• What is HMM:

• Finding the probability of a certain weather• qnÎ{sunny,rainy , foggy}

18

Hidden Markov Models

• Hidden Markov Models (HMMs):• What is HMM:• Using Bayes rule:

• For n days:

19

Discrete Markov Processes (Markov Chains)

20

Hidden Markov Models

21

Hidden Markov Models

Probabilistic parameters of a hidden Markov modelX — statesy — possible observationsa — state transition probabilitiesb — output probabilities

22https://en.wikipedia.org/wiki/Hidden_Markov_model

Hidden Markov Models

23

Hidden Markov Models

24

Hidden Markov Models

25

Hidden Markov Models

26

Three Fundamental Problems

• Evaluation: Given parameters and observation sequence, find

probability (likelihood) of observed sequence

- forward algorithm

• Decoding: Given HMM parameters and observation sequence, find the

most probable sequence of hidden states

• - Viterbi algorithm

• Learning: Given HMM with unknown parameters and observation

sequence, find the parameters that maximizes likelihood of data

- Forward-Backward algorithm

27

HMM Evaluation Problem

28

HMM Evaluation Problem

29

HMM Evaluation Problem

30

HMM Evaluation Problem

31

HMM Evaluation Problem

32

HMM Decoding Problem

33

HMM Decoding Problem

34

HMM Learning Problem

35

HMM Learning Problem

36

Return to Trajectory Prediction

• Challenges of using the HMM– Numerous observations if using coordinates– Preprocessing trajectories is necessary

37

Trajectory Preprocessing

• Dividing Space into Grids– More grids, more observations

Map of Beijing with 30 × 30 grid overlay: Each cell ≈ 1.78km2

38

Applying to Trajectory Prediction

• Computing state transition probabilities

Xue, Andy Yuan, et al. "Destination prediction by sub-trajectory synthesis and privacy protection against such prediction." ICDE 2013.

39

Dividing into triangles• Dividing into equilateral triangles

Wesley Mathew, Ruben Raposo, Bruno Martins: Predicting future locations with hidden Markov models. UbiComp 2012: 911-918

40

Aircraft Trajectory

• 3D cubes

Samet Ayhan, Hanan Samet: Aircraft Trajectory Prediction Made Easy with Predictive Analytics. KDD 2016: 21-3041

Model Training with Library

• Fit sequence data into the model– They will infer the hidden states using the algorithms

42

Case study

• https://hmmlearn.readthedocs.io/en/latest/

43

Lab Quiz

• Given two mapped trajectories, computing the EBD with O(nlogn)complexity (check our last lecture).

44


Recommended