+ All Categories
Home > Documents > Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks

Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks

Date post: 12-Jan-2016
Category:
Upload: neal
View: 22 times
Download: 0 times
Share this document with a friend
Description:
Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research. Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks. Motivation. Timing Analysis Calculation of Worst Case Execution Times (WCETs) of tasks - PowerPoint PPT Presentation
17
Harini Ramaprasad, Frank Mueller North Carolina State University Center for Embedded Systems Research Bounding Preemption Delay within Data Cache Reference Patterns for Real-Time Tasks
Transcript
Page 1: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

Harini Ramaprasad, Frank Mueller

North Carolina State University

Center for Embedded Systems Research

Bounding Preemption Delay within Data

Cache Reference Patterns

for Real-Time Tasks

Page 2: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

2

Motivation

Timing Analysis— Calculation of Worst Case Execution Times (WCETs) of

tasks— Required for scheduling of real-time tasks

– Schedulability theory requires a-priori knowledge of WCET

— Estimates need to be safe— Static Timing Analysis – an efficient method to calculate

WCET of a program!— Data caches introduce unpredictability in timing analysis

Data caches:• Improve Performance Significantly

• Complicate Static Timing Analysis for a task

Page 3: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

3

Preemptive scheduling

Practical Real-Time systems— Multiple tasks with varying priorities— Higher prio. task may preempt a lower prio. task at any

time— Additional DC misses occur when lower prio. task

restarted— WCET with preemption delay required

Static Timing Analysis becomes even more complicated!

Page 4: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

4

Data Cache Reference Patterns (Prior Work)

Data Cache Analyzer added to Static Timing Analysis framework

Enhanced Cache Miss Equations (Ghosh et al.) framework D$ miss/hit patterns for memory references

Used for loop-nest oriented code — Scalar and array references analyzed

Considers only a single task with no preemptions

Patterns fed to timing analyzer to tighten WCET estimate

Necessary terminology:— Iteration point

–Represents an iteration of a loop-nest— Set of all iteration points – Iteration Space

Page 5: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

5

Static Timing Analyzer Framework

Page 6: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

6

Methodology

Task Schedulability Response Time Analysis used

Steps involved in calculation of WCET with preemption delay— Calculate max. # of preemptions possible for a task— Identify placement of preemption points in iteration

space— Calculate preemption delay at a certain point

Page 7: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

7

Methodology: Analysis Phases

Phase 1: Single-Task Analysis— For every task

–Calculate Base Time –Build D$ Reference Patterns assuming NO preemptions

Performed once for every task

Phase 2: Preemption Delay Calculation (in task-set context)— Step 1: Identification of preemption points— Step 2: Calculation of WCET with preemption delay

Performed once for a task in context of task-set

Page 8: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

8

Phase 2: Preemption Delay Calculation

Max # of preemption points for task Ti

— For every higher priority task, Tj

–Find max time Tj can preempt Ti

–Subtract this from time rem. before deadline of Ti

— Termination

–No more higher priority tasks

–No time left before deadline

(whichever occurs first)

–Sum of gives max # of preemptions

Task Period WCET

T1 50 8

T2 100 10

T3 200 35

For T1:No hp task. # preemptions = 0

For T2:Trem = 100 – (2 * 8) = 84No more hp tasks# preemptions = 2

For T3Trem = 200 – (4 * 8) = 168Trem = 168 – (2 * 10) = 148No more hp tasks#preemptions = 4 + 2 = 6

Page 9: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

9

Phase 2: Preemption Delay Calculation

Identification of preemption points— Access Chain building

–Build time-ordered list of all mem. refs in task–Connect all refs accessing same D$ set to form chain–Different cache sets shown with different colors

— Assign weights to every access point–Weight

– # distinctly colored chains that cross the point– indicates # misses if preemption at that point

–Count only chains for D$ sets used by a higher prio. task–Count only if next point in chain is a HIT

Page 10: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

10

Phase 2: Preemption Delay Calculation

Calculation of WCET with preemption delay— Identification of worst-case preemption scenario

–General Observation:

– Large chunk of iter. pts. have max. preemption delay

– Reason: high temporal/spatial reuse in code

–Considering n highest costs gives upper bound on delay

– n = max # of preemptions for task

Page 11: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

11

Distribution of preemption costs

Page 12: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

12

Experimental Results – Task Set 1

Benchmark Period Stand-

alone

WCET

R w/o

delay

WCET w/

delay

R w/ delay

dotproduct 50000 750 750 750 750

convolution 62500 7491 8241 12491 13241

fir 125000 9537 17778 22037 35278

lms 125000 14536 32314 29136 77655

nrealupdate

s

250000 16738 48692 79138 235198

matrix1 250000 54168 111851 104568 > period

•Without delay, seems schedulable•Adding delay to response time safe•Above task-set is actually unschedulable!

Page 13: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

13

Ratios – Task Set 1

Benchmark R w/o delay /

stand-alone

WCET

R with delay /

WCET w/ delay

WCET w/ delay

/ stand-alone

WCET

dot-product 1 1 1

convolution 1.1 1.06 1.67

fir 1.87 1.6 2.31

lms 2.22 2.6 2

n-real-updates 2.91 2.97 4.73

matrix1 2.06 - 1.93

•Preemption delay calculation:•No significant change in R/WCET factor•Increase in WCET itself is significant

pessimistic analysis

Page 14: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

14

Spreading preemption points – key idea

Find n most expensive points

Spread them out in the iteration space

Page 15: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

15

Related Work

S. Basumallick and K. Nilsen. Cache issues in real-time systems. in ACM

SIGPLAN Workshop on Language, Compiler, and Tool Support for Real-Time

Systems, 1994.

C.-G. Lee, J. Hahn, Y.-M. Seo, S. L. Min, R. Ha, S. Hong, C. Y. Park, M. Lee, and

C. S. Kim. Analysis or cache-related preemption delay in Fixed-priority

preemptive scheduling. IEEE Transactions on Computers, 47(6):700.713, 1998.

C.-G. Lee, K. Lee, J. Hahn, Y.-M. Seo, S. L. Min, R. Ha, S. Hong, C. Y. Park, M.

Lee, and C. S. Kim. Bounding cache related preemption delay for real-time

systems. IEEE Transactions on Software Engineering, 27(9):805.826, Nov. 2001.

J. Staschulat and R. Ernst. Multiple process execution in cache related

preemption delay analysis. In ACM International Conference on Embedded

Software, 2004.

J. Staschulat, S. Schliecker, and R. Ernst. Scheduling analysis of real-time

systems with precise modeling of cache related preemption delay. In Euromicro

Conference on Real- Time Systems, 2005.

Page 16: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

16

Conclusions

Derivation of data cache reference patterns for every task

Construction of data cache access chains from these— Calculate preemption delay at a point

Determination of the max # of preemptions, n, for a given task

— Context of a task set.

Identification of the worst-case scenarios of preemptions.— Current work: Choose the n most expensive points

First work addressing data cache related preemption delay

Page 17: Bounding Preemption Delay within Data Cache Reference Patterns  for Real-Time Tasks

17

Thank you!

Questions?


Recommended