+ All Categories
Home > Documents > Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD...

Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD...

Date post: 19-Dec-2015
Category:
View: 215 times
Download: 0 times
Share this document with a friend
37
Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc. http://vlsicad.ucsd.edu/
Transcript
Page 1: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Design Cost Modeling and Data Collection InfrastructureDesign Cost Modeling and Data Collection Infrastructure

Andrew B. Kahng and Stefanus Mantik*

UCSD CSE and ECE Departments

(*) Cadence Design Systems, Inc.

http://vlsicad.ucsd.edu/

Page 2: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

ITRS Design Cost ModelITRS Design Cost Model

Engineer cost/year increases 5% / year ($181,568 in 1990)EDA tool cost/year (per engineer) increases 3.9% / yearProductivity due to 8 major Design Technology

innovations RTL methodology … Large-block reuse IC implementation suite Intelligent testbench Electronic System-level methodology

Matched up against SOC-LP PDA content: SOC-LP PDA design cost = $20M in 2003 Would have been $630M without EDA innovations

Page 3: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

$10,000,000

$100,000,000

$1,000,000,000

$10,000,000,000

$100,000,000,000

1985 1990 1995 2000 2005 2010 2015 2020

Year

RTL Methodology Only

With All Future Improvements

Tall

Th

in E

ng

inee

r

Sm

all

Blo

ck R

eu

se

IC I

mpl

em

enta

tion t

oo

ls

Larg

e B

lock

Reu

se

Inte

llige

nt

Test

bench

ES

Le

vel M

eth

odolo

gy

Very

La

rge B

lock

Reu

se

629,769,273

20,152,617

To

tal

De

sig

n C

os

t

In h

ou

se P

&R

SOC Design CostSOC Design Cost

Page 4: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

OutlineOutline

Introduction and motivations

METRICS system architecture

Design quality metrics and tool quality metrics

Applications of the METRICS system

Issues and conclusions

Page 5: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

MotivationsMotivations How do we improve design productivity ?

Is our design technology / capability better than last year?

How do we formally capture best known methods, and how do we identify them in the first place ?

Does our design environment support continuous improvement, exploratory what-if design, early predictions of success / failure, ...?

Currently, no standards or infrastructure for measuring and recording the semiconductor design process Can benefit project management

accurate resource prediction at any point in design cycle accurate project post-mortems

Can benefit tool R&D feedback on tool usage and parameters used improved benchmarking

Page 6: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Fundamental GapsFundamental Gaps

Data to be measured is not available Data is only available through tool log files Metrics naming and semantics are not consistent among

different tools

We do not always know what data should be measured Some metrics are less obviously useful Other metrics are almost impossible to discern

Page 7: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Purpose of METRICSPurpose of METRICS

Standard infrastructure for the collection and the storage of design process information

Standard list of design metrics and process metrics

Analyses and reports that are useful for design process optimization

METRICS allows: Collect, Data-Mine, Measure, Diagnose, then Improve

Page 8: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

OutlineOutline

Introduction and motivations

METRICS system architecture Components of METRICS System Flow tracking METRICS Standard

Design quality metrics and tool quality metrics

Applications of the METRICS system

Issues and conclusions

Page 9: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

METRICS System ArchitectureMETRICS System Architecture

Inter/Intra-net

DBMetrics Data Warehouse

WebServer

JavaApplets

DataMining

Reporting

Transmitter Transmitterwrapper

Tool Tool Tool

TransmitterAPI

XML

DAC00

Page 10: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

METRICS ServerMETRICS Server

DB JDBC

Decryptor

XML Parser

Java Beans

Receiver

Apache + Servlet

InputForm

ReceiverServlet

Reporting

ReportingServlet

ExternalInterface

Dataminer

Data translator

Internet/Intranet

Page 11: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Example ReportsExample Reports

nexus4 95%

nexus10 1% nexus11 2%nexus12 2%

% aborted per machine

% aborted per task

BA 8%

ATPG 22%

synthesis 20%

physical18%

postSyntTA13%

placedTA7%

funcSim7%

LVS 5%

CPU_TIME = 12 + 0.027 NUM_CELLS

Correlation = 0.93

Page 12: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Flow TrackingFlow Tracking

Run Current FLOW_SEQUENCENo Task T1 T2 T3 T41 T1 1 - - - 12 T2 1 1 - - 1/13 T1 2 - - - 24 T2 2 1 - - 2/15 T3 2 1 1 - 2/1/16 T3 2 1 2 - 2/1/27 T3 2 1 3 - 2/1/38 T4 2 1 3 1 2/1/3/19 T2 2 2 - - 2/210 T1 3 - - - 311 T2 3 1 - - 3/112 T4 3 1 - 1 3/1/0/1

TASK_NO

Task sequence: T1, T2, T1, T2, T3, T3, T3, T4, T2, T1, T2, T4

S

T1

T2

F

T1

T2

T3 T3 T3

T4

T2

T1

T2

T4

Page 13: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Testbeds: Metricized P&R FlowTestbeds: Metricized P&R Flow

METRICS

Placed DEF

QP ECOLegal DEF

CongestionMap

WRoute

Capo Placer

Routed DEF

CongestionAnalysis

Incr WRoute

Final DEF

LEF

DEF

DEF

Placed DEF

QP

Pearl

QP Opt

CTGen

Incr.

Routed DEF

WRoute

Optimized DEF

LEFGCF,TLF

Clocked DEF Constraints

Synthesis & Tech Map

Pre-placement Opt

GRoute

QP

Post-placement Opt

WRoute

Ambit PKS

UC

LA +

Ca

denc

e flo

wC

aden

ce P

KS

flo

w

Cadence SLC flow

Page 14: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

METRICS StandardsMETRICS Standards

Standard metrics naming across tools same name same meaning, independent of tool

supplier generic metrics and tool-specific metrics no more ad hoc, incomparable log files

Standard schema for metrics database

Standard middleware for database interface

Page 15: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Generic and Specific Tool MetricsGeneric and Specific Tool Metrics

tool_name stringtool_version stringtool_vendor stringcompiled_date mm/dd/yyyystart_time hh:mm:ssend_time hh:mm:sstool_user stringhost_name stringhost_id stringcpu_type stringos_name stringos_version stringcpu_time hh:mm:ss

Generic Tool Metricsnum_cells integernum_nets integerlayout_size doublerow_utilization doublewirelength doubleweighted_wl double

num_layers integernum_violations integernum_vias integerwirelength doublewrong-way_wl doublemax_congestion double

Placement Tool Metrics

Routing Tool Metrics

Partial list of metrics now being collected in Oracle8i

Page 16: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Open Source ArchitectureOpen Source Architecture

METRICS components are industry standards e.g., Oracle 8i, Java servlets, XML, Apache web server,

PERL/TCL scripts, etc.

Custom generated codes for wrappers and APIs are publicly available collaboration in development of wrappers and APIs porting to different operating systems

Codes are available at: http://www.gigascale.org/metrics

Page 17: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

OutlineOutline

Introduction and motivations

METRICS system architecture

Design quality metrics and tool quality metrics

Applications of the METRICS system

Issues and conclusions

Page 18: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Tool Quality Metric: Behavior in the Presence of Input Noise [ISQED02]Tool Quality Metric: Behavior in the Presence of Input Noise [ISQED02]

Goal: tool predictability Ideal scenario: can predict final solution quality even

before running the tool Requires understanding of tool behavior

Heuristic nature of tool: predicting results is difficult

Lower bound on prediction accuracy: inherent tool noise

Input noise "insignificant" variations in input data (sorting, scaling, naming, ...) that can nevertheless affect solution quality

Goal: understand how tools behave in presence of noise, and possibly exploit inherent tool noise

Page 19: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Monotone BehaviorMonotone Behavior

Monotonicity monotone solutions w.r.t. inputs

Parameter

Qua

lity

Parameter

Qua

lity

Page 20: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Monotonicity StudiesMonotonicity Studies OptimizationLevel: 1(fast/worst) … 10(slow/best)

Opt Level 1 2 3 4 5 6 7 8 9

QP WL 2.50 0.97 -0.20 -0.11 1.43 0.58 1.29 0.64 1.70

QP CPU -59.7 -51.6 -40.4 -39.3 -31.5 -31.3 -17.3 -11.9 -6.73

WR WL 2.95 1.52 -0.29 0.07 1.59 0.92 0.89 0.94 1.52

Total CPU 4.19 -6.77 -16.2 -15.2 -7.23 -10.6 -6.99 -3.75 -0.51

-60

-50

-40

-30

-20

-10

0

10

1 2 3 4 5 6 7 8 9

QP CPU

Total CPU

-0.5

0

0.5

1

1.5

2

2.5

3

1 2 3 4 5 6 7 8 9

QP WL

WR WL

Note: OptimizationLevel is the tool's own knob for "effort"; it may or may not be well-conceived with respect to the underlying heuristics (bottom line is that the tool behavior is "non-monotone" from user viewpoint)

Page 21: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Noise Studies: Random SeedsNoise Studies: Random Seeds

200 runs with different random seeds ½-percent spread in solution quality due to random seed

0

5

10

15

20

25

30

35

-0.2

4-0

.2-0

.16

-0.1

2-0

.08

-0.0

4 00.

040.

080.

12

% Quality Loss

# R

un

-0.05%

Page 22: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Noise: Random Ordering & NamingNoise: Random Ordering & Naming

Data sorting no effect on reordering

Five naming perturbation random cell names without hierarchy (CR)

E.g., AFDX|CTRL|AX239 CELL00134 random net names without hierarchy (NR) random cell names with hierarchy (CH)

E.g., AFDX|CTRL|AX129 ID012|ID79|ID216 random net names with hierarchy (NH) random master cell names (MC)

E.g., NAND3X4 MCELL0123

Page 23: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Noise: Random Naming (contd.)Noise: Random Naming (contd.)

Wide range of variations (±3%)

Hierarchy matters

0

2

4

6

8

10

12

14

16

-4 -2 0 2 4 6

CR

NR

CH

NH

MC

% Quality Loss

Num

ber

of R

uns

0

2

4

6

8

10

12

14

16

-4 -2 0 2 4 6

NR

NH

Page 24: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Noise: HierarchyNoise: Hierarchy

Swap hierarchy AA|BB|C03 XX|YY|C03 XX|YY|Z12 AA|BB|Z12

0

2

4

6

8

10

-1 1 3 5 7 9 11 13% Quality Loss

Nu

mb

er

of R

uns

Page 25: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

OutlineOutline

Introduction and motivations

METRICS system architecture

Design quality and tool quality

Applications of the METRICS system

Issues and conclusions

Page 26: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Categories of Collected DataCategories of Collected Data

Design instances and design parameters attributes and metrics of the design instances e.g., number of gates, target clock frequency, number of

metal layers, etc.

CAD tools and invocation options list of tools and user options that are available e.g., tool version, optimism level, timing driven option,

etc.

Design solutions and result qualities qualities of the solutions obtained from given tools and

design instances e.g., number of timing violations, total tool runtime,

layout area, etc.

Page 27: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Three Basic Application TypesThree Basic Application Types

Design instances and design parameters

CAD tools and invocation options

Design solutions and result qualities

Given and , estimate the expected quality of e.g., runtime predictions, wirelength estimations, etc.

Given and , find the appropriate setting of e.g., best value for a specific option, etc.

Given and , identify the subspace of that is “doable” for the tool e.g., category of designs that are suitable for the given

tools, etc.

Page 28: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Estimation of QP CPU and WirelengthEstimation of QP CPU and Wirelength

Goal: Estimate QPlace runtime for CPU budgeting and block

partition Estimate placement quality (total wirelength)

Collect QPlace metrics from 2000+ regression logfiles

Use data mining (Cubist 1.07) to classify and predict, e.g.: Rule 1: [101 cases, mean 334.3, range 64 to 3881, est err 276.3] if

ROW_UTILIZATION <= 76.15 then CPU_TIME = -249 + 6.7 ROW_UTILIZATION + 55 NUM_ROUTING_LAYER - 14 NUM_LAYER

Rule 2: [168 cases, mean 365.7, range 20 to 5352, est err 281.6] if NUM_ROUTING_LAYER <= 4 then CPU_TIME = -1153 + 192 NUM_ROUTING_LAYER + 12.9 ROW_UTILIZATION - 49 NUM_LAYER

Rule 3: [161 cases, mean 795.8, range 126 to 1509, est err 1069.4] if NUM_ROUTING_LAYER > 4 and ROW_UTILIZATION > 76.15 then CPU_TIME = -33 + 8.2 ROW_UTILIZATION + 55 NUM_ROUTING_LAYER - 14 NUM_LAYER

Data mining limitation sparseness of data

Page 29: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Cubist 1.07 Predictor for Total WirelengthCubist 1.07 Predictor for Total Wirelength

Page 30: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Optimization of Incremental Multilevel FM PartitioningOptimization of Incremental Multilevel FM Partitioning Motivation: Incremental Netlist Partitioning

Scenario: Design changes (netlist ECOs) are made, but we want the top-down placement result to remain similar to previous result

RefinementClustering

Page 31: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Optimization of Incremental Multilevel FM PartitioningOptimization of Incremental Multilevel FM Partitioning Motivation: Incremental Netlist Partitioning

Scenario: Design changes (netlist ECOs) are made, but we want the top-down placement result to remain similar to previous result

Good approach [CaldwellKM00]: “V-cycling” based multilevel Fiduccia-Mattheyses

Our goal: What is the best tuning of the approach for a given instance?

break up the ECO perturbation into multiple smaller perturbations?

#starts of the partitioner? within a specified CPU budget?

Page 32: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Optimization of Incremental Multilevel FM Partitioning (contd.)Optimization of Incremental Multilevel FM Partitioning (contd.) Given: initial partitioning solution, CPU budget and

instance perturbation (I)

Find: number of stages of incremental partitioning (i.e., how to break up I ) and number of starts

Ti = incremental multilevel FM partitioning Self-loop multistart n number of breakups (I = 1 + 2 + 3 + ... + n)

S T1 FT2 T3 Tn...

Page 33: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Flow Optimization ResultsFlow Optimization Results

If (27401 < num edges 34826) and (143.09 < cpu time 165.28) and (perturbation delta 0.1) then num_inc_stages = 4 and num_starts = 3

If (27401 < num edges 34826) and (85.27 < cpu time 143.09) and (perturbation delta 0.1) then num_inc_stages = 2 and num_starts = 1

...DesignName Optimized Regularibm01 215 217ibm05 1685 1723ibm02 249 269ibm03 618 669ibm06 363 371ibm04 444 488ibm08 1127 1219ibm10 752 773

Num Nets Cut

Up to 10% cutsize reduction with same CPU budget, using tuned #starts, #stages, etc. in ML FM

Page 34: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

OutlineOutline

Introduction and motivations

METRICS system architecture

Design quality and tool quality

Applications for METRICS system

Issues and conclusions

Page 35: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

METRICS Deployment and AdoptionMETRICS Deployment and Adoption

Security: proprietary and confidential information cannot pass across company firewall may be difficult to develop metrics and predictors across multiple companies

Standardization: flow, terminology, data management

Social: “big brother”, collection of social metrics

Data cleanup: obsolete designs, old methodology, old tools

Data availability with standards: log files, API, or somewhere in between?

“Design Factories” are using METRICS

Page 36: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

ConclusionsConclusions

METRICS System : automatic data collection and real-time reporting

New design and process metrics with standard naming

Analysis of EDA tool quality in presence of input noise

Applications of METRICS: tool solution quality estimator (e.g., placement) and instance-specific tool parameter tuning (e.g., incremental partitioner)

Ongoing works: Construct active feedback from METRICS to design

process for automated process improvement Expand the current metrics list to include enterprise

metrics (e.g., number of engineers, number of spec revisions, etc.)

Page 37: Design Cost Modeling and Data Collection Infrastructure Andrew B. Kahng and Stefanus Mantik* UCSD CSE and ECE Departments (*) Cadence Design Systems, Inc.

Thank YouThank You


Recommended