Date post: | 14-Jun-2015 |
Category: |
Business |
Upload: | rolandnikles |
View: | 369 times |
Download: | 1 times |
www.GSBLaw.comAnchorage Beijing New York Portland Seattle Washington, D.C.
Summary of ASCE Article: Quantifying Performance for the Integrated Project
Delivery System as Compared to Established Delivery Systems1
Julia Holden-DavisDecember 17, 2013
www.GSBLaw.com
Definition of IPD
• Multiparty Agreement• Very early involvement of key participants
(pre-design, or 0% design)• Key participants to include entities such as
owner, GC, architect, consultants, subcontractors, and suppliers.
• “IPD-ish” – Uses concepts behind IPD without use of multiparty agreement
www.GSBLaw.com
Contradictions in Definition of IPD
• Kent and Becerick-Gerber (2010): three principals – multiparty agreement, early involvement, shared risks and rewards
• AIA (2010): includes liability waivers between key participants, fiscal transparency, shared (including development) project goals
www.GSBLaw.com
Why IPD
• Believed to foster early involvement and collaboration– Shared project leadership– Shared risk and reward– Liability waivers
• Believed to lead to performance benefits– More value and reduced energy costs– Reduced design documentation time and cost– Less rework and more buildable facilities
www.GSBLaw.com
Literature Review
• Helpful list of studies, predominantly comparing Design-Bid-Build and Design-Build, but also contrasting one or the other to Construction Management at Risk
• Identifies a prior study re: IPD.2 Study determine that the Last Planner System improved project performance but did not find significant differences in performance between IPD/non-IPD projects– Focused on comparison of time and cost
www.GSBLaw.com
Goal of Study
• “Evaluate the performance of IPD projects by comparing them to projects delivered using other systems, such as CMR, DB and DBB. The focus extends beyond the commonly analyzed metrics of cost and time to include safety and quality, and less commonly studied metrics, such as changes, process inefficiencies, communication, and profit.”
www.GSBLaw.com
Study Methodology
• Stage A – Assess literature and industry practice– Determine current knowledge base– Identify key variables
• Stage B– Survey Development– Pilot Testing– Data Collection
www.GSBLaw.com
Study Methodology
• Stage C– Statistical analysis (univariate)– For each metric, normality tests and then
• T-tests where data set assumed to be normally distributed
• Nonparametric Mann-Whitney-Wilcoxon tests when normality test not met
www.GSBLaw.com
Data Characteristics
• 35 projects– 12 IPD– 23 non-IPD
• Data primarily received from contractors or construction managers
• 304 variables• Two primary geographic areas: Midwest and
California
www.GSBLaw.com
Data Characteristics
• Typical project was complex institutional vertical construction (with some commercial facilities)– 50% healthcare– 25% university research labs
• Projects completed between 2005 and 2012• Cost distribution between $5M to $400 M
www.GSBLaw.com
Nine Performance Areas
• Cost Performance Metrics• Quality Performance Metrics• Schedule Performance Metrics• Safety Performance Metrics• Project Change Performance Metrics• Communication Performance Metrics• Labor Performance Metrics• Environmental Performance Metrics• Business Performance Metrics
www.GSBLaw.com
Cost Performance Metrics
• Data basis– Unit Cost ($/ft2) – Construction Cost Growth (final construction
cost / original estimated construction cost)
• Result– No significant differences in cost performance– Confirms prior literature
www.GSBLaw.com
Quality Performance Metrics
• Metrics included:– As-built quality of major building systems– Number of deficiency issues– Number and cost of punchlist items– Costs of warranty and latent defects
• Major building systems included finishes, structure, and mechanical
www.GSBLaw.com
Quality Performance Metrics, cont.
• Scaled metric (1-5) representing economy, standard, high quality, premium, or high efficiency premium
• Data normalized – E.g., # of deficiencies / # of $M
www.GSBLaw.com
Quality Performance Metrics
• Systems Quality:– IPD best, then IPD-ish, then non-IPD
• Punchlist items / million– Non IPD highest, then IPD, then IPD-ish
www.GSBLaw.com
Quality Performance Metrics
• Deficiency issues / million– Non-IPD highest, then IPD, then IPD-ish (based on
median – top of range greater for IPD-ish than IPD)
• Warranty costs and Latent Defects– Latent defects not significantly different; warranty
slightly better for IPD
www.GSBLaw.com
Schedule Performance Metrics
• Data in three metrics:– Delivery speed (ft2/day from design start through
occupancy)– Construction speed (ft2/day from construction notice
to proceed through substantial completion) – Construction schedule growth (% based on final
construction schedule compared to original estimated construction schedule)
• Supplementary metric of intensity (avg. $ value of work completed/day)
www.GSBLaw.com
Schedule Performance Metrics
• Results– IPD projects slightly higher results for delivery
speed and construction speed– IPD projects have higher intensity– IPD projects have greater schedule growth– IPD projects deemed to deliver better, based
predominantly on delivery speed
www.GSBLaw.com
Safety Performance Metrics
• Data set:– Number of OSHA recordables– Number of lost-time injuries (LTI)– Number of fatalities
• No fatalities on any of projects, so not used• Normalized based on hours worked or per $M• No major differences except that distribution
was wider for non-IPD (more extreme values)
www.GSBLaw.com
Project Change Performance Metrics• Three types
– Total percent of change– Reason for change
• Project additions• Design-related changes (design changes, design
coordination, design errors)
– Average change order processing time (initiation to approval)
www.GSBLaw.com
Project Change Performance Metrics• Overall percent of change
– Non-IPD greatest range and median– Then IPD, then IPD-ish
• Program changes– IPD greatest range but lower median than non-
IPD. IPD-ish moderate range but lowest median
www.GSBLaw.com
Project Change Performance Metrics• Design-related Changes.
– IPD-ish greatest distribution and median. IPD least (significantly smaller distribution)
• Change Order Processing time– Non-IPD highest, then IPD-ish, then IPD
www.GSBLaw.com
Communication Performance Metrics• Direct means of communication and process
inefficiencies– RFIs
• Number (#RFIs/Project cost)• Processing time
– Rework– Resubmittals
www.GSBLaw.com
Communication Performance Metrics• RFI per Million and RFI Processing time
– Non-IPD greatest distribution and highest median, IPD-ish smallest distribution but middle median, IPD smallest median
• Rework– Not statistically significant
• Resubmittals per Million– IPD and IPD-ish less than non-IPD
www.GSBLaw.com
Labor Performance Metrics
• Three metrics– Extent to which additional labor was used (OT, 2nd
Shift, over-manning)– Trend of percent plan complete (measure of work
flow reliability based on # of actual task completions/ number of planned tasks)
– Labor factor (ratio of total cost of self-performed work / labor cost of self-performed work)
www.GSBLaw.com
Labor Performance Metrics
• IPD projects used less extra labor than non-IPD
• IPD projects had a higher labor factor (potentially suggesting increased inefficiency)
• IPD projects more positive trend toward PPC (percent plan complete)
www.GSBLaw.com
Environmental Performance Metrics• Two metrics
– Total value of construction waste (tons/$M)– Percent of waste recycled versus waste sent to
land fills
• Results– Non-IPD projects produced almost 2x as much
waste as IPD (median)– Recycling on IPD projects only slightly higher
www.GSBLaw.com
Business Performance Metrics
• Two metrics– Overhead and profit (combined)– Return business
• Results:– A few IPD projects had OH&P between 11 – 15%;
no non-IPD project exceeded 10%– Even lowest return feedback for IPD was positive;
non-IPD included some very negative
www.GSBLaw.com
Conclusion
• IPD projects displayed superior performance– 14 different metrics– Metrics over six of nine performance areas– If p-value strictness relaxed, would have been
even greater
• “IPD delivers higher quality projects faster at no significant cost premium”
www.GSBLaw.com
End Notes
• 1. Quantifying Performance for the Integrated Project Delivery System as compared to Established Delivery Systems, Mounir El Asmar, Ph.D., M.ASCE; Awad S. Hanna, Ph.D., F.ASCE; and Wei-Yin Loh, Ph.D., ASCE J.Constr. Eng. Manage. 2012.139.
• 2. Cho, S. and Ballard, G. (2011). “Last planner and integrated project delivery.” Lean Constr. J., 67-78