NCAR Supercomputing ‘Data Center’ Project

Post on 04-Jan-2016

26 views 1 download

Tags:

description

NCAR Supercomputing ‘Data Center’ Project. An NCAR-led computing ‘facility’ for the study of the Earth system. Outline. A Problem An Opportunity NSF’s Petascale Roadmap A Solution Facility Proposal: Site and Cost Partners The Scientific Payoff Next Steps Schedule and Panels. - PowerPoint PPT Presentation

transcript

NCAR Supercomputing ‘Data Center’ Project

An NCAR-led computing ‘facility’ for the study of the Earth system

Outline• A Problem • An Opportunity

– NSF’s Petascale Roadmap• A Solution

– Facility Proposal: Site and Cost– Partners

• The Scientific Payoff• Next Steps

– Schedule and Panels

NCAR Leadership in Supercomputing…

• One of the founding missions of NCAR was: “… to provide, or arrange for provision of facilities for the scientific community as a whole that whose initial cost and upkeep lie beyond the capability of individual universities or research groups.” – Preliminary Plans for a National Institute for Atmospheric Research. 1959 – NCAR Blue Book

• Note: the wording does not imply physical collocation.• This mission does confer a responsibility that cannot be

delegated - namely maintaining an complete integrated cyberinfrastructure (CI) system for modeling and data analysis that meets our scientific community’s needs.

Examples of NCAR simulation science today

• Global change climate ensembles

• Weather Research Forecast

• Geophysical Turbulence

• Fire storm front modeling

• Space weather

• More…

A problem

• NCAR Mesa Lab computer facility is quickly becoming obsolete

• Power, cooling and floor space will be inadequate beyond the next procurement

• Science is being restricted by focusing on capacity ahead of capability

CMOS Trends Continue …

Chips: Faster, Cheaper but Hotter

SCD Computer Facility Equipment Power Consumption (kW)

0

100

200

300

400

500

600

700

800

900

1000

Jan-97 Jan-98 Jan-99 Jan-00 Jan-01 Jan-02 Jan-03 Jan-04 Jan-05 Jan-06

Cray C90

IBM POWER3 blackforest & babyblue

SGI Origin3800

IBM POWER4 bluesky,thunder &bluedawn

IBM Linux

IBM BlueGene/L

IBM POWER5bluevista

Cray T3D

SGI O2K ute

HP SPP2000

SGI O2K ute

Compaq ES40

Network & Enterprise Systems

SGI O2K dataproc

Cray J90's

An OpportunityNSF’s Petascale Roadmap

“Overarching Recommendation: Establish a Petascale Collaboratory for the Geosciences with the mission to provide leadership-class computational resources that will make it possible to address, and minimize the time to solution of, the most challenging problems facing the geosciences.”

www.joss.ucar.edu/joss_psg/meetings/petascale/

Strategic Plan for High Performance

Computing(2006-2010)

Private Sector

Agency Partners

HPC Resource Providers

S&ECommunity

Portable, Scalable Applications Software &Services

SoftwareService

Provider (SSP)

SSP

SSP

Science-Driven HPC Systems

ComputeEngines

Local Storage Visualization

Facilities

NSF Conclusions• NSF is committed to developing and implementing a

strategic plan for cyberinfrastructure– Broad based plan involving the university, Federal agencies,

vendors, and International partners• ATM, OCE, and EAR take different approaches to the

realization of CI for their discipline– Dependent on the readiness of the community

• Petascale facility is an integrating theme for the Geosciences community– High potential for the generation of new knowledge and paradigm

for the conduct of research– Building and sustaining a petascale facility will be a significant

challenge to budgets and technology– Consistent with NSF strategic vision for CI

A solution for NCAR• A new computing facility (not at the Mesa Lab)• Extensive investigations, working with

consultants and internal needs resulted in a detailed set of options

• Provides for 5-20 years of computing (capacity and capability) diversity based on current and predicted future trends in CMOS technology

• Allows NCAR to reach beyond its current research scope

The facility needed• Data Center Expansion Report from NCAR’s

Computing and Information Systems Lab• 20,000 (initial to 60,000) sq. ft.• 4 (to 13) MW power + generators• Cooling, etc.• On 13 acres (20 year lifetime)• Accommodates computers, staff, open space,

initial and future requirements

Birds Eye View

Architectural View

Phase 2 Addition

Phase 3Addition

Importance of Site Selection

• Limited selection of sites that meet criteria– Size (10-15 acres)– Electrical capacity (up to 24 MW)– Fiber optic route (dark fiber)

• Investigated– Marshall– Louisville– Longmont– Westminster

• New partners and options are now being sought– IBM– Colorado School of Mines– Colorado State University– University of Colorado– University of Wyoming

(Water, Political Complications, Fiber Optics)

(Electrical Capacity)

Cost Drivers• Primary Drivers

– Tier III Reliability• Mechanical Systems• Electrical Systems

– Engineering

• Secondary Drivers– Building Size– Land Site

• Facility - up to $75M (one time)• Operations - $15M/year? (2X)• Computing increments - $15M/year (2X)• Computing infrastructure - $5M/year

The Scientific Payoff…

A petascale computer will enable scientists to …

• Do credible regional climate modeling for decision support. Requires resolving individual mountain ranges and ocean western boundary currents.

• Model climate and weather in a fully coupled mode.• Better understand the marine biogeochemical cycles.

Requires resolving ocean mesoscale eddies.• Accurately simulate the dynamical, microphysical and

radiative cloud processes. • Improve seismic predictions and understand the

structure of the inner core as well as the fine structure of the lower mantle.

A petascale computer will enable scientists to

• Perform new research in solid earth and environmental engineering

• Assimilate thousands of earthquakes bringing the fine structure of the Earth’s mantle and inner core into focus.

• Study the physical basis of land surface parameterizations by modeling soils, topography and vegetation at sub-meter scales.

• More accurately predict the damaging effects of solar flares on satellites and power distribution systems by resolving the fine structure of the corona magnetic field.

• Investigate energy management applications

Science examples …

2005 Hurricane Katrina, Track Forecast 4 km, 62 h forecast

Landfall on 8/29 14Z, Louisiana/Mississippi Border

12 km, 86 h forecast

Observed Track

OfficialForecast

Mobile Radar

Hurricane Katrina Reflectivity at Landfall

29 Aug 2005 14 Z

4 km WRF, 62 h forecast

Radar Composite Reflectivity

WRF 4 km Hurricane Katrina 72 h Forecast

Initialized 27 Aug 2005 00 Z

WRF Max Reflectivity

QuickTime™ and aBMP decompressor

are needed to see this picture.

Coupled Climate System Model

QuickTime™ and aVideo decompressor

are needed to see this picture.

Integrated Space Weather Modeling

QuickTime™ and aYUV420 codec decompressor

are needed to see this picture.

Thus …

Main Points • Huge scientific discoveries await geoscience modelers at

1 PFLOPS and beyond. • CMOS continues to get hotter and cheaper. The most

recent acquisition tracks this trend.• Every center is (or will be) facing facility challenges in the

race to these discoveries. This situation is NOT unique to NCAR.

• NCAR now has a facility plan, that if successful, uniquely positions it as a world leader in geoscience simulation.

• The new facility is not a crisis: it is an opportunity.

The Opportunity

• Understanding of fundamental physical processes in the Sun-Earth system

• Environmental and Energy applications not yet possible

• NCAR and partners will scope/define these options

– Such a facility would be a computational equivalent of the Hubble Telescope for geoscience simulation.

Next Steps

The Schedule• Formed NCAR project committee• Forming Blue Ribbon Panel and hold teleconference -

mid-Oct. 2005, meet mid-Nov.• Project plan development Oct-Dec• Community engagement - Nov-Jan• Formalize partnerships - Oct-Dec• Present initial plan to National Science Foundation,

mid-October, 2005• Forge international collaborations - Nov. 2005• Complete project plan - Feb. 2005• Initiate facility - June 2006?• First electrons - June 2008 - March 2009?

Contacts at NCAR

• Tim Killeen (killeen@ucar.edu) - NCAR Director

• Lawrence Buja (southern@ucar.edu) and Peter Fox (pfox@ucar.edu) are co-chairs of the NCAR project team

• Aaron Anderson (aaron@ucar.edu) is the computing facilities contact

• Jeff Reaves (jreaves@ucar.edu) is the financial/ contracts contact

Concluding remarks …