+ All Categories
Home > Documents > Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and...

Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and...

Date post: 02-Apr-2015
Category:
Upload: jorden-hamsher
View: 214 times
Download: 0 times
Share this document with a friend
20
Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Information Technology and Computing Infrastructure for Computing Infrastructure for U.S. LHC Physics U.S. LHC Physics Lothar A.T. Bauerdick, Fermilab Lothar A.T. Bauerdick, Fermilab Project Manager U.S. CMS Software and Computing Project Manager U.S. CMS Software and Computing
Transcript
Page 1: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 1Lothar A T Bauerdick Fermilab

Information Technology and Information Technology and Computing Infrastructure for Computing Infrastructure for

U.S. LHC PhysicsU.S. LHC Physics

Lothar A.T. Bauerdick, FermilabLothar A.T. Bauerdick, FermilabProject Manager U.S. CMS Software and ComputingProject Manager U.S. CMS Software and Computing

Page 2: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 2Lothar A T Bauerdick Fermilab

LHC Physics Discovery LHC Physics Discovery

through Information Technology through Information Technology and Computing Infrastructureand Computing Infrastructure

LHC Computing Unprecedented LHC Computing Unprecedented in Scale and Complexity (and Costs)in Scale and Complexity (and Costs)

Advanced Coherent Global Advanced Coherent Global “Info“Information-Infrarmation-Infrastructure”structure”

Partnerships: International, Interdisciplinary, Inter-agency!Partnerships: International, Interdisciplinary, Inter-agency!

+

Page 3: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 3Lothar A T Bauerdick Fermilab

LHC Physics DiscoveriesLHC Physics Discoveries

by Researchers at U.S. Universities and Labsby Researchers at U.S. Universities and Labs

U.S. LHC is Committed to Empower the U.S. Scientists U.S. LHC is Committed to Empower the U.S. Scientists

to do Research on LHC Physics Datato do Research on LHC Physics Data

This is why we are interested in the Grid This is why we are interested in the Grid as an Enabling Technologyas an Enabling Technology

Page 4: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 4Lothar A T Bauerdick Fermilab

Distributed Grid Computing and Data ModelDistributed Grid Computing and Data Model

Adapted as the LHC Baseline Model:Adapted as the LHC Baseline Model:Distributed Communities, Distributed Resources Distributed Communities, Distributed Resources

Coherent Global Data Access, Analysis, ManagementCoherent Global Data Access, Analysis, Management

Major Successes and Advances of Grid Infrastructure in the U.S.:Major Successes and Advances of Grid Infrastructure in the U.S.:

R&D and Testbeds: Prototyping and Hardening of Grid Infrastructure, R&D and Testbeds: Prototyping and Hardening of Grid Infrastructure, Deploying Grid Tools, Developing and Integrating Grid ApplicationsDeploying Grid Tools, Developing and Integrating Grid Applications

Example:US Atlas Grid Testbed

Page 5: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 5Lothar A T Bauerdick Fermilab

LHC Experiment

Online System

CERN Computer Center > 20 TIPS

USAJapan FranceUK

Institute

100-200 MBytes/s

2.5 Gbits/s

1 Gbits/s

2.5 - 10 Gbits/s

~0.6 Gbits/s

Tier 0

Tier 1

Tier 3

Tier 2

Physics cachePCs, other portals

Institute

Institute

Institute

Tier2 Center

Tier2 Center

Tier2 Center

Tier2 Center

Tier 4

Tier-ed System of Regional CentersTier-ed System of Regional Centers

Developing further the hierarchically organized fabric of “Grid Nodes” …Developing further the hierarchically organized fabric of “Grid Nodes” …

Page 6: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 6Lothar A T Bauerdick Fermilab

Transition to Production-Quality GridTransition to Production-Quality Grid

Centers taking part in LHC Grid 2003Centers taking part in LHC Grid 2003Production Service Around the World Production Service Around the World Around the Clock! Around the Clock!

Page 7: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 7Lothar A T Bauerdick Fermilab

……towards Dynamic Workspaces for Scientiststowards Dynamic Workspaces for Scientists

Communities of Scientists Working Locally within a Global ContextCommunities of Scientists Working Locally within a Global ContextInfrastructure for sharing, consistency of physics and calibration data, softwareInfrastructure for sharing, consistency of physics and calibration data, software

New IT Needed!

New IT Needed!

Page 8: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 8Lothar A T Bauerdick Fermilab

LHC Research Program Has Started Strongly!LHC Research Program Has Started Strongly!

Sizable R&D efforts and major investments in Tier-1 centers startedSizable R&D efforts and major investments in Tier-1 centers started

First Grid Infrastructure in place, in collaboration with the First Grid Infrastructure in place, in collaboration with the LHC Computing Grid Project at CERN and elsewhereLHC Computing Grid Project at CERN and elsewhere

U.S. LHC Scientists will profit in major waysU.S. LHC Scientists will profit in major ways develop the strong U.S. LHC environment for Physics Analysisdevelop the strong U.S. LHC environment for Physics Analysis

Address the core issues in U.S. LHC S&C: Address the core issues in U.S. LHC S&C: developing and implementing the distributed computing model developing and implementing the distributed computing model

central for success of U.S. Universities participationcentral for success of U.S. Universities participation Focus on end-to-end services, Focus on end-to-end services, Focus on distributed data access and management Focus on distributed data access and management

Work with Grid Projects like PPDG, NSF projects, DOE Science Grid etc;Work with Grid Projects like PPDG, NSF projects, DOE Science Grid etc;work with CERN and other centers around the world work with CERN and other centers around the world to setup a global information infrastructure (“info-structure”) to setup a global information infrastructure (“info-structure”) to enable the U.S. for scientific discovery at the LHCto enable the U.S. for scientific discovery at the LHC

Page 9: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 9Lothar A T Bauerdick Fermilab

The Goal:The Goal:

Provide capabilities to individual physicists and Provide capabilities to individual physicists and communities of scientists that allow themcommunities of scientists that allow them To participate as an equal in the LHC researchTo participate as an equal in the LHC research To be fully represented in the Global Experiment Enterprise To be fully represented in the Global Experiment Enterprise To receive on-demand whatever resources and information they need to To receive on-demand whatever resources and information they need to

explore their science interest while respecting the collaboration-wide explore their science interest while respecting the collaboration-wide priorities and needspriorities and needs

Provide massive computing, storage, networking resourcesProvide massive computing, storage, networking resources Including “opportunistic” use of resources that are not LHC owned!Including “opportunistic” use of resources that are not LHC owned!

Provide full access to dauntingly complex “meta-data”Provide full access to dauntingly complex “meta-data” That need to be kept consistent to make sense of the event dataThat need to be kept consistent to make sense of the event data

Collaborative Environment and Info-systemsCollaborative Environment and Info-systems

Page 10: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 10Lothar A T Bauerdick Fermilab

These Goals Require Substantial R&DThese Goals Require Substantial R&D

Global Access and Global Management of Massive and Complex DataGlobal Access and Global Management of Massive and Complex Data Location Transparency of Location Transparency of

Complex Processing Environments and of Vast Data CollectionsComplex Processing Environments and of Vast Data Collections Virtual Data, Workflow, Knowledge Management TechnologiesVirtual Data, Workflow, Knowledge Management Technologies Monitoring, Simulation, Scheduling and Optimization on a Monitoring, Simulation, Scheduling and Optimization on a

Heterogeneous Grid of Computing Facilities and NetworksHeterogeneous Grid of Computing Facilities and Networks End-to-End Networking Performance, Application IntegrationEnd-to-End Networking Performance, Application Integration Management of Virtual Organizations across the Grid, Management of Virtual Organizations across the Grid,

Technologies and Services for Security, Privacy, AccountingTechnologies and Services for Security, Privacy, Accounting Scientific Collaboration over the distanceScientific Collaboration over the distance Etc …Etc …

Grids are the Enabling TechnologyGrids are the Enabling TechnologyLHC Needs are Pushing the LimitsLHC Needs are Pushing the Limits

Technology and Architecture still evolvingTechnology and Architecture still evolvingNew Research and Development in IT is requiredNew Research and Development in IT is required

Page 11: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 11Lothar A T Bauerdick Fermilab

U.S. LHC Grid Technology CyclesU.S. LHC Grid Technology Cycles

““Rolling Prototypes”: evolution of the facility and data systemsRolling Prototypes”: evolution of the facility and data systemsPrototyping and early roll out to production quality servicesPrototyping and early roll out to production quality services

Participation in Computing and Physics Participation in Computing and Physics Data ChallengesData Challenges Emphasis on Quality, Documentation, Emphasis on Quality, Documentation,

Dissemination,Dissemination,Tracking of external “practices”Tracking of external “practices”

Page 12: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 12Lothar A T Bauerdick Fermilab

MITRiceMinnesota IowaPrinceton

•Expression of interest:

Grid Testbeds And Production GridsGrid Testbeds And Production Grids

BrazilSouth Korea

Grid Testbeds: Research, Development and Dissemination!Grid Testbeds: Research, Development and Dissemination! LHC Grid Testbeds first real-life large Grid installations, becoming production qualityLHC Grid Testbeds first real-life large Grid installations, becoming production quality Strong Partnership between Labs, Universities, Strong Partnership between Labs, Universities,

with Grid (iVDGL, GriPhyN, PPDG) and Middleware Projects (Condor, Globus)with Grid (iVDGL, GriPhyN, PPDG) and Middleware Projects (Condor, Globus) Strong dissemination component, together with Grid ProjectsStrong dissemination component, together with Grid Projects

E.g. U.S. CMS Testbed: E.g. U.S. CMS Testbed: Caltech, UCSD, U.Florida, UW Madison, Fermilab, CERNCaltech, UCSD, U.Florida, UW Madison, Fermilab, CERN

Page 13: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 13Lothar A T Bauerdick Fermilab

Example:Example: Grid Monitoring and Information ServicesGrid Monitoring and Information Services

MonALISA Monitoring System (Caltech)MonALISA Monitoring System (Caltech)deployed in U.S. & CERN Grid Testbeddeployed in U.S. & CERN Grid Testbed

Dynamic information services and Dynamic information services and Grid resource discovery mechanisms Grid resource discovery mechanisms using “intelligent agents”using “intelligent agents”

Use and further developUse and further developnovel Grid Technologies novel Grid Technologies and Grid Interfacesand Grid Interfaces

““Grid Control Room” For LHC GridGrid Control Room” For LHC Grid Technology driver for other projectsTechnology driver for other projects

Page 14: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 14Lothar A T Bauerdick Fermilab

Distributed Collaborative EngineeringDistributed Collaborative Engineering

““Projectization” essential for Software and Computing Effort of this complexityProjectization” essential for Software and Computing Effort of this complexity Requires expert manpower and engineeringRequires expert manpower and engineering Physics and Detector Groups at Universities are the first to profit from thisPhysics and Detector Groups at Universities are the first to profit from this

Example: Atlas Detector Geometry Description Databases Example: Atlas Detector Geometry Description Databases Idea and ConceptIdea and Concept

Geometry Modeller based on CDF [U.Pittsburg]Geometry Modeller based on CDF [U.Pittsburg] Massive Development Effort Massive Development Effort

NOVA MySQL Database [BNL]NOVA MySQL Database [BNL]– Repository of persistent configuration informationRepository of persistent configuration information

NOVA Service [ANL]NOVA Service [ANL]– Retrieval of transient C++ objects from NOVA DatabaseRetrieval of transient C++ objects from NOVA Database

Conditions Database Service [ANL/Lisbon]Conditions Database Service [ANL/Lisbon]– Access to time-varying information based on type, time, version and keyAccess to time-varying information based on type, time, version and key– Used in conjunction with other persistency services (e.g. NOVA Service)Used in conjunction with other persistency services (e.g. NOVA Service)

Interval Of Validity Service [LBNL]Interval Of Validity Service [LBNL]– Registration of clients; retrieval of updated information when validity expires; Registration of clients; retrieval of updated information when validity expires;

caching policy managementcaching policy management Release as scheduled to Detector and Physics GroupsRelease as scheduled to Detector and Physics Groups

Prototype at Silicon alignment workshop in December 2002Prototype at Silicon alignment workshop in December 2002

Page 15: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 15Lothar A T Bauerdick Fermilab

Example: Detector DescriptionExample: Detector Description

Detail from TRT

Detail from Barrel Liquid Argon(parameterized - 40kB in memory)

Geometry Modeller, Database, Visualization, OptimizationGeometry Modeller, Database, Visualization, Optimization

Page 16: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 16Lothar A T Bauerdick Fermilab

““Work Packages” for LHC ComputingWork Packages” for LHC Computing

Facilities and Fabric InfrastructureFacilities and Fabric Infrastructure U.S. Tier-1 and Tier-2 centers, U.S. University infrastructureU.S. Tier-1 and Tier-2 centers, U.S. University infrastructure

Distributed Computing InfrastructureDistributed Computing Infrastructure Networks, throughput, servers, catalogsNetworks, throughput, servers, catalogs

Grid ServicesGrid Services Middleware, “Virtual Organizations” support, end-to-end and higher level services, Middleware, “Virtual Organizations” support, end-to-end and higher level services,

trouble shooting and fault tolerance, distributed science environmenttrouble shooting and fault tolerance, distributed science environment

Experiment Specific SoftwareExperiment Specific Software Core software, frameworks, architectures, applications physics and detector supportCore software, frameworks, architectures, applications physics and detector support

Collaboratory Tools and SupportCollaboratory Tools and Support Communication, conferencing, sharing, Virtual Control RoomCommunication, conferencing, sharing, Virtual Control Room

Support ServicesSupport Services Training, info services, help deskTraining, info services, help desk

Page 17: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 17Lothar A T Bauerdick Fermilab

““Map” of Grid Projects Directly Related to LHCMap” of Grid Projects Directly Related to LHCe.g. the U.S. Particle Physics Data Grid (PPDG), funded by SciDACe.g. the U.S. Particle Physics Data Grid (PPDG), funded by SciDAC

Grid Projects are Grid Projects are a Large International Efforta Large International Effort

Page 18: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 18Lothar A T Bauerdick Fermilab

Partnership for Global InfostructurePartnership for Global Infostructure

Physics + Computer Science/Information Technology Funding AgenciesPhysics + Computer Science/Information Technology Funding Agencies

I.Gaines, 4-Agency

I.Gaines, 4-Agency

meeting at CERN

meeting at CERN

March 21st, 2003

March 21st, 2003

Page 19: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 19Lothar A T Bauerdick Fermilab

DOE Labs have great impact on U.S. LHC ScienceDOE Labs have great impact on U.S. LHC Science

Inter-agency partnership between the Inter-agency partnership between the DOE-funded Tier-1 and NSF-funded Tier-2 efforts DOE-funded Tier-1 and NSF-funded Tier-2 efforts

Tier-1s at Fermilab and BNL address the majorTier-1s at Fermilab and BNL address the majorGrid Infrastructure expertise and 24x7 SupportGrid Infrastructure expertise and 24x7 Support

Information Technology and Computing InfostructureInformation Technology and Computing Infostructurefor LHC Physics Discovery requires for LHC Physics Discovery requires

Research, Development, Deployment, DisseminationResearch, Development, Deployment, Dissemination

— —and Sustained Reliable Running and Sustained Reliable Running of Facilities and Support Services!of Facilities and Support Services!

+

Page 20: Apr 18, 2003 Meeting with the DOE 1 Lothar A T Bauerdick Fermilab Information Technology and Computing Infrastructure for U.S. LHC Physics Lothar A.T.

Apr 18, 2003Meeting with the DOE 20Lothar A T Bauerdick Fermilab

The U.S. LHC Mission is The U.S. LHC Mission is Physics Discovery at the Energy Frontier!Physics Discovery at the Energy Frontier!

This partnership takes advantage This partnership takes advantage of the significant strengths of of the significant strengths of

U.S. labs and universities in the area of CS and ITU.S. labs and universities in the area of CS and IT

and exploit synergy betweenand exploit synergy betweenU.S. Universities and National Labs,U.S. Universities and National Labs,

Software Professionals and Physicists,Software Professionals and Physicists,Computer Scientists and High Energy PhysicistsComputer Scientists and High Energy Physicists

LHC is amongst the first to put a truly distributed LHC is amongst the first to put a truly distributed “Info-Structure” in place, “Info-Structure” in place,

spearheading important innovations in how we do sciencespearheading important innovations in how we do science


Recommended