+ All Categories
Home > Documents > Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st...

Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st...

Date post: 28-Mar-2015
Category:
Upload: eric-ohara
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
11
Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009
Transcript
Page 1: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Computing for LHC

Dr. Wolfgang von Rüden, CERN, Geneva

ISEF students visitCERN, 28th June - 1st July 2009

Page 2: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

LHC Computing

Page 3: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

The LHC Computing Challenge

Signal/Noise: 10-9

Data volume High rate * large number of

channels * 4 experiments 15 PetaBytes of new data each year

Compute power Event complexity * Nb. events *

thousands users 100 k of (today's) fastest CPUs 45 PB of disk storage

Worldwide analysis & funding Computing funding locally in major

regions & countries Efficient analysis everywhere GRID technology

Page 4: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Wolfgang von Rüden, CERN 4

Particle collisions in the centre of a detector

June 2009

Page 5: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Wolfgang von Rüden, CERN 5

Massive Online Data Reduction

June 2009

Page 6: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Wolfgang von Rüden, CERN 6

Tier 0 at CERN: Acquisition, First pass processing Storage & Distribution

June 2009

Page 7: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Tier 0 – Tier 1 – Tier 2

Wolfgang von Rüden, CERN 7

Tier-0 (CERN):• Data recording• Initial data

reconstruction• Data distribution

Tier-1 (11 centres):• Permanent storage• Re-processing• Analysis

Tier-2 (~130 centres):• Simulation• End-user analysis

June 2009

Page 8: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Recent grid activity

These workloads are at the level anticipated for 2009 data

In readiness testing WLCG ran more than10 million jobs /month

(1 job is ~ 8 hours use of a single processor)

350k /day

8Wolfgang von Rüden, CERNJune 2009

Page 9: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Data transfer out of Tier0

• Full experiment rate needed is 650 MB/s

• Desire capability to sustain twice that to allow for Tier 1 sites to shutdown and recover

• Have demonstrated far in excess of that

• All experiments exceeded required rates for extended periods, & simultaneously

• All Tier 1s achieved (or exceeded) their target acceptance rates

Tier-2s and Tier-1s are inter-connected by the general

purpose research networks

Any Tier-2 mayaccess data at

any Tier-1

Tier-2 IN2P3

TRIUMF

ASCC

FNAL

BNL

Nordic

CNAF

SARAPIC

RAL

GridKa

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2

Tier-2Tier-2

Tier-2

June 2009 9Wolfgang von Rüden, CERN

Page 10: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

10

WLCG depends on two major science grid infrastructures ….

EGEE - Enabling Grids for E-ScienceOSG - US Open Science Grid ... as well as many national grid projects

Interoperability & interoperation is vital significant effort in building the procedures to support it

June 2009 Wolfgang von Rüden, CERN

Page 11: Computing for LHC Dr. Wolfgang von Rüden, CERN, Geneva ISEF students visit CERN, 28 th June - 1 st July 2009.

Enabling Grids for E-sciencE

EGEE-II INFSO-RI-031688

240 sites45 countries45,000 CPUs12 PetaBytes> 5000 users> 100 VOs> 100,000 jobs/day

ArcheologyAstronomyAstrophysicsCivil ProtectionComp. ChemistryEarth SciencesFinanceFusionGeophysicsHigh Energy PhysicsLife SciencesMultimediaMaterial Sciences…

Apr-04Jul-0

4

Oct-04Jan

-05

Apr-05Jul-0

5

Oct-05Jan

-06

Apr-06Jul-0

6

Oct-06Jan

-07

Apr-07Jul-0

70

10000

20000

30000

40000

50000

No. CPU

Apr-04Jul-0

4

Oct-04Jan

-05

Apr-05Jul-0

5

Oct-05Jan

-06

Apr-06Jul-0

6

Oct-06Jan

-07

Apr-07Jul-0

70

50

100

150

200

250

300

No. Sites

Grid infrastructure project co-funded by the European Commission - now in 3rd phase with over 100 partners


Recommended