+ All Categories
Home > Documents > LCG Applications Area

LCG Applications Area

Date post: 03-Jan-2016
Category:
Upload: malcolm-holcomb
View: 23 times
Download: 0 times
Share this document with a friend
Description:
LCG Applications Area. Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications DOE/NSF Review of US LHC Physics and Computing Projects January 14, 2003. RTAG. WP. WP. WP. WP. WP. The LHC Computing Grid Project Structure. Project Overview Board. - PowerPoint PPT Presentation
39
LCG Applications Area Torre Wenaus, BNL/CERN LCG Applications Area Manager http://cern.ch/lcg/peb/applications DOE/NSF Review of US LHC Physics and Computing Projects January 14, 2003
Transcript
Page 1: LCG Applications Area

LCG Applications Area

Torre Wenaus, BNL/CERN

LCG Applications Area Manager

http://cern.ch/lcg/peb/applications

DOE/NSF Review of US LHC Physics and Computing Projects

January 14, 2003

Page 2: LCG Applications Area

Agency Review, January 14, 2003 Slide 2

Torre Wenaus, BNL/CERN

The LHC Computing Grid Project Structure

Project Overview Board

ProjectExecution

Board (PEB)

Software andComputingCommittee

(SC2)

Requirements,Work plan,Monitoring

WP

RTAG

WP WP WP WP

Project Leader

GridProjects

Project Work Packages

Page 3: LCG Applications Area

Agency Review, January 14, 2003 Slide 3

Torre Wenaus, BNL/CERN

LCG Areas of Work

Fabric (Computing System) Physics Data Management Fabric Management Physics Data Storage LAN Management Wide-area Networking Security Internet Services

Grid Technology Grid middleware Standard application

services layer Inter-project

coherence/compatibility

Physics Applications Software Application Software

Infrastructure – libraries, tools Object persistency, data

management tools Common Frameworks –

Simulation, Analysis, .. Adaptation of Physics

Applications to Grid environment Grid tools, Portals

Grid Deployment Data Challenges Grid Operations Network Planning Regional Centre Coordination Security & access policy

Page 4: LCG Applications Area

Agency Review, January 14, 2003 Slide 4

Torre Wenaus, BNL/CERN

Applications Area Organization

Project

WP WP WP

Project

WP

Project

WP WPWP

Overall management, coordination, architectureApps AreaLeader

ProjectLeaders

Work PackageLeaders

ArchitectsForum

Direct technical collaboration between experiment participants,IT, EP, ROOT, LCG personnel

Page 5: LCG Applications Area

Agency Review, January 14, 2003 Slide 5

Torre Wenaus, BNL/CERN

Focus on Experiment Need

Project structured and managed to ensure a focus on real experiment needs

SC2/RTAG process to identify, define (need-driven requirements), initiate and monitor common project activities in a way guided by the experiments themselves

Architects Forum to involve experiment architects in day to day project management and execution

Open-ness of information flow and decision making Direct participation of experiment developers in the projects Tight, iterative feedback loop to gather user feedback from

frequent releases Early deployment and evaluation of LCG software in experiment

contexts Success defined by experiment adoption and production

deployment

Page 6: LCG Applications Area

Agency Review, January 14, 2003 Slide 6

Torre Wenaus, BNL/CERN

Applications Area Projects

Software Process and Infrastructure (SPI) (operating – A.Aimar) Librarian, QA, testing, developer tools, documentation, training, …

Persistency Framework (POOL) (operating – D.Duellmann) POOL hybrid ROOT/relational data store

Mathematical libraries (operating – F.James) Math and statistics libraries; GSL etc. as NAGC replacement Group in India will work on this (workplan in development)

Core Tools and Services (SEAL) (operating – P.Mato) Foundation and utility libraries, basic framework services, system

services, object dictionary and whiteboard, grid enabled services Physics Interfaces (PI) (launched – V.Innocente)

Interfaces and tools by which physicists directly use the software. Interactive (distributed) analysis, visualization, grid portals

Simulation (launch planning in progress) Geant4, FLUKA, simulation framework, geometry model, …

Generator Services (launch as part of simu) Generator librarian, support, tool development

Bold: Recent developments (last 3 months)

Page 7: LCG Applications Area

Agency Review, January 14, 2003 Slide 7

Torre Wenaus, BNL/CERN

Project Relationships

Sof

twa

re P

roce

ss &

Inf

rast

ruct

ure

(S

PI)

Core Libraries & Services (SEAL)

Persistency(POOL)

PhysicistsInterface

(PI)

MathLibraries…

LCG Applications Area

Other LCG Projects in other Areas

LHC

Exp

erim

ents

Page 8: LCG Applications Area

Agency Review, January 14, 2003 Slide 8

Torre Wenaus, BNL/CERN

02Q1 02Q2 02Q3 02Q4 03Q1 03Q2 03Q3 03Q4 04Q1 04Q2Simulation tools XDetector description & model XConditions database XData dictionary XInteractive framew orks XStatistical analysis XDetector & event visualization XPhysics packages XFramew ork services XC++ class libraries XEvent processing framew ork XDistributed analysis interfaces XDistributed production systems XSmall scale persistency XSoftw are testing XSoftw are distribution XOO language usage XLCG benchmarking suite XOnline notebooks X

Candidate RTAG timeline from March

Blue: RTAG/activity launched or (light blue) imminent

Page 9: LCG Applications Area

Agency Review, January 14, 2003 Slide 9

Torre Wenaus, BNL/CERN

LCG Applications Area Timeline Highlights

2002 200520042003

Q1 Q2 Q3 Q4 Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4Q1 Q2 Q3 Q4

Hybrid Event Store available for general users

Distributed production using grid services

First Global Grid Service (LCG-1) available

Distributed end-user interactive analysis

Full Persistency Framework

LCG-1 reliability and performance targets

“50% prototype” (LCG-3)

LCG TDR

Applications

LCG

POOL V0.1 internal releaseArchitectural blueprint complete

LCG launch week

Page 10: LCG Applications Area

Agency Review, January 14, 2003 Slide 10

Torre Wenaus, BNL/CERN

Architecture Blueprint

Executive summary Response of the RTAG to the mandate Blueprint scope Requirements Use of ROOT Blueprint architecture design precepts

High level architectural issues, approaches Blueprint architectural elements

Specific architectural elements, suggested patterns, examples Domain decomposition Schedule and resources Recommendations

RTAG established in June

After 14 meetings,much email...

A 36-page final reportAccepted by SC2 October 11

http://lcgapp.cern.ch/project/blueprint/

Page 11: LCG Applications Area

Agency Review, January 14, 2003 Slide 11

Torre Wenaus, BNL/CERN

Component Model

Granularity driven by component replacement criteria; development team organization; dependency minimization

Communication via public interfaces Plug-ins

Logical module encapsulating a service that can be loaded, activated and unloaded at run time

APIs targeted not only to end-users but to embedding frameworks and internal plug-ins

Page 12: LCG Applications Area

Agency Review, January 14, 2003 Slide 12

Torre Wenaus, BNL/CERN

Basic Framework

Foundation Libraries

Simulation Framework

Reconstruction Framework

Visualization Framework

Applications

. . .

Optional Libraries

OtherFrameworks

Software Structure

Implementation-neutral services

STL,ROOT libs,

CLHEP,Boost, …

Grid middleware, …

ROOT, Qt, …

Page 13: LCG Applications Area

Agency Review, January 14, 2003 Slide 13

Torre Wenaus, BNL/CERN

Distributed Operation

Architecture should enable but not require the use of distributed resources via the Grid

Configuration and control of Grid-based operation via dedicated services

Making use of optional grid middleware services at the foundation level of the software structure

Insulating higher level software from the middleware Supporting replaceability

Apart from these services, Grid-based operation should be largely transparent

Services should gracefully adapt to ‘unplugged’ environments

Transition to ‘local operation’ modes, or fail informatively

Page 14: LCG Applications Area

Agency Review, January 14, 2003 Slide 14

Torre Wenaus, BNL/CERN

Managing Objects

Object Dictionary To query a class about its internal structure Essential for persistency, data browsing, etc. The ROOT team and LCG plan to develop and converge on a

common dictionary (common interface and implementation) with an interface anticipating a C++ standard (XTI) (Timescale ~1yr?)

Will contact Stroustrup, who has started implementation Object Whiteboard

Uniform access to application-defined transient objects, including in the ROOT environment

What this will be (how similar to Gaudi, StoreGate?) not yet defined Object definition based on C++ header files

Now that ATLAS as well as CMS will use this approach, it is being addressed in a common way via the LCG AA

Page 15: LCG Applications Area

Agency Review, January 14, 2003 Slide 15

Torre Wenaus, BNL/CERN

Dictionary: Reflection / Population / Conversion

ROOT I / O

LCGDictionaryCI NT

DictStreamer

.h

in

out

ROOTCI NT

CI NT generatedcode Dict generated

code

.adl.xml

ADL/ GOD

OtherClients

LCG

to

CIN

TD

ict

gate

way

(2)

(1)

Population

Conversion

Reflection

.h

GCC-XMLIn progress

New in POOL 0.3

Page 16: LCG Applications Area

Agency Review, January 14, 2003 Slide 16

Torre Wenaus, BNL/CERN

Other Architectural Elements

Python-based Component Bus Plug-in integration of components providing a wide

variety of functionality Component interfaces to bus derived from their C++

interfaces Scripting Languages

Python and CINT (ROOT) to both be available Access to objects via object whiteboard in these

environments Interface to the Grid

Must support convenient, efficient configuration of computing elements with all needed components

Page 17: LCG Applications Area

Agency Review, January 14, 2003 Slide 17

Torre Wenaus, BNL/CERN

EventGeneration

Core Services

Dictionary

Whiteboard

Foundation and Utility Libraries

DetectorSimulation

Engine

Persistency

StoreMgr

Reconstruction

Algorithms

Geometry Event Model

GridServices

I nteractiveServices

Modeler

GUIAnalysis

EvtGen

Calibration

Scheduler

Fitter

PluginMgr

Monitor

NTuple

Scripting

FileCatalog

ROOT GEANT4 DataGrid Python Qt

Monitor

. . .MySQLFLUKA

EventGeneration

Core Services

Dictionary

Whiteboard

Foundation and Utility Libraries

DetectorSimulation

Engine

Persistency

StoreMgr

Reconstruction

Algorithms

Geometry Event Model

GridServices

I nteractiveServices

Modeler

GUIAnalysis

EvtGen

Calibration

Scheduler

Fitter

PluginMgr

Monitor

NTuple

Scripting

FileCatalog

ROOT GEANT4 DataGrid Python Qt

Monitor

. . .MySQLFLUKA

Domain Decomposition

Products mentioned are examples; not a comprehensive list

Grey: not in common project scope(also event processing framework, TDAQ)

Page 18: LCG Applications Area

Agency Review, January 14, 2003 Slide 18

Torre Wenaus, BNL/CERN

Use of ROOT in LCG Software

Among the LHC experiments ALICE has based its applications directly on ROOT The 3 others base their applications on components with

implementation-independent interfaces Look for software that can be encapsulated into these components

All experiments agree that ROOT is an important element of LHC software

Leverage existing software effectively and do not unnecessarily reinvent wheels

Therefore the blueprint establishes a user/provider relationship between the LCG applications area and ROOT

LCG AA software will make use of ROOT as an external product Draws on a great ROOT strength: users are listened to very carefully!

So far so good: the ROOT team has been very responsive to needs for new and extended functionality coming from POOL

Page 19: LCG Applications Area

Agency Review, January 14, 2003 Slide 19

Torre Wenaus, BNL/CERN

Blueprint RTAG Outcomes

SC2 decided in October…

Blueprint is accepted RTAG recommendations accepted to

Start common project on core tools and services Start common project on physics interfaces

Page 20: LCG Applications Area

Agency Review, January 14, 2003 Slide 20

Torre Wenaus, BNL/CERN

Applications Area Personnel Status

18 LCG apps hires in place and working; +2 in Jan, Feb Manpower ramp is on target (expected to reach 20-23) Contributions from UK, Spain, Switzerland, Germany, Sweden,

Israel, Portugal, US

~10 FTEs from IT (DB and API groups) also participating ~8 FTEs from experiments (CERN EP and outside CERN) also

participating in (mainly) POOL, SEAL, SPI CERN established a new software group as the EP home of the

LCG applications area (EP/SFT) Led by John Harvey. Taking shape well. Localized in B.32

Fraction of experiment contribution which is US-supported (CERN or US resident) is currently ~30%

US fraction of total effort is <10%

Page 21: LCG Applications Area

Agency Review, January 14, 2003 Slide 21

Torre Wenaus, BNL/CERN

LHC Manpower needs for Core Software

2000Have (miss)

2001 2002 2003 2004 2005

ALICE 12(5) 17.5 16.5 17 17.5 16.5

ATLAS 23(8) 36 35 30 28 29

CMS 15(10) 27 31 33 33 33

LHCb 14(5) 25 24 23 22 21

Total 64(28) 105.5 106.5 103 100.5 99.5

Only computing professionals counted

From LHC Computing (‘Hoffman’) Review (FTEs)

Page 22: LCG Applications Area

Agency Review, January 14, 2003 Slide 22

Torre Wenaus, BNL/CERN

Personnel Resources – Required and Available

Estimate of Required Effort

FTEs today: 18 LCG, 10 CERN IT, 8 CERN EP + experiments

0

10

20

30

40

50

60S

ep-0

2

Dec

-02

Mar

-03

Jun-

03

Sep

-03

Dec

-03

Mar

-04

Jun-

04

Sep

-04

Dec

-04

Mar

-05

Quarter ending

FT

Es

SPI

Math libraries

Physics interfaces

Generator services

Simulation

CoreToolsS&Services

POOL

Blue = Available effort:

Future estimate: 20-23 LCG, 13 IT, 28 EP + experiments

Now

Page 23: LCG Applications Area

Agency Review, January 14, 2003 Slide 23

Torre Wenaus, BNL/CERN

Current Personnel Distribution

FTE assignments

POOL

SPI

CLS

PI

Simu

Gen

Math

ROOT

Grid

Arch

Mgmt

Page 24: LCG Applications Area

Summary of LCG-funded Resources Used - Estimate to end 2002

Experience-Weighted FTEsCERN Special FundingApplications 8.9

Persistency 1.8Software Process Support 2.0Simulation 1.8Root 1.4Architecture 0.0Grid Interfacing 1.1Training 0.7

Fabric 5.4System Management & Operations 1.6Development (e.g. Monitoring) 1.8Data Storage Management 0.8Grid Security 1.1

Grid Technology 3.0Data Management 2.3Grid Gatekeeper 0.7

Grid Deployment 4.6Int 2.1OPS 2.5

Management 2.1LCG 2.1

EU FundingDataGrid 6.9

Total Weighted FTEs in calendar 2002 30.8

Page 25: LCG Applications Area

Agency Review, January 14, 2003 Slide 25

Torre Wenaus, BNL/CERN

U.S. Leadership

Direct leadership and financial contribution: T.Wenaus as AA manager

In addition to contributions via ATLAS and CMS A .75FTE job requiring CERN residence Salary support from the BNL base program (is this fair?) CERN residency and US travel costs borne by CERN Together with the strong U.S. presence in CMS and ATLAS

computing leadership, this role gives the U.S. a strong voice in the LCG applications area

Not a dominating influence of course; e.g. at this point all the applications area project leaders are Europeans

Presence at CERN is very important, like it or not Importance is increased because of the utterly deplorable state of

the CERN infrastructure for both audio and video conferencing The U.S. should put up the money to fix this, if no one else will;

it is in our own vital interest

Page 26: LCG Applications Area

Agency Review, January 14, 2003 Slide 26

Torre Wenaus, BNL/CERN

Schedule and Resource Tracking (example)

Page 27: LCG Applications Area

Agency Review, January 14, 2003 Slide 27

Torre Wenaus, BNL/CERN

Apps area planning materials

Planning page linked from applications area page Applications area plan spreadsheet: overall project plan

http://lcgapp.cern.ch/project/mgmt/AppPlan.xls High level schedule, personnel resource requirements

Applications area plan document: overall project plan http://lcgapp.cern.ch/project/mgmt/AppPlan.doc Incomplete draft

Personnel spreadsheet http://lcgapp.cern.ch/project/mgmt/AppManpower.xls Currently active/foreseen apps area personnel, activities

WBS, milestones, assigned personnel resources http://atlassw1.phy.bnl.gov/Planning/lcgPlanning.html

Follow Applications Area planning link on the review web page

Page 28: LCG Applications Area

Agency Review, January 14, 2003 Slide 28

Torre Wenaus, BNL/CERN

Core Libraries and Services (SEAL) Project

Launched in Oct, led by Pere Mato (CERN/LHCb) 6-member (~3 FTE) team initially; M.Marino from ATLAS

Scope: Foundation, utility libraries Basic framework services Object dictionary Grid enabled services

Many areas of immediate relevance to POOL; these are given priority Users of this project are software developers in other projects and the

experiments Establishing initial plan, reviewing existing libraries and services

Process for adopting third party code will be addressed in this project Initial workplan will be presented to SC2 on Jan 10 2003/3/31: SEAL V1 essentials in alpha

Page 29: LCG Applications Area

Agency Review, January 14, 2003 Slide 29

Torre Wenaus, BNL/CERN

SEAL Work Packages

Foundation and utility libraries Boost, CLHEP, …, complementary in-house development

Component model and plug-in manager The core expression in code of the component architecture described in

the blueprint. Mainly in-house development. LCG object dictionary

Already active project in POOL; being moved to SEAL (wider scope than persistency). Will include filling dictionary from C++ header files.

Basic framework services Object whiteboard, message reporting, component configuration, ‘event’

management Scripting services Grid services: common interface to middleware Education and documentation

Assisting experiments with integration

Page 30: LCG Applications Area

Agency Review, January 14, 2003 Slide 30

Torre Wenaus, BNL/CERN

Physicist Interface (PI) Project

Led by Vincenzo Innocente (CERN/CMS) Covers the interfaces and tools by which physicists will directly use

the software Planned scope:

Interactive environment: physicist’s desktop Analysis tools Visualization Distributed analysis, grid portals

Very poorly defined and understood Currently surveying experiments on their needs and interests In more of an ‘RTAG mode’ than project mode initially, to flesh out

plans and try to clarify the grid area Will present initial plans (and possibly an analysis RTAG proposal) to

SC2 on Jan 29

Page 31: LCG Applications Area

Agency Review, January 14, 2003 Slide 31

Torre Wenaus, BNL/CERN

Software Process and Infrastructure (SPI)

Code documentation, browsing Doxygen, LXR, ViewCVS Testing Framework CppUnit, Oval Memory Leaks Valgrind Automatic Builds Probably the ATLAS system Coding and design guidelines RuleChecker CVS organization Configuration/release mgmt SCRAM Software documentation templates

http://spi.cern.ch

Components available:

Page 32: LCG Applications Area

Agency Review, January 14, 2003 Slide 32

Torre Wenaus, BNL/CERN

SPI Services

CVS repositories One repository per project Standard repository structure and #include conventions being finalized this

week Will eventually move to IT CVS service when it is proven

AFS delivery area, Software Library /afs/cern.ch/sw/lcg Installations of LCG-developed and external software Installation kits for offsite installation LCG Software Library ‘toolsmith’ started in December

Build servers Machines with various Linux, Solaris configurations available for use

Project portal (similar to SourceForge) http://lcgappdev.cern.ch Very nice new system using Savannah (savannah.gnu.org) Used by CMS as well as LCG; ATLAS will probably be using it soon Bug tracking, project news, FAQ, mailing lists, download area, CVS

access, …

Page 33: LCG Applications Area

Agency Review, January 14, 2003 Slide 33

Torre Wenaus, BNL/CERN

Page 34: LCG Applications Area

Agency Review, January 14, 2003 Slide 34

Torre Wenaus, BNL/CERN

POOL

Pool of persistent objects for LHC, currently in prototype Targeted at event data but not excluding other data Hybrid technology approach

Object level data storage using file-based object store (ROOT) RDBMS for meta data: file catalogs, object collections, etc (MySQL)

Leverages existing ROOT I/O technology and adds value Transparent cross-file and cross-technology object navigation RDBMS integration

Integration with Grid technology (eg EDG/Globus replica catalog) network and grid decoupled working modes

Follows and exemplifies the LCG blueprint approach Components with well defined responsibilities Communicating via public component interfaces Implementation technology neutral

Page 35: LCG Applications Area

Agency Review, January 14, 2003 Slide 35

Torre Wenaus, BNL/CERN

Pool Release Schedule

End September - V0.1 (Released Oct 2) All core components for navigation exist and interoperate Assumes ROOT object (TObject) on read and write

End October - V0.2 (Released Nov 15) First collection implementation

End November - V0.3 (Released Dec 18) First public release EDG/Globus FileCatalog integrated Persistency for general C++ classes (not instrumented by ROOT),

but very limited: elementary types only Event metadata annotation and query

June 2003 – Production release

Page 36: LCG Applications Area

Agency Review, January 14, 2003 Slide 36

Torre Wenaus, BNL/CERN

POOL Milestones

Page 37: LCG Applications Area

Agency Review, January 14, 2003 Slide 37

Torre Wenaus, BNL/CERN

Simulation Project

Mandated by SC2 to initiate simulation project following the RTAG Project being organized now Expected to cover

generic simulation framework Multiple simulation engine support, geometry model, generator

interface, MC truth, user actions, user interfaces, average tracking, utilities

ALICE virtual MC as starting point if it meets requirements Geant4 development and integration FLUKA (development and) integration physics validation simulation test and benchmark suite fast (shower) parameterisation generator services

Page 38: LCG Applications Area

Agency Review, January 14, 2003 Slide 38

Torre Wenaus, BNL/CERN

Comment on Grid Technology Area (GTA)

Quote from slide of Les:

LCG expects to obtain Grid Technology from projects funded by national and regional e-science initiatives -- and from industry

concentrating ourselves on deploying a global grid service

All true, but there is a real role for the GTA, not just deployment, in LCG:

Ensuring that the needed middleware is/will be there, tested, selected and of production grade

(Re)organization in progress to create an active GTA along these lines

Important for the Applications Area: AA distributed software will be robust and usable only if the grid middleware it uses is so

Page 39: LCG Applications Area

Agency Review, January 14, 2003 Slide 39

Torre Wenaus, BNL/CERN

Concluding Remarks

Essentially the full expected AA scope is covered by the anticipated activities of the projects now defined

Manpower is in quite good shape Buy-in by the experiments, apart from ALICE, is good

Substantial direct participation in leadership, development, prompt testing and evaluation, RTAGs

U.S. CMS represented well because of strong presence in computing management and in CERN-based personnel

U.S. ATLAS representation will improve with D.Quarrie’s relocation to CERN as Software Leader; further increases in CERN presence being sought

Groups remote from CERN are contributing, but it isn’t always easy Have pushed to lower the barriers, but still it isn’t easy

New CERN EP/SFT group is taking shape well as a CERN hub for applications area activities

POOL and SPI are delivering, and the other projects are ramping up First persistency prototype released in 2002, as targeted in March 2002


Recommended