Mumbai, Feb 13, 2006CHEP 2006
1P. SphicasLHC experiments’ software
State of Readiness of the LHC experiments’ software
Outline The Startup
Status at last CHEP (04) Today’s picture
Common Software Individual experiments
Software deployment What is left to do/what’s being done Summary/Outlook
P. SphicasCERN/UoA
Computing in High Energy PhysicsMumbai, Feb 2006
The startup (LHC and experiments)
Mumbai, Feb 13, 2006CHEP 2006
3P. SphicasLHC experiments’ software
LHC startup plan
L=3x1028 - 2x1031
Stage 1Initial commissioning
43x43 to 156x156, N=3x1010
Zero to partial squeeze
Stage 1Initial commissioning
43x43 to 156x156, N=3x1010
Zero to partial squeeze
Stage 275 ns operation
936x936, N=3-4x1010
partial squeeze
Stage 275 ns operation
936x936, N=3-4x1010
partial squeeze
L=7x1032 - 2x1033
L=1032 - 4x1032
Stage 325 ns operation
2808x2808, N=3-5x1010
partial to near full squeeze
Stage 325 ns operation
2808x2808, N=3-5x1010
partial to near full squeeze
Mumbai, Feb 13, 2006CHEP 2006
4P. SphicasLHC experiments’ software
LHC startup: CMS/ATLAS Integrated luminosity with the current LHC plans
Run 2008
1.E-01
1.E+00
1.E+01
1.E+02
1.E+03
1.E+04
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
weeks
luminosity (10**30 cm-2 sec-1) integrated luminosity (pb-1)
events/crossing
Top re-discovery
Higgs (?)
Z’ muonsSusy - Susy
1031
Lumi(cm-2s-1)
1032
1033 1.9 fb1.9 fb-1-1
LHC = 30%(optimistic!)
1 fb-1 (optimistic?)
Mumbai, Feb 13, 2006CHEP 2006
5P. SphicasLHC experiments’ software
Pilot Run Pilot Run : Luminosity
30 days; maybe less (?); 43*43 bunches, then 156*156 bunches
PILOT RUN
1.E-04
1.E-03
1.E-02
1.E-01
1.E+00
1.E+01
1.E+021 3 5 7 9
11 13 15 17 19 21 23 25 27 29
DAYS
luminosity (10**30 cm-2 sec-1) integrated luminosity (pb-1)"
events/crossing
1029
1030
1028
1031
Lumi(cm-2s-1)
Pile-up Int. Lumi(pb-1)
10
1
0.1
LHC = 20%(optimistic!)
Mumbai, Feb 13, 2006CHEP 2006
6P. SphicasLHC experiments’ software
Multiplicity paper:• Introduction• Detector system
- Pixel (& TPC)• Analysis method• Presentation of data
- dN/dη and mult. distribution (s dependence)
• Theoretical interpretation- ln2(s) scaling?, saturation, multi-parton inter…
• Summary
pT paper outline:• Introduction• Detector system
- TPC, ITS• Analysis method• Presentation of data
- pT spectra and pT-multiplicity correlation
• Theoretical interpretation- soft vs hard, mini-jet production…
• Summary
Startup physics (ALICE)
Can publish two papers 1-2 weeks after LHC startup
Mumbai, Feb 13, 2006CHEP 2006
7P. SphicasLHC experiments’ software
Startup plan Physics rush:
ALICE: minimum-bias proton-proton interactions Standard candle for the heavy-ion runs
LHCb: BS mixing, sin2 repeat If the Tevatron has not done it already
ATLAS-CMS: measure jet and IVB production; In 15 pb-1 will have 30K W’s and 4K Zs into leptons.
Measure cross sections and W and Z charge asymmetry (pdfs; IVB+jet production; top!)
Luminosity?
Mumbai, Feb 13, 2006CHEP 2006
8P. SphicasLHC experiments’ software
Startup plan and Software Turn-on is fast
Pile-up increasing rapidly Timing (43x43 to 75ns to
25 ns) evolution LOTS of physics
For all detectors: Commission detector and
readout Commission trigger
systems Calibrate/align detector(s) Commission computing
and software systems Rediscover the Standard
Model
Simulation Reconstruction Trigger Monitoring Calibration/Alignment
calculation application
User-level data objects selection
Analysis Documentation
Mumbai, Feb 13, 2006CHEP 2006
9P. SphicasLHC experiments’ software
Status at last CHEP
My very rough estimate … average over 4 experiments
2007
Path accomplished (%)
• Realistic detectors (HV problems, dead channels, mis-alignments, …) not yet implemented• Calibration strategy : -- where (Event Filter, Tier0) ? -- which streams, which data size ? -- how often, how many reprocessings of part of raw data ? C PU ? not fully developed in most cases (implications for EDM and Computing Model ?)• Software for experiment monitoring and for commissioning with cosmic and beam-halo muons (the first real data to be collected …) not developed yet (reconstruction must cope with atypical events …)
F. Gianotti @ CHEP04
Today’s picture
Common Software
Mumbai, Feb 13, 2006CHEP 2006
11P. SphicasLHC experiments’ software
LCG Application Area Deliver the common physics applications software for
the LHC experiments
Organized to ensure focus on real experiment needs Experiment-driven requirements and monitoring Architects in management and execution Open information flow and decision making Participation of experiment developers Frequent releases enabling iterative feedback
Success is defined by adoption and validation of the products by the experiments
Integration, evaluation, successful deployment
Mumbai, Feb 13, 2006CHEP 2006
12P. SphicasLHC experiments’ software
AA Projects SPI – Software process infrastructure
Software and development services: external libraries, savannah, software distribution, support for build, test, QA, etc.
ROOT – Core Libraries and Services Foundation class libraries, math libraries, framework services,
dictionaries, scripting, GUI, graphics, SEAL libraries, etc.
POOL – Persistency Framework Storage manager, file catalogs, event collections, relational
access layer, conditions database, etc.
SIMU - Simulation project Simulation framework, physics validation studies, MC event
generators, Garfield, participation in Geant4 and Fluka.
Mumbai, Feb 13, 2006CHEP 2006
13P. SphicasLHC experiments’ software
AA Highlights SPI is concentrating on the following areas:
Savannah service (bug tracking, task management, etc.) >160 hosted projects, >1350 registered users (doubled in one year)
Software services (installation and distribution of software) >90 external packages installed in the external service
Software development service Tools for development, testing, profiling, QA
Web and Documentation ROOT activity at CERN fully integrated in the LCG organization
(planning, milestones, reviews, resources, etc.) The main change during last year has been the merge of the SEAL and
ROOT projects Single development team Adiabatic migration of the software products into a single set of
core software libraries 50% of the SEAL functionality has been migrated into ROOT (mathlib,
reflection, python scripting, etc.) ROOT is now at the “root” of the software for all the LHC experiments
Mumbai, Feb 13, 2006CHEP 2006
14P. SphicasLHC experiments’ software
AA Highlights (2) POOL (object storage and references) has been consolidated
Adapted to new Reflex dictionaries, 64bit support, new file catalog interfaces, etc.
CORAL is a major re-design of the generic relational database access interface
Focusing on the deployment of databases in the grid environment
COOL conditions database is being validated Many significant performance and functionality improvements Currently being validated by ATLAS and LHCb
Consolidation of the Simulation activities Major release of Fluka-2005.6 released in July 2005 Garfield (simulation of gaseous detectors) added in the project scope New developments and improvements of Geant4 toolkit New results in the physics validation of Geant4 and Fluka
Today’s picture
Individual experiments
Mumbai, Feb 13, 2006CHEP 2006
16P. SphicasLHC experiments’ software
Frameworks: essentially done ALICE: AliROOT; ATLAS+LHCb: Athena/Gaudi
CMS: moved to a new framework; in progress
Converter
Algorithm
Event DataService
PersistencyService
DataFiles
AlgorithmAlgorithm
Transient Event Store
Detec. DataService
PersistencyService
DataFiles
Transient Detector
Store
MessageService
JobOptionsService
Particle Prop.Service
OtherServices
HistogramService
PersistencyService
DataFiles
TransientHistogram
Store
ApplicationManager
ConverterConverter
Mumbai, Feb 13, 2006CHEP 2006
17P. SphicasLHC experiments’ software
Simulation (I) Geant4: success story; Deployed by all experiments.
Functionality essentially complete. Detailed physics studies performed by all experiments.
Very reliable in production (better than 1:104) Good collaboration between experiments and Geant4 team Lots of feedback on physics (e.g. from testbeams) LoH (Level of Happiness): very high
LHCb : ~ 18 million volumes ALICE : ~3 million volumes
Mumbai, Feb 13, 2006CHEP 2006
18P. SphicasLHC experiments’ software
Simulation (II) Tuning to data: ongoing. Very good progress made
CMS HCAL:CMS HCAL:Brass/ScintillatorBrass/Scintillator
ATLAS Tilecal: ATLAS Tilecal: Fe/ScintillatorFe/Scintillator
Geant4 / data for e/
Mumbai, Feb 13, 2006CHEP 2006
19P. SphicasLHC experiments’ software
Fast simulation (I) Different levels of “fast” simu at the four expts:
CMS extreme: swimming particles through detector; include material effects, radiation, etc. Imitate full simulation – but much faster (1Hz).
ATLAS: particle-level smearing. VERY fast (kHz) LHCb: generator output directly accessible by the physics
application programs But: ongoing work in bridging the gap
For example, in shower-parametrization in the G4 full simulation (ATLAS)
Common goal of all: output data at AOD level
Mumbai, Feb 13, 2006CHEP 2006
20P. SphicasLHC experiments’ software
Fast simulation (II)
Simplified (FAMOS) geometry
Detailed geometry
Nested cylinders,
Fast propagation,Fast material
effect simulation.
Complicated geometry,
Propagation in
short steps, full & slow simulation
t t-
pT (2nd jet)
Mumbai, Feb 13, 2006CHEP 2006
21P. SphicasLHC experiments’ software
Reconstruction, Trigger, Monitoring General feature: all based on corresponding framework
(AliRoot, Athena, Gaudi, CMSSW) Multi-threading is necessary for online environment Most Algorithms & Tools are common with offline
Two big versions: Full reconstruction “seeded”, or “partial”, or “reconstruction inside a region of
interest” This one used in HLT
Online monitoring and event displays “Spying” on Trigger/DAQ data online
But also later in express analysis Online calibrations
Mumbai, Feb 13, 2006CHEP 2006
22P. SphicasLHC experiments’ software
Online selection
109 Ev/s 109 Ev/s
102Ev/s102Ev/s
99.99 % Lv199.99 % Lv1
99.9 % HLT99.9 % HLT
0.1 %0.1 %
105 Ev/s 105 Ev/s
0.01 %0.01 %
Same hardware (Filter Subfarms) Same software (CARF-ORCA) But different situations
Same hardware (Filter Subfarms) Same software (CARF-ORCA) But different situations
Mumbai, Feb 13, 2006CHEP 2006
23P. SphicasLHC experiments’ software
High-Level Trigger A huge challenge; large (small) rejection (accept) factor
In practice: startup will use smaller rates. CMS example: 12.5 kHz (pilot run) and 50 kHz (1033 cm-2s-1) Real startup conditions (beam, backgrounds, expt)
unknown Startup trigger tables: in progress. ATLAS/CMS have
prototypes. Real values: when beam comes…
ATLAS/CMS LHCb ALICE
Intrctn rate 109 Hz 107 Hz 104 Hz
HLT input 100 kHz 1 MHz 1 kHz
HLT accept 100-200 Hz 2000 Hz ~50 Hz
Lvl-1 (HW)
HLT (SW)
Mumbai, Feb 13, 2006CHEP 2006
24P. SphicasLHC experiments’ software
Regional reco example: CMS HLT electrons (I)
“Lvl-2” electron: reclusterInside extended Lvl-1 trigger
area
Brem recovery: “supercluster”Seed; road in around seed;
collect all clusters in road
Add pixel information Very fast; pre-brem
Mumbai, Feb 13, 2006CHEP 2006
25P. SphicasLHC experiments’ software
Regional reco example: CMS HLT electrons (II) “Level-3” selection
Full tracking, loose track-finding (to maintain high efficiency):
Cut on E/p everywhere, plus Matching in (barrel) H/E (endcap)
Another full-tracking example: LHCb
RZ VeloSpace Velo Long track
Velo-TT
Timing:
Decoding: 4.6 ms
Velo RZ: 1.3 ms
Velo space: 6.0 ms
Velo-TT: 3.7 ms
Long: 30.0 ms
Mumbai, Feb 13, 2006CHEP 2006
26P. SphicasLHC experiments’ software
Calibration/Alignment Key part of commissioning activities
Dedicated calibration streams part of HLT output (e.g. calibration stream in ATLAS, express-line in CMS; different names/groupings, same content)
What needs to be put in place Calibration procedure; what, in which order, when, how Calibration “closed loop” (reconstruct, calibrate, re-reconstruct, re-
calibrate…) Conditions data reading / writing / iteration Reconstruction using conditions database
What is happening Procedures defined in many cases; still not “final” but understanding
improving Exercising conditions database access and distribution infrastructure
With COOL conditions database, realistic data volumes and routine use in reconstruction
In a distributed environment, with true distributed conditions DB infrastructure
Mumbai, Feb 13, 2006CHEP 2006
27P. SphicasLHC experiments’ software
Calibration/Alignment (II) Many open questions still:
Inclusion in simulation; to what extent? Geometry description and use of conditions DB in distributed
simulation and digitisation Management
Organisation and bookkeeping (run number ranges, production system,…)
How do we ensure all the conditions data for simulation is available with right IOVs?
What about defaults for ‘private’ simulations ? Reconstruction
Ability to handle time-varying calibration Asymptotically: dynamic replication (rapidly propagate new
constants) to support closed loop and ‘limited time’ exercisesTier-0 delays: maximum of ~4-5 days (!)
Calibration algorithms Introduction of realism: misbehaving and dead channels; global
calibrations (E/p); full data size; ESD/RECO input vs RAW
Mumbai, Feb 13, 2006CHEP 2006
28P. SphicasLHC experiments’ software
Documentation Everyone says it’s important; nobody usually does it
A really nice example from ATLAS ATLAS Workbook Worth copying…
Mumbai, Feb 13, 2006CHEP 2006
29P. SphicasLHC experiments’ software
Analysis (introduction) Common understanding: early analysis will run off of
RECO/ESD format RECO/ESD(ATLAS/CMS)~(0.25-0.5) MB; ALICE/LHCb~0.04 The reconstructed quantities; frequent reference to RAW data
At least until basic understanding of detector, its response and the software will be in place
Asymptotically, work off of Analysis Object Data (AOD) MiniDST for the youngsters in the audience Reduction of factor ~5 wrt RECO/ESD format Crucial: definition of AOD (what’s in it); functionality Prototypes exist in most cases
Sizes and functionality not within spec yet One ~open issue: is there a need for a TAG format (1kB
summary)? E.g. ATLAS has one, in a database; CMS not.
Mumbai, Feb 13, 2006CHEP 2006
30P. SphicasLHC experiments’ software
Analysis “flow”: an example
RECO/AODDatasets
AODpre
Cand, User Data
AODSignal dataset
Background dataset(s)
preCand,
User Data
At Tier 1/ Tier2At Tier 0/ Tier1
AOD, Cand
AOD, Cand
pre
pre
At Tier 2
Laptop ?500 GB
50 GB
Example numbers
Mumbai, Feb 13, 2006CHEP 2006
31P. SphicasLHC experiments’ software
User analysis: a brief history 1980s: mainframes, batch jobs, histograms back. Painful. Late 1980s, early 1990s: PAW arrives.
NTUPLEs bring physics to the masses Workstations with “large” disks (holding data locally) arrive; looping
over data, remaking plots becomes easy Firmly in the 1990s: laptops arrive;
Physics-in-flight; interactive physics in fact. Late 1990s: ROOT arrives
All you could do before and more. In C++ this time. FORTRAN is still around. The “ROOT-TUPLE” is born Side promise: if one inherits all one owns from TObject,
reconstruction and analysis form a continuum 2000s: two categories of analysis physicists: those who can only
work off the ROOT-tuple and those who can create/modify it Mid-2000s: WiFi arives; Physics-in-meeting; CPA effect – to be
recognized as a syndrome.
Mumbai, Feb 13, 2006CHEP 2006
32P. SphicasLHC experiments’ software
Analysis (I) All-ROOT: ALICE
Event model has been improving; Event-level Tag DB deployed
Collaboration with ROOT and STAR New analysis classes developed by PWG’s Batch distributed analysis being deployed Interactive analysis prototype New prototype for visualization p
Mumbai, Feb 13, 2006CHEP 2006
33P. SphicasLHC experiments’ software
Analysis a la ALICE
Mumbai, Feb 13, 2006CHEP 2006
34P. SphicasLHC experiments’ software
Analysis a la LHCb PYTHON!
Bender
Simul.Gauss Recons
Brunel
AnalysisDaVinci
MCHits
Stripped DST
Digits DSTMCParts
GenParts
AOD
RawDataDetectorDescription
ConditionsDatabase
Digit.Boole
Mumbai, Feb 13, 2006CHEP 2006
35P. SphicasLHC experiments’ software
A la CMS Goal: one format, one program for all (reconstruction,
analysis) Store “simple” structures that are browsable by plain ROOT;
And then: load CMSSW classes and act on data as in a “batch”/”reconstruction” job
Same jet-finding; muon-matching code; cluster corrections Issue is what data is available (RAW, RECO, AOD)
gSystem>Load("libPhysicsToolsFWLite")AutoLibraryLoader::enable()TFile f("reco.root")Events.Draw("Tracks.phi()-TrackExtra.outerPhi(): Tracks.pt()", "Tracks.pt()<10", "box")
Mumbai, Feb 13, 2006CHEP 2006
36P. SphicasLHC experiments’ software
How will analysis actually be done? It is not possible to enforce an analysis model
TAGs may turn out to be very useful and widely utilized; they may also turn out to be used by only a few people.
Many physicists will try to use what their experience naturally dictates to them
At a given stage, users may want do dump ntuples anyway
For sure *some* users will do this anyway The success of any model will depend on the perceived
advantages by the analyzers Extremely important:
Communication: explain the advantages of modularity Help users: make transition process smooth
Mumbai, Feb 13, 2006CHEP 2006
37P. SphicasLHC experiments’ software
Event Display (I) ATLAS
Mumbai, Feb 13, 2006CHEP 2006
38P. SphicasLHC experiments’ software
Event Display (II) Interactive analysis; LHCb example:
Via a PYTHON script
Add the options of your analysis to Panoramix.opts
Add the options of your analysis to Panoramix.opts
Software Deployment
Mumbai, Feb 13, 2006CHEP 2006
40P. SphicasLHC experiments’ software
Issues not covered in this talk Code management;
ATLAS example: Approximately 1124 CVS modules (packages) ~152 containers
Container hierarchy for commit and tag management ~900 leaf
Contain source code or act as glue to external software ~70 glue/interface
Act as proxies for external packages Code distribution:
Different layers of builds (nightly, weekly, developers’, major releases…)
Testing and validation Very complex process. Ultimate test: the “challenges”
Mumbai, Feb 13, 2006CHEP 2006
41P. SphicasLHC experiments’ software
ATLAS integrated testbeam All ATLAS sub-
detectors (and LVL1 trigger) integrated and run together with common DAQ and monitoring, “final” electronics, slow-control, etc. Gained lot of global operation experience during ~ 6 month run.
x
z
y
Geant4 simulation of test-beam set-up
Mumbai, Feb 13, 2006CHEP 2006
42P. SphicasLHC experiments’ software
Cosmics ATLAS CMS
Tower energies:~ 2.5 GeV
What’s left to do
Mumbai, Feb 13, 2006CHEP 2006
44P. SphicasLHC experiments’ software
Injecting additional realism Impact on detector performance/physics; e.g. ATLAS
cables, services from latest engineering drawings, barrel/end-cap cracks from installation
realistic B-field map taking into account non-symmetric coil placements in the cavern ( 5-10 mm from survey)
include detector “egg-shapes” if relevant (e.g. Tilecal elliptical shape if it has an impact on B-field …)
displace detector (macro)-pieces to describe their actual position after integration and installation (e.g. ECAL barrel axis 2 mm below solenoid axis inside common cryostat) break symmetries and degeneracy in Detector Description and Simulation
mis-align detector modules/chambers inside macro-pieces include chamber deformations, sagging of wires and
calorimeter plates, HV problems, etc. (likely at digitization/reconstruction level)
Technically very challenging for the Software …
Mumbai, Feb 13, 2006CHEP 2006
45P. SphicasLHC experiments’ software
Real commissioning Learning a lot from testbeam (e.g. ATLAS integrated
test) and integrated tests (e.g. CMS Magnet Test/ Cosmic Challenge)
But nothing like the real thing Calibration challenges a crucial step forward
All experiments have some kind of system-wide test planned for mid and end-2006
Detector synchronization Procedures (taking LHC beam structure and luminosity) being
put in place; still a lot to do Preparing for real analysis
Currently: far from hundreds of users accessing (or trying to access) data samples
Summary/Outlook
Mumbai, Feb 13, 2006CHEP 2006
47P. SphicasLHC experiments’ software
Summary Overall shape: ok
Common software in place. Much of the experiments’ software either complete or nearly
fully-functional prototypes in place Difference between theory and practice: working on it,
but still difficult to predict conditions at the time A number of important tests/milestones on the way
E.g. the calibration challenges. In parallel with Grid-related milestones: major sanity checks
Deployment has begun in earnest First pictures from detectors read out and reconstructed… at
least locally Performance (sizes, CPU, etc): in progress
Mumbai, Feb 13, 2006CHEP 2006
48P. SphicasLHC experiments’ software
Still a long way to go before some of the more complicated analyses are possible:
Example from SUSY (IFF sparticles produced with high Gauginos produced in their decays, e.g.
qL20qL (SUGRA P5)
q g q 20qq (GMSB G1a)
Complex signatures/cascades
(1) 20 1
0h (~ dominates if allowed)
(2) 20 1
0+– or 20 +–
Has it all: (multi)-leptons; jets, missEt, bb… This kind of study: in numerous yellow reports
Complex signal; decomposition… In between: readout, calib/align, HLT, reconstruction,
AOD, measurement of Standard Model… But we’re getting ever closer!
Outlook
~
–
~ _~
~ ~
~ ~~~
~ ~ ~