Mitglied d
er
Helm
holtz-G
em
ein
schaft
Thomas Lippert
Institute for Advanced Simulation
Jülich Supercomputing Centre
Far more than Petaflops:The Jülich Supercomputing Centre
ScicomP 15 & SP-XXL
Barcelona Supercomputing Centre
May 20, 2009
Supercomputing Drives Basic Sciences
Geophysics
Solid State Physics
Chemistry
Particle Physics
Structure of Matter
Plasma PhysicsNuclear Physics
Astrophysik
Kosmologie
Astrophysics
Cosmology
New Physics
Supercomputing Drives Applied Science
Environment
Weather/Climatology
Pollution / Ozone Hole
Ageing Society
Medicine
Biology
Energy
Plasma Physics
Fuel Cells
Materials
Spintronics
Nano-Science
Supercomputing Drives Engineering and Business
Competitiveness
Reducing design costs by virtual
prototyping:
faster time to
market
Allowing investigations where
economics or ethics preclude
experimentation
imperative of
supercomputing
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 5
FROM JÜLICH TO EUROPE
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 6
Jülich in Brief
Largest civilian research
centre in Europe
360 Mio. Euro/a budget
4.300 staff members
1.200 scientists
700 guest scientists from 50 countries
9 Departments (Institutes)
(Institute for Advanced Simulation)
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 7
You might have heard of ….
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 8
Jülich Supercomputing Centre (JSC)
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 9
IAS Institute for Advanced Simulation
Jülich Supercomputing Centre
(JSC)
Soft Matter
Biophysics Hadron Physics
Nano/Material
Science
IAS Organisation
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 10
Milestones
1961 Zentralinstitut für Angewandte Mathematik (ZAM)
1987 Höchstleistungsrechenzentrum HLRZ
1998 HLRZJohn von Neumann Institut für Computing (NIC)
2007 ZAM Jülich Supercomputing Centre (JSC)
Member of Gauss Centre for Supercomputing
2008 Institute for Advanced Simulation
Coordinator of the PRACE Project
2010 European Supercomputing Centre
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 12
German Research School for Simulation Science
Co-funded by NRW, BMBF and Helmholtz Association
PhD and Master students in two-years course
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 13
MATSE
Education
Organization JSC (current staff assignment)Large Scale Facility Grid &
Infrastructures
Distributed Systems
& Grid Computing
Computational Science
Mathematical Methods
Computational
Science
SimLab
Biology
SimLab
Plasma Phys.
SimLab Mol. Systems
Complex
SystemsNIC
Research Group
HPC
Systems
HPC
Operations
HPC Data
Management
HPC System
Development
European HPC
Infrastructure
UNICORE
Development
Grid
Research
Performance
AnalysisHelmholtz
Young
Investigators
Group
Mathematics
& Education
Modeling &
Methods
Numerical
Algorithms
Mathematical
Software
Education
HPC Application
Support
Applied
Visualization
Program
Optimization
Programming
Environments
SL Operation
Research Group
Quant.-Inf.
Distributed
Systems
Communication
Systems
HPC
Networking
JuNet &
Ext. Networks
Security
Network
Technologies
Director
Secretaries, Administration
Technology
Technology
Development
File- & Archive
Systems
D-Grid
Operation
Technical
Infrastructure
NIC
Coordination
Public
Relations
User/Project
Management
Organization JSC
HPC Systems
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 15
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 16
Supercomputers
1956 First Computer in Jülich
1989 Cray YMP 0.003 Teraflop/s
1996 Cray T3E 0.8 Teraflop/s
2003 IBM p690 9 Teraflop/s
2006 BGL: JUBL 46 Teraflop/s
2008 BGP: JUGENE 223 Teraflop/s
2009 JuRoPA 200 Teraflop/s
HPC-FF 100 Teraflop/s
BGP: JUGENE 1000 Teraflop/s
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 17
IBM Blue Gene/L
JUBL, 45 TFlop/s
2004
2005/6
2007/8IBM Blue Gene/P
JUGENE, 223 TFlop/s
2009
File Server
GPFS
File Server
GPFS, Lustre
IBM Power 4+
JUMP, 9 TFlop/s
IBM Power 6
JUMP, 9 TFlop/s
IBM Blue Gene/P
JUGENE, 1 PFlop/s
Developing Supercomputers @ JSC
Intel Nehalem Clusters
HPC-FF
100 TFlop/s
JUROPA
200 TFlop/s
General-Purpose Highly-Scalable
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 18
JUGENE: Jülich’s Scalable Petaflop System
IBM Blue Gene/P
JUGENE
32-bit PowerPC 450
core 850 MHz, 4-way SMP
72 racks, 294,912 procs
1 Petaflop/s peak
144 TByte main memory
connected to a Global Parallel File System (GPFS) with
5 PByte online disk capacity and up to 25 PByte offline tape capacity
Torus network
First Petaflop system in Europe
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 20
Juropa
2208 compute nodes
2 Intel Nehalem-EP quad-core processors
2.93 GHz
SMT (Simultaneous Multithreading)
24 GB memory (DDR3, 1066 MHz)
IB QDR HCA (via Network Express Module)
17664 cores, 207 TF peak
Sun Microsystems Blade SB6048
Infiniband QDR with non-blocking Fat Tree topology
ParaStation Cluster-OS
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 21
HPC-FF
1080 compute nodes
2 Intel Nehalem-EP quad-core processors
2.93 GHz
SMT (Simultaneous Multithreading)
24 GB memory (DDR3, 1066 MHz)
8640 cores, 101 TF peak
Bull NovaScale R422-E2
Infiniband QDR with non-blocking Fat Tree topology
ParaStation Cluster-OS
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 22
Infiniband Topology
23 x 4 QNEM modules, 24 ports each
6 x M9 switches, 648 ports max. each,
468/276 links used
Mellanox MTS3600 switches (Shark), 36 ports,
for service nodes
4 Compute Sets (CS) with 15 Compute Cells (CC)
each
CC with 18 Compute Nodes (CN) and 1 Mellanox
MTS3600 (Shark) switch each
Virtual 648-port switches constructed
from 54x/44x Mellanox MTS3600
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 23
JUST – Jülich Storage Cluster
GPFS Storage Cluster for all our Supercomputers
Supercomputers are Remote Clusters for GPFS
1 PB Capacity today, expansion to 6 PB in Q4 2009
20 GB/sec Bandwidth, expansion to 66 GB/sec in Q4
Tivoli Storage Manager (TSM) for backup, archive and HSM
2 SUN tape libraries used with TSM
16 PB capacity today
Can be expanded to 32 PB next year
Information and Technology
Deputy Director JSC
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 25
Preparing Infrastructure for ….
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 26
Emerging multi- & many-core Architectures
Accelerators promise exciting performances at low power
Cell Broadband Engine: 200 / 100 GFlop/s (100 W)
nVIDIA Tesla T10: 1000 / 80 GFlop/s (200 W)
AMD FireStream 9270: 1200 / 240 GFlop/s (220 W)
Programming paradigm
CUDA, Brook, Cell-SDK, CellSS, RapidMind, OpenCL, ...
Application kernels have to be adapted by hand
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 27
Many-core prototypes @ JSC
QPACE / eQPACE
Special purpose computer for lattice QCD
Main design goal: energy- and cost-efficiency
Developed by SFB/TR “Hadron Physics”
3-D torus network based on FPGA – SPE-to-SPE comm.
Ultra-dense packaging: 25,6 TFlop/s per rack
Explore broader purpose capabilities within PRACE WP8
Enhanced communication: Beyond nearest neighbor & MEM-to-
MEM
Support of standard communication layers (MPI)
JUICEnext QS22 cluster
Cell based computational platform and test facility
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 28
Future Developments around JuRoPA
Cluster Management
ParaStation (incl. MPI)
GridMonitor
Operating System
SUSE SLES 11
Fighting Operating System Jitter
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 29
Building the D-Grid
InternetClient
Firewall
JUGGLE
Globus LCG/gLite
UNICORE 5
Gateway
to the other UNICORE sites
UNICORE5
DMZ
UNICORE6
UNICORE 6
Registry
SoftComp
UNICORE5 UNICORE6
Communication Systems
PRACE Project Manager
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 31
High-speed Supercomputer Connectivity
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 32
Pan-European Supercomputer Network Research
and Provisioning
• DEISA:
design and operation
of a pan-European
10 Gbit/s network
• LOFAR:
planning and operation
of German-Dutch
peering
• Phosphorus:
R&D in on-demand
optical networking
DEISA
Phosphorus LOFAR
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 33
Data Communication – JuNet by Numbers
• JSC: overall responsibility for Campus network JuNet &
external connections
• JuNet
94 Ethernet switches,
1,5 Tbit/s
9.500 ports in 60 buildings
300 WLAN Access Points
• Supercomputing centre
95 Ethernet switches,
>8 Tbit/s
6.000 ports in 2 buildings
Infiniband, proprietary networks
• External Connectivity
5 Gbit/s X-WiN (redundant)
Dark fibres to RWTH, TZJ,
FhG-Birlinghoven
Project network operation:
DEISA, LOFAR, Phosphorus
VPN and dial-in services
Thomas Eickermann, PRACE Project Coordination@FZ-Jülich
Towards the High-End HPC Service for
European Science
20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC 35
Computational science infrastructure in
Europe The European Roadmap for
Research Infrastructures is the
first comprehensive definition
at the European level
Research Infrastructures are
one of the crucial pillars of the
European Research Area
A European HPC service:
Horizontal
attractive for research
communities
supporting industrial
development
20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC 36
• Prepare the contracts to establish the PRACE
permanent Research Infrastructure as a single
Legal Entity from 2010 on including governance,
funding, procurement, and usage strategies.
• Perform the technical work to prepare operation
of the Tier-0 systems in 2009/2010 including
deployment and benchmarking of prototypes for
Petaflop/s systems and porting, optimising, Peta-
scaling of applications
PRACE PROJECT
20.5. 2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC 37
PRACE – Initiative
New Partners - since May 2008
General Partners
Principal Partners
General Partners
tier 1
tier 0
GENC
I
29.4. 2009 Physikalisches Kolloquium Universität Duisburg-Essen 38
HET: The Scientific Case Weather, Climatology, Earth Science
– degree of warming, scenarios for our future climate.
– understand and predict ocean properties and variations
– weather and flood events
Astrophysics, Elementary particle physics, Plasma physics
– systems, structures which span a large range of different length and time scales
– quantum field theories like QCD LHC, FAIR
– ITER
Material Science, Chemistry, Nanoscience
– understanding complex materials, complex chemistry, nanoscience
– the determination of electronic and transport properties
Life Science
– system biology, chromatin dynamics, large scale protein dynamics, protein association and aggregation, supramolecular systems, medicine
Engineering
– complex helicopter simulation, biomedical flows, gas turbines and internal combustion engines, forest fires, green aircraft
General Partners
PRACE Initiative
PRACEProject
Further PRACE Activities
BSC
Genci
EPSRCNCF
GCS
RIS
HPC Application Support
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 41
User Research Fields
JUMP~ 200 Projects
JUGENE~40 Projects
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 42
Levels of User Support and Training
User Support Training
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 43
Simulation Laboratories:
Community-oriented research and support units
SL
Plasma Physics SL
Biology
SL
Earth andEnvironment
Supercomputing Centre
SL
MolecularSystems
SL SL
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 44
Example of Various Support Activities
Blue Gene/L Scaling Week, May 2006
Blue Gene Scaling Workshop, Dec 2006
Jointly with IBM, and Blue Gene/P Consortium (Argonne
National Lab.)
Scaling to 16k core on Blue Gene/L
Usage of Performance Tool SCALASCA
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 45
JUGENE Usage Snapshot
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 46
Scientific Visualization
Simulation of Blood Flow in a Ventricular
Assist Device (Prof. Marek Behr, RWTH Aachen)
Coordination Office
John von Neumann-Institute
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 48
Soft Matter Composites
DEISA
I3HP
Jülich Initiative
Other
48
Proposals for computer time accepted from Germany and Europe
Peer review by international referees
Allocated by an independent Scientific Council (NIC)
National and European User Groups
Chemistry
Many Particle Physics
Elementary Particle Physics
Biology/Biophysics
Material Science
Soft Matter
Other
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 49
GCS: Gauss Centre for Supercomputing
Germany‟s Tier-0/1 Supercomputing Complex
Association of Jülich, Garching and Stuttgart
A single joint scientific governance
Germany‟s representative in PRACE
More information: http://www.gauss-centre.de
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 50
National HPC Pyramid
3
European
HPC Centre
Topical HPC Centre,
Centre with regional
tasks
HPC Server
Aachen, DKRZ, Dresden, DWD,
Erlangen, G-CSC Frankfurt,
HLRN (Hannover, Berlin),
Karlsruhe, MPG/RZG,
Paderborn
University/Institute
~ 10
~ 100
National
HPC Centre Garching, Jülich, Stuttgart
Gauß Centre for Supercomputing
Gauß Alliance
Computational Science
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 52
Methods & Algorithms
Parallel Performance
Simulation Laboratories (New)
Earth &Environment
Plasma Physics
EnergyBiologyMolecularSystems NanoMikro
Cross-Sectional Teams
Astro-Particle
Research Groups
Quantum Information
NIC Group
Distributed Computing
Education & Training Programmes
Simulation Labs
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 53
Example 1: Simulation Lab Biology
Research
Protein folding & interaction
Structure prediction
Systems biology
Support
Libraries, Bio databases LSDF (Topic 2)
Benchmarking
Monte Carlo, FFT docking, Machine learning
Codes
PROFASI, SMMP
Outreach
FZJ: Biological institutes (ISB, INM), Helmholtz
Groups
Regional: ABC of Life Science Informatics
International: UC Berkeley, Michigan Tech
Protein 1LQ7
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 54
54
Example 2: Simulation Lab Plasma Physics
Research
Kinetic methods: Particle-in-Cell, Vlasov, MD
Fluid + MHD models
Transport: Monte Carlo
Support
Plasma model porting & scaling
Code benchmarking – eg: 3D PIC
Codes
PSC, ILLUMINATION, PEPC, racoon, EIRENE
Outreach
FZJ groups: IEF (Plasma), IKP (Nuclear)
Regional: Univs. Aachen, Bochum, Düsseldorf
National: GSI (HA-EMMI), Garching, FZ-Rossendorf
Laser-ion acceleration
Solar flare modelling
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 55
Petawatt Laser on Thin Foil(Dr. Paul Gibbon, FZJ)
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 56
Example 3: Research Group Quantum Information
Frontier research on quantum effects in computing
Mitigation of quantum effects in integrated circuits
Exploit „qubit‟ paradigm for algorithm acceleration
Massively parallel QC simulation
Research
Robustness of quantum algorithms
(Gate imperfections, decoherence, error correction)
First-principles simulation of real ion-trap quantum computers
Cooperations
New W2 professorship with RWTH Aachen (> June 2009)
FZJ, U. Groningen, U. Innsbruck
8-bit ion trap
Mathematical Methods
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 58
Cross-Sectional Group
Mathematical Methods and Algorithms
Numerical Algorithms Large-scale linear systems, eigenvalue problems
Error-controlled Fast Multipole Method
Kernels for new architectures (Cell Engine)
Modelling Pedestrian dynamics (simulation faster than real-
time)
3D soil-root water transfer
Software & Service Parallel eigenvalue library, FMM program,
toolboxes
Mathematical software, benchmarking
Cooperations FZJ research groups, Univs. Wuppertal, Bonn,
Cologne
FH Aachen, MPIKS Dresden, industrial partners
FMM
Evacuation modelling
Soil-root interface
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 60
Fundamental Diagram
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 61
Simulation of the 2nd Bosphorus Bridge
MotivationThe earthquake safety analysis of the 2nd Bosphorus Bridge in Istanbul is of great
practical interest.
Strategy
A high resolution FEM model of
the bridge has to be developed
Empirical knowledge of typical
earthquake loads at Istanbul provides input for the simulation
The resulting FEM calculations require supercomputer resources
CooperationKandilli Observatory and Earthquake Research Institute
Bogazici University, Istanbul
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 62
62
Higher Education
German Research School for Simulation Sciences
(GRS)
Joint foundation of FZJ and RWTH Aachen
Master courses, doctoral programme in simulation
sciences
Cooperations with regional universities
JSC scientists with professorships at Aachen,
Wuppertal and Bonn
Bachelor & Master programme in Technomathematics
(Aachen U. Appl. Sci.)
Biennial graduate schools in Scientific Computing
Grid Technology and Infrastructures
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 64
Development and Usage of UNICORE
2008200720062005200420032002200120001999 2009
More than a decade of German and
European research & development
and infrastructure projects
And many others, e.g.
2010 2011
UNICORE
UNICORE Plus
EUROGRID
GRIP
GRIDSTART
OpenMolGRID
UniGrids
VIOLA
DEISA
NextGRID
CoreGRID
D-Grid IP
EGEE-II
OMII-Europe
A-WARE
Chemomentum
eDEISA
PHOSPHORUS
D-Grid IP 2
SmartLM
PRACE
D-MON
DEISA2
ETICS2
SLA4D-Grid
WisNetGrid
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 65
Eclipse-based UNICORE Rich Client (URC)
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 66
UNICORE
WS-RF
hosting
environment
XNJS – Site 1
IDB
UNICORE
Atomic
Services
OGSA-*
Service
Registry
Local RMS (e.g. Torque, LL, LSF, etc.)
Target System Interface – Site 1
Local RMS (e.g. Torque, LL, LSF, etc.)
X.509, Proxies,
SOAP, WS-RF,
WS-I, JSDL
OGSA-ByteIO,
OGSA-BES, JSDL,
HPC-P,
OGSA-RUS, UR
X.509, XACML,
SAML, Proxies
DRMAA
UCC
command-
line client
URC
Eclipse-based
Rich client
Portal
e.g. GridSphere
HiLA
Programming
API
Gateway – Site 1
UVOS
VO
Service
External
Storage
USpace
GridFTP, Proxies
USpace
XUUDB
Workflow
Engine
Service
Orchestrator
XACML
entity
UNICORE
WS-RF
hosting
environment
XNJS – Site 2
IDB
UNICORE
Atomic
Services
OGSA-*
Target System Interface – Site 2
XUUDB
XACML
entity
Gateway – Site 2CIS
Info
Service
OGSA-RUS, UR,
GLUE 2.0
Grid services
hosting
job incarnation
web service stack
data transfer to
external storages
authorization
authentication
scientific clients
and applications
central services
running in WS-RF
hosting environments
Gateway
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 67
Core D-Grid sites committing
parts of their existing
resources to D-Grid
Approx. 700 CPUs
Approx. 1 PByte of storage
UNICORE is installed and
used
Additional Sites received
extra money from the BMBF
for buying compute clusters
and data storage
Approx. 2000 CPUs
Approx. 2 PByte of storage
LRZ
DLR-DFD
UNICORE usage in D-Grid
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 68
Consortium of leading national HPC centers in Europe
Deploy and operate a persistent, production quality, distributed,
heterogeneous HPC environment
UNICORE as Grid Middleware
On top of DEISA‟s core
services:
Dedicated network
Shared file system
Common production
environment at all
sites
Used e.g. for workflow
applications IDRIS – CNRS (Paris, France), FZJ (Jülich, Germany), RZG (Garching,
Germany), CINECA (Bologna, Italy), EPCC ( Edinburgh, UK),
CSC (Helsinki, Finland), SARA (Amsterdam, NL), HLRS (Stuttgart, Germany),
BSC (Barcelona, Spain), LRZ (Munich, Germany), ECMWF (Reading, UK)
www.deisa.eu
Usage in DEISA
Performance Analysis
20.5.2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC 70
Cross-Sectional Group Parallel Performance
Objective
Optimization tools for
parallel codes with highest
scalability
Research
Scalasca: performance
analysis tool for large-scale
systems
Current projects
ParMA, SILC (BMBF)
VI-HPS (Helmholtz)
Scalasca-Cube screenshot
20.5.2009 SP-XXL, Barcelona Thomas Lippert, IAS/JSC 71
Blood Pump Code after Improvement
Some Fundamental Stuff
Approaching the Femto-Dimension
c nano piko femtom atto zepto yoctod
Well understood:
Era of BBN
(Nucleosynthesis)
To be confirmed:
Era of Quark-Hadron
Transition: QCD
Unknown:
Dark Matter
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 77
Etwas Theorie…
12.1.2009 Physikalisches Kolloquium Ruhr-Universität Bochum 77
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 78
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 79
10 Breakthroughs of the Year 2008 SCIENCE VOL 322:
Proton‟s Mass „Predicted‟
STARTING FROM A THEORETICAL DESCRIPTION OF ITS INNARDS,
physicists precisely calculated the mass of the proton and
other particles made of quarks and gluons. The numbers
aren‟t new; experimenters have been able to weigh the
proton for nearly a century. But the new results show that
physicists can at last make accurate calculations of the
ultracomplex strong force that binds quarks….
20.5.2009 SP-XXL, Barcelona ThomasLippert, IAS/JSC 80
Mitglied d
er
Helm
holtz-G
em
ein
schaft
Many Thanks
For Your Attention !