LHC Control SystemTutorial for the FNAL LARP group
Jim PatrickNovember 9, 2005
General Overview
Taken from a number of talks, mostly: ICALEPCS Conference, October 2005 (33 CERN presentations!) December 2004 AB/CO “Controls Day” Plus other miscellaneous talks
Goals: Give some practical flavor of the system, but not prepare one
to write actual code. Relate concepts to FNAL system.
I am not at all an expert.
Outline
Overview Timing Device Model Front-End Software Application Software and Development Services Security Summary
CERN – AB Department
b. frammery -10.10.2005
The LHC Control
System
B. FrammeryFor the CERN - AB/CO Group
CERN – AB Department
b. frammery -10.10.2005
CERN machines
(LEP)
LHC
SL Division
PS Division
Until 2003 Since 2003
AB Department
CERN – AB Department
b. frammery -10.10.2005
Strategy
• Develop new software and hardware infrastructures• For LHC• To be used & tested on all the new developments• To be spread over all the CERN accelerators at a later
stage
• Integrate industrial solutions as much as possibleMeaning that, meanwhile, the “legacy” controls for LINAC2,the PSB, the PS and the SPS are to be maintained
CERN – AB Department
b. frammery -10.10.2005
TCP/IP communication services
TCP/IP communication services
CENTRAL OPERATORCONSOLES
LOCAL OPERATORCONSOLES
FIXEDDISPLAYS
CE
RN
G
IGA
BIT
E
TH
ER
NE
T
TE
CH
NIC
AL
NE
TW
OR
K
FILE SERVERSLinux/HP ProLiant
APPLICATION SERVERSPVSS /Linux PC
SCADA SERVERS
RT/LynxOSVME Front Ends
Linux/LynxOS PCFront Ends PLCs
LHC MACHINE
TCP/IP communication services
PUBLIC ETHERNET NETWORK
TIMING GENERATION
LHC MACHINE
ACTUATORS AND SENSORSCRYOGENICS, VACUUM, ETC…
QUENCH PROTECTION AGENTS,POWER CONVERTERS FUNCTIONSGENERATORS, …
BEAM POSITION MONITORS,BEAM LOSS MONITORS,BEAM INTERLOCKS,RF SYSTEMS, ETC…
Wor
ldF
IP S
EG
ME
NT
(1,
2.5
MB
its/s
ec)
PLCs
PR
OF
IBU
S
FIP
/IO
OP
TIC
AL
FIB
ER
S
cPCIFront Ends
ANALOGUESIGNALSYSTEM
T T T T
TT
TT
TT
LHC controls architecture diagram
CERN – AB Department
b. frammery -10.10.2005
Software
frameworks
CERN – AB Department
b. frammery -10.10.2005
The software frameworks (1)
• Front-End Software Architecture (FESA) Complete environment for Real-Time Model-driven control software implemented in C++ for the LynxOS and Linux platforms
• Java framework for accelerator controlso Uses J2EE application servers with lightweight containerso Plain Java objects (no EJB beans)o Applications can run (for test) in a 2-tier setup o Unified Java API for Parameter Control (JAPC) to access any
kind of parameter.o Runs on Linux platform
CERN – AB Department
b. frammery -10.10.2005
The software frameworks (2)
• UNified Industrial Control System (UNICOS)o Complete environment for designing, build and programming
industrial based control systems for the LHC.o For cryogenics, vacuum, environmental controls etc.o Supervision layer: PVSS II (SCADA from ETM)o Cross communication with accelerator controls framework possible
UNICOS & the Java framework for accelerator controls usethe same graphical symbols and color codes
CERN – AB Department
b. frammery -10.10.2005
Machine Timing
& sequencing
Overview ICentral Timing. What’s the CBCM ?
The Central Beam and Cycle Manager CBCM is a collection of hardware and software systems responsible for coordinating and piloting the timing systems of CERN’s accelerators.
In the LHC era, the CBCM will control Linac-II, Linac-III, the PSB, CPS, ADE, LEIR, SPS and the LHC timing systems.
The CBCM will also drive the Beam Synchronous Timing (BST) for LHC. There will be 3 distributions R1, R2, Experiments.
Hardware IILHC MTG
2.2 G-Bit / S optical link64Mb Reflective memories
Main MTGLHC MTG
BST
GMTLHC
CPS and SPS telegrams and timingsand MTG synchronization when filling
Clocks:Bunch Clock 40..8 MHz.Frev ticks at 89us.40.00 MHz GPS clock1PPS (1Hz) clockBasic period clock
Preloaded Injector Sequences
Preloaded LHC Sequence
3 BST Cards
SafeParams
Energy/RingIntensity/RingSafe Beam FlgBeam present FlgExtraction permit FlgBIC Beam permit Flg
External Conditionsand Events
The LHC telegram which will contain at least the following information: 0x14xxyyyy USER: The cycle ID, it has values like PILOT, NOMINAL, DUMP MD… PARTY1: The particle type in Ring-1, Protons/Ions from LEIR PARTY2: The particle type in Ring-2 ENERGY1: The Beam Energy in Ring-1 ENERGY2: The Beam Energy in Ring 2 INTEN1: The Beam Intensity in Ring-1 INTEN2: The Beam Intensity in Ring-2 RING: The next ring to be injected Ring-1, Ring-2, NONE BUNCH: The next target bunch position in the ring 0..35640 BATCH: The actual batch number in the ring 1..12 BATCHES: The number of CPS batches MODE: The machine mode, Pre-injection, Injection, Ramping, Physics, DUMP etc BPNM: The basic period number from the start of the cycle COMLN: Timing trigger bit patterns which are calculated by the CBCM to trigger specific
actions. STATUS: Machine status bits like, OK, ABORT, QUENCH BEAMID: Identifies the next beam in all injectors
The SPS telegram The CPS telegram The UTC time each second The LHC 1KHz events 0x0100xxxx The LHC machine events CTIM X:=: 0x13xx0000 Some CPS & SPS events such as the SPS extraction kicker warning pre-pulse.
LHC Timing cable
Can have NO“next” lines
Payloads = 0000for LHC events
CTIM X = F (code)
LHCMachine = 1
Event Type = 3
Overview IVTiming Reception The CTRx V/VME I/PCI P/PMC Down to 1ns UTC time stamping if HPTDC
installed, else 25ns 50MHz external clocks 1PPS 1KHz 10MHz and 40MHz internal clocks Counters are 24-bit 2048 actions supporting MP and PPM Telegram and Payload handling Full counter remote control Fully Integrated into FESA, Alarms monitor
FESA
Tim/Tgm Lib
CTRx SPSTg8
PSTg8
CERN – AB Department
b. frammery -10.10.2005
Data Management
CERN – AB Department
b. frammery -10.10.2005
Databases : the 4 domains of data
Physical Equipment
Machine Layout
Controls Configuration
Operational DataSerial Number
Installed Equipt
Type
OpticsPowering
Computer Address
Settings Measurements
AlarmsLogging
Post-Mortem
Equipment
Catalogue
Consistent n
aming and identifi
catio
n scheme as d
efined in Q
uality
Assura
nce Plan
© 2001 By Default!
A Free sample background from www.pptbackgrounds.fsnet.co.uk
Slide 18
Device/Property ModelDevice/Property Model
A A devicedevice is a named entity within the control system, which is a named entity within the control system, which corresponds to a physical device (Beam Position Monitor, corresponds to a physical device (Beam Position Monitor, Power Converter) or to a virtual controls entity (e.g. transfer Power Converter) or to a virtual controls entity (e.g. transfer line)line)
The state of a device is accessed via The state of a device is accessed via propertiesproperties and can be and can be
read or modified by the read or modified by the getget and and setset access methods. access methods. (synchronous and asynchronous)(synchronous and asynchronous)– Uses CORBA, hidden from the user by “Controls Middleware”Uses CORBA, hidden from the user by “Controls Middleware”
Property can be Property can be monitoredmonitored (publish/subscribe) (publish/subscribe)– A A cycleSelectorcycleSelector or a polling period can be specified or a polling period can be specified– Optional on-change mode: client will be notified only when property Optional on-change mode: client will be notified only when property
has changed (server criteria).has changed (server criteria).– Uses Java Message Service (JMS) “publish/subscribe” technologyUses Java Message Service (JMS) “publish/subscribe” technology
Device classes can implement many properties of simple Device classes can implement many properties of simple type or few properties of composite typetype or few properties of composite type
Device Model
Devices refer to higher level constructs than in ACNET Devices have properties; may have more than one All properties are like C-structures, not confined to
READING, SETTING etc. as in ACNET You name the elements (“parameters”) These have associated datatype, units, dimension, minimum
and maximum value etc. Can be atomic (single element) or composite (multiple
elements) Scaling assumed to be done by front-end
Device structure defined in “MetaProperty” classes for each general type
Formal hierarchical naming scheme A “Working Set” device is a collection of devices
© 2001 By Default!
A Free sample background from www.pptbackgrounds.fsnet.co.uk
Slide 20
Beam Current Transformer Acquisition Beam Current Transformer Acquisition as example of composite propertyas example of composite property
tagtag value typevalue type
cycleIdcycleId Long LongLong Long
timeNanotimeNano Long LongLong Long
numberOfBunchnumberOfBunch IntegerInteger
maximumBunchIntensitymaximumBunchIntensity FloatFloat
bunchIntensitybunchIntensity Float[NbOfBunch]Float[NbOfBunch]
minimumBunchIntensityminimumBunchIntensity FloatFloat
bunchSpreadSigmabunchSpreadSigma FloatFloat
statusTagstatusTag LongLong
StandardEntries
Input Form for metadata
Generic Workset Display Program
Generic Control Knob Component
Initialisation Parameters
Any status or value control parameter for a device can be stored as a REFERENCE. This includes arrays for function generators.
This can be done for any of up to 64 virtual machines which configure our accelerators for a particular operation like injecting protons in the LHC.
Particular sets of values can be stored in named ARCHIVES for a virtual machine. Used to set up the machines for a particular operation.
The Directory Service provides interfaces for storing and retrieving REFERENCES and ARCHIVES.
15/12/2004 Common Application Infrastructure – AB/CO Day - Lionel Mestre 25
JAPC
• “Java API for Parameter Control”• Single API for all Java applications to access
devices (physical / virtual)• Based on the concept of parameter
(device/property)• Unified and simple access to various systems
Hardware (via Controls Middleware – CMW; Including PVSS devices)
Directory Service (descriptions) SDDS logged data, Simulation Virtual Parameters in the middle tier
• Provides more services to applications Metadata, descriptions Groups, Caching, Transactions
26ICALEPCS 2005 -- Vito Baggiolini, CERN
JAPC Code example• Counter device named Counter11 with one property with
one parameter named Measurement that is an int:
Parameter p = Factory.createParameter(“Counter11”, “Measurement”);
CycleSelector sel = Factory.createCycleSelector(END_OF_CYCLE);
ParameterValue value = par.get(sel);
int counts = value.getInt();
• Code generation facility to make pseudo-”wide” API– e.g. par.getMeasurement();– Compile time check
FESA generic services
0. Outline
Our offering to the equipment-specialist
“A comprehensive offering consisting of a model, method, framework, suite of tools and set of utility packages and support services”
Our progress at a glance
“FESA switches from being a project to being an open-ended activity”
2. Service offering
ModelMethodToolsFrameworkUtility packages
DocumentationTrainingSupportConsultingRequirements
……..Formal generic-model and customization-language
…….....Workflow formalized as a step-by-step method
…...............One tool dedicated to each step of the above
…..... Reusable C++ package which can be tailored
...“A la carte” interfacing with PLC and timing
.........Essentials, tools’ on-line documentation
………………………..mostly ad hoc, on-line tutorial
……………………………………….mostly ad hoc
…………………………………….long-term goal
….……………......Issue management system
3. Client needs coverage as today
Design Equipment Software1
Field-bus standards
b
Configure Alarms2
Implement in C++3
Deploy on FEC4
Configure
timings
5
6 Instantiate Hardware configurationa
Test7
http://project-fesa.web.cern.ch/project-fesa/
Arruat et. al. ICALEPCS 2005
FESA Development
Framework attempts to automate development Minimize code that must be written by the developer via
automatic generation of code and configuration information Library support for timer cards and common devices
Generic GUI tools guide one through 4 main phases: Designing the class structures (internal variables, real-time
scheduling, external API etc.) Deploying the class on a front-end computer Instantiating 1 or more instances of a deployed class (defining
configuration values for internal variables, real-time scheduling etc.)
Testing over the accelerator middleware
Arruat et. al. ICALEPCS 2005
FESA Development
15/12/2004 Common Application Infrastructure – AB/CO Day - Lionel Mestre 33
LHC Software Architecture
•All accelerators share common characteristics
•Create a model that captures those characteristics important for control
•Have a common domain model•Have common software components
to work with this model•Rationalize software development to
reuse and extend the common parts to control all accelerators and transfer lines
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 35
Applications
DatastoreDevices
JAPC CMW/RDA
JAPC
Hibernate / Spring JDBC
Data Access Object (DAO)
LSA Client API
LSA CORE (TH1.4-8O)(Settings, Trim, Trim History, Generation,
Optics, Exploitation, Reference)
ParametersConcentration
JAPCCMW/RDA
JAPC RemoteServer - JMS
JAPC-LSA
LSA Client implementation
LSA Client APIJAPC API (TH1.5-8O)
Spring HTTP Remoting / ProxiesJAPC RemoteClient - JMS
Business Tier (Web Container)
Client Tier
CORBA IIOP
CORBA IIOP JDBC
HTTPHTTPJMS
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 36
BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99BLM1 BLM2 BLM3 BLM4 BLM5 BLM6 BLM7 BLM8 BLM9 BLM10 BLM11 BLM99
4000 Beam Loss Monitors
BLMsConcentration
Publication
OperatorConsole 1
OperatorConsole 2
LoggingFixed
DisplaysOperator
Console 3Operator
Console 4
…
Broadcasting
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 37
DatastoreDevices
Complex Business Logic(Settings, Trim, Trim History, Generation,
Optics, Exploitation, Reference)
Operator Console 1 Operator Console 2 Operator Console 3
many applications many applications many applications
(38)
Extrapolation to Beamline Settings
App Server (Container)
Middleware
MotorDataMod.
MageaDataMod.
MotorDataMod.
Middleware
Beamline Control Graphical User Interface
HardwareConfig
Beamline Settings
Beamline Layout
Beamline H2
Layout = [ Tax1,Bend1, Coll3, …]
H2 = [ Tax1, Bend1, Coll3, …]
150 GeV e-
Mot5
Tax1 Bend1 Coll3
Mot3 Mot4
150 GeV e- = [ ]
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 39
Complexity must be handled
•Need of Standard Services Service discovery (find where services
are) Remoting (split application among tiers) Transaction handling (multiple device
“sets”) Database access (object-relational
mapping) Security (Who/what/where can access)
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 40
One Answer : J2EE + EJB
•Infrastructure provides standard services
•Widely used in industry•In house experience•Change of programming model
Intrusive Force the use of container Force the use of components Tie the persistency to the container Debug with application server on local PC Deployment hell
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 41
Another Answer :J2EE – EJB + Spring Framework
•Design for 3 logical tiers•Run 2 or 3 physical tiers•Developers write plain Java•No change in the programming
model•Focus on our domain•No time for doing infrastructure
15/12/2004 Common Application Infrastructure – AB/CO Day - Lionel Mestre 42
Applications
•Trim (perform + history browser/revert)
•Orbit Steering•Generic Equipment Control•Fixed Displays•SDDS Logger and Viewer•Optics Twiss viewer•Settings Generation•Using the Application Frame
27/01/2005 LSA for SACEC 43
Generic Equipment Control
27/01/2005 LSA for SACEC 44
Generic Measurement
27/01/2005 LSA for SACEC 45
Trim
27/01/2005 LSA for SACEC 46
Trim history
27/01/2005 LSA for SACEC 47
Visualization of the settings
27/01/2005 LSA for SACEC 48
Orbit Steering
15/12/2004 Common Application Infrastructure – AB/CO Day - Lionel Mestre 49
Fixed Displays
15/12/2004 Common Application Infrastructure – AB/CO Day - Lionel Mestre 50
Optics Display
11/10/2005 Architecture for LHC Controls – iCALEPCS 2005 - Lionel Mestre 51
Results and Future Targets
Control of TI8 (October 2004)Steering of the SPS ring orbitLEIR controls
•SPS start-up•Extraction sequencing TI2/TI8•LHC sector test
CERN – AB Department G. Kruk – 14.10.2005
Development Process Issues Projects and code organization
Source versioning management Build services (automation of common tasks)
• Compilation, JAR• Documentation generation
Dependencies management Release management
• Releasing new versions of software in a dedicated repository
Applications deployment Issues & bugs tracking
(guidelines, naming conventions, directory structure)
(CVS)
(JIRA)
CERN – AB Department G. Kruk – 14.10.2005
Common-Build constraints(Directory structure)
equipstate/build.xmlproduct.propertiesproduct.xmlpeoplesrc/ java/ test/
CERN – AB Department G. Kruk – 14.10.2005
Target examples
Based on Apache Ant• Java based open source build tool (like Make)
Compiling sources• ant compile
Building distribution of the product• ant dist
Releasing new version of the product• ant release
CERN – AB Department G. Kruk – 14.10.2005
What Release does
Extracts the product from the CVS to the dedicated production repository
Builds the product (calling Common-Build)
Installs it in a multi-versioned repository• New version is added without modifying the old ones• We can always use old versions
Updates product aliases (symbolic links)
Eclipse IDE
CERN – AB Department G. Kruk – 14.10.2005
GUI Applications deployment
We use Java Web Start deployment technology • uses a special XML descriptor (JNLP file) to deploy and run
applications• ensures that all required libraries (cached locally) are up to date
Repository contains all libraries and JNLP file Console Manager contains directory of
applications, starts them on request
CERN – AB Department
b. frammery -10.10.2005
GENERAL SERVICES
CERN – AB Department
b. frammery -10.10.2005
The Alarm System
• LHC Alarm SERvice (LASER)
LEIR(FESA
)
New SPSalarms (FESA)
PS alarm
Gateway
LASERService
LHC(FESA)
Legacy PS
alarms
Legacy CAS
alarms(SPS, TCR,
CSAM)
CurrentNewGateway
Broker
Broker
CAS alarm
Gateway
•« Standard » 3-tier architecture•Java message service (JMS)•Subscription mechanism
LASER project – CO Day WorkshopLASER project – CO Day Workshop
ArchitectureArchitecture
AcceleratorDevices
TechnicalServices
ControlSW
Alarm Sources
Reso
urc
e
Distribution Gathering Definition Reduction Archiving
Services
Busi
ness
Laser Source API
Laser Client API
Alarm Consoles Definition Consoles External ClientsAdmin Consoles
Pre
senta
tion
Alarm Clients
CERN – AB Department
b. frammery -10.10.2005
Logging
• Several 105 parameters will be logged• Every data or setting is timestamped (UTC)• Parameters are logged
o on regular intervals (down to 100 ms)o on requesto on-change
LHC Logging Service
DEVDB9Oracle DB
LHCLOGDBOracle DB Data Migration
03.08.2004
Database Server
Application Server
ClientTier
API
Data Input Data Output
TT40 VAC Cryo
New clients
Established Clients
Java XML
ABJAS4 (PC) – Oracle AS ABOFS1 (HP) – Oracle AS
R. Billen 10.08.2004
LHCLogging LHCLoggingTest LHCLogging LHCLoggingTest
Loader GUI GUILoaderLoaderTest
GUITest
GUITest
LoaderTest
Test
productioproductionn
testtest
backbackupup
http://lhc-logging.web.cern.ch/lhc-logging/
Timber
Define standard logging parameters analogous to D43 Java client API available so programs may log as they wish Access to data via GUI applications, web interface, API All logging to single Oracle database
Lots of disk allocated, but scaling issues Log all data first into “Measurement DB” cache which has a
lifetime of ~week. Periodically (~15 min) insert selected data into Oracle
Shot by Shot Logging???
Timber Web Interface
Timber Web Interface
27/01/2005 LSA for SACEC 66
SDDS Logging & Monitoring
27/01/2005 LSA for SACEC 67
SDDS Browser & Viewer
CERN – AB Department
b. frammery -10.10.2005
Post Mortem
• Automatic (typ. when an interlock appears) or manual trigger• No beam allowed if PM not ready• Capture of
o Logged datao Alarms (LASER)o Transient recorder signals (OASIS)o Fixed displays
• Analysiso A few Gigabytes per Post Mortem captureo Structured sorting of causes & effectso Needed from October 2005 for Hardware commissioningo Continuous development effort for the years to come
TI8 extractio
n test i
n October 2004 alre
ady proved the
importance of a
PM syste
m
To take a snapshot of the LHC vital systems.
CERN – AB Department
b. frammery -10.10.2005
Analogue signals
Open Analogue Signals Information System (OASIS)o To visualize and correlate in Real-Time time critical signals in
the control roomo ~500 signals for LHC – 50 MHz bandwidth (+ ~1000 in PS/SPS)
o Distributed cPCI system using analogue MPX and oscilloscope modules (Acqiris or other types) close to the equipment
o Triggers through the timing network for precise time correlations
o Standard 3-tier architecture.
The ancestor
CERN – AB Department
b. frammery -10.10.2005
Real-Time Feedback systems
• LHC orbit feedbacko 2000 Beam position parameterso 1000 steering dipoles o 10 Hz frequency
• LHC tune feedback•Modest system – 4 parameters and some 30 PCs (up to 50 Hz ?).
• LHC Chromaticity feedback •Considered but difficulty to have reliable measurements
CERN – AB Department
b. frammery -10.10.2005
FB
•Centralized architecture•> 100 VME crates involved•Through the Technical network•Tests on SPS in 2004 successful•Simulations show 25Hz capability
Orbit Feedback system
Networking
• General Purpose Network (GPN)– Desktop Computing, testing, access from outside, …
• Technical and Experiment Network (TN and EN)– Only operational devices
– Authorization procedure
• Inter domain communications– Application Gateways + Trusted services
• Network monitoring and intrusion detection– Performance and statistics
– Disconnection on “breakpoints”
• Testing– TOCSSiC (hostile network environment)
Security Policy
• Network Domains– Physical network segregation + Functional Sub-Domains (FSD)
• Hardware Devices– No USB, modems, CDs, wireless …
• Operation System– Central installation + Strategy for security patches
• Software– Development guidelines, installation and test procedures
• Logins and passwords– Traceability, no generic accounts, strong passwords
• Training
• Security Incidents and Reporting
Cheat Sheet
FNAL CERN
TLG CBCM/MTGs
MDAT/TCLK LHC Telegram/Machine Cycle;Events
ACNET protocol Controls Middleware (CMW)
MOOC FESA
VxWorks LynxOS/Linux
Data Acquisition Engine (DAE) Application Server
MECCA ant
VMS/Windows/Solaris/Linux Linux
C/Java Java
Sybase Oracle
Lumberjack Timber
NUMI CNGS
Behind the firewall On the Technical Network
Summary
New control system being developed for the LHC that will eventually be used by all CERN accelerators. This is highly ambitious particularly on the required time scale.
Some qualitative, philosophical similarity to FNAL system; but totally different implementation.
More sophisticated device model. Attempts at better information management, more reuse of machine software among various accelerators and transfer lines.
Infrastructure is new and will likely have growing pains. There is an enormous amount of hardware to install and
software to write and make work in less than 2 years…