Lunarc history

Post on 02-Jan-2016

22 views 0 download

Tags:

description

Lunarc history. 1986 - 1988  IBM 3090 150/VF 1988 - 1991  IBM 3090 170S/VF 1991 - 1997  Workstations, IBM RS/6000 1994 – 1997 IBM SP2, 8 processors. 1998 Origin 2000, 46 processors, R10000 1999 Origin 2000, 100 processors, R12000, 300 Mhz - PowerPoint PPT Presentation

transcript

Lunarc history

1986 - 1988 IBM 3090 150/VF1988 - 1991 IBM 3090 170S/VF1991 - 1997 Workstations, IBM RS/60001994 – 1997 IBM SP2, 8 processors.1998 Origin 2000, 46 processors, R100001999 Origin 2000, 100 processors, R12000, 300 Mhz2000 Origin 2000, 116 processors, R12000, 300 Mhz2000 Beowulf Cluster, cluster with 40 AMD 1.1 GHz cpus2001 64 of the Origin 2000 processors were relocated to

NSC.2002 A 64 processor cluster. AMD Athlon 1900+

(WhenIm64)2003 128 processors added (Toto7). Intel P4 2.53 GHz

Current hardware

• Husmodern, cluster– 32 nodes, 1,1GHz AMD Athlon, 001201

• WhenIm64/Toto7, clusters– 65 noder, AMD 1900+, 020408– 128 noder, P4 2.53 GHz, 030218– Fileserver, login nodes etc

• Ask, SGI Origin 2000– 48 nodes, R12000, 300 MHz, 12Gb

Current hardware

About Lunarc

• Current staff– 1.3 fte

• Future Administration– 2.5 fte ( minimum, depending on contract

formulations)

Current users

• Core groups– Theoretical chemistry, Physical Chemistry2,

Structural Mechanics

• Other large users– Fluid Mechanics, Fire safety engineering,

Physics

• New groups– Inflamational Research, Biophysical

Chemistry, Astronomy

Current users

Total usage by categories

Engineering16%

Chemistry79%

Computer Sciences

0%

Physics3%

Biologi2%

Engineering

Chemistry

Computer Sciences

Physics

Biologi

Lunarc web

• User registration• System information• System usage• Job submission ?

Using clusters

• Log in– Use ssh, unix tools etc

• mkdir proj• sftp/scp user@...• vi/joe submit script

– Submit script documentation

• Queue management – qsub script

• Transfer result files back– sftp/scp

For many, this is a straightforward process, but why do we get so many questions??

Web portal for our clusters

• Good knowledge about local circumstances

• Traditional users -> clusters -> grids

• User interface• Grid of clusters