+ All Categories
Home > Documents > Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012...

Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012...

Date post: 20-Aug-2021
Category:
Upload: others
View: 3 times
Download: 0 times
Share this document with a friend
12
ISSUE 71, SUMMER 2012 2 APOS: fluid dynamics and petascale systems 3 New metajournal for research software 4 EUDAT: managing research data 5 Nu-Fuse: fusion reactor modelling 6 CRESTA: reaching for exascale 7 MAUS: cooling muons 8 HPC training at EPCC 9 New Bursary for MSc students Software Boot Camp 10 NESS: next generation sound synthesis 11 Crucible: cross-disciplinary success for EPCC The value of visitor programmes 12 Meet the new staff at EPCC In this issue... The newsletter of EPCC, the supercomputing centre at the University of Edinburgh Sound synthesis gets real Good vibrations
Transcript
Page 1: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

Issue 71, summer 2012

2 APOs: fluid dynamics and petascale systems

3 New metajournal for research software

4 euDAT: managing research data

5 Nu-Fuse: fusion reactor modelling

6 CresTA: reaching for exascale

7 mAus: cooling muons

8 HPC training at ePCC

9 New Bursary for msc students software Boot Camp

10 Ness: next generation sound synthesis

11 Crucible: cross-disciplinary success for ePCC The value of visitor programmes

12 meet the new staff at ePCC

In this issue...

The newsletter of EPCC, the supercomputing centre at the University of Edinburgh

Sound synthesis gets real

Good vibrations

Page 2: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

2

editorial Lawrence Mitchell

Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel computing and the articles in this edition highlight the broad range of our activities.

On the HPC side, we have a number of projects involved in exascale computing. CRESTA is using co-design approaches to develop exascale software tooling. Nu-FUSE is tackling the Grand Challenge of fusion energy and looking at exascale computing. APOS is working to shorter timescales, helping important scientific applications to reach petascale performance. Finally, we also have a new project, NESS, which is looking at real-time sound synthesis using HPC hardware.

As well as simulating larger problems ever faster, we need a way to handle all the data that science (computational or not) generates. EPCC is taking a lead role in EUDAT: designing a pan-European data infrastructure.

Our involvement in training the future generations of computational scientists continues. We are a PRACE Advanced Training Centre, and funded by the PRACE project we will be

running a number of HPC short courses in the coming year. As ever, our MSc in HPC continues to attract keen students, ten of whom are making the journey to ISC this June as volunteers. And from next year an annual bursary in memory of our former colleague John Fisher will be awarded to an MSc student. In addition to these events, we’re actively involved in the Software Carpentry scheme. Our visitor programmes continue in their popularity and in this edition we hear of their long-lasting legacies.

Continuing in the vein of training computational scientists, the Software Sustainability Institute continues its work to increase the long-term viability of academic software: for example we’ve recently finished a collaboration with neutrino physicists at Rutherford Appleton Laboratories to improve the maintainabilty of their data analysis software. We’re also excited to announce the launch of a new metajournal for research software. It aims to make research software easier to cite, reuse and validate while rewarding collaborative efforts.

Last but not least, we welcome eight new staff members to EPCC, all of whom have joined us in the last year.

Waving not drowningLawrence Mitchell

APOS is a joint Russian and European collaboration to study and develop productive programming methodolgies for modern HPC systems.

Over the past nine months, EPCC has been working with the Applied Modelling and Computation Group at Imperial College London on the fluid dynamics code, Fluidity. Computational fluid dynamics, in which the behaviour of liquids is studied through computer simulation, is a large and active research area. It encompasses scientific Grand Challenges such as climate and ocean modelling.

We are working on Fluidity as an exemplar application studying programming methodologies for petascale (and future exascale) HPC systems. It is widely acknowledged that to exploit modern multicore systems fully, scientific applications need to move

away from an MPI-only programming model to mixed-mode parallelisation strategies that exploit intra-node infrastructure in a more efficient manner.

Our work on Fluidity to date has been in this area. We have successfully added OpenMP threading to the two main computational kernels: assembly of the global finite element problem; and solution of the resulting set of equations. Our recent work in this area has shown that significant performance gains are possible[1]. Performance of the whole simulation is still being investigated, but initial results are very promising.

For more information on the APOS project, see http://apos-project.eu/

[1] M. Weiland et al. Mixed-mode implementation of PETSc for scalable linear algebra on multi-core processors arXiv:1205.2005v1 [cs.DC]

DNS turbidity current generated with Fluidity. Image courtesy AMCG, ICL.

Page 3: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

3

The Software Sustainability Institute, headquartered at EPCC, has partnered with Ubiquity Press to launch the Journal of Open Research Software, a “metajournal” for research software. This will enable all authors of research software to create a permanent, citable, open access record of their software. Additionally, it will enable all researchers to access, reuse and cite the software published in this way. Reuse not only rewards the author, but leads to more efficient, higher quality science by making existing science reproducible and enabling new science to build on the work of others.

What is a metajournal?Metajournals provide a fully open-access way to discover research resources that are spread across multiple locations and which are usually hard to find. They use mechanisms familiar to anyone who has published or retrieved a “regular” paper. Metapapers reward authors for openly archiving and making accessible their research datasets, software and reports by making them citable and tracking this for indicators of impact. In addition, metajournals provide information to maximise reuse potential of the software or data.

Why a software metajournal?Traditionally, it has been hard to cite software. This is an issue because it discourages publication of software (there is no incentive), and it discourages reproducible research and the reuse of code (because it is hard to find software and reward reuse). In the long run we require a reevaluation of how we measure and rate the “success” of research, but a software metajournal is a pragmatic approach to using existing, understood mechanisms to help raise the profile of software as a research output on a par with publications, datasets and methods. Without the ability to give credit for both the reuse of software and making it available for reuse, we inhibit reproducible research and disincentivise those for whom developing software is an inherent part of their research. By following open-access principles, software metajournals also ensure that software is made available for the long term.

Aren’t there already journals which publish software?Unfortunately, there are fewer than there should be. Some disciplines and journals have been forward thinking enough to ask for the source code to be submitted at the same time as the paper describing the scientific results (eg Computer Physics Communications). Others have opened a special software track alongside their main research tracks (eg Journal of Machine Learning Research and Bioinformatics). However in both these cases there is a requirement for there to be novelty in both the research and the software. Finally there are journals like Open Research Computation which represent the pinnacle of software journals, with strict criteria and a focus on code quality.

The Journal of Open Research Software fills the gap that isn’t covered by these journals – an open access journal for software that has a low effort barrier for submission. It acts as a journal of record that allows significant versions of a piece of software to be cited and made available, eg in connection with a research paper. Whilst the aim is to make it easy to submit a software metapaper to the journal, it retains a strict peer review process that ensures the accessibility of the software and correctness of the metadata associated with the software.

How do I submit my software to the Journal of Open Research Software?During the beta phase of JORS, publishing a data paper is completely free. An Article Processing Charge of £25 will be introduced once the journal formally launches.

LinksTo start the submission process, visit: http://openresearchsoftware.metajnl.com/submit-a-software-paper/

If you’re interested in helping to shape the metajournal as it progresses, or think you would make a good reviewer, send a short summary of your experience to: [email protected]

http://www.software.ac.uk/

http://www.ubiquitypress.com/

http://openresearchsoftware.metajnl.com/

A new journal for softwareNeil Chue Hong

Share. Reuse. Validate. Cite. Reward. Collaborate.

Page 4: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

4

The EUDAT project, launched on October 1st 2011, aims to design a pan-European solution to the challenge of data proliferation in Europe’s scientific and research communities: a Collaborative Data Infrastructure (CDI) driven by research communities’ needs.

Europe and its people are faced with an increasing number of economic, demographic, social and environmental issues whose resolution will determine the prosperity of future generations. An ageing population, pollution, the loss of biodiversity, the need for a rapid transport system, shortages of water, energy and fuel, untreated diseases: these are the Grand Challenges for European society in the twenty-first century. From a research perspective, collaboration across national borders presents the best way to tackle these challenges. This requires new, federated approaches to research data management.

Data has become an essential commodity in any large scale research project. Maintaining open access to data should become a priority as researchers seek to use an increasing amount of information within collaborative projects. This phenomenon – huge quantities of data present in many areas of our life – is often referred to as “the data deluge”, or even “the data tsunami”. The collection, curation, storage, archiving, integration and deployment of this deluge is an immense challenge that can no longer be handled by a single organisation or even a single country. Over recent years European countries have invested heavily in research that produces this data. It is now recognised that there is an urgent need for a pan-European infrastructure that will allow us both to extract best value from current and planned investments in this area, and enable collaborative, cross-disciplinary data use.

This infrastructure needs to enhance the existing collective data capacity to meet rising demand while building on existing platforms, ensuring that the collective, expanding capacity across the continent is used optimally. It must also establish universal principles for optimising the use of existing data capacity, such as assessing what data should be stored and made available to users. Ideally it will present a transparent common interface to the world: a “one-stop shop” for research data. This is EUDAT.

To date, EUDAT has carried out a comprehensive review of research communities’ approaches and requirements in the deployment and use of a common and persistent data e-infrastructure. It is now beginning to design the appropriate services and technologies to match these requirements.

To keep track of emerging patterns, and to try to ensure that we cover as broad a spectrum of potential users as possible, we have adopted user personas to exemplify certain important research scenarios. Kathrin Kirsch, for instance, is a German philologist working as a research assistant at the University of Marburg. Kathrin never learned much about computers other than using search engines, Microsoft Word and email programs. She wants to access and use corpora and other linguistic data collections that other users have already created. She is interested in different varieties and genres of corpora and needs extensive metadata on authors, genres, topics, creation dates of the source texts, annotation levels, data formats, access rights and so forth.

In contrast Evelien Erkens is a computational physicist, and a competent C programmer. She is more than happy hacking Perl, Python and shell scripts at the Linux command line. Very often she may be found late in her department, re-analysing results, tuning analysis codes and tidying up the simulation data store. Having a very tidy mind, she has often been known to annotate other people’s data with relevant (and correct!) metadata they may have neglected.

Each of these “archetypal users” helps us identify a particular set of CDI services needed to carry out their research. Some are drawn from the core EUDAT communities, others from the wider fields of computational and data science. So far they point to a wide range of necessary services, but with a clear emphasis on a top five: single sign-on; a comprehensive set of metadata catalogues; persistent identifiers for data; efficient data movement; and safe replication. These will be the priority areas for the first year of EUDAT.

EUDAT is the largest project in Europe looking at harmonising access to research data across disciplines. This is not a trivial challenge: it is as much about policy and people as technology. But it is surely worth the effort. As more and more research data is “born digital” we need to get better at storing, managing and preserving it for the long-term.

“If I have seen further it is by standing on the shoulders of giants,” said Newton. As it heads towards its second year, EUDAT has this very much in mind.

www.eudat.eu

euDAT: meeting the Grand Challenges... and the importance of dataRob Baxter

www.flickr.com/photos/seriykotik/195406053

Page 5: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

5

One possible answer to the current energy crisis is fusion power. It is safer than fission-based nuclear power and generates minimal pollution. Fusion power is the subject of Nu-FuSE, a G8-funded project led by Graeme Ackland at the University of Edinburgh and involving seven international partners.

One active area of research in the field of fusion power focuses how to build the reactor. Of particular interest is the construction of the shell to contain the fusion reaction. Research at the University of Edinburgh, led by Professor Graeme Ackland in the School of Physics and in collaboration with EPCC, is focusing on possible materials for fusion reactors. It uses modelling techniques to understand radiation damage and defects in materials. The material needs to be strong and to maintain its strength over the lifetime of the reactor. This is not a straightforward problem and there are many factors that are unique to fusion reactors. The shell of the reactor will be exposed to high levels of radiation affecting the integrity of the material. The reactor will also need to withstand prolonged exposure to high temperatures as well as the presence of a water-lithium mix that promotes fast corrosion of most metals. This rules out many materials as the main component of the shell. Existing metal alloys do not provide the combination of resistance to radiation, temperature, corrosion and bombardment by particles necessary in a fusion reactor. Instead, the reactor shell is likely to be constructed from a unique new class of material based on a complex metal alloy containing many different elements.

Designing this new material is difficult to impossible with current experimental techniques. Experiments need to expose any material to the conditions inside the reactor, conditions which can currently only be recreated in the reactor itself! The material needs to survive for the lifetime of a commercial fusion reactor (expected to be decades). To avoid testing the materials for decades before they even reach a reactor, shorter observations must be extrapolated, which is at best difficult and unreliable, at worst impossible.

Computational modelling of the reactor material provides a clear avenue of research into construction of the reactor. The challenges this approach must overcome are formidable, involving multiple techniques on multiple scales. Quantum mechanical simulations are needed to understand the energies that bind the atoms and how they react in the presence of radiation, but only work on the nanosecond timescale. Classical molecular dynamics focuses on the atomic scale and considers individual defects in a material which can cause a metal to become brittle and weaken the reactor. Larger systems are

treated using continuum-scale modelling, simulating the geometries of the components in the reactor casings on much longer timescales.

Our work in Edinburgh is currently focusing on examining the properties of the materials, in particular their ability to resist damage from radiation, and when helium from the fusion reaction hits the shell and becomes embedded in it. Both radiation and embedded foreign atoms can cause cascade effects where defects propagate through the material causing the material to weaken and eventually break apart.

As part of our work studying possible reactor materials, the NU-FuSE project also aims to deliver new functionality and further optimisation to Moldy, the parallel classical molecular dynamics software. Moldy was developed by Graeme Ackland in the 1980s to focus on the simulation of metals, and is therefore particularly suited to modelling fusion reactor materials. We aim to add new methods, including the ability to simulate new metallic potentials and magnetic iron-based systems. We will also incorporate temperature-assisted dynamics to allow the sampling of possible material configurations with significantly less computational effort.

Meeting the challenges of simulating the components of a fusion reactor requires long-term vision and huge amounts of computational resources. As well as our work on simulating the materials that make the reactor, Nu-FuSE will investigate exascale solutions to molecular modelling. We are exploring possible attributes for exascale hardware that would suit molecular simulation as well as techniques to improve the performance of simulations at larger process counts.

We also plan to use the diversity of the project partners to train a new cohort of ‘computational fusion scientists’. We will draw on the international collaboration offered by this project to create joint meetings, seminars and training opportunities.

www.nu-fuse.com

materials research for fusion reactors and the Nu-Fuse projectToni Collis and Adrian Jackson

NuFuSE-

A detailed cutaway of the ITER Tokamak, with the hot plasma (pink) in the centre. Image: ITER Organization

Page 6: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

6

Preparing applications for exascale through co-designLorna Smith, Katie Urquhart, Mark Parsons

The need for exascale platforms is being driven by a set of important scientific challenges. These are problems of global significance that cannot be solved on current petascale hardware, but require exascale systems. Example grand challenge problems originate from energy, climate, nanotechnology and medicine and have a strong societal focus. Meeting these challenges requires associated application codes to utilise developing exascale systems appropriately. Achieving this needs a close interaction between software and application developers. EPCC coordinates the EC funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. This project focuses on just such a close interaction: bringing together application scientists, software developers and numerical experts to address the exascale challenge through co-design.

The concept of co-design dates from the late 18th century, recognising the importance of a priori knowledge. In terms of modern software, co-design recognises the need to include all relevant perspectives and stakeholders in the design process. The International Exascale Software Project (IESP) recognised the importance of co-design vehicles within the exascale software design and development process[1]. They noted that, “Co-Design Vehicles (CDVs) are applications that provide targets for, and feedback to, the software research, design and development efforts in the IESP”. The IESP concluded it is more important than ever that “we continue to discuss, share and develop our joint knowledge around the exascale challenge. CDVs are required because there are several possible paths to exascale with many associated design choices along the way.”

One of the most successful examples of co-design comes from Quantum Chromodynamics, with projects and systems such as QCDSP, QCDOC (with the resulting influence on IBM’s BlueGene), APE and CP-PACS demonstrating this success. However the breadth of communities engaged in co-design activities has increased with the move towards exascale. It now includes areas such as fusion, computational fluid dynamics, and climate science.

CRESTA firmly believes that the exascale challenge requires application scientists, software developers, hardware designers, and numerical experts to engage in a co-design process. The project has focused on developing a novel, integrated co-design process. Six applications with exascale potential are being used as co-design vehicles to develop the appropriate underlying exascale software. The applications have been chosen as a representative sample from across the supercomputing domain: biomolecular systems; fusion energy; the virtual physiological human; numerical weather prediction; and engineering.

Over the duration of the project, the aim is to deliver key, exploitable technologies that will allow the co-design applications to successfully execute on multi-petaflop systems in preparation for the first exascale systems arriving at the end of this decade. It is still early days for the project, and we will keep you posted on our progress!

[1] Dongarra, J., Beckman, P. et al., Volume 25, Number 1, 2011, International Journal of High Performance Computer Applications, ISSN 1094-3420

http://cresta-project.eu

Page 7: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

Neutrinos are particles produced in highly energetic processes, occuring for example in the Sun, nuclear reactors or the Big Bang. They are of great interest to physicists since there is evidence they have mass, despite the Standard Model of Particle Physics assuming otherwise. Neutrinos are also thought to hold the answer as to why there is more matter than anti-matter in the Universe.

MICE (Muon Ionization Cooling Experiment) is a UK-based international collaboration of around 100 particle and accelerator physicists based at Rutherford Appleton Laboratory. MICE is prototyping a muon cooling machine to improve the efficiency of neutrino creation. The software part of MICE is MAUS, the MICE Analysis User Software. MAUS is a modular data-analysis package written in Python and C++ that provides a framework for the analysis of data from MICE detectors. It supports both online analysis of live data and detailed offline data analysis as well as simulation and accelerator design. Online analysis of live data, or online reconstruction, is particularly important since it allows the MICE hardware to be tuned during operation.

MAUS has a large team of developers of varying degrees of availability. Chris Tunnell, the MICE Offline Detector Software Coordinator, approached the Software Sustainability Institute (SSI) to discuss managing the development of MAUS more effectively as well as requesting assistance in completing pressing development tasks relating to online reconstruction.

EPCC’s Rob Baxter, SSI’s software development manager, proposed various ways in which the MAUS team could be managed more effectively. Based on a review of the MAUS wiki and issue tracker – a key medium in managing distributed software teams – Rob’s recommendations centred on being able to readily identify and track the overall picture of MAUS development. Contributing to this were:

• Creating a Gantt chart to understand dependencies between tasks and the impact of any slippage.

• Maintaining a risks and issues log and regularly reviewing this.

• Sending short highlight reports “up the line”, summarising progress, changes in risks and issues, and plans for the next period.

From a developer’s perspective, I undertook an evaluation of MAUS’s software and online resources. MAUS scored very highly in terms of sustainability. They are taking steps to resolve the issues I identified (primarily to do with the structure of their wiki).

I was then embedded within the MAUS development team for 6 months and developed a number of online reconstruction

components. This contributed to the sustainability of MAUS by allowing the functionality to be available in the MICE control room as the hardware evolves. These components included:

• Extensions to allow for data analysis jobs to be run in parallel using the Celery open source distributed task queue and RabbitMQ task broker.

• Extensions to allow MAUS to cache data in a MongoDB document-oriented database, rather than in temporary data files.

• Convertors to create histograms or tables from data collected from simulated or live runs.

• A web front-end, written in Django, to present these histograms and tables via a browser.

The components were first run in the MICE control room in December 2011. They are now in regular use and a researcher has been identified to take over and maintain them. In addition, MAUS developers have successfully created their own histogram convertors based upon the ones I developed. The web front-end will be updated in response to experiences in running MICE and understanding exactly what data control room staff need and how this should be presented.

LinksMICE: http://mice.iit.edu

MAUS: http://micewww.pp.rl.ac.uk/projects/maus

Software Sustainability Institute/MAUS collaboration overview: http://micewww.pp.rl.ac.uk/projects/maus/wiki/MAUSSSI

7

mICe: cooling muons in particle physics Mike Jackson

“The collaboration was a very useful experience... I would work with SSI again and would recommend them to others.”

Chris Rogers, MICE Physics Software Manager at Rutherford Appleton Laboratory

The first run of MAUS in the control room.

Page 8: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

PRACEPRACE, the Partnership for Advanced Computing in Europe, has selected six of its members sites as the first PRACE Advanced Training Centres (PATCs). EPCC hosts the PATC for the UK, the other five centres being in Finland, France, Germany, Italy and Spain.

The mission of the PATCs is to carry out training and education activities that will enable the European research community to utilise the computational infrastructure available through PRACE. The long-term vision is that such centres will become the hubs of European high-performance computing education.

Funded by the PRACE project, EPCC will be offering around ten courses per year both in Edinburgh and elsewhere in the UK – see opposite for a list of past and future courses.

EPCC course schedule for 2012

• Message-Passing Programming with MPI (EPCC, 24–26 April)

• Shared-Memory Programming using OpenMP (EPCC, 21–22 May)

• Performance Optimisation for the AMD Interlagos Processor (EPCC, 11–12 July)

• GPU Programming with CUDA and OpenACC (EPCC, 28–30 August)

• Cray XE6 Performance Workshop (London, September)

• Software Carpentry (EPCC, 4–5 December)

As I hope you can see from the picture above, the initial courses have proved very popular and been much appreciated by the attendees!

HPC training at ePCCDavid Henty

ePCC has a long history of providing training and education in a wide range of HPC topics such as parallel programming, HPC software development and performance optimisation. some recent initiatives have enabled us to offer our courses more widely than before, so it seems timely to give a brief overview of all our training activities.

EPSRC HPC Short CoursesEPCC is a member of the HPC-SC Consortium, a group of thirteen universities and research institutes formed to ensure that appropriate, high quality training in advanced aspects of HPC is available to researchers in the UK.

As well as running a programme of HPC courses around the UK, we can also offer some funding to support the attendance of selected EPSRC PhD students.

MSc in HPCEPCC continues to run our one-year postgraduate masters course, the MSc in High Performance Computing. Highlights of this year’s programme include all students having accounts on the UK national supercomputer HECToR, the new HPC Cluster Challenge where students build a parallel cluster from spare parts, and ten of the class being selected as student volunteers for ISC 2012 in Hamburg in June.

PRACE training activities www.training.prace-ri.eu

EPCC course list www.epcc.ed.ac.uk/training-education/course-programme

EPSRC HPC Short Courses www.hpc-sc.ac.uk

MSc in HPC www.epcc.ed.ac.uk/msc

Attendees at the recent PRACE OpenMP course enjoy the Edinburgh sunshine.

Students tackle the MSc in HPC’s new Cluster Challenge.

8

Page 9: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

On Monday 14th and Tuesday 15th May, Newcastle University played host to a Software Carpentry boot camp run by the Digital Institute at Newcastle University, SoundSoftware and The Software Sustainability Institute. This was the first boot camp to be delivered entirely by UK tutors, independent of Greg Wilson’s Software Carpentry team in Canada.

Software Carpentry aims to teach scientists how to quickly build the high-quality software they need, and so maximise the impact of their research. The format is a workshop, or boot camp, followed by 4-8 weeks of self-paced online instruction. Boot camps do not solely teach specific products, rather they focus on practices to help scientists rapidly develop high-quality software. Instead of just stating that “these are good because software developers say they are” each practice is justified with reference to individual and group psychology, and empirical studies into how software development works.

Two weeks earlier, I and my Software Sustainability Institute colleagues Neil Chue Hong and Steve Crouch had joined the Digital Institute’s Steve McGough and SoundSoftware’s Chris Cannam at a boot camp run by Greg at University College London. Newcastle saw us reunited for the daunting task of

running a boot camp ourselves. To our relief, the attendees’ comments were very positive and there were a number of valuable suggestions for future improvements: courses, like software, benefit from iterative development and user feedback.

The Software Sustainability Institute will help to set up further boot camps for later in the year at locations across the UK.

LinksThe Software Sustainability Institute: http://software.ac.uk/

The Software Sustainability Institute blog: http://software.ac.uk/blog/

Software Carpentry: http://software-carpentry.org/

Newcastle boot camp: http://software-carpentry.org/boot-camps/newcastle-university-may-2012/

Newcastle attendees’ comments: http://www.software.ac.uk/blog/2012-05-02-uks-first-software-carpentry-boot-camp

Digital Institute: http://digitalinstitute.ncl.ac.uk/

SoundSoftware: http://soundsoftware.ac.uk/

From recruits to instructors -- the first uK-run software Carpentry boot campMike Jackson

EPCC has established a bursary in memory of our colleague John Fisher who died in 2007.

John joined EPCC as head of user support in 1994, having already enjoyed a successful career in computing spanning almost a quarter of a century. His first role at EPCC was supporting the UK’s first parallel supercomputing service on a Cray T3D and he subsequently took on a similar role in the HPCx service. John was also a major contributor to EPCC’s successful bid for HECToR.

In all of these supercomputing services, John’s primary role was communicating with the many hundreds of users throughout the UK. His comforting presence and sympathetic ear made him the friendly face and voice of supercomputing support in the UK for more than a dozen years.

The John Fisher Bursary will be funded by the School of Physics & Astronomy for a student undertaking the Masters course in High Performance Computing (MSc in HPC) taught by EPCC. The bursary will cover tuition fees and additional programme costs up to a value of £6,750. UK and EU students who have been accepted for admission to the MSc in HPC are eligible to apply.

The bursary will be in place for the 2012-2013 session and will be awarded competitively. It is EPCC’s intention that the bursary will be awarded on an annual basis.

John’s contribution to EPCC was considerable; he had a great influence on his many colleagues and was much respected, admired and liked. He is greatly missed.

The John Fisher BursaryMaureen Simpson

9

Mike Jackson takes the floor at the bootcamp.

Page 10: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

NESS (Next Generation Sound Synthesis through Simulation) is using HPC to advance the state of the art in sound synthesis. It is a collaborative effort between EPCC and the Acoustics Group, which is jointly run through the Reid School of Music and the School of Physics at the University of Edinburgh.

Various approaches to digital sound synthesis have emerged since the inception of the field in the 1950s. Among these are abstract signal-based methods such as additive synthesis, employing sums of sinusoidal signals; subtractive synthesis, where a single spectrally rich waveform is shaped by various filters; and frequency modulation (a closely related technique, phase modulation, was used in the immensely popular Yamaha DX7) where a base frequency is distorted by a modulator frequency in order to produce a wide variety of synthetic tones. A different approach is to use fragments of recorded audio, or sampled tones, as is the case with many modern digital pianos. The most general design criterion underlying such methods has been, to date, computational efficiency – crucial in the case of audio processing, where sample rates are necessarily high (at least 40kHz).

In the more general context of building new virtual musical instruments, however, such designs can be problematic. In the case of abstract methods, sound quality can be somewhat artificial (think of 1980s synth bands such as Depeche Mode, New Order and a-ha). When using sampling techniques –although individual recorded sounds do indeed possess a natural quality – it can be difficult to escape the character of the sampled material. This can lead to repetitive patterns: to which the ear is acutely sensitive and unforgiving!

Newer approaches, geared towards producing synthetic sound with a natural acoustic character, are based around simulations of the physics behind real-world instruments: physical modelling synthesis. Though several techniques have been proposed, in NESS the focus is on direct time-stepping methods

such as “finite difference time domain” (FDTD). In FDTD a mathematical model of a musical instrument is discretized over a grid and the effect of an input (eg the striking of a drum-stick) is realised by progressing the simulation through a series of time-steps. The input parameters to the simulation include instrument design specifications such as the thickness and tension of the material as well as control data such as the timing and force of strikes or bowing. In this project, modelling will be complete, in the sense that the 3D acoustic field will also be simulated. The ultimate goal is to create new instruments that are not feasible in real life (imagine a trumpet with 20 valves).

The image above shows a simulation of a timpani drum, created as part of NESS. The effects of a drum strike are modelled over time, clearly highlighting the reverberation of the soundwave inside the drum chamber (leading to audible “cavity modes”), as well as the full spatialized radiation pattern of the virtual instrument.

Although such methods can result in a realistic sound, the drawback is computational cost. Even small-scale simulations can easily require 10-100 seconds of simulation time for each second of sound produced on typical desktop hardware. However, FDTD methods are inherently parallelisable due to the high-level of data independence at each time-step. By adapting the simulations to run on GPUs the computation time can be dramatically reduced. Initial work on the project has involved porting Matlab simulation code to the CUDA language used by NVidia GPUs. This has resulted in reductions to the simulation time of up to 40x on some codes.

In later stages, NESS will collaborate with composers of electronic music to test and prototype the work, leading to performances of original and entirely synthetic music.

NESS is a 60-month, ERC funded project.

www.epcc.ed.ac.uk/projects/ness

10

Next generation sound synthesis through simulation Adrian Mouat & Stefan Bilbao

Page 11: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

11

No, not the Snooker World Championship at the Crucible Theatre. Rather, the Heriot-Watt Crucible III: an interdisciplinary collaboration programme for early career academics from Heriot-Watt University, the University of Edinburgh, the Moredun Research Institute and SELEX Galileo.

This, the third running of the programme, was held over six days in February and March 2012 and attended by Iain Bethune and Adam Carter of EPCC. It featured workshops hosted by the Royal Society of Edinburgh, the Scottish Parliament, the British Council, Heriot-Watt’s School of Textiles and Design, the Royal Botanic Gardens Edinburgh and Our Dynamic Earth. The 30 participants from diverse fields engaged with numerous experts from academia, industry, media and government. The aim was to build research collaborations that maximised their interdisciplinarity, innovation and impact.

As a final challenge, participants designed novel proposals for interdisciplinary projects to pitch to a panel of high-level research experts. Chaired by Quentin Cooper, host of BBC Radio 4’s Material World, the panel comprised Heriot-Watt University Principal, Steve Chapman; Deputy Principal for Research & KT, Alan Miller; Professor of Autonomous Systems Engineering, David Lane; and Professor of Environmental Science, Teresa Fernandes. They were joined by Ian Underwood, head of the Institute for Integrated Micro and Nano Systems, University of Edinburgh.

The winning proposal was the ‘Chip Shop’: a project to develop an in-silico design toolkit for micro-fluidic chips. It exploited

the power of HPC with computational fluid dynamics to optimise designs for a range of interesting biological applications. The project team (pictured) comprised Iain Bethune (EPCC), Prashant Valluri and Helen Bridle (University of Edinburgh School of Engineering), and Tom Apsray, Birgit Gaiser and Maiwenn Kersaudy-Kerhoas (Heriot-Watt).

In addition to bringing together researchers, the Heriot-Watt Crucible also provides seed funding to develop collaborative projects with the aim of bidding for funding from an external source.

As well as taking part in the winning project, EPCC is also involved in two other proposals: using HECToR to predict the ecological impact of potential oil leakages off the coast of Belize, and optimising the design of key components in the chemical processing industry. We are currently awaiting the results and look forward to promising new research collaborations.

Contact Research Futures to find out more: [email protected]; www.hw.ac.uk/researchfutures

Further news from the HW Crucible:

www.news.hw.ac.uk/news/7105-Cross_institute_research_collaborations_forged_at_Heriot_Watt_Crucible_III

ePCC success at the CrucibleIain Bethune

ePCC visitor programmes: a long-lasting legacyFrédéric Magoulès, Ecole Centrale Paris, France (TRACS visitor: 1999 and 2002. HPC-Europa visitor: 2004)

My participation in the TRACS and HPC-Europa programmes was a great scientific experience for me. I visited EPCC to carry out work on computational mechanics, hosted by Mechanical and Chemical Engineering at Heriot-Watt University.

With a background in applied mathematics and computer science, I was working at the algorithmic interface between parallel computing and the numerical analysis of partial differential equations. My aims were to design, analyze, develop, and validate mathematical models and computational methods for the high-performance simulation of multidisciplinary scientific and engineering problems. I had the opportunity to work on a real parallel computer with a large number of nodes – and without a long batch queue. Add to that a useful version of MPI, and the interactive computer facilities at the host department, and you have all the ingredients for successful programming and debugging.

The code was used on the SGI and the Cray T3E, then the largest computer in Europe. Using novel domain decomposition methods with innovative preconditioning techniques, the code

was distributed on more than 128 processors (an impressive number ten years ago) with excellent efficiency and scalability on real engineering problems arising in industry. Helpful and efficient staff at both EPCC and the host department helped me take full advantage of the computing facilities.

Managing the large volumes of data produced by my code led me to grid computing. As grid technologies mature, their relevance to business increases dramatically, and together with EPCC I have been involved in the EGEE and BEinGRID projects, developing novel parallel algorithms and innovative frameworks for grid computing, followed by validation in a series of business experiments to demonstrate their benefits in geographically distributed systems.

Frédéric has written two books on grid computing and edited another: Fundamentals of Grid Computing: Theory, Algorithms and Technologies (edited by F Magoulès); Introduction to Grid Computing (F Magoulès, JPan, KA Tan, A Kumar); Grid Resource Management: Toward Virtual and Services Compliant Grid Computing (F Magoulès, TMH Nguyen L Yu).

HPC-Europa transnational access visitor programme: www.hpc-europa.eu

Page 12: Good vibrations - EPCC...Good vibrations 2 editorial Lawrence Mitchell Welcome to the summer 2012 edition of EPCC News. For a long time now, EPCC has been about more than just parallel

Ruyman Reyes CastroApplications Developer I am from Tenerife in the Canary Islands. Prior to joining EPCC, I worked on the European-funded TEXT project for the Universitat Jaume I (Castellon de la Plana, Spain), implementing

part of the ScaLAPACK library in SMPSs/OMPSs. I am also finishing a PhD dissertation about directive-based accelerators in University of La Laguna. My work in EPCC is focused on improving the performance of CP2K, an Open Source program to perform atomistic and molecular simulations of solid state, liquid, molecular and biological systems. www.cp2k.org

Mark FilipiakApplications Developer For the previous 20 years I worked at the University in Meteorology and then GeoSciences, mostly in satellite remote sensing. Initially I was involved in the development

and data analysis for a microwave stratospheric sounder, more recently I analysed data from an infra-red imager that measures sea surface temperature. At EPCC I am working with the Fluidity CFD/ocean modelling code, developing a mesh decomposition that will improve its performance on NUMA architectures. www.epcc.ed.ac.uk/research-collaborations/hector-dcse

George GrahamBusiness Development ManagerPrior to joining EPCC I worked with a number of companies in a business development and sales capacity. My motivation for working for EPCC is driven by a continued interest in high performance computer architectures, stretching way back to my days as a computer science undergraduate. I focus predominantly on selling cycles on HECToR. My approach to date has been fairly broad, targeting potential users of HPC across a range of industry sectors from aerospace and automotive through to life sciences and biotechnology. www.hector.ac.uk

Iakovos Panourgias Applications DeveloperAfter completing a degree in Information Technology, I worked as a software engineer and systems architect in the telecommunications sector. I moved to Edinburgh in 2010 to join

the MSc in High Performance Computing. After graduating, I worked as a software engineer for the financial sector, before joining EPCC. I am currently working on the APOS project, which will investigate technologies and develop tools that will help migrate existing applications to high-performance computing architectures.www.apos-project.eu

Luis CebamanosApplications DeveloperAfter completing a degree in Telecommunication Engineering in Spain, I went to Manchester for a year as a visiting Researcher where I got involved in the SKA (Square Kilometre Array)

project. Back in Spain, I worked for Telefonica and finally at Barcelona Supercomputing Centre. I moved to Edinburgh in 2009 to work at Heriot Watt University on computational chemistry projects. At EPCC I concentrate on SPRINT (Simple Parallel R INTerface), working on the parallelisation of the Support Vector Machine algorithm available for R. I have a strong interest in any HPC technology or compiler. www.epcc.ed.ac.uk/software-products/sprint

Ronnie GallowayBusiness Development ManagerI hold an MA honours degree in Scottish History from Edinburgh and an MBA from Cass Business School. I am experienced in developing higher education business streams and have

previously, inter alia, held business-facing roles at Brunel University, London Business School, Cass Business School, the ICAEW and with Edinburgh Research & Innovation at the University of Edinburgh. At EPCC I am mainly involved with Supercomputing Scotland. www.supercomputingscotland.org

Daniel Holmes Applications DeveloperI’m currently working on the BioColloids project, which aims to add functionality to HemeLB (a bloodflow simulation program) so it can simulate the movement of biologically

significant particles in the bloodstream. I contribute to Supercomputing Scotland as a technical expert with a focus on parallel programming for Windows machines. I am concluding my PhD, attempting to achieve technology transfer between academia and industry by creating a high-performance communication library in C#, a popular object-oriented language. www.2020science.net/software/hemelbwww.supercomputingscotland.org

Katie Urquhart CRESTA Administration and Dissemination OfficerI have an arts rather than science background. I graduated from the University of Edinburgh with a degree in English Literature and since then I’ve worked for a couple of Higher Education Institutes (Central St Martins and Edinburgh) and Higher Education agencies (Higher Education Academy, SFC, QAA). http://cresta-project.eu

Meet the new staff at EPCC

www.epcc.ed.ac.uk


Recommended