+ All Categories
Home > Documents > BCS Sep13 Big Data

BCS Sep13 Big Data

Date post: 23-Dec-2015
Category:
Upload: elijah-warren
View: 25 times
Download: 2 times
Share this document with a friend
Description:
Big Data
Popular Tags:
35
bcs.org/itnow SUMMER 2013 AUTUMN 2013 THE MAGAZINE FOR THE IT PROFESSIONAL
Transcript
Page 1: BCS Sep13 Big Data

bcs.org/itnow

SUM

MER

2013A

UTU

MN

2013T H E M A G A Z I N E F O R T H E I T P R O F E S S I O N A L

Page 2: BCS Sep13 Big Data

BIG DATA06 WHERE ARE WE WITH BIG DATA?08 BIG DATA, IP AND PRIVACY10 SECURING BIG DATA12 WHAT IS BIG DATA?14 BIG DATA VISION16 BIG DATA AND PATIENT CARE20 IDENTITY AND BIG DATA22 YOUR RESOURCES: BIG DATA

HEALTH40 TAKING CARE

SECURITY26 CLOUD SURFING28 AVOIDING CYBERWASH30 ROLE WITH IT32 DATA, DATA EVERYWHERE35 TAKE CONTROL36 SECURITY UPDATE38 DATA PROTECTION

LEARNING AND DEVELOPMENT44 EDUCATION VS SKILLS

...THE REST42 3D TECHNOLOGY52 BCS AND BROADCASTING54 SCRUM & KANBAN56 CIO VS CDO58 COMPUTER ARTS64 YOUR RESOURCES

EDITORIAL TEAMHenry Tucker Editor-in-ChiefJustin Richards Multimedia EditorGrant Powell Social Media EditorBrian Runciman Publisher

PRODUCTION Florence Leroy Production Manager

AdvertisingBarry DavidsonE [email protected] +44 (0) 203 177 1167

Keep in touchContributions are welcome for consideration. Please email: [email protected]

ITNOW is the membership magazine of BCS, The Chartered Institute for IT.It is sent to a wide variety of IT professionals, from systems developers to directors, consultants to training and education specialists. A subscription to ITNOW comprises four issues. All prices include postage. For subscribers outside the UK, delivery is by Standard Air.

Annual subscription rates Institutional: print edition and site-wide online access: £158/US$299/€236; print edition only: £148/US$281/€254; site-wide online access only: £148/US$281/€223. Personal: print edition and individual online access: £148/US$297/€223.

ITNOW, ISSN 1746-5702, is published quarterly (March, June, September, December) by BCS, The Chartered Institute for IT, North Star House, Swindon, UK. The US annual subscription price is $299. Airfreight and mailing in the USA by agent named Air Business Ltd, c/o Worldnet Shipping Inc., 156-15, 146th Avenue, 2nd Floor, Jamaica, NY 11434, USA.Periodicals postage paid at Jamaica NY 11431.US Postmaster: Send address changes to ITNOW, Air Business Ltd, c/o Worldnet Shipping Inc., 156-15, 146th Avenue, 2nd Floor, Jamaica, NY 11434, USASubscription records are maintained at BCS, The Chartered Institute for IT, North Star House, North Star Avenue, Swindon, SN2 1FA UK.

For payment details and terms and conditions, please see: www.oxfordjournals.org/our_journals/combul/access_purchases /price_list.htmThe current year and two previous years’ issues are available from Oxford University Press. Previous volumes can be obtained from the Periodicals Service Company, 11 Main Street, Germantown, NY 12526, USA. E [email protected] T +1 518 537 4700, F +1 518 537 5899

For further information, please contact: Journals Customer Service Department, Oxford University Press, Great Clarendon Street, Oxford OX2 6DP, UK. E [email protected] (and answerphone) +44 (0)1865 353 907 F +44 (0)1865 353 485

The opinions expressed herein are not necessarily those of BCS or the organisations employing the authors.© 2013 The British Computer Society.Registered Charity No 292786.

Copying: Permission to copy for educational purposes only without fee all

or part of this material is granted provided that the copies are not made or distributed for direct commercial advantage; BCS copyright notice and the title of the publication and its date appear; and notice is given that copying is by permission of BCS.

To copy otherwise, or to republish, requires specific permission from the publications manager at the address below and may require a fee.

Printed by the Wyndeham Group, UK.ISSN 1746-5702. Volume 55, Part 3.

BCS The Chartered Institute for ITFirst Floor, Block D, North Star House,North Star Avenue, Swindon, SN2 1FA, UK.T +44 (0)1793 417 424 F +44 (0)1793 417 444 www.bcs.org/contactIncorporated by Royal Charter 1984.

Roger Marshall BCS PresidentDavid Clarke Chief Executive

Feedbackemail: [email protected]

Imag

e: iS

tock

phot

o/17

3390

168

BIG DATAAim high

Make a meaningful impact

Strategy | Insight | Innovation | Transformation | TrainingAgile | Lean | ITIL® | ISO/IEC 20000 | IT Governance

We are ConnectSphere. We are service management specialists.

Our mission is to unite, empower, and build the capability of our clients to deliver meaning and value to their clients.

We develop cultures of change and innovation to help our clients to succeed.

email: [email protected]

ITIL ® is a registered trade mark of the Cabinet Offi ce

www.connectsphere.com

Call: UK +44 (0)845 838 2345 US +1 (919) 313 4558

Page 3: BCS Sep13 Big Data

ITNOW September 2013 September 2013 ITNOW 04 05

MEMBER NEWS

David Evans, Director of Membership at the Institute, explained further: ‘Our experience of working with employers, small and large, is that they want to hire and retain individuals who are committed to exceeding professional standards and that they are willing to pay a premium for such individuals.’

This is borne out by results from a recent survey conducted by the Institute that revealed that a Chartered IT Professional with a Certificate of Current Competency earns on average approximately £92,000 pa.

‘They can command such salaries because of their competence, their commitment to keeping their skills current and up-to-date and to professional ethics and values, all of which are valued by employers,’ David added.

CompetenceCITP provides a recognisable sign of professional integrity and dedication. IT professionals achieve CITP status through a combination of peer assessment and formal testing and are awarded a certificate of

‘What are we doing to raise the profile of BCS membership with employers?’ is a question the Membership team is often asked. There are a number of ways that the Institute actively promotes the importance of membership and professionalism to employers, but there are things we can all do that will help with this objective.

We have launched a micro-site for recruiters at www.bcs.org/recruiters.

This, and its accompanying guide, is designed to inform human resources professionals, recruitment agencies and hiring managers of the importance of looking for IT professionals with BCS membership. It highlights what BCS membership means that cannot always be ascertained from a CV alone.

A commitment to the values in the BCS Code of Conduct and what that means in day-to-day working life, the commitment to keep up-to-date in their field of expertise and commitment to professional development on a continuing basis, and the resources supporting BCS members to help them in their careers.

For these reasons we recommend that hirers:• Add ‘Member of BCS’ to their list

of requirements when looking to recruit IT professionals;

• Look for FBCS, MBCS or AMBCS letters after candidates’ names;

• Look for CITP letters after candidates’ names to show a defined level of competence;

• Ask for evidence of CPD.

If you hire IT professionals, do you ask for ‘Member of BCS’ in your list of requirements or ask for evidence of CPD?

We can all play our part in raising awareness of the importance of BCS membership to employers, and centrally we are working hard through initiatives such as this to ensure the word is widely communicated and understood.

www.bcs.org/recruiters

Membership and recruiters

MEMBER NEWS

doi:1

0.10

93/i

tnow

/bw

t033

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

May saw BCS Council elect a new chair and vice-chair – one a past President and the other a new council mem-ber who was first elected to the body by the professional membership last year. Dr Roger Johnson, FBCS CITP (BCS President 1992/3) returned to the Chair’s seat at May’s Council meeting, and Kevin Streater, FBCS CITP took up his seat as Vice-Chair.

BCS Council is the only body in the governance structure of the Institute where all professional members are represented and the only body where any one member in good standing can be appointed through election rather than merit or industry seniority.

It therefore provides an essential link with the entire membership and the constituent parts of branches, specialist groups, international and the young professionals group (YPG) – all of whom have a fair representation on Council.

‘Through the coming year, it is essential to strengthen the legitimacy of Council by demonstrating it does speak for the membership. The priority is to ensure that there is continuous and transparent communication between all layers in the governance structure,’ said Roger.

‘It is vital that council acts as the facilitator that provides greater assurance that the BCS strategic priorities command membership support,’ he added.

Kevin added: ‘the challenge Council faces is how to maintain a dialogue with the wider membership without which it has no reason to exist. It is vital that Council is effective at representing the views of the wide, diverse membership as part of the institutional governance processes and that the wider membership is aware of the issues being

discussed at Council on their behalf.’To raise Council’s profile, and the profile

of individual elected members of Council, a number of activities are being planned as a starting point for a wider dialogue with the membership. These include having a Council article in all future editions of ITNOW, dissemination of Council meeting summaries through the secure area (My Council) of the BCS website, regular updates on Council business through social media and the consideration of webcasting of key Council discussions.

To support communications to Council it is proposed that a number of activities are arranged, each of which will provide members with opportunities to engage directly with their representative Council members. These include: Council leadership presence at all Member Group conventions and similar events, regional roadshows and communication from Council leadership encouraging member groups to invite their Council representatives to events and committee meetings.

Through the coming year Council will be considering a number of important matters such as reviewing the BCS Strategic Plan and budget as well as electoral matters such as electing the next year’s Deputy President, elections to Trustee Board and of

Employers value individuals who are committed to professional standards and development according to BCS, The Chartered Institute for IT.

current competence that remains valid for five years after which revalidation is required.

They also commit to a code of conduct that outlines professional behaviours, and are accountable for meeting it.

Bad hiresWith UK companies reporting the cost of a bad hire to be in the region of £50,000, more organisations are looking for ways of trying to ensure that they get it right first time and hire people who fit their values and can deliver for their customers.

Richard Atkinson MBCS CITP, CIO Just Giving, said of CITP: ‘It’s very important for the progression of the IT industry that we embrace standards. If we don’t, we will fail to earn the respect of our customers.

‘CITP holds value for employers in developing employees to provide a greater service at every level, and in helping to retain those employees through recognition and fostering a sense of accomplishment.’

www.bcs.org

DIALOGUEMAINTAIN

Professionals

Wanted

Imag

e: iS

tock

phot

o/16

2947

253

vice-presidents as vacancies arise. ‘This is just a flavour of the higher

profile we would like to give Council amongst the membership. Please join in the Member-Council discussion group on the Member Network and on LinkedIn - and please consider standing for Council yourself especially if you are in our significantly under-represented groups of under 40s or female,’ they both added.

Any member can put themselves forward for nomination which opens shortly. If you would like more information about Council and its work please contact the Vice-Chair Kevin Streater ([email protected]) or Roger Johnson ([email protected]).

Nomination forms for 2014 elections to Council to be received for the followingby noon on Monday 4 November 2013:

• Professional Membership• Regional Constituency• International Constituency• Specialist Group Constituency

For nomination forms and more information go to: www.bcs.org/elections.

Watch Kevin talking about his role at www.bcs.org/videos/kevinstreater.

Imag

e: S

tock

byte

/stk

1282

27rk

e

Page 4: BCS Sep13 Big Data

September 2013 ITNOW 0706 ITNOW September 2013

doi:1

0.10

93/i

tnow

/bw

t034

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Image: iS

tockphoto/148232706

WHERE ARE WE WITH

compound annual growth rate of 31.7 percent – about seven times the rate of the overall information and communications technology market.‘

The same article reports further investment in the perceived future of big data with the recent announcement by Dell, Intel Corporation and Revolution Analytics of the Big Data Innovation Centre in Singapore.

The new centre brings together expertise from all three organisations to provide training programmes, proof-of-concept capabilities and solution development support on big data and predictive analytic innovations catering to the Asian market.

How and whenThe ‘when’ of embracing any new technology is massively variable depending on your organisation’s aims, business sector and so on.

Some of the things that could affect your timing are neatly summed up by Redmond magazine in a recent article, simply by listing some of the possible motivators. They mention that you could utilise ‘CRM systems and data feeds to

tweets mentioning their organisations that can alert them to a sudden problem with a product.’ If this kind of real-time feedback is of benefit then dipping a toe into the deluge of the big data waters is best done sooner rather than later.

Another area mentioned is ‘potential market opportunities spawned by an event’. Not as business-critical as product feedback, but important in a time of global austerity.

Redmond also mentions things such as online and big-box retailers using big data to automate their supply chains on the fly and law enforcement’s analysing huge amounts of data to thwart potential crime and terror attacks. The scope and motivations vary widely, but potential benefits are both long- and short-term.

As to how to go about it, some of the tools are mentioned above, often oriented around Hadoop. Microsoft recently launched Windows Azure HDInsight and Redmond also cited VMware’s key application infrastructure and big data and analytics portfolio called Pivotal.

www.bcs.org

Further readingMicrosoft’s special report on using clusters for analytics:http://research.microsoft.com/apps/pubs/default.aspx?id=179615Victor Mayer-Schonenberger and Kenneth Cukier, ‘Big Data’ review:http://www.bostonglobe.com/arts/books/2013/03/05/book-review-big-data-viktor-mayer-schonberger-and-kenneth-cukier/T6YC7rNqXHgWowaE1oD8vO/story.htmlIBM on big data:www-01.ibm.com/software/data/bigdataWired on Cloudera: www.wired.com/wire-denterprise/2013/06/cloudera-search

The hardware perspective:www.techrepublic.com/blog/big-data-analytics/are-we-headed-for-a-plat-form-change-for-bigdata/445?tag=content;blog-list-riverBig data sources:www.zdnet.com/top-10-categories-for-big-data-sources-and-mining-technolo-gies-7000000926/Hadoop:http://en.wikipedia.org/wiki/HadoopThings you should know about implementing big data:http://redmondmag.com/arti-cles/2013/05/01/buried-in-big-data.aspxBIGThere is absolutely no question that there

is an awful lot more data around now than there was. BCS’s data migration blog-ger recently commented on the fact that some telco’s are ‘de-tuning or switching off entirely some of their monitoring platforms due to the sheer volume of data and their inability to store it, never mind process it meaningfully.’

IBM says that every day, we create 2.5 quintillion bytes of data – so much that 90 per cent of the data in the world today has been created in the last two years alone.

Clearly there is enough data around that if we can get any meaningful analyses from it then the ‘marketing puff’ concern can be allayed. Where is all this data coming from?

SourcesSocial media platforms produce huge quantities of data, both from individual network profiles and the content that influencers and the less influential alike produce. Short form blogging, link-sharing, expert blog comments, user forums, ‘likes’

and more all contain potentially useful information.

There is also data produced through sheer activity: machine-generated content in the form of device log files, which could be characterised as the internet of things. This would include output from such things as geo-tagging.

Yet more data can be mined from software as a service and cloud applications, data that’s already in the cloud but mostly divorced from internal enterprise data.

Another large, but at this stage largely untapped, area is the data languishing in legacy systems, which includes things like medical records and customer correspondence.

CaveatsA recent post from BCS’s future blogger called into question some of the behind the scenes story: ‘For the big data commercial advocates, there must be algorithms that can trawl the data and create outcomes better, that is to say more cost effective, than traditional advertising. Where is the

evidence that such algorithms exist? How will these algorithms be created and evaluated and improved upon if they do exist? One problem is that in a huge dataset, there may be many spurious correlations and the difference between causation and correlation hard to prove.’

As we would perhaps expect, the likes of IBM say this goes beyond hype: ‘While there is a lot of buzz about big data in the market, it isn’t hype. Plenty of customers are seeing tangible ROI using IBM solutions to address their big data challenges.’

Big Blue goes on to quote a 20 per cent decrease in patient mortality by analysing streaming patient data in the healthcare arena; a telco that enjoyed a 92 per cent decrease in processing time by analysing networking and call data and a whopping 99 per cent improved accuracy in placing power generation resources by analysing 2.8 petabytes of untapped data for a utilities organisation.

ToolsTo handle large datasets in times gone by enterprises used relational databases and warehouses from proprietary suppliers. But they just can’t handle the volumes of

data being produced. This has seen a trend towards some open source alternatives such as Hadoop, which Wikipedia defines as ‘an open-source software framework that supports data-intensive distributed applications, licensed under the Apache v2 license. It supports the running of applications on large clusters.’

Wired recently reported on Cloudera — one of several companies that help build and use Hadoop applications, which is offering a Google-style search engine for Hadoop called, uninspiringly, Cloudera Search.

Interestingly, Wired pointed to a recent Microsoft paper on whether customers really need to put all their data in Hadoop. It argued, says Wired, that ‘most companies don’t (have) data problems that justify the use of big clusters of servers. Even Yahoo and Facebook, two of the companies most associated with big data, are using clusters to solve problems that could actually be done on a single server.’

Despite that interest is on the up and big organisations are taking advantage. A recent piece from The Sun Daily mentions that ‘analyst firm International Data Corp projects the global big data technology and services market will grow at a

There have been many descriptions of big data of late – mostly metaphors or similes for ‘big’ – deluge, flood, explosion… but not only is there a lot of talk about big data, there is also a lot of data. What can we do with structured and unstructured data? Can we extract insights from it? Is ‘big data’ just a marketing puff term? Brian Runciman MBCS introduces the ITNOW big data focus.

DATA?

Page 5: BCS Sep13 Big Data

BIG DATA

08 ITNOW September 2013

average of 125TB of data, but will actually utilise only 12 per cent of it. This shocking statistic brings home the key attribute/challenge of big data; which is the sheer volume, velocity and variety of data that reside and travel across multiple channels and platforms within and between organisations.

Think of all the personal information that is stored and transmitted through ISPs, mobile network operators, supermarkets, local councils, medical and financial service organisations (e.g. hospitals, banks, insurers, and credit card agencies).

Also, not forgetting information shared and stored on social networks, by religious organisations, educational institutions and or employers. Each organisation has the headache of organising, securing and exploiting their business, operational and customer data.

Incidentally, the information is increasingly comprised of unstructured data, such as: video, audio, image and written content, which require a lot more effort and intelligence to process.

As a result, many organisations have turned to ever more advanced analytics and business intelligence (including big data and social media) solutions to extract value from this sea of information, in order to create and deliver better and more personalised services to the right customer, at the right time.

Personal privacyGiven such powerful tools, and the large amount of replicated information spread across various sources, it is much easier to obtain a clear picture of any individual’s situation, strengths and limitations. Furthermore, the explosion in speed, types and channels of interaction, enabled by components of Gartner’s nexus of forces, may have brought about a certain degree, (per-haps even an expectation and acceptance), of reduction in personal privacy.

However, people do still care about what and how their personal information is used, especially if it could become disadvantageous or harmful to them. There is a certain class of data which can easily become ‘toxic’ should a company suffer any loss of control, and it include: personal information, strategic IP information, corporate sensitive data (e.g. KPIs and results)

The situation is further complicated by differing world views on personal privacy as a constitutional or fundamental human right. The UK’s Data Protection Act is not applicable to personal information stored outside of the UK, yet we deal daily with organisations, processes and technologies that are global in scale and reach. On the other hand, some users are happy to share personal data in exchange for financial gain.

According to a recent SSRN paper, data protection and privacy entrepreneurship

likely finish off the job. On the contrary, there are real

opportunities for organisations to get their houses in order, by putting in place the right policies and principles for big data governance, in order to reap the immense benefits that big data insights can bring.

doi:1

0.10

93/i

tnow

/bw

t035

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Jude Umeh FBCS CITP, the Institute’s DRM blogger, looks at the relationship between big data, privacy and intellectual property.BD,

P, IP

September 2013 ITNOW 09

may have their place, but ‘people should not have to pay to protect their privacy or receive coupons as compensation’,

especially as this might further disadvantage the poor.

Intellectual propertyIn addition to the above points, organisations also have to deal with the drama of IP rights and masses of unstructured data. Simply put, every last piece of the aforementioned 125TB of big data held in your average organisation will have some associated IP rights that must be taken into consideration when collecting, storing, processing or storing all that information. According to legal experts, companies need to think through fundamental legal aspects of IP rights e.g. ‘who owns the input data companies are using in their analysis, and who owns the output?’

An extreme scenario: Imagine how that corporate promotional video, shot on location with paid models and real people (sans model release), plus uncleared samples in the background music, which just went viral on a number of social networks, could end up costing a lot more than was ever intended. Oh, by the way, the ad was made with unlicensed video editing software, and is freely available to stream or download on the corporate website and on YouTube. Well, such an organisation will most likely get sued, and perhaps should just hang a sign showing where the lawyers can queue up. Every challenge brings an opportunity, but not always to the same person.

Now imagine all that content, and tons more like it (including employees ‘personal’ content), just sloshing around in every organisation, and you might begin to perceive the scale of the problem. In

fact, this creates a lucrative opportunity for big data mining and analysis algorithms, specifically designed perhaps for the

computer audit and forensic investigations market.

Pointing the way forwardHere are three key things that organisations should bear in mind when seeking to deal with issues and problems posed by big data, privacy and IP:

1. Information is the lifeblood of business – therefore treat with due respect and implement the right policies for big data governance. The right information, at the right time, and for the right user, is the holy grail for business and it demands capabilities in data science, and increasingly in data art (visualise data in a meaningful / actionable ways).

2. Soon it may not even matter who owns personal data - personal information is becoming another currency with which the customer can obtain value. There is a growing push to focus big data governance and controls on data usage rather than data collection.

3. It’s not the tool, but how you use it – technology is not really that much a differentiator, rather it is the architecture and infrastructure approach that make all the difference - e.g. Forrester recommends the ‘hub and spoke’ model for decentralised big data capability.

It would seem that the heady combination of big data, privacy and IP could be lethal for any organisation; basically, if privacy issues don’t get you, then IP issues will

Every last piece of the aforementioned 125TB of big data held within your average organisation will have some associated IP rights.

REFERENCESGartner - ‘Information and the Nexus of Forces: Delivering and Analyzing Data’ - Analyst: Yvonne Genovese

BCS TWENTY:13 ENHANCE YOUR IT STRATEGY - ‘Intellectual property in the era of big and open data.

Forrester - ‘Deliver On Big Data Potential With A Hub-And-Spoke Architecture’ – Analyst: Brian Hopkins

SSRN - ‘Buying and Selling Privacy: Big Data’s Different Burdens and Benefits’ by Joseph Jerome (Future of Privacy Forum) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2294996

Out-law.com - ‘Big data: privacy concerns stealing the headlines but IP issues of equal importance to businesses’ – http://www.out-law.com/en/articles/2013/march/big-data-privacy-concerns-stealing-the-headlines-but-ip-issues-of-equal-importance-to-businesses-says-expert/

BCS Edspace Blog ‘Big data: manage the chaos, reap the benefits’ Marc Vael - www.bcs.org/blogs/edspace/bigdata

Capping IT Off – ‘Forget Data Science, Data Art is Next!’ - Simon Gratton www.capgemini.com/blog/capping-it-off/2013/07/forget-data-science-data-art-is-next

Along with cloud, social and mobility, big data (aka information) is one of four key technology forces which, according to Gartner’s Nexus of Forces, have combined to create a paradigm shift in the way we do business.

In a previous article (see refs)on this topic, I discussed how the nexus of forces impacts the rather more fundamental concept of intellectual property. In this article, we shall dive a little deeper into the key issues that impact and influence big data.

A little web research will bring up vast amounts of information and links to articles on the topic of big data. On closer inspection, however, only two or three main issues appear capable of making or breaking the promise of big data, and these are related to: solution approach, personal privacy and intellectual property (IP).

The first issue deals with technology, deployment and the organisational context, whereas the latter two big-ticket items raise concerns about the nature and applicable use of information or big data.

For the purpose of this article we’ll pay more attention to the latter issues, mainly because sparks tend to fly whenever the commercial exploitation of information and content enters into the realm of personal privacy and IP rights.

Big dataAccording to a recent Forrester Research paper, typical firms tend to have an

Page 6: BCS Sep13 Big Data

BIG DATA

10 ITNOW September 2013

sources, like government databases, social media, as well as data that organisations would be willing to share.

In addition the inbuilt instrumentation of smart systems generates a massive amount of, as yet, untapped data. To realise its potential value big data needs to be transformed into smart information , which can then be used to improve planning and increase efficiency as well as to create new kinds of products.

Information security challengesThe underlying information security challenges of malice, misuse and mistake apply equally to big data. Big data techniques can also be used by criminals to improve their exploits, provide insights

that facilitate security breaches and aggregate data to assist with identity theft.

Big data can be misused through abuse of privilege by those with access to the data and analysis tools; curiosity may lead to unauthorised access and information may be deliberately leaked. Mistakes can also cause problems where corner cutting could lead to disclosure or incorrect analysis. The Cloud Security Alliance has published a report on the top ten big data security and privacy challenges.

There are three major risk areas that need to be considered:

Information life cycle: big data turns the classical information life cycle on its head. There may be no obvious owner for the data to ensure its security. What will be

cloud is being used to process the big data understand how to verify that this is secured.

Access managementAccess to the analysis infrastructure, data being analysed and the results should be subject to proper IAM controls.

AuditThere should be logging and monitoring of activities on the analysis infrastructure to allow proper auditing. The key risk areas for big data are the changed information life cycle, the provenance of the big data and technology unknowns.

The basic objectives of information security for big data are the same as for normal data, however, special attention is needed to ensure controls for these key risk areas.

doi:1

0.10

93/i

tnow

/bw

t036

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Big data can create business value by solving emerging business challenges. However, big data also cre-ates security challenges that need to be considered by organisations adopting or using big data techniques and technologies says Mike Small, FBCS CITP.

BIG DATASECURING

Imag

e: T

hink

stoc

k/iS

tock

phot

o

September 2013 ITNOW 11

discovered by analysis may not be known at the beginning. The provenance of the data may be doubtful, the ownership of the data may be subject to dispute and the classification of the information discovered may not be feasible until after analysis. For all of these reasons the compliance requirements and controls needed cannot easily be predetermined.

Data provenance: big data involves absorbing and analysing large amounts of data that may have originated outside of the organisation that is using it. If you don’t control the data creation and collection process - how can you be sure of the data source and the integrity of the data? How do you know that you have the right to use the data in the way that is being planned? These points are brought out very clearly in a UK report on the use of smart metering of power consumption by utility companies .

Technology unknowns: the technology that underlies the processing of big data was conceived to provide massively scalable processing rather than to enforce security controls. While this is not a new phenomenon in the IT industry there has not been sufficient time for the inherent

vulnerabilities and security weaknesses to become manifest.

Information stewardship for big dataTaking care to look after property that is not your own is called stewardship. Information stewardship is not a new term; it has been in use since the 1990s and covers the wide range of challenges involved in managing information as a key organisational asset. These include the management of the whole information life cycle from ownership to deletion as well as aspects like business value, data archi-tecture, information quality, compliance

and security.The basic objectives of information

security for big data are the same as for normal data being to ensure its confidentiality, availability and integrity. To achieve these objectives certain processes and security elements must be in place. There is a large overlap with the normal information security management processes, however, specific attention is needed in the following areas:

Everyone is responsibleThe unstructured nature of big data means that it is difficult to assign the responsibility to a single person. Everyone in an organisation needs to understand their responsibility for the security of all of the data they create or handle.

Verification of data sourceTechnical mechanisms are needed to verify the source of external data used; for example digital signatures.

Systems integrityThere needs to be good control over the integrity of the systems used for analysis including privilege management

and change control. Be careful to validate conclusions, if you can’t explain why the results make sense they probably don’t.

Always build in a way to check, don’t let big data lead you to stupid conclusions.

Secure processingMeasures to secure the data within the analysis infrastructure are needed to mitigate potential vulnerabilities and to secure against leakage. These could include disk level encryption and a high level of network isolation. Big data should be secured in transit preferably using encryption - at least using SSL/TLS. If the

The underlying information security challenges of malice, misuse and mistake apply equally to big data.

Mike will present an in-depth look at this subject at the BCS IRMA (Information Risk Management and Assurance) Specialist Group meeting in London in October 2013.

REFERENCESwww.kuppingercole.com/report/advisorynote_bigdatasmartdata 70750140513

www.enisa.europa.eu/activities/risk-management/evolving-threat-environment/ENISA_Threat_Landscape/at_download/fullReport

https://downloads.cloudsecurityalliance.org/initiatives/bdwg/Big_Data_Top_Ten_v1.pdf

www.energynetworks.org/modx/assets/files/electricity/futures/smart_meters/ENACR009_002_1.1-Control%20Points.pdf

There is now an enormous quantity of data in a wide variety of forms that is being gen-erated very quickly. However, the term big data is as much a reflection of the limitations of the current technology as it is a statement on the quantity, speed or variety of data.

The term big data needs to be understood as data that has greater volume, variety or velocity than can be comfortably processed using the technology that you already have.

Big data comes from a number of sources both internal and external. Many organisations have accumulated large amounts of data that they are not exploiting. There is an even larger amount of data that is held in publicly available

Page 7: BCS Sep13 Big Data

BIG DATA

12 ITNOW September 2013

previous web searches and tying that together with that person’s current location so that, for instance, they can be pinged with an advert for a nearby Chinese restaurant if their searches have indicated they like Chinese food before they have walked past the restaurant. Here ‘big’ principally means ‘very fast’.

• Trying to gain business intelligence for the mass of unstructured or semi-structured data an organisation has in its documents, emails, etc. Here ‘big’ equates to ‘complex’.

So, although there is no commonly accepted definition of big data, we can say that it is data that can be defined by some combination of the following five characteristics:

• Volume – where the amount of data to be stored and analysed is sufficiently large so as to require special considerations.

• Variety – where the data consists of multiple types of data potentially from multiple sources; here we need to consider structured data held in tables or objects for which

the metadata is well defined, semi-structured data held as documents or similar where the metadata is contained internally (for example XML documents), or unstructured data which can be photographs, video, or any other form of binary data.

• Velocity – where the data is produced at high rates and operating on ‘stale’ data is not valuable.

• Value – where the data has perceived or quantifiable benefit to the enterprise or organisation using it.

• Veracity – where the correctness of the data can be assessed.

Interestingly, I saw an article from The New York Times about a group that works for the council in New York. They were faced with the problem of finding the culprits who were polluting the sewers with old cooking fats.

One department had details of where the sewers ran and where they were getting blocked, another department had maps of the city with details of all the restaurants and a third department had details of which restaurants had contracts

a mix of ‘attributes’, similar to key-value stores. The most common NoSQL databases, such as Hadoop, are extensible record stores.

Graph databases consist of interconnected elements with an undetermined number of interconnections and are used to store data representing concepts such as social relationships, public transport links, road maps or network topologies.

Storing the data is, of course, just part of the story. For the data to be of use it must be analysed and for this a whole new range of sophisticated techniques are required, including machine learning, natural language processing, predictive modelling, neural networks and social network mapping. Sitting alongside these techniques are a complementary range of data visualisation tools.

Big data has always been with us, whether you consider it as a volume issue, a variety issue, a velocity issue, a value issue or a veracity issue, or a combination of any of these. What is different is that we now have the technologies to store

and analyse large quantities of structured, semi-structured and unstructured data.

For some this is technically challenging. Others see the emergence of big data technologies as a threat and the arrival of the true big brother society.

doi:1

0.10

93/i

tnow

/bw

t037

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Keith Gordon MBCS CITP, former Secretary of the BCS Data Management Specialist Group, looks at definitions of big data and the database models that have grown up around it.

WHAT IS BIG DATA?

Imag

e: T

hink

stoc

k/S

tock

byte

September 2013 ITNOW 13

with disposal companies for the removal of old cooking fats.

Putting that together produced details of the restaurants that did not have disposal contracts and were close to the blockages and which were, therefore, possible culprits. That was described

as an application of big data, but there was no mention of any specific big data technologies. Was it just an application of common sense and good detective work?

The technologiesMore recently, following the revelation from Edward Snowden, the American whistle-blower, the Washington Post had an article explaining how the National Security Agency is able to store and analyse the massive quantities of data it is collecting about the telephone, text and online conversations that are going on around the world. This was put down to the arrival, within the last few years, of big data technologies.

But it is not just government agencies that are interested in big data. Large data-intensive companies, such as Amazon and Google, are taking the lead in some of the developments of the technologies to handle big data.

Our beloved SQL databases, based on the relational model of data, do not scale easily to handle the growing quantities of structured data and have only limited facilities for handing semi-structured and unstructured data. There is, therefore, a need for alternative storage models for data.

Collectively, databases built around these alternative storage models have become known as NoSQL databases, where this can mean ‘NotOnlySQL’ or ‘No,NeverSQL’ depending on the alternative storage model being considered (or,

indeed, your perception of SQL as a database language).

There are over 150 different NoSQL databases available on the market. They all achieve performance gains by doing away with some (or all) of the restrictions traditionally associated with conventional

databases in exchange for scalability and distributed processing. The principal categories of NoSQL databases are key-value stores, document stores, extensible record (or wide-column) stores and graph databases, although there are many other types of NoSQL databases.

A key-value store is where the data can be stored in a schema-less way, with the ‘key-value’ relationship consisting of a key, normally a string, and a value, which is the actual data of interest. The value itself can be stored using a datatype of a programming language or as an object.

A document store is a key-value store where the values are specifically the native documents, such as Microsoft Office (MS Word and MS Excel, etc), PDF, XML or similar documents. Whilst every row in a table in an SQL database will have the same sequence of columns, each document could have data items that are completely different.

Like SQL databases, extensible record stores, or wide column stores, have ‘tables’ (called ‘super column families’) which contain columns (called ‘super columns’).

However, each of the columns contains

Our beloved SQL databases, based on the relational model of data, do not scale easily to handle the growing quantities of structured data.

For the data to be of use it must be analysed and for this a whole new range of sophisticated techniques are required, including machine learning, natural language processing, predictive modelling, neural networks and social network mapping.

The BCS Data Management Specialist Group web pages are at:www.bcs.org/category/17607

Whether you live in an ‘IT bubble’ or not, it is very difficult to miss hearing of something called big data nowadays. Many of the emails hitting my inbox go further and talk about ‘big data technologies’. These fall into two camps: the technologies to store the data and the technologies required to analyse and make sense of the data.

So, what is big data? In an attempt to find out I attended a seminar put on by The Institution of Engineering and Technology (IET) late last year. After listening to five speakers I was even more confused than I had been at the beginning of the day. Amongst the interpretations of the term ‘big data’ I heard on that day were:

• Making the vast quantities of data that is held by the government publically available, the ‘Open Data’ initiative. I am really not sure what ‘big’ means in this scenario!

• For a future project, storing, in a ‘hostile’ environment with no readily-available power supply, and then analysing in slow time large quantities of very structured data of limited complexity. Here ‘big’ means ‘a lot of’.

• For a telecoms company, analysing data available about a person’s

Page 8: BCS Sep13 Big Data

BIG DATA

14 ITNOW September 2013

temptation to dismiss the results or massage the figures. What policies and processes are needed to ensure that this doesn’t happen?

Another important governance issue is around how to protect the valuable data. The information security threat is constantly evolving and as big data becomes the critical driving force for many organisations, the risk of having their data asset compromised or corrupted becomes acute. Great clarity on who is responsible for managing this issue and how it is managed will be critical.

So, when starting to consider all these issues, the most fundamental question is; where should responsibility for these issues lie?

Generally speaking, four options tend present themselves:

• The CIO as the person responsible for managing the data asset;

• The person or people who get the benefit from the data asset;

• With a neutral third party;• A mixture of the above.

As things stand, in many organisations, the CIO is the default answer. After all, the ‘I’ in CIO stands for information, so surely this should be a core responsibility? This approach does have some justification. CIOs are often the only people who have

naturally be reluctant to create yet another C-level role.

Finally there is the hybrid approach, for example sharing governance responsibility between the CIO and the users or putting a CDO in place to report to the CIO or a senior user figure such as a COO. It is certainly true that all significant stakeholder groups will need to be involved at some level in ensuring good governance around data. However, this again brings in the issues around governance by committee and unclear overall responsibilities.

Any of the above models could work, but ultimately, which of them will work is most likely to be highly influenced by the nature of the organisation. In general terms, therefore, the pictured model might apply.

However, this model does not take account of some further vital factors. For example, corporate culture is a key issue. In an organisation with a very strong cooperative culture, the hybrid approach might be the one to choose.

Last but not least, giving this important responsibility to an individual with the right experience and personality can be seen as being at least as important as their job title. Give the job to the right person and the chances are it will get done, give the job to the wrong person and the chances are it won’t. What remains true in all cases, however, is that this issue will become more and more important and addressing it successfully is going to be of vital importance for all organisations.

doi:1

0.10

93/i

tnow

/bw

t038

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Adam Davison MBCS CITP asks whether big data means big governance.

Imag

e: iS

tock

Pho

to/D

igita

lVis

ion/

Rya

n M

cVay

September 2013 ITNOW 15

an overall understanding of what data, in total, the organisation owns and what it is used for. Also, the CIO tends to have practical responsibility for many of the issues listed above such as IT security (not quite the same as information security, however) and data cleansing (not quite the same as data quality).

However, the CIO typically has responsibility for managing the data. Is it

therefore appropriate that he/she should also own the governance framework under which this data is managed? Furthermore, CIOs tend to have a wide range of responsibilities, so their ability to give sufficient focus to data/information governance could be limited. Finally, CIOs may not be ideally positioned when it comes to influencing behaviours across the organisation as a whole.

Responsibility with the user?For many, having overall responsibility for data governance resting with the users, the people who gain benefit from the data, is an appealing concept. They are, after all, the people who have most to lose if good governance isn’t applied. Again, however, there are downsides to this. Only in the

For the average undergraduate student in the 1980s, attempting to research a topic was a time consuming and often frustrating experience. Some original research and data collection might be possible, but to a great extent, research consisted of visit to a library to trawl through text books and periodicals.

Today the situation is very different. Huge volumes of data from which useful information can be derived are readily available - both in structured and unstructured formats - and that volume is growing exponentially. The researcher has many options. They can still generate their own data, but they can also obtain original data from other sources or draw on the analysis of others. Most powerfully of all, they can combine these approaches allowing great potential to examine correlations and the differences. In addition to all this, researchers have powerful tools and technologies to analyse this data and present the results.

In the world of work the situation is similar, with huge potential for organisations to make truly informed management decisions. The day of the ‘seat of the pants’ management is generally believed to be on the way out, with future success for most organisations driven by two factors: what data you have or can obtain and how you use it.

However, in all this excitement, there is an aspect that is easy to overlook: governance. What structured and processes should organisations put in place to ensure that they can realise all these possibilities?

Equally importantly, how can the minefield of potential traps waiting to ensnare the unwary be avoided? Can organisations continue to address this area in the way they always have, or, in this new world of big data, is a whole new approach to governance needed?

What is clear is that big data presents numerous challenges to the organisation, which can only be addressed by robust governance.

Most of these aren’t entirely new, but the increasing emphasis on data and data modelling as the main driver of organisational decisions and competitive advantage means that getting the governance right is likely to become far more important than has been the case in the past.

Adam Davison MBCS CITP writes the Strategy Perspective Blog for BCS. www.bcs.org/blogs/itstrategy

Diversity of Organisational Activities

Level of information dependency

Low

Low CIO or User

User

CIO

CDO

High

High

relatively small organisation will it be practical for the user side to be represented by a single individual. More frequently, one runs the risk of ending up with a sort of governance by committee, with a range of stakeholders each with their own viewpoints. In this scenario, the chance of a consistent and appropriate governance model being created and such a model being successfully applied are very limited.

Faced with these issues, some organisations have chosen to take a third way and create the post of chief data

officer (CDO): someone who has overall responsibility for organisational data but who sits outside of either (usually) IT or the end-user communities. This approach is in many ways attractive. It means that overall governance responsibility rests with someone who is able to focus themselves entirely on the issues related to data (not the case with either the CIO or the user community) and who can take an entirely neutral viewpoint when setting rules on how such data is managed, and used. However, issues again emerge.

The CDO concept can be undermined by the question of organisational authority to ensure that the decisions that they make are binding, particularly as CEOs, already under pressure from multiple directions for increased senior level representation, will

Questions, questionsTo start with there is the question of the overall organisational vision for big data and who has the responsibility of setting this? What projects will be carried out with what priority? Also one has to consider practicalities – how will the management of organisational data be optimised?

Next we come to the critical question of quality. Garbage in, garbage out is an old adage and IT departments have been running data cleansing initiatives since time immemorial. But in the world of big data, is this enough? What about the role of the wider organisation, the people who really get the benefit from having good quality data? There is also the issue that a lot of the anticipated value of big data comes not just from using the data you own, but from combining your data with external data sets. But how do you guarantee the quality of these externally derived data sets and who takes responsibility for the consequences of decisions made based on poor quality, externally derived data?

Although garbage in more or less guarantees garbage out, the opposite is not necessarily true. There are two elements involved in turning a data asset into something useful to the organisation; good quality data and good quality models to analyse that data. As was clearly demonstrated in the banking crisis, however, predictive models rarely give perfect results.

How therefore can organisations ensure that the that the results of modelling are properly tested against historic data and then re-tested and analysed against real results so the models and the data sets required to feed the models can be refined and improved? Above all, how can organisations ensure that the results of analysis are treated with an appropriate degree of scepticism when used as a basis for decision-making?

Confirmation biasAlso, when considering how such models are used, the psychological phenomenon of confirmation bias needs to be considered; the human tendency to look for or favour the results that are expected or desired. Inevitably analysis of data will sometimes give results that are counterintuitive or just not what was looked for, leading to the age old

Is it appropriate that the CIO owns the governance framework under which data is managed?

VISIONBIG DATA

Page 9: BCS Sep13 Big Data

BIG DATA

16 ITNOW September 2013

arguably, gone further. Originally designed in 2003 as an early detection mechanism for bioterrorism, the BioSense project now uses pooled data to track emerging public health problems in real time. The cloud-hosted system is a model for how the NHS could integrate data to actively identify and respond to health challenges.

Other secondary benefits of big data for the NHS would include capturing performance data that could benchmark the performance of hospital departments and highlight pockets of poor care. Such a benchmarking system was recommended by the Francis Report into failings at Mid Staffordshire NHS Trust, and would allow for interventions before, rather than after, problems compromised patient safety.

The patient perspectiveThe more ambitious target, however, is for the NHS to realise the direct or primary benefits of improved data that can enhance quality and safety at the level of an individual patient.

This means improving the flow of information to clinicians to enable them to make quicker, more informed and ultimately better decisions. One example is sepsis, which kills close to a third of affected patients. Survival hinges on detecting sepsis in the initial six hours of onset, meaning every hour counts. Applied intelligently and in real time, data can directly increase the chance of survival. Cerner’s St John’s Sepsis Agent technology continuously monitors key clinical indicators and where sepsis

radically change how it cares for Britain. By embracing data, the NHS has the possibility to improve the quality, safety and affordability of care.

AmbitionsThe question, however, is how ambitious is the NHS is prepared to be. If it restricts itself to secondary benefits of smarter data, the NHS will be a more informed purchaser, with accurate pictures of health trends, hospital performance and at risk patients. To truly transform care, however, the NHS needs to go further, directly applying data into clinical practice to improve outcomes, strengthen patient safety and prevent hospital admissions.

The challenge is not unique; it is the same as that faced by companies such as Google, Facebook and Amazon. In harvesting aggregated data, online companies build a valuable asset, but success hinges on analysing and applying that data at the level of an individual consumer. That allows for tailored interventions, such as targeted advertising on Google or suggest purchases on Amazon that change consumer behavior.

The context for the NHS is different, but the objective is the same; using data to better understand a patient’s health and target specific interventions to improve it.

It will require ambition to make it a reality, but the prize of a better, safer, affordable NHS makes it worth striving for.

Many thanks to Iain Wood at Cerner for his assistance with this article.

doi:1

0.10

93/i

tnow

/bw

t039

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Big data needs to meet small data to deliver on healthcare’s challenges says Dr Justin Whatling FBCS, Chair BCS Health and Senior Director of Population Health at Cerner.

Imag

e: iS

tock

Pho

to

September 2013 ITNOW 17

patterns are detected an automatic alert is sent to care teams who can intervene to avoid unnecessary deaths.

By using big data to structure and process data directly in clinical practice, the NHS can directly improve the quality and safety of care. Real time application of big data can directly change patient care and save lives.

Cloud and real time data analytics will enable proliferation of algorithms and decision support to direct patient care and the care of patient cohorts. This will massively expand with the availability of genomics data mapped to patients’ electronic medical records. Decision support using this linked information enables targeted personalised care.

Also the progressing genetic re-classification of diseases will change

existing handfuls of histopathological cancer types into hundreds of types per cancer. We will require huge patient databases to understand how to treat smaller numbers of each genetic type, if not down to the individual patient level. By merging the scale of big data with the detail of ‘small’ data, information can be applied to individual patients at the point of care. This is no longer science fiction but an achievable technological reality.

In advocating the transformation of care, we have to be mindful of the financial

Harnessing the potential of big data is one of the single biggest opportunities facing the NHS. It will be at the heart of how we make care better, safer and more affordable in the future. The question, therefore, is not whether big data will transform care, but how to maximise the benefits. The danger for the NHS is that it focuses on data at a macro level, building pictures of health trends across populations. Useful though that is, it merely classifies problems.

The real breakthrough comes from using data to identify solutions, helping doctors and nurses to make real-time decisions. The NHS does not need to choose between using data to plot problems or identify solutions; it needs to be ambitious in embracing both, and using big data to transform the quality of care.

Before the NHS conquers big data, however, it first needs to overcome its cultural aversion to sharing it. Some industries, such as retail and banking, have been quick to see monetary value in high quality, customer data and have developed advanced systems to capture, trade and use it for commercial advantage. The NHS is different.

The lack of commercial drivers, combined with greater sensitivity and legal constraints around sensitive medical data has helped foster an ingrained nervousness about sharing data. The consequences for care are stark; clinicians are left facing decisions without vital pieces of the jigsaw that did not follow the patient through the system.

realities facing the NHS. In the short-term, the NHS needs to realise £20 billion of efficiency savings by 2015. Longer-term, a financial time-bomb caused by the rise of chronic diseases in an ageing population makes the current model unsustainable.

Data offers a solution and should underpin the effort to make care affordable. By mapping health trends across a population, at risks groups can be identified and resources targeted more effectively. At an individual level, enabling patients to manage their own health and avoid hospital admissions dramatically reduces the cost of care.

In the US, a pilot to join up patient data between the Department of Veterans Affairs and Kaiser Permanente, a leading Californian care consortium, reduced patient visits by over 26 per cent. If the

NHS is to remain affordable, it has to transform how it delivers care. Harnessing big data to better target resources and prevent hospital admissions should be at the heart of this. Far from undermining the case for investment in big data technology, budget constraint reinforces it.

The NHS has much to be proud of as it celebrates its 65th birthday. It remains a model of what an ambitious, progressive society can achieve, even in times of austerity. But if the NHS is to celebrate another sixty five years, it needs to

Information sharingFortunately for big data advocates, the tide is turning on the NHS’s reluctance to share. In May last year, the Department of Health’s information strategy, ‘The power of information’ explicitly stated that ‘not sharing information has the potential to do more harm than sharing it’. This was followed in March 2013 by the ‘Caldicott2’ report, a government-commissioned review tasked with identifying how NHS information sharing could be improved without compromising patient confidentiality.

The Health Secretary, Jeremy Hunt, has spoken passionately about the need to improve the quality of data as part of his plan to make Britain a ‘global hub’ for health technology. It would be naïve to think political will can reverse decades of cultural conservatism overnight, but the direction of travel is the right one and there is now a determination to use data to improve the quality of care.

In seeking to harness the power of big data, the easiest approach for the NHS would be to focus on the secondary benefits of big data. This involves joining up data currently locked in silos to identify health opportunities across the population.

Some work has already started on this, with national disease registries and more recently with NHS England’s care.data project already extracting data from General Practice and seeking to extract and link data from hospitals, and down the line from community and mental health providers across the NHS. The US has,

Cloud and real time data analytics will enable proliferation of algorithms and decision support to direct patient care.

BIG DATA

AND PATIENT CARE

Page 10: BCS Sep13 Big Data

September 2013 ITNOW 19

GREEN DATA CENTRES

18 ITNOW September 2013

doi:1

0.10

93/i

tnow

/bw

t040

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

In our ever connected world we are relying more and more on data centres, but they use a lot of power. With this in mind, Henry Tucker MBCS went to see the self-proclaimed, greenest data centre in the world.

The dark blue water laps gently on the hard granite shoreline. Take just one step into the cold water and the drop is 70m straight down. Go a little further out and it can get as deep as 150m. These Norwegian fjords have been used for many things ever since man first laid eyes on them. Now they are being used to cool a data centre.

Green Mountain is no ordinary data centre though, even before it started using 8oc fjord water to cool its servers. That’s because not only is it quite green, literally and figuratively, but it is also a mountain. Well inside one.

Smedvig, the company that owns the data centre isn’t the first to operate inside the mountain though. The tunnels that run up to 260m into the granite were drilled by

company running it has designed an efficient way to cool it using the other thing that it has on tap; the fjord water.

As the water is so deep, when you get down to 100m it is a constant 8oc all year round. This water is then drawn into a large concrete tank without using pumps because it is at sea level. As it is sea water it can’t be used to directly cool anything inside the data centre as it would be too corrosive.

So what they do is use a closed, fresh water pipeline that draws the heat away from the racks, this is then cooled using the sea water and titanium heat exchangers. After this the sea water then goes out into the fjord again at a temperature of 18oc.

As water is far more efficient and effective than air, Green Mountain claims

NATO in the early 1960s after the Cuban missile crisis. Initially it was used to store field hospitals and then later to house and repair torpedoes and mines, now NATO has gone and the mountain is almost empty.

In many ways it makes the perfect location for a data centre. They aren’t places that you want a lot of people going to, they need to be secure and they need to be cooled efficiently and effectively. With only one way in it is secure, the fjord isn’t at risk from tsunami or earthquakes and with all the mountains and lakes around it, it isn’t short of cheaper electricity from the network of hydroelectric plants in the area.

The fact that it has the hydroelectric power nearby would always be in its favour when it came to being environmentally friendly. However, in addition to this the

that it gets 100KW of cooling from just 1KW of power.

A potential downside of its remote location, of course, could be the time data takes to get to and from the servers that reside there. Stavanger though is the hub of the Norwegian oil industry and so it already has fast connections to other parts of Northern Europe and to London. It claims to have only a 6.5ms latency to the UK.

As for its claim to being the greenest data centre in the world, Green Mountain CEO Knut Molaug explained this claim.

‘It is because of the low CO2 emissions, we have virtually no CO2 emissions. As you know the data centre industry is very energy hungry and we, in Norway, have close to 100 per cent renewable energy.

‘We use 100 per cent hydro power here. So that’s number one. Number two, is that we built the system to be extremely efficient utilising the fjord outside the site for cooling. This means that we have one of the world’s most efficient data centres in combination with using green energy and

and make potentially greener data centres.‘It’s the beauty of competition that when

somebody stands out, someone will want to level you or pass. We hope that this spurs further development within green data centres.’

As to what is driving companies to choose to use the data centre, although it has excellent green credentials, Knut doesn’t think that it is the main reason companies chose it.

‘I think that the main driver for any business is money. The fact that we have green energy available, at low cost, is the main driver for almost all of them. Everyone would like to be green, but they don’t want to pay for it. We can offer a cheaper alternative that, in addition, is green.’

Green Mountain is a good example of making the best use of the things you have around you in order to be as efficient as possible. With the data centre industry growing hopefully more will take on some of the features of Green Mountain to reduce their CO2 footprint.

using former buildings in our efforts, and in everything we have built, we have put a green element in all the designs.’

Going greenThe company’s rationale for building the data centre came from wanting to build the facility, but also to try and do it in as environmentally a way as possible.

‘It was a combination of both (wanting to build a data centre and wanting it to be green). We were discussing the possibility of building a data centre in other places around us, because the owners of Green Mountain are already owners of another data centre in Stavanger, and during the process of evaluating the possibility of building another data centre using water for cooling, this site came up for sale. So it was a combination of we were looking to build a green data centre and an opportunity that came along.’

By creating what Green Mountain likes to claim is the greenest data centre they are hoping that other companies take their lead

Cooling stationData room in-row coolers

Fjord

30m

100m

8oC COOLING FROM THE FJORD

DATAMOUNTAIN

Page 11: BCS Sep13 Big Data

BIG DATA

20 ITNOW September 2013

data that customers give up when they use those sites or services. The quid pro quo is usually targeted advertising. As Viviane Reding of the European Commission said on 22 Jan 2012, ‘Personal data is the currency of today’s digital market’. It is widely said that if you are not paying the full cost of a service you are a product, not a customer. Most young people either do not think about this or they accept it, and it can be a win-win situation. You can apparently get something for nothing, or almost nothing, if you pay for it with your identity attributes.

Do you need to get offline?However, you may not want your identity attributes to be used and privacy may really matter to you. If that is the case, do you need to get offline and lose out on

maximises the benefits of the availability of those attributes, while minimising the disbenefits of revealing more attributes than are strictly needed. This requires detailed analysis and not broad generalisations.

While technology solutions may exist, the social and economic aspects of implementation are very complex. They are also very personal and will change for an individual over time, even in identical contexts. What was a playful prank at school could have implications when applying for jobs years later if it can be linked to your identity.

The opportunitiesThe pace of innovation in online commerce and delivery of government services is accelerating. By making everything digital,

exploiting the power of big data and the ubiquity of mobile communications, there are huge opportunities to improve productivity, enhance the value to individuals and manage risks effectively.

While the potential upsides are great, the downsides are also stark. The downsides lie mainly in the potential loss of privacy (both real and perceived) and the erosion of trust, if those online cannot provide evidence of their trustworthiness in the context of the transactions they wish to make.

Context and demonstrable trustworthiness are key to the use of personal data attributes in the online world. They are blended with our experiences in the offline world. The success of ‘bricks and clicks’ commercial models is testament to this. Those who mine big data need to think very hard about how they monetise personal data attributes. They need to be transparent about what they are doing and provide evidence that they are trustworthy if they are to handle our attributes in an acceptable manner, and be successful in an online world.

doi:1

0.10

93/i

tnow

/bw

t041

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Louise Bennett FBCS, Chair of BCS Security, looks at the opportunities and dangers of one of the implications of big data: identity discovery through data aggregation.

Imag

e: iS

tock

Pho

to/D

igita

lVis

ion/

Rya

n M

cVay

September 2013 ITNOW 21

some deals you might be offered? What does big data mean for your privacy? Can you retain online privacy or is identity discovery through the aggregation of your personal data attributes inevitable?

Personal information disseminates over time into many different areas and once published on the internet it is improbable that it can ever all be deleted. There are also powerful commercial tools available to mine information about an individual or organisation. The next time you use a social media site or search engine consider what adverts or suggestions are made to you. They will often be tied to your habits.

For this reason, many people will want to use different identities for different activities on the internet to frustrate potential data aggregation. Many of us will feel there has been an invasion of our privacy if, out of the blue, a connection we deliberately withheld is made about us. For example, you may wonder: ‘How on earth did the organisation my husband has just bought something from know my mobile phone number. We did not give it to them and it is in another name. So how could they text my smart phone to tell me his purchase will be delivered to our home tomorrow?’

Increasing regulationConcerns about data aggregation and data mining on the internet are likely to increase rather than decrease in the coming years. There is also likely to be pressure for regulation because of the potential privacy implications. One example of this is the proposed new EU Regulation on Data Protection. This includes a section on ‘the right to be forgotten’. However, if the Regulation ever gets agreed (which is unlikely with about 4,000 amendments tabled and a 2014 deadline before the EU elections), the right to be forgotten is one thing that will probably be removed.

Such a right is certainly technically challenging, if not impossible, in the internet age. The best privacy activists can hope for is a right to relative obscurity.

The online world increasingly uses a network of attributes to determine identity. If these attributes are just matched for a one-off identity check that is one thing, if they are stored and aggregated in

big databases it raises more concerns. When we think about privacy, particularly in relation to commercialisation of the internet, government surveillance and data collection, it is revealing to consider the outrage at Edward Snowden’s revelations about elected governments engaging in lawful espionage compared to the absence of concern that businesses (accountable only to their shareholders) have all this data in the first place.

Many individuals object to identity discovery through data aggregation, whether by governments or business. This is especially true where it is used to find out about a person’s preferences and life, using data that the individual regards as sensitive, personal data. It is of even more concern when it is used for cyber-stalking and cyber-bullying, or transfers into the

real world as stalking or other criminal activities.

This in turn can lead to people feeling it is legitimate to withhold information about themselves or provide incorrect information in responding to requests they feel are unjustified (e.g. mandatory fields on their age, ethnicity or religion being requested before they receive their goods or services). This is especially important where identity discovery is looking for attributes that are not actually identity attributes, but give information about a person’s preferences or life choices (such as sexuality or membership of organisations).

Attributes of identityThe ‘attributes’ aspect of identity are key to the responsible use of big data. Everything is context dependent. We rarely engage completely online. Often the trust context is developed offline (through our friends or trusted brands) and carried through to the online experience. It is vital to determine what attributes are required in a particular interaction and how trustworthy attributes can be conveyed in a manner that

Think for a moment about all the data that you have given to organisations when you signed up for a subscription or purchased a ticket. Add to that your loyalty card data and what you have posted to social networks, your browsing history and email, your medical and education records. Then add in your bank records, things friends and others have posted about you, memberships and even CVs posted to job sites. Pictorially it will look something like the picture above.

Are you happy about people joining all this data together into an aggregated view of your life and mining it? If they do so, what are the implications for privacy and will it benefit you or ‘them’ more?

There are many commercial models on the internet. Some services are free or below cost because there is value in the

It is revealing to consider the outrage at Edward Snowden’s revelations about elected governments engaging in lawful espionage compared to the absence of concern that businesses have all this data in the first place.

Public profile

Education

Birth & Citizenship

Government records

Medical

Financial

Photographs

Memberships

Social media

Social interactions

Email

Purchases

Preferences

Professional qualifications

CV/Resume

Career

• Social media sites• News groups• Chat groups• Instant messaging

• Professional bodies• Groups• Organisations

• Online albums• Other people’s photos• Picture sharing

• Banks• Credit references• Credit cards

• Car/Driving licence• Electoral role• Address sites• Directories

• School• Training• University

• News sites• TV sites• Conferences • Books

• Publications• Articles

• Job sites• Career sites• Personal website

• Online shops• Review sites• Ratings sites

• Phone records• SMS

• Search engines

WHO ARE YOU?

Page 12: BCS Sep13 Big Data

September 2013 ITNOW 2322 ITNOW September 2013

Managing big data privacy concernsThe article examines the key privacy concerns facing any organisation that uses big data analytics, notably the feasibility of obtaining consent from individuals for using their personal information gathered from social media for data aggregation purposes. The writer suggests several practices that both public and private sector organisations can use to deal with these issues, such as offering their consumers and stakeholders a process to review and correct information that has been collected about them.Lynn Goodendorf, VerSprite, USASource: Information Security, March 2013

An automation tool for single-node and multi-node Hadoop clustersSetting up a Hadoop cluster to handle large data sets is not a difficult job but requires a lot of human time and effort. This increases as the number of nodes goes up. The time and effort could be reduced by using a tool to automate the Hadoop cluster set-up procedure. Such an automated tool is the subject of this study, which shows that it would allow a single operator to install a single-node or multi-node cluster.Vaibhav N Keskar, Amit A Kathade, Gopal S Sarda and Amit D Joshi, College of Engineering Pune, IndiaSource: Journal of Artificial Intelligence

The value of dataThis article examines businesses’ use of big data, as of March 2013. The writer says that the large data sets available to corporations represent a valuable asset, with some companies making significant use of them and including them on their balance sheets, while others have not begun to recognise its value. He cites examples of the use of data in business management, including an analysis of the comments on Twitter to predict movements in stock prices and an analysis of customer transaction data. Peter Bartram, author of The Perfect Project Manager, UKSource: Financial Management, UK

Scalable storage solutions for applied big dataThe storage solutions for applied big data

are examined in this article. The author reports that there is a plethora of scalable storage products ranging from those for use with home entertainment to the Oakridge National Laboratory Titan supercomputer. The challenge is for them to work reliably at the scale required, and allow for future expansion. The article looks at the Common Internet File System for end users, and highlights the features and drawbacks of General Parallel File System (GPFS) and Lusture high performance computing file systems.Rob Farber, Editor, Scientific ComputingSource: Scientific Computing, USA

Big data privacy standards The development of big data security and privacy standards for companies in the USA are discussed in this article. It looks at the 10 technical and organisational security and privacy challenges posed by big data identified by the Cloud Security Alliance (CSA) Big Data Working Group. They fall into four main categories: technical challenges, legal and privacy challenges, data analytics, and ethical.Jaclyn Jaeger, Compliance Week, USASource: Compliance Week, USA. Feb 2013

Data science, predictive analytics, and big data: a revolution that will transform supply chain design and managementThe writers illuminate the myriad of opportunities for research where supply chain management (SCM) intersects with data science, predictive analytics, and big data, collectively referred to as DPB. They show the relevancy of these terms to supply chain research and education. The authors call for research on skills that are needed by SCM data, propose definitions of data science and predictive analytics. They examine possible applications of DPB and provide examples of related research questions, as well as examples of research questions employing DPB that stem from management theories.Matthew A Waller and Stanley E FawcettSource: Journal of Business Logistics. June 2013

Big data raises big questionsThe obstacles faced by governments that want to harness the power of big data are discussed in this article. Governments

of Hohenheim, GermanySource: Journal of Broadcasting & Electronic Media

Internal auditors’ input to big data projectsOn big data projects, internal auditors need to have a seat at the table and ask hard questions about risks and rewards, says this writer. Big data has become a significant development for internal auditors as corporations adopt its use, following its rise to prominence during the 2012 US elections. Internal audit must be in the forefront in classifying data sets, according to the article.Russell A Jackson, freelance writer, USASource: Internal Auditor, February 2013

Assisting developers of big data analytics applications when deploying on Hadoop cloudsBig data analytics applications are a new type of software applications which analyse big data using massive parallel processing frameworks (e.g. Hadoop). Developers of such applications typically develop them using a small sample of data in a pseudo-cloud environment and then deploy the applications in a large-scale cloud environment with considerably more processing power and larger input data. The authors noticed that the runtime analysis and debugging of such applications in the deployment phase cannot be easily addressed by traditional monitoring and debugging approaches. In this paper, they propose a lightweight approach for uncovering differences between pseudo and large-scale cloud deployments, using execution logs from these platforms.Weiyi Shang, Zhen Ming Jiang, Hadi Hemmati, Ahmed E Hassan, and Patrick Martin, Queen’s University, Canada, and Bram Adams, Polytechnique Montréal.Source: ICSE: International Conference on Software Engineering, February 2013

Finding the needle in the big data systems haystackWith the increasing importance of big data,

Dallas and London offices respectively.Source: McKinsey Quarterly, 2013

Big data: The next big thing in innovationThe rise of big data is connected to the advent of web 3.0 and the proliferation of sensors increasing the amount of automated data collection, according to this writer. However, she says that putting big data to work, whether to drive innovation or to reshape innovation processes, will not be so easy.MaryAnne M Gobble, Research Technology ManagementSource: Research Technology Management, USA January/February 2013

The rise of big dataThe writers look at the effect of increasing quantities of digital information, or big data, on the way humans interact, communicate, and learn. Topics include the determination of correlative rather than causative relationships in statistical research using large quantities of data, the lack of accuracy and precision of data created through resources such as the internet, and the ability of technology to produce larger statistical samples.Kenneth Cukier, The Economist, and Viktor Mayer-Schoenberger, Oxford Internet Institute, UKSource: Foreign Affairs, May/June 2013

Big data in digital media researchThis paper discusses the methodological aspects of big data analyses with regard to their applicability and usefulness in digital media research. The authors examine the consequences of using big data at different stages of the research process, based on a review of a diverse selection of literature about online methodology. They argue that researchers need to consider whether the analysis of huge quantities of data is justified, given that it may be limited in validity and scope, and that small-scale analyses of communication content or user behaviour can provide equally meaningful inferences when using proper sampling, measurement, and analytical procedures.Merja Mahrt, Heinrich Heine University, Germany, and Michael Scharkow, University

collect and store data in various formats and completely unrelated systems that may not be able to communicate or transmit data effectively between them. Furthermore, big data projects raise extra issues as they use personal data gathered for an entirely different purpose. As they have been slow to embrace big data, few governments have built data warehouses, which means they could invest instead in the latest-generation systems, such as logical data warehouses.Merill Douglas. Source: Government Technology, USA April 2013

Looking back at big dataThis writer discusses the use of big data in historical research, looking at the relationship between historians and computer scientists in terms of digital historical archives, digital publishing, and online research resources. She describes computational history research conducted by Adam Jatowt from the University of Kyoto, as well as collective memory, data mining, and probability distributions. She covers a study on fame conducted by Google, research into journalism trends, the British Old Bailey Online law archive, and historian William Turkel.Leah Hoffmann, technology writerSource: Communications of the ACM, USA, April 2013

Big data: What’s your plan?This article describes how to make plans for using big data to gain a market advantage in 2013. It focuses on the elements of a successful plan, including making data accessible, using an advanced analytic model, and tools which translate analysis results into actual business actions. It covers corporate planning challenges, including identifying investment priorities, balancing the cost and speed of the strategy, and ensuring acceptance by those who use the strategies.Stefan Biesdorf, David Court, and Paul Willmott of McKinsey, from the Munich,

doi:1

0.10

93/i

tnow

/bw

t042

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Helen Wilcox outlines some of the resources in the member library on big data.

Imag

e: iS

tock

Pho

to/1

3516

1171

many new systems have been developed to solve the big data challenge. At the same time, some famous database researchers argue that there is nothing new about these systems and that they are actually a step backward. This article sheds some light on this discussion.Tim Kraska, Brown University, USASource: IEEE Internet Computing, January 2013

Hotel management must interpret and apply big data to gain the competitive advantageHow the hotel industry collects and measures big data to achieve competitive advantage is described in this article. The first step in getting to grips with the proliferation of information and its use is to establish a hotel-specific big data model, says the writer. Data collection and evaluation are key components to any hotel’s success, but so are the intelligence and the intuition of the management team in interpreting vast amounts of data.Mark Lynn, HVS, USASource: HVS Global Hospitality Report, March 2013

Keep up with your quantsAnalytics are a competitive necessity nowadays, but hiring quantitative experts (quants) who can manipulate big data successfully is not enough, say these authors. They show how to become an intelligent consumer of analytics in order to make effective data-driven decisions. Learning basic statistical principles and methods is essential, but the quants have the detailed know-how, and working closely with the right ones is key. They should communicate their work effectively, while the consumer should ask lots of questions and avoid hunting for evidence to fit preconceived notions. Instead, the quants should establish a culture of inquiry that focuses on learning the real truth behind the numbers. Thomas H Davenport, Babson College, Deloitte Analytics, and International Institute for AnalyticsSource: Harvard Business Review, USA, July/August 2013

YOUR RESOURCES

To get these articles and more login to the BCS secure area, go to ‘My Knowledge’ then ‘EBSCO Databases’

BRIEFING: BIG DATA

Members: login to ‘My Knowledge’

in the secure area to see further

listings with direct links

Page 13: BCS Sep13 Big Data

Information assurance (IA) is what information security people do to try and manage risks associated with information and data.

This covers the people, processes and systems that might access, store, process, and transmit it. It should be holistic, and focus on more than just technical security controls, taking on board strategic and organisational issues too.

IA should consider governance and compliance issues alongside the risk ones,

paying due regard to legal, regulatory and contractual compliance.

It is not simply an IT or technical discipline where techies can work in isolation from the real world; often it requires a delicate balance when people and cultural conflict are possible, e.g. with BYOD.

Other balances must be struck when considering aspects of privacy and transparency, weighing obligations against benefits and risks.

A good IA professional tries to rarely say no, preferring to understand what the business is trying to achieve and then working collaboratively with it to arrive at a suitable method of getting the desired result.

Those working in IA must continue to stay on top of standards and good practice, advances in technologies and emerging issues that may impact particular approaches and change risk profiles (e.g. online communications and cloud computing being targeted by foreign governments).

INFORMATION SECURITY

Most of all, they must engage positively with their business.

Working in this space is both challenging, with everything continually developing, and rewarding, especially when playing a part in defending your organisation, client or country.

www.bcs.org/security

When it comes to information assurance you need to take a wide view of the issues, says Gareth Niblett, Chairman of the BCS Information Security Specialist Group.

Information Security Specialist Group (ISSG):www.bcs-issg.org.uk

Information Risk Management and Assurance Specialist Group:www.bcs.org/groups/irma

BCS Security Community of Expertise (SCoE):www.bcs.org/securitycommunity

FURTHER INFORMATION

HOLISTIC SECURITY

doi:1

0.10

93/i

tnow

/bw

t043

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Imag

e: iS

tock

Pho

to/1

6876

7483

24 ITNOW September 2013

Agile Certified

Professionals across the business can now demonstrate their ability to delivergreater value from their projects – with the global benchmark in agile capability.

BCS Agile Certification pushes the boundaries in agile thinking and delivers thewhy, not just the how, of agile by bringing people together in an agile learning environment to tackle real-world business issues. It’s method-neutral, leaving youto decide on the agile approach that works best in your organisation.

Enjoy successful agile projects and transform the way you do business.

bcs.org/agilecertified

BC2

91/L

D/A

D/0

713

© BCS, The Chartered Institute for IT, is the business name of the British Computer Society (Registered charity no. 292786) 2013

bc291_ld_ad_itnow_fp_agile_af_Layout 1 26/07/2013 09:35 Page 1

Page 14: BCS Sep13 Big Data

September 2013 ITNOW 27

INFORMATION SECURITY

26 ITNOW September 2013

Microsoft; therefore why should our own organisation’s information being stored in the cloud be any different?

Even if the agreement with the cloud supplier satisfies the organisation’s legal and regulatory department and complies with data protection legislation, how sure can we

really be about who has the ability to read our information, and what is more, what could we do about it even if we knew?

Organisations like the NSA and GCHQ are actually doing what they are supposed to do – keeping us safe – so we should not be surprised to hear that interception takes place; after all, that is what the Regulation of Investigatory Powers Act (RIPA) 2000 was designed to control. What should really concern us is what information are we losing control of.

In May 2013, the Cabinet Office mandated that ‘Purchases through the cloud should be the first option considered by public sector

buyers of IT products and services’, and use of the G-Cloud will provide cost savings to the taxpayer as a result.

This is potentially both good news and bad. As taxpayers, we should be delighted that the government is trying to spend our money in a more effective way. However,

since government often stores vital sensitive personal information about us, does placing it in the cloud put it (and us) at greater risk?

The interception of traffic through network routers and switches is not technically demanding – the challenge comes in the decision as to what to intercept, and this is where the skills of the security service analysts come into their own. Semantic and heuristic analysis of data streams provides a first cut of data to be saved, and further analysis yields a result – either negative, in which case the information is discarded since the costs of indefinite storage would be astronomic; or positive, in which case it is

our most vital assets?Many organisations use more than one

cloud service, and keeping track of what is stored with which supplier is becoming an increasing challenge. Subscription-based software is now available to consolidate all of an individual’s or organisation’s cloud credentials, sign-in to each, and display all the cloud services in a single, consistent view. Great – information management has just become much easier. However, now all the keys to the kingdom are in one place, and all the security agencies have to do is ‘persuade’ the consolidation service suppliers to hand over the keys and off you go.

Certainly in these days of economic downturn, the pressure is on finance directors to save money wherever they can, and the organisation’s IT infrastructure is certainly a very good place to begin; but the potential savings from using the cloud should be always be balanced against the potential losses – financial and non-financial – that might arise if interception orders are served on cloud service suppliers.

Maybe it’s time to take a step back and look at clouds from both sides now.

www.bcs.org/security

For many organisations, information underpins their very existence. Take pharmaceutical companies for instance – if the composition of their latest cancer-curing drug, which has taken many years and hundreds of millions of pounds to develop is suddenly copied, their business is very definitely placed at risk. The same applies to any organisation that has invested time and money in research and development.

Let’s consider for a moment what happens to our information when we send it off into the cloud. It leaves our network and then what? We simply don’t know.

We don’t know what network equipment

carries our information, where is it stored or how it gets there. Given that our information is now totally outside our control and that the possibility of its interception is limitless, should we really abandon our ability to control it?

Shouldn’t we therefore consider what information is stored where, rather than simply trusting the cloud supplier to secure

retained for possible further exploitation.Regardless of this, should we be

concerned? I believe that we should. Data protection legislation aside, organisations have a fiduciary responsibility to protect their sensitive information.

There have recently been questions as to whether the Chinese government has the capability of political and commercial espionage through data interception from the high-end routers and switches that are used widely in the networks of national and international communications service providers. Whether or not this were true, why would any other national government

not exert the same influence over similar equipment manufacturers within their own jurisdiction?

When an organisation’s information passes through the jurisdiction of any government, it takes little more than an executive order to allow them to intercept it, and quite possibly to impound it if they wish. do

i:10.

1093

/itn

ow/b

wt0

44 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

It has been several years since cloud services became a viable and cost-effective means of managing our information and IT infrastructure. There have been numerous articles and books written about the technicalities of how organisations can make best use of the cloud and of the security issues that arise. David Sutton FBCS CITP, co-author of Information Security Management Principles says that we should turn our attention back to focus on the ‘what’ and the ‘where’, rather than on the ‘how’.

Data protection legislation aside, organisations have a fiduciary responsibility to protect their sensitive information.

Certainly, using the cloud does (or at least should) solve two key business problems:• A quantifiable reduction in costs.

Organisations using the cloud require less IT infrastructure on their own premises; they don’t have to spend money on staff to look after these increasingly complex systems and they don’t feel the need to upgrade them whenever new hardware, operating systems or application software appear.

• A reduction in security worries. The cloud provider takes care of securing the organisation’s outsourced infrastructure and the information - well, in theory at least.

Whilst the first benefit is undoubtedly true, can we be sure about the second? Recently, there has been much discussion in the media about interception of personal information including emails, fixed and mobile phone call records, text messages, instant messages, Facebook and Twitter accounts . . . the list seems endless.

Media reporting about the PRISM programme has highlighted the active participation of Apple, Facebook, Google and

Shouldn’t we consider what information is stored where, rather than simply trusting the cloud supplier to secure our most vital assets?

CLOUD SURFING

Imag

e: iS

tock

Pho

to/1

3200

1048

Page 15: BCS Sep13 Big Data

September 2013 ITNOW 29

INFORMATION SECURITY

28 ITNOW September 2013

For the IT department, protecting personal data is an on-going concern requiring constant review. The technology provided by employers has changed dramatically over the last decade so that there are more opportunities for data leakage, for instance by way of portable devices.

Employers are faced not only with the

threat of financial penalties, but with the immeasurable consequences of loss of reputation. Only recently it was reported that the personal details of about six million people have been inadvertently exposed by a bug in Facebook’s data archive, whilst over in South Korea, a cyber alert has been issued after an apparent hacking attack on government websites, including the presidential office.

Where do the threats come from?GCHQ, in its Ten Steps to Cyber Security1,

due to multiple failures in technology, processes and people.

Organisations erroneously believe that security policies will eliminate human error, but a common method of orchestrating a cyber-attack is to take advantage of a weakness in human nature by sending an email engineered with a link to an infected site or to malicious software. These days it’s easier to hack users than computers.

Plymouth University has conducted some interesting research looking at

the people and processes affecting the information security behaviour of employees and created a model that serves as a tool for predicting what factors have the ability to potentially impact upon employees3.

There will be a ‘spectrum of commitment’ in most organisations where users will display different behaviours along a scale from total rejection of advice at one end to total acceptance and adherence at the other. Most employees fit somewhere in the middle. The researchers assert that personality affects intention, in that it will act as a filter through which various other influences are passed in order to inform and affect an individual’s ultimate security behaviour.

The ability to carry out that intention is then affected by the security safeguards put in place by the organisation, such as training, monitoring, disciplinary procedures and proactive security role models.

Cormac Herley of Microsoft Research looks at the issue of IT security behaviours from the user’s perspective, tackling it

• misuse of web access or email access, unauthorised access to systems or data using someone else’s ID,

• breach of data protection laws or regulations,

• misuse of confidential information • loss or leakage of confidential

information.So what is the diligent IT department to

do about this significant challenge from an unexpected quarter?

For the smaller organisation without

a dedicated security professional this is challenging as a wealth of information from diverse sources has to be continuously collated to stay on top of things. There is no single, comprehensive source of knowledge. Instead we inform ourselves by reading, attending conferences and taking advice from experts and our peers. Following this process we review what we have learnt and carry out a risk analysis to see how the threats could actually affect us.

Once we have carried out a risk analysis, we get together with management and create a hybrid plan of written policies coupled with actions restricting employee behaviour. There are hard choices to be made about whether to prohibit, monitor or allow on trust and rely on the users’ cooperation and understanding.

But even with this framework in place, staff-related security incidents still occur which leads us to ask whether the users are taking security seriously. For the IT department it is all too easy to blame the staff for not following instructions whereas in reality, staff-related incidents are often

breaks the risk down into the following categories: • cyber criminals making money

through fraud or the sale of valuable data,

• industrial competitors trying to steal secrets,

• foreign intelligence services,

• hackers in it for the laughs, • hacktivists with ideological motives • employees or those with legitimate

access, either by accident or deliberate misuse.

However, the Department for Business Innovation and Skills reported that for the year to April 2013 the statistics for the number of ‘other incidents caused by staff’ was actually fractionally higher than that of cyber attackers, at 72 per cent2.

These incidents were broken down into subcategories: do

i:10.

1093

/itn

ow/b

wt0

45 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

IT security: a source of perennial concern for the IT department; ultimately a source of tedium for many employees. Juliet Flavell says that companies need to present users with workable security guidelines.

Imag

e: iS

tock

Pho

to/1

3657

9577

AVOIDING

from a cost versus benefit perspective4. He suggests that the majority of the advice given is good. However, for the user this type of advice comes at a cost, which is weighed up against the perceived benefits.

He argues that users go through a subtle thought process when deciding whether or not to follow good practice, carrying out a cost/benefit calculation, where the cost is the time and effort involved in following the guideline, and the benefit is subsequently avoiding the harm an attack might bring. So it comes down to direct and indirect costs.

Security guidelines do help to reduce exposure to the direct costs of an attack, but at the same time the indirect costs, though hard to measure, are increased in terms of time and effort spent implementing the guidelines.

So users ignore new advice because they are overwhelmed and the cumulative effort required is a burden. The benefit is perceived to be debateable, combined with the fact that there is a lack of data on the frequency and severity of attacks. Therefore to the user it appears to be mere speculation that following security advice will reduce the risk of attack.

What this means is that we need more detailed research into the success rates of various types of attack to help us to tailor our guidance and avoid ‘worst-case risk analysis’, allowing us to offer usable advice on the most common exploits such as phishing.

We must prioritise our advice, avoiding cyberwash, otherwise users will have to select which advice they follow and which they ignore and finally we must realise that when we exaggerate all the risks, many users will simply ignore us.

www.bcs.org/security

But even with this framework in place, staff-related security incidents still occur which leads us to ask whether the users are taking security seriously.

We must prioritise our advice, avoiding cyberwash, otherwise users will have to select which advice they follow and which they ignore.

CYBERWASH

References 1. http://www.gchq.gov.uk/Press/Pages/10-Steps-to-Cyber-Security.aspx2. https://dm.pwc.com/HMG2013breachessurvey/3. Understanding the influences on information security behav-iour, Professor Steven Furnell and Anish Rajendran, Plymouth University4. So Long, And No Thanks for the Externalities: The Rational Rejection of Security Advice by Users, Cormac Herley, Microsoft Research

Page 16: BCS Sep13 Big Data

September 2013 ITNOW 31

INFORMATION SECURITY

30 ITNOW September 2013

a third of both the vulnerability space and protection, but they are not neatly segregated into silos that can be managed independently.

Clearly, people management and business processes overlap strongly, but it is less well recognised that both overlap to a considerable extent with technologies as well. Many technological vulnerabilities are exploitable only via human intervention, and protection technologies must not interfere with business operations.

Equally, business demands must not conflict with technical security or compliance requirements. Consequently, around 75 per cent of IS does not directly relate to technologies, but to change control, policy and procedure management, business process review, analysis, audit and the security-related components of statutory and regulatory compliances.

The remit of ITS is the security of the technological infrastructure over which business information flows and the remit of IS is to ensure business information is managed and used securely over that infrastructure.

The components of information security

Our definition of IS seems pretty all-embracing, so how does information assurance (IA) differ from it? IA has a monitoring responsibility for the entirety of IS, but its primary obligations are oversight of the accuracy, authenticity, completeness and accessibility of business information assets,

business requirements and identify potential vulnerabilities in both processes and technical implementations, for example, to ensure that data backup and DR procedures will perform effectively for the business.

The CIO must have a broad understanding of everything within the remit of the CTO and CISO, but should concentrate primarily on those aspects of IA that are outside those remits; ensuring that information-related legal and contractual obligations are fulfilled, that business information can be trusted as a basis for decision-making, and that the required data can be found and assembled into useful information on demand.

The CIO role therefore requires detailed knowledge of the business and its expectations, excellent communication skills, and broad understanding of both technologies and regulatory requirements sufficient to enable reliable communication with the CISO and CTO.

So we see that the security-related roles of CIO, CISO and CTO are ideally nested in the same way that we established the disciplines of IA, IS and ITS are. In fact they map pretty much one-to-one.

Each tier has a wider remit than the tier below, relying on the specialist knowledge of that tier to implement specific components of broader security requirements ultimately driven by corporate governance requirements.

However, this must not be a command and control hierarchy of silos, but a consortium based on complementary expertise. Nested silos perform no better than parallel silos, and implementing ‘security’ from within silos of any kind cannot deliver real information assurance.

The free flow of information across the structures outline here is essential to the performance of the whole.

www.bcs.org/security

of principle five are the responsibility of IA. Principles one, six and the non-technical aspects of principle eight are matters for the legal department, which comes under corporate governance.

Therefore the relationships resolve to a nest of functions with ITS at the centre, surrounded in turn by IS, IA and, finally, the information-related components of corporate governance.

From this it is clear that the common practice of positioning IA or IS (or even just data protection) within the IT department is a recipe for failure. Both the range of business and soft skills - commercial acumen, risk expertise, legal understanding and psychology - and the authority and reporting pathways required for proper performance of IS and IA are rarely required of, or used as hiring criteria in, the pure technological security domain.

So, from the security perspective, where should the recognised roles of CTO, CISO and CIO ideally sit, and what are the ideal skill sets for fulfilling them?

The CTO is responsible for the technological architecture and infrastructure, including IT security - firewalling, network segregation, intrusion detection and prevention, event monitoring and the provision of adequate backup and disaster recovery (DR) technologies.

The role requires a deep understanding of information technologies at the level of first principles, knowledge of and the ability to evaluate current vendor offerings, and awareness of the nature of and changes to the technical threat landscape.

The CISO requires conceptual understanding of technologies with the same scope as the CTO’s, but not at such a detailed level. But the CISO must maintain detailed current knowledge of the threat and vulnerability spaces and be able assess current risks to business information with confidence.

In addition, the CISO must be able to review

and responsibility for compliance with the business-related components of relevant statutory and regulatory compliances.

Let’s examine a simple example to see where the boundaries lie. The UK Data Protection Act invokes eight principles:

1. processing must be fair and lawful;2. processing must be restricted to

specific purposes;3. information must be adequate,

relevant and not excessive;4. information must be accurate and

up-to-date;5. information must not be kept for

longer than necessary;6. processing must be in accordance

with subjects’ rights;7. technical and procedural security

must be adequate;8. data transfers abroad must comply

with the above requirements.

In this case, ITS would be responsible for the technological components of principle seven, and that’s all. The procedural components of principle seven, technical reviews in support of principle eight, and the fulfilment of principle five are the remit of IS. Principles two, three, four and the direction

doi:1

0.10

93/i

tnow

/bw

t046

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Mike Barwise MBCS CITP looks at the role of information assurance and how it fits in with other roles within business.

Imag

e: iS

tock

Pho

to/1

6619

9437

Information assurance is one of those popu-lar terms, like risk, that is widely used without a clear understanding of its real meaning.

To some extent it has been the victim of grade inflation. We started out doing IT security, which gradually became referred to as information security and ultimately as information assurance - ever grander-sounding titles, despite the actual nature of what most of us were doing hardly changing.

As a result, the security remit of most Chief Information Officer (CIO) and Chief Information Security Officer (CISO) roles is today still restricted largely to technologies, often essentially replicating the security remit of the Chief Technology Officer (CTO). Much corporate information assurance therefore exists in name alone.

So what could be done differently? Let’s look first at information security (IS). Although this is commonly considered a technological discipline, technological security (ITS) is really only one of its components.

Technologies, business processes and people management each contribute roughly

ROLE WITH IT

Corporate information governance

Page 17: BCS Sep13 Big Data

September 2013 ITNOW 33

INFORMATION SECURITY

32 ITNOW September 2013

23 or 54, 46 are perfect good instances of data, but they are not particularly informative.

In order for data points to become information there must be a known relationship between data and what it encodes that makes it meaningful. In contrast information is always data because semantic meaning is always encodable as data. All information is data, but not all data is information.

It is vital that as information security professionals we enrich our understating of the ways data becomes meaningful.

To do this we need to consider in turn issues related to order, truth, association, size, brevity, resolution and causal efficacy.

Ordering the dataKnowing the meaning of data, i.e. what

information it is, is the critical step in establishing its true value. So how does data become meaningful? The first way in which data becomes meaningful is order. I give you a list of four digit numbers from 0000 to 9999.

As a dataset I’ve just expressed all the credit card EMV PIN numbers in use in the UK and as information it has little meaning, and even less value. I now reorder that list putting all the numbers from 1930-1995 at the beginning and put values like 0000, 1111 etc at the end. I give the list a title of the ‘most popular EMV PIN numbers in the UK’. Ordering changes the meaning, and by extension, the value of that list.

TruthIf we consider another order of our EMV

Typically only two data points are required - an email and a password. Knowing what data enables makes it valuable as meaningful information.

ResolutionThe greater the amount of data captured about a ‘thing’ the more informative it is likely to be.

Consider the difference in two CCTV cameras looking at the same scene from the same perspective, where one has an image resolution that can allow people to be identified from the footage, the other not.

One critical point to make with relation to resolution is linked to brevity. Maintaining meaning when aggregating, summarising, or otherwise reducing the resolution of the data set is often a more subtly difficult problem than people imagine.

What does it all mean?Exploring data and information, and the critical role of meaning in influencing their value, will enhance how we manage the confidentiality, integrity and availability risks to them as assets.

Perhaps it is only as information that data has any inherent value worth protecting at all.

Likewise, by assuming neither data or information have intuitive, self-evident, definitions will help to reduce the subtle dangers associated with being part of a dialog or process that fails to see the dangers in the uncritical use of a common language.

And a final word should go to big data, as we move into datasets that are so vast that it may well blur the line between data and information.

Perhaps there is a notional critical mass after which a dataset, regardless of the content, is de facto meaningful.

www.bcs.org/security

The more data there is the more meaningful the information can become. Even if you don’t see the actual record, the size of a data set changes the risk profile - a news report indicating a breach of five records has a very different negative value to a news report of a breach of 5 million records.

Brevity‘If I had more time, I would have written a shorter letter’ is commonly accepted as true. When bandwidth was costly short meaningful messages were more valuable. If we had a list of default PINS and some of the associated PAN numbers it would be a big data set.

That information could be re-expressed in a condensed format as the algorithm for generating a default PIN from a PAN (in truth this isn’t actually how it works, but it does help to illustrate the point). Brevity condenses the content of data without loss of meaning and in so doing it becomes more valuable.

Causal efficacyWhat can data you hold let you do? In order to dig into this question we need to change our perspective a little. Consider the question of what data you would need to supply to make an online payment when the card is not present.

Typically one needs a credit card number, a name, a card verification value and an expiry date associated together to make a valid transaction (assuming the sites you used didn’t require an additional verification step).

Whilst payments require a number of data points to have a level of causal efficacy, consider how many data points you need to identify yourself to get access to online services.

PIN number list, this time titled ‘the most popular EMV PIN numbers in the UK as reported in a secret credit card brands report’ received from a legitimate source.

We could say that the ordered data is now more accurate. With complete accuracy the information becomes true. True data becomes more meaningful and highly valuable information.

Accuracy and truth are not synonymous. But we can assume that without accuracy, the truth of information would be hard to verify.

AssociationA list of EMV PIN numbers, however well and truly ordered, has limited meaning.

For a criminal with some stolen credit cards they have better guesses about the appropriate EMV PIN to use with it, but with only three attempts per card the value is still limited.

As part of our thought experiment let us consider what if, on each row of the list, was an example valid primary account number (PAN) example written alongside.

The association of PIN and a valid PAN has given our data much more value. Association of different data elements is another route for adding a valuable meaning to data and the correlations indicated in the associations of data elements in different data sets is a fundamental deliverable of big data.

It is also important to note that when two individually benign data elements are brought together their meaning and value can be increased hugely.

SizeSize always matters. Imagine the increase in meaning if, instead of one valid PAN number, 50 valid PAN numbers were listed with their corresponding valid PIN. do

i:10.

1093

/itn

ow/b

wt0

47 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

As individuals living in a rich technology and communication ecosystem we capture, encode and publish more data than ever before. This trend toward greater amounts of data is set to increase as technology is woven ever more into the fabric of our everyday lives, says Ben Banks MBCS, European Information Security Manager, RR Donnelley.

As information security and privacy professionals we are in the vanguard of navigating this new landscape. Our challenge is enabling commerce whilst ensuring our stewardship for these new assets remains strong.

This article explores one aspect of this challenging new world - when does data become information and what does that change mean for our assurance work?

From data to InformationData and information are not synonymous. Although the terms data and information are often used interchangeably adopting a more rigorous understanding of them has important implications.

It is fairly intuitive that an instance of data, a data point, when considered in isolation is not information. For example

The greater the amount of data captured about a ‘thing’ the more informative it is likely to be.

DATA, DATA

EVERYWHERE

Imag

e: iS

tock

Pho

to/1

6013

8145

Page 18: BCS Sep13 Big Data

IT governance comprises the organisation and provision of IT services and the performance measurement and enhancement of these services. It is in relation to this latter element that the concept of assurance arises.

Assurance is an important component of IT governance. How can the CIO show that the IT service is meeting its value for money and service objectives? Usually this is through the provision of performance metrics, but how can they prove that these metrics and associated analysis are reliable?

I once attended a meeting with a Chief Executive and his six direct reports, two of whom were the CIO and the Chief Internal Auditor (CIA). I asked each head of department, in turn, who was responsible for internal control in their company? Without hesitation each one pointed to the CIA. When I then asked them how frequently the CIA audited their controls they responded ‘every three years’.

When I then asked them who was responsible in-between the three year period, they shuffled their feet, avoided by eyes and remained silent. I then pressed them on risk management. They accepted that this was their responsibility, but when I then pointed out that risk was managed by controls they started to realise that control was their responsibility too. Assurance is primarily achieved by measuring the effectiveness of controls in managing risks.

Unfortunately most auditors and very few managers cannot define what a control is and how it operates, so it is not too surprising that our IT assurance processes are somewhat suspect. I would even go further by asserting that our current control paradigm is not fit-for-purpose.

Our technology has changed beyond

recognition in the last 40 years. From mainframe computers running single batch programmes to cloud computing. Apart from the hardware and the people, most of IT is invisible to the eye. We cannot see the software, data or transactions that comprise our IT systems. Even the bits we can see may be operated by a remote third-party. We are attempting to control 21st century technologies with 18th century controls, without even knowing what a control is.

A control works by comparing something against a known answer. It is simply a test. As an example, let us consider a gender field with a single allowable entry of either ‘M’, or ‘F’. The monitoring mechanism checks for an allowable entry. If it meets the ‘M’ or ‘F’ criteria then it is allowed to pass through to the next process. If it fails the test, however, the process is modified so that the transaction is returned to the initiator. So all controls are processes, but not all processes are controls.

Applying this to risk management we now need to consider the movement from inherent (gross) risk to residual (net) risk. The risk equation has two components: reducing the likelihood and reducing the consequence.

To do both you need a minimum of two controls and this assumes that each control is one hundred per cent effective. I can prove with some pseudo mathematics that this is not the case. By deconstructing a control into its four elements: design, implementation, monitoring and evaluation, and then assigning a maximum value to each element. I then assess the actual value of each element for a particular control and mathematically calculate the overall effectiveness of that control.

This shows that few, if any controls, actually reduce the inherent risk to an acceptable residual level. In many cases we can only manage one side of the risk equation; either we reduce likelihood, or we reduce the consequence, but we may not be able to do both. So all those risk registers that show a movement from inherent red risk to residual green risk are basically wistful thinking.

In most cases the best we can achieve is a movement from red to yellow. Nowhere is this better illustrated than in our change management process, which, in most entities, relies on trust as the control mechanism. Trust the programmer to not add any unauthorised code and trust the tester to find it if they did.

We can only manage humans, not control them, because they have free will, so trust is not a control and we should therefore acknowledge that anything which relies on it is flawed.

So if IT governance effectiveness is assured by controls and our control paradigm is flawed, then where does that leave us? My answer is in a quagmire. We are truly up the creek without the preverbal paddle. The only solace that I can offer is that we now have a more ‘scientific’ method of measuring control effectiveness, which at least can provide a more accurate picture of where we really are in our risk management process. Assurance, or lack of it, truly has business value, because it can show senior management where they really are regarding their residual IT risks. Unfortunately, this is likely to be a most uncomfortable experience.

www.bcs.org/security

According to John Mitchell IT governance can be defined as ‘a structure of relationships and processes to direct and control the IT function in order to achieve the enterprise’s goals by sustaining and extending the enterprise’s strategies and objectives’.

September 2013 ITNOW 35

doi:1

0.10

93/i

tnow

/bw

t048

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

INFORMATION SECURITY OPINION

TAK

ECONTROL

For more information contact Ami Glass on 01628 826 336.

Page 19: BCS Sep13 Big Data

September 2013 ITNOW 37

INFORMATION SECURITY

36 ITNOW September 2013

presented on the same subject, this time concentrating more on commercialisation of the internet and use of identity attributes as currency to buy things online.

In June I was on two panels at the SC Conference. One on consumerisation and security and the other on standards. The consumerisation was very interesting, mainly talking about the security implications of bring your own device (BYOD), but also touching on near field communication (NFC) and Bluetooth.

The presentations from the conferences and talks can be found on the website at: www.bcs.org/scoe/workshops

Work on BYOD by the SCoE has resulted

in a research report and a workshop on NFC was held in February for which the report is available on the website.

An Institute position paper should be published soon, if you would like to express a viewpoint or provide input to this, please drop me a line at [email protected].

The Institute will be talking at the UK Internet Governance Forum and the UN

Internet Governance Forum. The UN IGF in October is a prestigious event and it is quite an achievement to have the Institute proposal accepted for the third year running. Members of the Identity Assurance Working Group in collaboration with Open Net Korea and Beijing University are presenting a workshop on identity assurance.

This looks to be a lively debate, the main subject is the balance between security and anonymity. The key question being asked is, is there a place for anonymity online, is it needed or even desirable?

We are also looking at other areas of identity assurance including protecting

the naïve from themselves, how lack of identity assurance can result in digital exclusion and the whole area of online identity governance. The workshop should prove very informative to our work in this area. Full details including a background paper can be found at: www.intgovforum.org/cms.

Everyone is welcome to join the workshop via remote participation and

behalf of CESG, the UK National Technical Authority. This is an ideal certification for anyone wishing to demonstrate their experience and expertise in the areas of information assurance and security. The scheme is open to anyone including the complete BCS membership.

The scheme offers three levels of practitioner, from those in more junior roles (SFIA level 2) at Practitioner level, to the senior roles (SFIA level 4) at Senior Practitioner level and the real experts and industry leaders (SFIA level 6) at Lead Practitioner level.

Though initially targeted at improving professionalisation in the public sector, we hope that it will appeal to the private sector as well. We expect to see this become the de facto certification for security

professionals over the next few years. We are really pleased that the Institute has been involved in the development of this scheme.

There are also a number of related activities. The Institute offers a number of professional certifications, training courses and exams in the area of information security that the Institute has been developing. More details can be found here: http://certifications.bcs.org/category/15733.

The SCoE has quite a full calendar for 2013 with a number of areas of research and we hope that the publications prove useful. Updates will continue to be provided via the specialist groups, ITNOW and the various mailing lists that the Institute operates.

More details of these can be found on the Institute website at: www.bcs.org/eventscalendar and www.bcs.org/membergroups

www.bcs.org/security

IoT is a massive area. The SCoE is looking at it from a security perspective, but there are many other aspects which require BCS expertise.

In May the SCoE had presentations from the National Fraud Authority and Get Safe Online. The Institute are supporters of both areas of work and remain involved in their ongoing development. It is definitely worth having a look at the Get Safe Online website; there is a lot of useful information there including lots of good stuff for our children.

The Institute is also involved in the work of the IET and education partners to create a Cyber Security Skills Alliance. The idea is to get Cyber Security built in to more degree courses and improve the quality and quantity of security experts in this

area. Improving the academic options in the area is crucial to upskilling the UK’s internet generation. This feeds neatly in to the other main area of work.

The Institute obviously feels that professionalisation of the IT industry is really important. There are millions of people working in IT, many claiming to be experts or professionals, so how can you be sure?

The Institute, IET, IEEE and other professional bodies have done much to help in this area. Just being a full member of the Institute shows a level of professionalism and competence. However, until recently there has been nothing for information security.

Since last year there is now a scheme that provides such recognition for information security and information assurance professionals – the CESG Certified Professional Scheme.

As mentioned in the Spring issue of ITNOW, the Institute is one of the three bodies chosen to operate the scheme on

details of this will be sent out nearer the time of the conference. We are still trying to get the Identity Dynamic Coalition formed and running and will pursue this at IGF again this year. If you would like more information or to be involved please email [email protected].

Members of the SCoE represent the Institute on various standards panels, including the British Standards Institute and the International Standards Organisation. In April the ISO SC27 Conference took place in France. One important outcome of this is that new versions of ISO/IEC 27001 and ISO/IEC 27002 are likely to be published by the end of the year.

Various other standards including a number in the 270xx series and a number on privacy and identity management have also been published or are due in the next few months. A report on all of this will be published on the website soon.

The Institute has also been providing input to a number of UK HMG standards and policies and work on the Government Digital Agenda. It is an exciting time for the UK government with the move to ‘digital by default’ and the Institute will remain actively involved to ensure the membership views are taken in to account as these strategies move forward.

If you want to become involved there are various ways to do so. The best way initially is via the policy hub: www.bcs.org/policyhub.

Our work with the Digital Policy Alliance (http://dpalliance.org.uk) has been quite interesting. Over the last few months work has been going on around the new EU Data Protection legislation and Digital Single Market. The other big area is the internet of things (IoT), which is now becoming more important as consumer devices from cars to toasters connect to the internet.

The Institute has a specialist interest group for IoT, if you would like to join, more details can be found here: www.bcs.org/scoe/internetofthings. do

i:10.

1093

/itn

ow/b

wt0

49 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

BCS, The Chartered Institute for IT, remains actively involved in information security. Andy Smith FBCS CITP details what the various groups have been working on, primarily in two key areas.

Imag

e: iS

tock

Pho

to/1

5390

0037

The key question being asked is, is there a place for anonymity online, is it needed or even desirable?

The first is general information security, the other is professionalisation of the information security industry. The full version of this update can be found at: www.bcs.org/securitycommunity.

The Security Community of Expertise (SCoE) and its various sub-groups deal with information security and IT security on behalf of the Institute. Over the last few months the SCoE has been busy talking at various conferences and representing the Institute on various national and international bodies. We have also been busy writing position papers and providing feedback on various legislative proposals.

Our latest report on aspects of identity that covers the work of the Identity Assurance Working Group (IAWG) for 2012/13 has been published and is available for download on the website at: www.bcs.org/identity.

The Institute held a workshop at the InfoSec Europe 2013 conference in April. This was on the subject of identity assurance, which the Institute feels is a very important area. We concentrated on preventing identity theft and what organisations can do to protect themselves and ensure their staff are who they claim to be.

At the EEMA (www.eema.org) annual conference in the Netherlands the team

Just being a full member of the Institute shows a level of professionalism and competence.

UPDATESECURITY

Page 20: BCS Sep13 Big Data

September 2013 ITNOW 3938 ITNOW September 2013

laptops despite having a policy intended to prevent this from happening. In this case, the laptops were unencrypted due to problems with the council’s encryption software;

• the council had not encrypted the laptops despite requests from both employees;

• 68 other unencrypted laptops are unaccounted for, with a further six known to have been stolen; and

• this enforcement action follows a previous enforcement action by the ICO against the council following the loss of unencrypted memory sticks in 2010.

So what? This decision highlights the ICO’s increased willingness to issue monetary penalties to reinforce its message that portable devices containing personal data should be encrypted. This decision is notable as a significant penalty was issued despite no sensitive personal data being contained on the laptops. However, given that the council had been the subject of an enforcement notice two years earlier for similarly

losing unencrypted portable devices, the ICO has demonstrated that it will not tolerate recurring data protection breaches. Any business handling personal data can take away some key lessons from this recent decision:

• ensure that all portable devices are encrypted;

• keep all portable devices securely stored when not in use;

• maintain an asset register detailing who is responsible for each portable device and identifying where such devices are located; and

• it is not enough to have compliant policies in place - you must follow them.

ICO focuses on online contentThe facts: The ICO is raising its game on enforcement, getting to grips with newer technologies and increasing its focus on compliance – even on social networking sites and online forums.

In the last 12 months the ICO, responsible for data protection policy and enforcement, has issued new guidance on cloud computing, BYOD, IT asset disposal

by a private individual who is acting within the narrow constraints of the domestic purposes exemption.

Organisations and businesses do not benefit from the domestic purpose exemption and in many cases, even groups of individuals acting together are likely to fall outside it and be subject to data protection legislation.

Those running a social networking site or online forum must take special care as many aspects of their use of personal data are likely to be regulated. For example:

• using details of the site’s users and subscribers will be regulated; and

• posted content, where moderated, will be regulated.

There is not a clear distinction between having data protection liability for moder-ated content posted and having no such liability where posts are made directly by users and not first moderated. Having the ability to moderate posts (a sensible precaution in the light of defamation and other risks), such as through site terms on acceptable use and policies giving the site power to remove posts in breach of site rules, is likely to tip the balance and impose data protection liability for posted content.

Consequent risks to those running such sites and forums include:

• the need to take reasonable steps to ensure personal data facts are accurate;

• the need to ensure personal data is adequate and not excessive to need;

• the ability to deal with data subject rights, such as correction of details;

• the restriction on indefinite retention of personal data;

• the need to ensure that there is a lawful ground for such processing in the first place; and

• the restriction on personal data export unless adequate safeguards are in place.

and social networking and online forums.

So what? The guidance issued and details published in respect to enforcement action make clear that the ICO is keen to keep up with technological advances and uses of them. The regulator is also clearly taking a stronger line with those who would argue that they are not subject to the data protection legislation, arguing that they do not control the use of personal details sufficiently to be under ICO jurisdiction.

The ICO’s stance on social networking sites or other online forums is a good example of this. According to the ICO:

• posting content on a website will often

involve processing personal data;• downloading and using personal data

from third party sites will be processing; and

• those running the social network site or online forum are likely to be processing personal data.

This processing will normally be regulated by data protection legislation unless it is do

i:10.

1093

/itn

ow/b

wt0

50 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

Charlotte Walker-Osborn, Head of the TMT Sector and Partner, Liz Fitzsimons, Legal Director, and James Ruane, Associate, from international law-firm Eversheds LLP, take us through a whistle-stop tour of a couple of recent developments in the field of data protection.

INFORMATION SECURITY

Imag

es: I

ngra

m P

ublis

hing

/122

4006

83

DATA

ICO fines Glasgow City Council £150,000 over theft of unencrypted laptops

The facts: The Information Commissioner’s Office (ICO) has issued a monetary penalty of £150,000 against Glasgow City Council after two of its laptops were stolen during refurbishment works at its offices.

On 28 May 2012, two unencrypted laptop computers were stolen from the offices of Glasgow City Council during refurbishment works. At the time they were stolen, the first laptop was locked in a storage drawer with the key placed inside. The second laptop was placed alongside the key in another drawer, but the user had forgotten to lock the drawer.

In finding that the council had seriously breached its obligation to take appropriate technical measures against the loss of personal data, the ICO considered the following aggravating factors:

• one of the laptops contained the personal information of more than 20,143 individuals, including bank account details of 6,069 individuals;

• the council did not prevent its IT supplier from issuing unencrypted

That said, the ICO has issued some pragmatic guidance in respect of data accuracy to help those affected understand what reasonable steps will involve (not being a requirement for the operator to check every individual post for accuracy).

• have user policies for acceptable and non-acceptable posts;

• have procedures to deal with take-down requests;

• have procedures to remove or suspend disputed content pending resolution; and

• have suitable complaints policies and procedures to deal with complaints about site content, accuracy and use of personal data.

The operator of the site or forum must then comply with and follow such procedures and policies, implementing them promptly and thoroughly to minimise the risk of ICO action.

It should also be noted that although traditionally the ICO has been a reactive body, increasingly its approach to compliance is proactive, such as checking sites, their terms, policies and use of personal data without warning.

Those involved in social networking sites or online forums may be concerned about what content can now can be posted. The ICO is due to release further guidance on using personal data for the purposes of journalism, including where posts are reasonably believed to be in the public interest.

This guidance should provide more clarity as to what exactly operators should be doing to avoid ICO action.

Please note that the information provided above is for general information purposes only and should not be relied upon as a detailed legal source.

www.bcs.org/security

PROTECTION

Page 21: BCS Sep13 Big Data

September 2013 ITNOW 41

HEALTH INFORMATICS

40 ITNOW September 2013

What is BCS’s involvement with the Professional Record Standards Body? I, along with BCS, have been supporting this over a number of years; it’s been three years in the making, but it takes time to get to that level of buy-in and it’s all structured correctly now. I’ve been involved, personally, over the last couple of years and have helped to shape that. And then, more recently, when it was coming to the stage when they wanted to move forward as an organisation, BCS was asked if it would be a founding member.

Prior to that it had been asked for some funding to support it; it provided a little bit of seed-corn funding, through BCS Health, to help get them started during the interim phase, since they had an interim team to build it and, although it was a small amount of money, it helped them to become operational and to enable them to get all the other organisations joined up, so they were very grateful for that.

As a result of that, a volunteer member, Philip Scott MBCS, joined them for their steering committee meetings over that period and then the Institute was asked to be a founding member as a professional body and, of course, it is the informatics professional body, whereas many of the others are clinical professional bodies.

However, they do need the informatics involvement as well, and that’s the structure that we got approved through the Policy and Public Affairs Board. And I think, like all these organisations, it’s not just a matter of having your name against it, it’s what value or contribution that bodies are going to provide PRSB, to help shape the organisation and the clinical colleges, to be looking at the standards and to be signing them off.

From BCS’s point of view, we need to be helping them to understand how to best work with suppliers and to make sure the standards are then adopted, or ask: do they need accreditation mechanisms and if so, how can you have a light-weight accreditation mechanism; how do we help

pragmatic aspects of it.In the longer term, as with many

independent organisations, the question will be: how does it create a sustainable business model to continue to exist?

Although a lot of the initial project funding will come through the Information Centre, health systems always get rearranged at certain times and, what you don’t want with such an important thing, that’s actually working bottom up with clinical professions, is to then become the casualty of any restructuring or changes in budgets, for example. So they’ll need to work on establishing what is that mature business model that they can operate to make sure that it’s a long-term part of the way we do things and I think there’s options for that and they’ll be wanting to explore those quite quickly.

It’s not going to be government-led?No, it’s definitely an independent body; it’s been constituted that way, servicing and supporting the four nations in Great Britain and it might do slightly different things for each of the different nations. It needs requirements from both the market place, about what needs to be done for both clinicians and patients, but it will also need to have requirements that come from the centre as well.

Is it a good time to be in health informatics?Yes, in many ways it really is a good time because of the demand that’s pent up to try and transform the way healthcare is being delivered and there’s not many tools that can be called upon and, as we know, health informatics is under-exploited and it all becomes about the data and the information and what you do with that, to change the way in which we deliver things and to help to improve services for the patients. I like health informatics’ focus at the moment, being centred on patients, about coming out of the back room, and being much more focused on supporting the business of health care and making it work.

I think there are challenge areas too, mainly because of the financial constraints that a lot of the healthcare organisations are operating under and the changes that are going on centrally as well. We need to make sure that we don’t lose our health informaticians in all of that, but actually

• advising statutory voluntary professional advisory bodies and associations on records and structure of content as well.

I think there’s also a lot of hope that the PRSB will be able to go on and do lots of other things as well. But that’s how it is setting out its overall mandate for this first period.

What do you think its biggest challenges will be in the short to medium term?The short term challenge – how do you get it working in practice? It’s substantiated as a body now; it’s got a lot of support around it and it’s just putting in those processes and mechanisms that means it actually works and it can deliver on the work that they’re doing, making practical decisions around how much do they do within the organisation, how much do they broker in the standards market for the development and so on; so some of the do

i:10.

1093

/itn

ow/b

wt0

51 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

we’re protecting it and growing it because it’s a strategic capability we have, so I think there’s challenges in all of that as well.

There’s momentum that we can see here at the healthcare conference, passion to do something and to do something substantive that’s focused on the patient. And also in the open market, by encouraging a flourishing market with the engagement of a lot more vendors solving problems at a more local level. I think it will help to unlock the market so that there are more health informatics opportunities for people within organisations, within suppliers and so on. Globally, it is a great sector to be working in.

A couple of years ago Professor Heinz Wolf gave one of the plenary talks here at HC and he was saying that the only conceivable way for our health care system to survive, particularly with respect to our growing aging population, was if we had some sort of points system whereby people would do things for their neighbours, performing some sort of social care function, and, in return, they’d gain points which they in turn could ‘cash in’ for their own future care when they most needed it. Was that a ‘pie in the sky’ idea or possibly an inevitable one?

It would be nice to think that we could get there, and in some ways it already does happen. There are some six million informal carers already and they all do it because they care about something. But how do you help bring them together into networks that support each other; it’s quite an interesting idea.

I think there are some business models out there who are quite interested in similar schemes, whereby people aren’t charging money for goods and services exchanged, but are interesting in recycling or reusing goods and I think social networking has really helped in such cases, where people just try to help each other out.

In a healthcare or health informatics setting there are online patient registries where people can join - typically in areas around neurological conditions and niche groups, but not necessarily - who come together and share problems about experiences in those networks. And that can be quite self-treating in a way, as someone may say: ‘I’ve got this condition, I’ve just started taking this for it’ and they might get a response from 10 others in their social network who say ‘don’t worry,

that’s the main side effect, I had that, it did go away after two weeks’; I think those are massively supportive for the individuals, knowing there are people out there like them that they can get help from. So it should take some of the burden away from the system because they’re actually becoming ‘intelligent’ patients and they’re using their own asset – and some would say that patients are often the biggest asset of the healthcare system - to actually change their own health and well-being and that of those around them.

I think Professor Wolf was referring, in a way, to David Cameron’s proposed idea of the ‘big society’, but only this time it becomes mandatory for people to help others in order to guarantee their own future healthcare. I’m not sure that could ever work for everyone as some people just physically can’t do a lot.

I don’t see how you can sensibly mandate it, but I think people reach stages in their lives where they happen to want to give back more, where people have had a medical disease that’s been awful - they might have got over cancer or whatever it is - and they want to give back to those communities to help others going through similar issues. A lot of people run marathons or take part in charity events, raising money for good causes, so it’s kind of there in many ways already. But, personally, I don’t see how you could mandate it.

How do you think HC2013 has gone?I’m really pleased with the way it’s gone. The content of the sessions that I’ve been to and have heard feedback from has been excellent. Working with all the partners that have been involved in the event has all come together very well to form a consolidated programme.

There’s been a really good buzz and I’m particularly pleased with all the networking that’s being going on, which is really pleasing to see – people working on problems and looking to collaborate with each other more in future etc, there’s been an awful lot of that going on. And talking to colleagues on the exhibition stands, they’ve had some really good conversations with customers and partners to collaborate with as well, so overall, yes, I’ve been very pleased with the event.

www.bcs.org/health

TAKING CAREthem find a business model that’s going to work so they’ve got a sustainable revenue, above and beyond the initial funding they’ll get for the next couple of years from the Health and Social Care Information Centre.

So Philip’s fully engaged with that agenda, linked into our Standards Lead, Stephen Kay. They’re valuing our support and we’re really pleased to be involved with such an important and new part of the fabric, to have an influence and bring our expertise and experience to bear.

What are PRSB’s key aims or goals?At the highest level:–

• to show the requirements of the people that use and receive care;

• with regard to the patients and clinicians, to demonstrate that their requirements can be supported and expressed in a way that the clinical information is structured and stored within electronic health record systems for health and social care, and it needs to do that by reflecting on the way in which different clinicians work together;

• looking at what’s best practice; • making sure that it takes good care

of recipients and that it’s being done in a safe and secure way, embedded into systems.

They’re going to do that by a range of different things that they’ve outlined today, namely:

• providing overall governance of the structure for content standards;

• providing professional assurance standards that are being proposed and implemented;

• privatisation of standards as well, because some things will be more important than others to get done, and doing it in a very agile way will be important, rather than ‘boiling the ocean’;

BCS Multimedia Editor, Justin Richards, recently caught up with Justin Whatling, Chair of BCS Health, at HC2013, to talk about the Institute’s involvement with the newly launched Professional Record Standards Body (PRSB) and the current state of health informatics.

Page 22: BCS Sep13 Big Data

September 2013 ITNOW 43

3D TECHNOLOGY

42 ITNOW September 2013

The BBC’s recent decision to put its 3D TV venture on hold is yet another indication that all is not well with television’s foray into the third dimension. A number of factors have contributed to its current demise and these include a failure to properly accommodate the ways people behave when ‘watching’ TV - from the child who regularly switches attention between toys and screen, to the adult who multitasks. In every case those glasses get in the way and all too often gravitate to that dark space beneath the sofa.

Cinema audiences are more single-minded and are generally intent on a truly immersive experience. They are therefore more willing to tolerate viewing glasses as an interim solution but look forward to the development of alternative technologies that will support the convenience of glasses-free (autostereoscopic) 3D.

In fact, glasses-free 3D cinema is not a futuristic vision – in Moscow back in 1941 it was reality playing on a 5x3m screen:

‘The auditorium is plunged in darkness, except for a little lamp suspended from the ceiling by a long cord. But wait – an actor suddenly reaches out from the screen and draws the lamp towards him. How did he do it? As a matter of fact, there was no lamp left burning in the auditorium. It was simply an effect produced by the stereocinema… A juggler flings a ball straight at the audience, and those who happen to come within his line of vision blink and duck involuntarily…’ Ivanov [1941]

On show was the 40 minute 3D film, ‘Konsert’ (Fig. 1) and during a four month period some 500,000 people took the opportunity to enjoy autostereoscopic 3D. Unfortunately the venture could not have been more ill-timed, and it came to an abrupt halt in June when Germany and Russia became embroiled in total warfare.

On 20 February 1947 glasses-free 3D re-opened in Moscow. Significant developments in display technology were complemented by advances in the art and science of stereo photography. The 3D

eloquently summarised two key functions of the display hardware. Firstly a ‘multiplication’ function is needed, and this in effect makes multiple copies of the stereo output from the projection system. These are then distributed to each audience member by means of a ‘spreading’ function in such a way that the left and right stereo views are directed to the intended eyes.

The technologyThe oldest method of implementing a projection-based glasses-free 3D display uses a parallax barrier comprising a set of interleaved transparent and opaque bands aligned vertically (akin to a picket fence – although the palings and gaps must be so narrow that they cannot be individually resolved by the eyes, and the palings must be non-reflective). The barrier is located in front of a light diffusing screen and operates in such a way that the two images generated by the projector(s) are segmented into vertical strips and interleaved on the screen. Light returning from the screen must also pass through the barrier, which now operates so as to make each set of strips visible from specific locations and gives rise to multiple viewing positions at a certain distance from the screen. Clearly for cinema the display technique must support viewing across the length and breadth of an auditorium, and in this respect the parallax barrier fails.

In the 1920s Edmond Noaillon undertook extensive research into glasses-free cinema. He recognised the limitations of the parallax barrier and developed the radial raster barrier, which was subsequently used by Semyon Pavlovich Ivanov and coworkers in the implementation of Russian glasses-free 3D cinema. In this scenario, the transparent and opaque bands of the barrier are no longer parallel but form a fan-like structure and if extended meet at a common point. Both multiplication and spreading functions are carried out effectively - viewing being supported across both dimensions of the auditorium (Fig. 4 and 5).

Russian researchers recognised that the barrier is inherently inefficient in its transmission of light (all light falling onto the opaque bands is lost). Consequently, when autostereoscopic cinema reopened in 1947 the barrier was superseded by a radial lenticular approach – thereby enabling the formation of much brighter

differences in the images presented to the two eyes. These arise because each eye sees the world from a slightly different vantage point and this is used by the visual system to give a vivid impression of depth. In the case of stereo photography, a scene is photographed from two locations approximately separated by the distance between our eyes. If we then present these photos to the eyes in such a way that the left-hand photo can only be seen by the left eye and the right-hand photo only by the right eye, then the visual system fuses content and we perceive 3D.

In the case of glasses-based cinema and TV, the left and right views are encoded in some way and are simultaneously presented to all members of the audience. Viewing glasses serve a decoding function and ensure that the left and right stereo images are mapped to the intended eye. This is a simple and cost-effective method of delivering 3D – provided that viewing glasses are deemed acceptable.

Glasses-free 3D cinema poses some interesting challenges. Back in 1940, Dennis Gabor (inventor of holography)

doi:1

0.10

93/i

tnow

/bw

t052

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

images.Within a few

years, glasses-free 3D cinemas opened in Kiev, Leningrad and Astrakhan and continued successfully through to the 1960’s.

Success or failureGlasses-based 3D cinema continues to represent the simpler solution and is able to accommodate more closely packed audiences. Furthermore, the challenges associated with glasses-free approaches become more taxing as screen width is increased. These would have been key factors that eventually caused the Russians to migrate to the glasses-based approach. Our continued use of glasses-based technology is primarily driven by commercial considerations: it represents the most cost effective approach.

In the case of TV it is important to recognise that 3D works well with only some forms of content. This suggests the need for display technologies that can seamlessly transition between 2D and 3D modes of operation – thereby supporting the infusion of 3D into 2D delivery. In turn this implicitly necessitates a glasses-free approach able to support appropriate freedom in viewing positions. Furthermore, since TV audiences are usually in quite close proximity to the screen, accommodation and convergence issues (which can cause visual strain) cannot be cast to one side.

Such requirements - coupled with the creation of appropriate content - are essential to the success of 3D TV and may perhaps be most readily achieved through the use of techniques that better capitalise on the remarkable capabilities of our sense of sight. In this respect, technologists have sometimes incorrectly assumed that binocular vision is the sole basis for 3D perception.

Certainly the early pioneers of 3D cinema have demonstrated the practicality of delivering autostereoscopic 3D to quite large audiences. Armed with today’s materials, technologies, simulation tools and know-how, we are far better placed

Fig. 3: The parallax barrier with a light-diffusing screen. Three exemplar viewing locations shown.

Fig. 1: Stereo frames from ‘Konsert’ – black and white with the occasional inclusion of colour. This stereopair may be easily fused by slightly crossing the eyes.

to implement viable and highly innovative glasses-free solutions.

feature film ‘Robinson Crusoe’ (Fig. 2) was a sterling success:

‘…It was only when Crusoe in his shipwreck throws a rope to a drowning sailor that you get the first shock. The rope comes hurtling and curling right out of the screen into the darkness…. You duck. We all did. After that we were ready for anything…this luminous effect resembles the unearthly atmosphere of the Insect or Tropical Bird Houses in London Zoo. As if light were liquid. As if the heat were tangible… The depth is used cleverly to increase our sense of Crusoe’s loneliness…When he [Crusoe] goes down with fever, therefore, though little dialogue has been possible, he is a real person to us. Sensation in place of speech has placed us inside his head. We fight every inch of the way with him towards survival and recovery…A strikingly beautiful shot of a ship in full sail close inshore - the effect of stereoscopic photography on the canvas, rope and wood of a sailing ship cannot be described except in terms of goldsmith’s work….’ Macleod [1947]

Our everyday perception of the 3D world is strongly influenced by small

Barry G Blundell FBCS looks at 3D, referring back to glasses-free 3D Cinema 70 years ago in Russia.

Fig. 4: The radial raster barrier invented by Edmond Noaillon in the 1920s, which was successfully used by Russian pioneers.

Fig. 2: Stereo frames from Robinson Crusoe (1947).

Fig. 5: Constructing the radial raster barrier circa 1940. The barrier weighed around six tons. The opaque regions were groups of copper wire. Given the required accuracy and the lack of lightweight plastics, the development was a remarkable achievement.

Further readingBlundell, B.G., ‘On Aspects of Glasses-Free 3D Cinema ~70 Years Ago’, www.barrygblundell.com Funk, W., ‘History of Autostereoscopic Cinema’, Proc. SPIE, Vol. 8288, (2012).

Ivanov, S.P., ‘Russia’s Third Dimensional Movies’, American Cinematographer, pp. 212-213, (May 1941).

Macleod, J., ‘Stereoscopic Film. An Eyewitness Account’, Monthly Film Bulletin, pp. 118-119, (1st October 1947).

Valyus, N.A., Stereoscopy, The Focal Press, (1966).

GL

AS

SE

S-F

RE

E 3

D

Page 23: BCS Sep13 Big Data

doi:1

0.10

93/i

tnow

/bw

s004

©20

12 T

he B

ritis

h Co

mpu

ter

Soc

iety

44 ITNOW September 2013 September 2013 ITNOW 45

utilitarian and moral, and for a lifetime, not just for the first job. It also isn’t able to produce a real professional on its own, as training and development in the work environment can deliver different things. We should see academic education as foundational and serving a wider purpose, rather than trying to shoe-horn it into something it can’t and shouldn’t do - which is provide a shrink-wrapped ready to go worker.’

Dr Bill Mitchell countered the arguments put forward by his colleague David Evans by arguing that academic education should start earlier and lasts throughout our lives: ‘The purpose of an academic education is to equip a person with the intellectual skills needed to succeed in life. The right kind of computer science academic education would give someone the right intellectual skills to succeed in the IT profession. We should not confuse formal education with academic education, which does not stop the second you leave university. To really fix our current formal

education system we have to start with a genuine academic education starting from primary school.’

Professor Kevin Jones continued the argument in support of the motion by contrasting the role of education versus workplace training: ‘Whilst an academic education is absolutely vital to providing the intellectual basis that all skills will be based on over a lifelong IT career, industry itself is better suited to providing training in the complementary skills that have a huge short-term effect on the success of their employees; a rational long-term, forward thinking partnership between an academic education and specific skills development provided in the workplace leads to the best overall result.’

Professor Jones used the magnificent and ancient surroundings of Armourers’ Hall to draw an analogy between the role of academic education as providing the lasting foundations and structure that have held the building up over centuries whereas workplace training provided the

short-term and changing skills in interior design and decoration that would change quickly with fashion.

Closing the arguments in opposition was Dr David Bowers who expanded upon the consequences of equipping a generation of IT and computing science graduates with short-term employability skills:

‘Academic education can and does meet the skills needs of the IT profession because it equips graduates with essential cognitive skills that are transferable across time: the intellectual skills necessary for ongoing learning in the face of novelty and change. Technical skills and competencies result from workplace experience building on academic education.’

Following the debate there was a brief opportunity for the audience to ask questions and raise points about the arguments for and against the motion. There was much consensus between both the supporting and opposing camps and ultimately the debate came down to an argument about the degree to which academic education, in its own right, can meet the on-going learning needs of any profession.

The outcomeFollowing the debate and points from the audience a second vote was conducted and a substantially increased majority of 71 per cent supported the motion with some of the audience absenting.

Clearly the arguments presented by David Evans and Professor Kevin Jones had swayed the audience and increased the margin in favour of the motion, notwithstanding the fact that both support and opposition found a great deal of common ground in their arguments.

Post debate discussionThe Oxford Union style debate provided a forum to explore the motion from two perspectives and it raised a number of questions that were subsequently opened to wider discussion on the BCS Learning

On the 12 June the BCS Learning and Development Specialist Group, in partnership with the Worshipful Company of Information Technologists’ Education and Training committee, hosted an Oxford Union style debate on the motion: ‘This house believes that academic education will never meet the skills needs of the IT profession’. Paul Jagger FBCS, Secretary of the BCS Learning and Development Specialist Group, reports on the debate itself and on the post debate discussions.

‘Universities are failing to educate graduates with the skills we need’ - this is the complaint often made by employers of IT graduates. Does the problem start in school with the state of IT and computing science teaching and assessment at GCSE and A Level, or is it reflective of a fundamental misunderstanding among employers as to the role of an academic education?

Whilst certainly a contentious motion, it was also one that is open to much debate and discussion, and a subject that is critical to the future economic success of the UK economy and IT profession.

Those supporting the motion, proposed at the Armourers’ Hall in London in June, were:

Professor Kevin Jones, Head of Computing at City University and David Evans, BCS Director of Membership.

Opposing the motion were: Dr Bill Mitchell, Director of the BCS Academy of Computing and Dr David Bowers, Programme Director for Computing and IT

at the Open University.Prior to the debate an invited audience

of approximately 90 industry leaders, academics and representatives from professional and trade associations were invited to vote on the motion. The initial vote showed that 53 per cent supported the motion.

The debate was opened by Connor Ford, a sixth form computing student at St John’s Academy, Marlborough, who shared some of his experience at the lack of computing and IT skills among teachers.

The debate was chaired by Dr Leslie Spencer, also of St John’s Academy, with guests of honour BCS President, Roger Marshall and WCIT Master, Michael SK Grant presiding.

Following Connor Ford’s outstanding introduction to the motion, and speaking first in support of the motion, David Evans argued from his own experience in commerce and more recently as a senior officer of BCS that: ‘Academic education serves two different purposes, both do

i:10.

1093

/itn

ow/b

wt0

53 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

EDUCATION VERSUS SKILLS

Page 24: BCS Sep13 Big Data

46 ITNOW September 2013 September 2013 ITNOW 47

(six months academic, six months work) would be a good idea and an ideal way of integrating work experience, certification etc and reflection on that experience into an academic qualification.’

One theme that came across in numerous discussions before, during and after the debate is the need for academic qualifications to map to the Skills Framework for the Information Age (SFIA) in a manner that allows prospective employers to identify the knowledge and skill that has been developed at under-graduate or postgraduate level, that aligns with a specific skills set and job role in the IT industry.

Several universities have made that leap, notably the Open University and the

University of Northampton, although it must be recognised that an undergraduate qualification in itself is not sufficient to build competence in a particular job role – that is the role of employers who must recognise their responsibility to continue the learning journey with both formal and informal learning in the workplace.

The conventional university route is not the only option that employers should consider in order to develop skilled IT professionals.

The advent of the higher apprenticeship and associated work-based learning that develops vocational skills leading to a foundation degree or other qualification is just as valuable a route to employment, and one that lends itself to alignment with the SFIA model.

When BCS launches the professional qualification at SFIA level 3 or 4, that will act as a stepping stone to Chartered IT Professional status, the IT profession will offer a compelling vocational training path to professional qualification for those who either choose not to follow the university route or want to achieve a milestone on the road to CITP status later in their career.

ConclusionsWhilst the debate motion certainly highlighted the alignment gap between academia and the IT profession (real or perceived), the motion was essentially fallacious in that an academic education does not exist exclusively, or perhaps even primarily, to produce graduates that have all the skills and experience necessary to be competent and productive during their first week of employment, as many employers might desire. Who would want to undergo a heart operation conducted by a surgeon whose only experience has been in the classroom?

The idea that universities will be able to produce ‘production ready’ IT graduates capable of being sent out on billable assignments in their first week of employment is unrealistic and misleading, no matter how desirable that might be to employers.

Employers and employees must jointly shoulder responsibility for developing and sustaining the specific short-term skills that are required to remain competitive in the IT profession.

Academic education must do more to show how their role in developing the long-term foundation knowledge and skills maps to the industries skills framework (SFIA) whilst recognising that undergraduates are not going to achieve Chartered IT Professional status straight out of university – there is no value in raising expectations to a level that cannot be achieved by academic education alone.

The government, schools, colleges, universities, employers and professional bodies all have a role to play in ensuring that the UK has the skills it needs to compete on the global stage and none of these stakeholders will bridge the gap between the outputs of academia and the needs of the market in isolation.

RecommendationsThe post-debate discussion led to some simple yet compelling recommendations that will help to close the gap.

1. Sandwich degree courses are an excellent way to build employability skills into an undergraduate degree whilst forging links with prospective employers.

2. Universities should show a clear alignment between their IT and computing qualification and the Skills Framework for the Information Age.

3. The role of the higher apprenticeship should be promoted as an attractive alternative to the conventional university route, blending workplace learning and academic education that leads to a career in IT.

4. BCS should accelerate the development of the entry level IT professional qualification aimed as SFIA level 3 or 4 as a realistic milestone for both graduates and apprentices to achieve within a few years of qualification.

LEARNING AND DEVELOPMENT

and Development Specialist Group’s LinkedIn Forum that ran until 2 August.

A number of themes arose from both the debate and the subsequent online discussion including:• The very different yet complementary

roles of academic education and work-place learning in providing the skilled resources the IT profession requires.

• The perceived lack of alignment between academic programmes and the reality of the working world.

• Limited opportunities for placement years or integrated education/work-place learning in higher education.

• The focus on university being the right (or only) path to a career in the IT profession.

• Difficulty in mapping the outcomes

of academic education to the skills required by employers of IT skilled graduates.

A selection of comments from the discussion highlight these issues:

‘[Academic education] is not a preparation so much as a constant companion; we draw on what academic education gives us through our whole career, and academic education should be something we turn to for updates, refreshers and life changes throughout our life so shouldn’t end with our first employment contract.’

‘What seems to have gone missing is the importance of work placements and sandwich years as part of degree programmes.’

‘I think a return to the thin sandwich

The debate was video recorded and may be viewed online along with a selection of photographs and a copy of the ‘Book of the Night’ at:

www.bcs.org/content/ConWebDoc/50013

The post-debate discussion will continue online via the BCS Learning and Development Specialist Group’s forum on LinkedIn. Readers of this article are welcome to contribute to that discussion online:

http://www.linkedin.com/groupIte-m?view=&gid=2430056&type=member&item=258516388&qid=624610c2-9d93-4dac-a74b-7bc6441fdab0&trk=group_most_recent_rich-0-b-ttl&goback=%2Egmr_2430056

Resources

5. The BCS Academy should invite IT industry leaders, especially among major employers of IT graduates, to have a voice in the development of IT undergraduate and post-graduate qualifications throughout the UK.

6. The IT profession would benefit from a closer working relationship between BCS and e-Skills UK especially in respect of developing a joint policy to align academia with the future skills needs of the IT profession – based upon SFIA.

Readers may identify other recommendations or have alternative points of view, and with that in mind you are welcome to follow up using these resources listed below.

Page 25: BCS Sep13 Big Data

September 2013 ITNOW 49

June 2013IT analytics that help with big dataYour business is complex. Big data promises to manage this complexity to make better decisions. But the technology services that run your business are also complex. Many are too complex to manage easily, fuelling more complexity, delays, and downtime.

In ‘Turn Big Data Inward With IT Analytics: The Future Of Service Monitoring And Management,’ Forrester predicts this will inevitably get worse. To combat this onslaught, you can no longer just accelerate current practices or rely on human intelligence. You need machines to analyse conditions to invoke the appropriate actions.

These actions themselves can be automated. To perform adaptive, full-service automation, you need IT analytics, a disruption to your existing monitoring and management strategy. This report helps IT infrastructure and operations leaders prepare for IT analytics that turns big data efforts inward to manage the technology services that run your business. Key points:• If you can’t manage today’s

complexity, you stand no chance managing tomorrow’s: Your company’s data is growing at exponential rates, and without systems in place to manage and organise growth, you will drown in it. Virtualisation, consumerisation and cloud are guaranteed to follow. The answer cannot simply be to accelerate the same processes and methods.

• One size does not fit all: IT analytics solutions using a single algorithm are impossible, and it’s unlikely that one single vendor will be able to offer all of the solutions needed. Forrester expects the emerging and existing management software vendors to consolidate many of these capabilities.

• Work on improving processes before deploying IT analytics tools: IT analytics is an exciting field because it

represents breakthrough innovations that can bring substantial value and lasting competitive advantage to the businesses that adopt them and the vendors that create the right solutions. Service management process improvements are a prerequisite to any other effort, including the tools.

• Let the robots do the hard, meaningless work: The value of IT is not to just process data and perform repetitive ‘intellectual grunt work’. All of us in IT need to turn our intelligence and creativity toward making our business more successful. Deciphering the cryptic language of technology data is not the skill set of value in the future.

July 2013Deliver on big data potential‘Deliver On Big Data Potential With A Hub-And-Spoke Architecture’ says that data management has become as crucial as financial management to leading firms, but businesses grapple with data management platforms that can’t respond fast enough to fickle customers and fluid markets. Big data has emerged as the new industry buzzword promising to help do more with data and break down silos.

Forrester examines this phenomenon and discovers that firms are taking a pragmatic approach that focuses first on wringing value from internal data at a lower cost and implementing a new architecture they call ‘hub-and-spoke’.

This report explores this architecture and provides recommendations for exploiting the big data phenomenon using it. Enterprise architects and other technology strategists need this information to be equipped with options for flexibility and speed as they earn seats at the business strategy table. Key points:

• Firms have a lot of data, want more, and struggle to afford it: Firms are collecting a lot of data, but their

NEW FOR MEMBERS

doi:1

0.10

93/i

tnow

/bw

t054

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

data platforms struggle to remain affordable as scale and demands increase. While the pack struggles, leaders have figured out an approach that breaks down silos, enables agile analytics, and creates cost-effective data management.

• Hub-and-spoke data architecture meets the need: Yesterday’s correct data architecture involved centralised warehouses, marts, operational data stores, and a lot of ETL. Hub-and-spoke takes a different approach; it features rapid analytics and extreme-scale operations on raw data in an affordable distributed data hub. Firms that get this concept realise all data does not need first-class seating.

• Understand technology patterns to deliver flexible options: Forrester has interviewed firms with practical experience to identify best practices. It found seven common technology patterns that reuse the building blocks of hub-and-spoke.

• Make data management part of your business strategy: Enterprise architects must earn their way to the strategy table, then come prepared with a host of options for speed and flexibility using the hub-and-spoke concept. Most importantly, stop talking about big data and start talking about business outcomes that big data can help deliver.

The full Forrester reports are available for BCS members to read in the secure area (login required).www.bcs.org/login

The Institute publishes new reports from analyst Forrester every month. Members have access to the full text. This is an overview of recent reports.

++ANALYTICS FOR BIG DATA ++BIG DATA POTENTIAL

Imag

e: H

emer

a 10

1025

685

While studying for the MSc Computing, student’s learning will have a direct impact on their employment and career development.

With opportunities to reflect on the activities and processes with which they’re already familiar, they’ll be able apply their new knowledge and skills in ways that make an immediate difference.

The new programme defines structured study pathways, intended to help students to achieve particular career goals, while also allowing the inclusion of option modules from supporting disciplines. With a simplified structure the new qualifications use fewer assessment points, reducing the internal hurdles to student progression.

The brief to the module teams in producing the new taught content was to get the students to reflect on their particular work experiences. Instead of using case studies, or carefully developed artificial examples they are getting the students to reflect on experiences they have had in the workplace.

The module authors aim to get the student to engage with the external content they have in their own environment with which they are already familiar – white papers, specifications, real working practices and such like. The intention is to get the students to recognise and apply what they are learning

from the academic knowledge and skills in their own context.

The Open University has offered a post-graduate computing programme in various forms for about 20 years. Previously, the MSc consisted of eight taught 15 credit modules, each module requiring around 150 hours of study and a capstone project.

FlexibilityHowever, although the students who studied the eight taught modules were getting the flexibility they wanted, their choices did not always build into a coherent qualification. In response to this, over the past couple of years, the OU has taken a look at what it offered and taken input from its marketing team, from external proposition testing, and existing and prospective students in order to refresh the programme.

What it has come up with is a new qualification structure that still includes some student choice, but now also has more emphasis on guided study paths. This gives students a structured engagement with larger parts of the computing curriculum and so a better introduction to subjects in their chosen field.

The team at the OU decided that the structure that it had, the 15 credit, 150 hours of study, taught modules were too small for the guided path approach and also

introduced unnecessary assessment points, so it restructured around 30 credit modules. This means that a student would take four of those to cover the taught component of the MSc and then take a capstone project at the end. The study pathways consist of two core modules and two option modules and a choice of capstone project.

With the focus on the student’s own employment the team have also changed the requirements for the capstone project. Traditionally this final stage of the MSc is a 60 credit project. Students learn research-focussed skills, and the assessment demonstrates the student’s ability to define and investigate a research problem, and then design and implement appropriate solutions.

This style of project has been retained in the new qualifications but there is now an option to study a 30 credit project along with another taught module. This 30 credit project is an opportunity to investigate a topic of the student’s choice in a ‘professional’ employment-related setting. The student will act as an ‘informed investigator’ designing, conducting, analysing and reporting on their chosen project. The additional taught module allows students to bring content from other areas of the OU, such as law, finance or business.

www.openuniversity.co.uk/itnow

With an increased emphasis on employability and the skills needed to excel in the online, connected workplace, the Open University is changing its Masters computing programme. Kevin Waugh, Postgraduate Programme Director for Technologies and Computing at the Open University, explains the drivers for change and goes on to outline what these changes will look like.

MASTER

ADVERTISEMENT

CHANGE

Page 26: BCS Sep13 Big Data

ITNOW September 2013 51

OTHER JOURNALS

doi:1

0.10

93/i

tnow

/bw

t055

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

The July issue of Interacting with Computers is a special commentary on scale derivation, looking at the thorny question of measuring games engagement and any potential effects on players. Brian Runciman MBCS reports.

VIDEO GAME VIOLENCE EFFECTS

BCS Members can get a reduced subscription rate to Interacting with Computers and Formal Aspects of Computing:www.bcs.org/category/17544

Formal Aspects of ComputingPublished by Springer, this is a journal of the BCS Formal Aspects of Computing Science Specialist Group. It presents research and development results at the junction of computing theory and practice. Its principal aim is to promote the growth of computing science, to show its relation to practice and to stimulate applications of apposite formalisms to practical problems.

The scope of the journal includes: well-founded notations for the description of systems; verifiable design methods; elucidation of fundamental computational concepts; approaches to fault-tolerant design; theorem-proving support; state-exploration tools; formal underpinning of widely used notations and methods; approaches to requirements analysis.www.bcs.org/fac

Interacting with Computers This is the interdisciplinary journal of human-computer interaction, published in cooperation with OUP, and is an official publication of BCS and the Interaction Specialist Group. Publishing accessible, interdisciplinary research papers and cutting-edge thematic special issues on human-computer interaction, the journal has a strong and growing impact factor, a high ranking and excellent indices.

Topics covered include: HCI and design theory; new research paradigms; interaction process and methodology; user interface, usability and UX design; development tools and techniques; empirical evaluations and assessment strategies; new and emerging technologies; ubiquitous, ambient and mobile interaction; accessibility, user modelling and intelligent systems; organisational and societal issues.www.bcs.org/category/17104

Formal Aspects of Computing - Applicable Formal Methods

Volume 25, Number 2 2013 contains the following papers:

• ‘An inductive approach to strand spaces’

• ‘HYPE: Hybrid modelling by composition of flows’

• ‘Threaded behaviour protocols’• ‘The mechanical generation

of fault trees for reactive systems via retrenchment I: combinational circuits’

• ‘The mechanical generation of fault trees for reactive systems via retrenchment II: clocked and feedback circuits’

of video game quality or content on the games’ power to engage players, and thereby impact on their behaviours.

The following article ‘Comments in the Article “Characterising and Measuring User Experiences in Digital Games”’ is a response to a study by the Game Experience Research Lab in Eindhoven University of Technology.

The study aimed to describe the challenge of adequately characterising and measuring experiences with playing digital games; discuss the applicability of usability metrics to user-centred game design; highlight the concepts of flow and immersion as candidates for evaluating game play; and to describe the multi-measure approached employed.

Interacting with Computers - The Interdisciplinary Journal of Human-Computer Interaction

The July 2013 Special Issue: Commentary on Scale Derivation.

• ‘The Tricky Landscape of Developing Rating Scales in HCI’

• ‘Game Engagement/Experience Questionnaire’

• ‘Gauging Engagement in Video Games: Does Game Violence Relate to Player Behaviour? Report on a Study’

• ’The Psychometric Approach to User Satisfaction Measurement’

• Critical Review of “The Intranet Satisfaction Questionnaire”’

• ‘Website Interaction Satisfaction: A Reassessment’

• ‘Creating a Short Usability Metric for User Experience (UMUX) Scale’

The first article introduces a paper by J H Brockmyer on the development of a game engagement questionnaire (GEQ) to ‘reliably and separately measure a hypothetical one-dimensional construct of “deep engagement” in video games.’

Combined with a second study to demonstrate that higher scorers in the GEQ would be more engaged in game play, the authors contend that the approach could assess the potential impact of playing violent video games.

The paper analyses three areas it finds ‘problematic’ in these assertions: the unidimensionality of the series of semi-discrete psychological states that make up the hypothetical ‘deep engagement’ trait; the complexity of the theoretical basis of the construct; and the lack of attribution

LATEST JOURNAL CONTENTS

5 December 2013 | Central London, UK

Data AnalyticsDeriving intelligence and value from big data

Chairman: Professor David Crawford University of Essex and Director, Telecom Technologies Ltd.

Confi rmed speakers include:■ Francine Bennett, CEO and Co-Founder, Mastodon C■ Alex Hamilton, Principal & Owner, radiant.law ■ Chris Nott, CTO Big Data Analytics, IBM UK & Ireland

For more information and to register to attend, visit:

www.theiet.org/data-analytics

Big Data ought to mean big business opportunities. Get hands-on experience of cutting edge data analytics technologies, and meet those individuals who are realising the Return on Investment to equip you with the intelligence you need to inform your Big Data strategy.

The IET is the Professional Home for Life® for engineers and technicians, and a trusted source of Essential Engineering Intelligence®. The Institution of Engineering and Technology is registered as a Charity in England and Wales (No. 211014) and Scotland (No. SCO38698).

Membership

Discount

Available!

DA Full Page Ad.indd 1 13/08/2013 12:50

Page 27: BCS Sep13 Big Data

September 2013 ITNOW 53

MEMBER GROUPS

52 ITNOW September 2013

Across the BCS’s 105 Branches, Specialist Groups and International Sections, about 15 per cent use recording technologies on a regular basis. The reasons are not hard to find: some of the technologies used are expensive and only available in specific locations, while others require high levels of skill for successful use. But new technologies can now dramatically reduce both cost and skill requirements.

In late 2011, the Member Board Best Practice Committee set up the Recording and Broadcasting Working Group (RBWG) with a mission to create good practice guidance for recording, broadcasting and archiving events, to:

• enable the majority of BCS member groups to record and broadcast their events as a normal practice, by recommending technologies and procedures;

• develop activities and materials to be presented at BCS member group conventions;

• review existing BCS facilities for recording, including the Tricaster system, and recommend improvements.

A number of important lessons have been learnt from member groups and BCS’s Publications use of audiovisual technologies. For example, with video, in some cases the number of views did not justify the investment whilst, in others, simple methods had been used to create content with significant numbers of views.

Setting about the taskThe group purchased and analysed various pieces of equipment and software after a careful selection process, and with a view towards retaining it for longer-term use. Members of the RBWG also contributed the use of their personal laptops, equipment, broadband connections and software licenses plus their time, knowledge and skills.

Some examples: recording solutionsIn video cameras the RBWG found that a 3.5mm microphone socket, allowing attachment of a range of external

material for inclusion in the final edited recording, although it is usually best to use a single audio track as a basis for synchronisation. It is also possible to add direct copies of the speaker’s slides via a process of screen capture and to synchronise slide changes with the audio.

The RBWG evaluated a range of software for editing audio and video material. However, whilst appropriate editing can produce a polished and attractive final recording, it can also be a very time consuming process. Experience has suggested that a ratio of five or ten hours editing for one hour of recorded material may be quite common.

Broadcasting solutionsThe web is the major delivery system for member group generated content and the action of posting a recording file on a web page provides an effective, on demand time-shifted broadcasting solution. The most popular video hosting service is, of course, YouTube.

At the location, the organiser of a live broadcast must adopt one of the online recording solutions described in the report and use software to forward the recorded data stream to a broadcast server. For example, the RBWG also evaluated the use of Adobe Flash Live Media Encoder – a free upload and recording utility and found it to be a sophisticated, semi-professional software which could be used successfully with a server running Adobe Flash Live Media Server.

Online meeting servicesWeb-based online meeting services have been available for the past decade and have been widely used. Many of these services are modelled on earlier bespoke video conferencing services, but use a web browser or plugin as the presentation software and the internet for communication. Most are offered as cloud-based services rather than software.

Although support for committee meetings was not part of the RBWG’s terms of reference, they decided to investigate online meeting services

microphones, was particularly valuable, although this is only found on camcorder models from around £400 upwards. A camcorder and microphone combination, sufficient for high quality offline recording, can be obtained for around £500.

Online recording techniques require the use of cameras and microphones attached to a laptop, which is usually placed at the back of the lecture room where a volunteer can manage a number of items of equipment. Webcams are ideally suited to online recording but have limitations such as fixed focal length and wide-angle lenses, which mean they must be placed near to the speaker. To overcome this, the webcam must then be attached to a laptop at the back of the room via a USB extension cable. The RBWG successfully used a 20 metre cable which included a repeater. Certain recording techniques which rely solely on the use of a microphone or a webcam can also give good results at a price of around £50, an order of magnitude less than the cost of a camcorder configuration.

More flexibility is offered by attaching a camcorder to the laptop but the WG found that the technology to enable this is not well developed. A simpler and cheaper solution than HDMI is to use the analogue output from the camcorder using a video grabber device. This converts the analogue signal to a digital input via a USB port. This process reduces quality to that of an SD video input but is nevertheless adequate for some purposes. A number of these video grabber devices are available at prices around £20, but they must be chosen with care since there are also non-functional devices on the market.

Post event Most recordings can be improved by post-event editing to remove unwanted material such as introductions, gaps in recording, fluffs, interruptions and so on. Sometimes, an audio track can also be improved by using noise reduction software. Where multiple tracks have been recorded from a number of cameras and/or microphones, it may be possible to select the best do

i:10.

1093

/itn

ow/b

wt0

56 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

because they appeared to offer convenient end-to-end solutions which might enable remote users to participate in typical BCS events. The WG evaluated a number of solutions, notably Google+ which is particularly valuable for its unique capability to host online meetings and link these to other forms of content. It seems clear that it could offer benefits to any member group as a support mechanism for committee meetings, with few barriers to its adoption. Any group which adopts this practice would then find it easy to extend its usage to lecture style events, and then to recording and broadcasting via YouTube.

It’s worth noting that YouTube, SlideShare, and Google+ are themselves social media with similar functionality for building personal networks, posting messages, and for hosting content, enabling searching, endorsement, recommendation, and so on. They are also well adapted for video and audio content, in contrast with other social media offerings which emphasise documents and still photographs. LinkedIn has the capability to host SlideShare presentations.

Social media provides many opportunities for supporting member group activities and potentially developing new ways of working, but there is a need to develop effective ways of exploiting them.

Hygiene factorsSome people have expressed concerns over privacy, security and intellectual property rights in relation to recording and broadcasting, and especially in relation to electronic meetings and public broadcasts. It’s therefore worth outlining the main factors that member groups should consider when setting up an event.

Intellectual property rightsA public meetings and any content delivered during such a meeting becomes part of the public domain. The presenter should not divulge any proprietary, confidential or secure information nor seek to gain any such information from others present. In addition, the presenter should not make statements defamatory or

Further informationThe full RBWG report, with details on the software and hardware evaluated is in the Members’ Secure Area.

Member groups can also use the dedicated YouTube channel at: www.youtube.com/user/BCSgroupsandbranches

In the 21st century, what role can internet-based communication and social media technologies play in our activities as a professional institute? Can they support virtual meetings, or make the content generated by specialist groups and branches accessible to a greater range of members? Dr. Geoff Sharman FBCS CITP chaired the Recording and Broadcasting Working Group, which was tasked with improving audiovisual approaches for member groups.

damaging to BCS.Assuming this is the case, the organiser

should also ensure that the speaker owns the copyright of any material he or she presents and is willing to release it, in order to protect BCS against subsequent charges of copyright infringement.

PrivacyMany member group events are open to non-members and are therefore public meetings. In the course of such a meeting, personal information about the speaker or other attendees (such as their appearance, experience, attitudes, ethnic background, etc.) may become known. When a meeting is recorded or broadcast this information may become widely known, potentially infringing the data protection rights of these attendees. In the course of recording an event, members of the audience may also be recorded, for example when asking questions. Therefore, to protect their privacy, the organiser should inform the attendees that they may be recorded and what action they should take if they wish to avoid this, for example to leave the meeting or to sit in a reserved area of the room.

SecurityThe technologies and methods described here are unsuitable for communication which will include proprietary business or personal information, or information relating to national defence and security.

Any member group that needs to communicate this kind of information should consider carefully how it is managed and use appropriate technologies.

Recommendations

Each member group should be encouraged to develop a recording and broadcasting strategy, describing what events it plans to record and what level of investment is appropriate to enable this.

Each member group should assess which technique, or combination of techniques, best suits its style of operation and is the most effective for each event.

Each member group should be encouraged to appoint a committee member as Recording Officer, to act as a focal point for recording activities and the development of skills.

Best Practice Committee should establish an initial inventory of equipment at Southampton Street.

Best Practice Committee should document techniques and scenarios for using recording and broadcasting equipment.

Equipment, software and techniques should be reviewed after two years.

Member Board should work with BCS HQ to develop cooperative working practices so that the activities of volunteer members are supported by appropriate staff functions.

Best Practice Committee should set up the appropriate social media accounts and establish procedures for promoting/migrating material from member group repositories to the central repository and tagging it with appropriate terms.

Member Board should work with the BCS Publications Department to identify the best recordings produced by member groups and showcase them in BCS publications.

Best Practice Committee has initiated a study of social media and should aim to identify the best ways of exploiting their synergy with recording and broadcasting techniques.

COMMUNICATION

AN

D ENGAGEMENT

Page 28: BCS Sep13 Big Data

September 2013 ITNOW 55

MEMBER OPINION

54 ITNOW September 2013

While Scrum and Kanban (and Waterfall before it) may be good management practices, they are not sufficient by themselves to result in agile development.

Software development has never been easy as the computer industry is only about 50 years old. Most applications, until recently, were small and could be readily broken into smaller segments for incremental delivery.

The software industry, even now, has few pre-built and tested components (apart from the dreaded dll files) that can

be slotted together and used over and over again, like spare parts in the automobile industry.

Development strategyCan clever management systems like Scrum or Kanban come to the rescue? I don’t play rugby, so I wouldn’t know much about Scrum, and Kanban is just a clever

‘agile’ principles. The resulting solutions are robust and almost self-documenting as their structural properties can be read (and modified) by referring to the corresponding ‘parameter’ files.

As the same engine could now be used on all sites (because it has no embedded data), few bugs remain for too long, resulting in robust solutions requiring little subsequent support.

Quick prototypesProgen4GL was born, which we later re-named as Progen4GL/Agile System, and spread to many industries. One of the unforeseen advantages was our ability to produce prototype systems rapidly at low cost. As customers often didn’t really know what they wanted, creating quick prototypes became a marketing tool, vehicle for incremental delivery and an invaluable pre-cursor to the final system

Is it cheap being agile?Fig. 1 shows the conventional development model. It usually results in expensive solutions, requiring extensive de-bugging

we decided to use their best bits and develop our own variations replacing the offending statements - to remain faithful to our chosen principle - ‘never mix code with data’. We discovered that record locking and printing solutions too were not satisfactory and developed our own versions.

The two guiding principlesFirstly, we agreed to ‘keep it simple’ whenever possible. If necessary, split the underlying logic into several small steps (rather than in a single brilliant one-liner) that most programmers could understand even years afterwards. Secondly, ‘never mix code with data’. Only use truly re-useable code that we could use and over again.

Many programmers see nothing wrong in creating tables using SQL’s CREATE statement, followed by column properties. This, of course, means that each table has a dedicated piece of code, with embedded ‘data’ about columns within it. Similarly, languages like Visual Basic use TYPE and STRUCTURE statements, which require column properties to be embedded within the code. When I got involved in software development, I forbade my programmers to use SQL’s CREATE and VB6’s TYPE statements!

Keep data away from code at all costTherefore, all tables, forms, grid display, menus, relational links and reports were generated by reading ‘data’ held in exter-nal text files, leaving the underlying code unchanged. As far as we were concerned, any development system that required ‘data’ (e.g. column names, screen or report layouts) to be embedded in the ‘code’ was a thoroughly bad idea.

The programming model we eventually developed does indeed permit rapid changes at a modest cost as required by

system of billboards used in JIT (just in time) to manage inventory – nothing to do directly with software design and agile development.

Some 20 years ago, I was asked to advise on the development of robust, low-cost solutions for the SME sector. Agile development at the time was not the buzzword as it is now, however we soon ran into difficulties.

We discovered that the software development process itself leaves much to be desired. Most programming languages

and software tools available at the time, including Visual Basic and SQL, expected you to ‘mix code with data’. You could not read even a simple random access file without using the TYPE statement in VB6. SQL’s statements like CREATE and UPDATE required column names to follow the statement.

As these tools had many good aspects,

doi:1

0.10

93/i

tnow

/bw

t057

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Ravi Raizada FBCS was an Assistant Professor at Penn State University and Visiting Professor at Purdue University. He is currently Managing Director of Limrose Group (UK).

Kanban is just a clever system of billboards used in JIT (just in time) to manage inventory.

Ravi Raizada FBCS responds to the last issue’s agile special with another view, asking whether SCRUM and Kanban are enough to make you agile....

A R E

SCRUMKANBANE N O U G H ?

A N D

– with or without Scrum or Kanban. For example, development of a full-function library management system in the UK would leave little change from £200,000, and an ERP/MRP system could cost considerably more. Fig. 2 shows how the Progen4GL/Agile development process works. It follows the principle of ‘never mix code with data’ at all times.

The ‘data’ element that forms the interface between the Progen4GL-engine and the application can be edited at any time at a minimal cost. Fig. 3 shows a

small text file used to UPDATE Customer (CULIST) and Stock (STOCK) tables from Sale Invoice (SINV) and the internally generated SQL type of statement. The text file shown is part of the data element and is not included anywhere within the re-useable Progen4GL-engine

Re-useable code doesn’t seem very revolutionary until you realise that it can not only save thousands of pounds in development costs, but can also lead us to full-fledged agile development. No Scrum or Kanban used here.

Where next?Smart management techniques, like Scrum and Kanban have a role to play in developing good software, however major software companies like Microsoft, too, have a responsibility. They should concentrate on developing more suitable language structures to enable agile devel-opment. This can often be achieved by making subtle changes without needing a drastic re-write.

Accountsprogramme

VisualBasicSQLServerOracleC#JavaPHP

ERP/MRPbill of material

Libraryprogramme

Manufacturing

Accounts

Library system

Development costs £200k+

Development costs £200k+

Development costs £500k+

Bugs, poor documentation

Fig. 1 Bug-ridden conventional programming systems

CULIST, CUCOD, SICUS 3CUCP + SITDCUTB + SITDCUTO + SITGN

STOCK, STCOD, SIPROD 7STQTY - SIQTYSTWRK - SIQTYSTEM - SIQTYSTVAL - SIRSVSTTOT + SIDLVSTTO1 + SIDLVSTLU = SIDAT

UPDATE CULIST where CUCOD = SICUS Set CUCP = CUCP + SITD Set CUTB = CUTB + SITD Set CUTO = CUTO + SITGN

UPDATE STOCK where STCOD = SIPROD Set STQTY = STQTY - SIQTY Set STWRK = STWRK - SIQTY Set STEM = STEM - SIQTY Set STVAL = STVAL - STRSV Set STTOT = STTOT + SIDLV Set STTO1 = STTO1 + SIDLV Set STLU = SIDAT

Progen4GL UPDATE file

Fig. 3 UPDATE - customer and stock tables from invoicing

Progen4GL/Agile

Engine

RDMBS/SQLReport writersearch engine

forms

generator

Library system

Care homesystem

Manufacturing

Read/write Excel

Read/write CSV

Visual BasicC# code

Read/write Access

Read/write MySQL

Code only this side Data only this side

Client tables, business rules and links

Self-documenting system

Small text files <10K

Fig. 2 Progen4GL/Agile programming method

Page 29: BCS Sep13 Big Data

September 2013 ITNOW 57

INTERVIEW

56 ITNOW September 2013

What do you think are the biggest challenges for CIO’s in the near future?Well I think right now the explosion of mobile computing is at the centre for most IT departments and CIOs around the business in corporations. It’s an interesting technical challenge, but I think, putting that to one side, it’s also a challenge to the way people are using technology in their business lives, I think it’s that part that actually is the main challenge. Yes, there’s lots of things to do with security and the cost of ownership and the variation of different technologies that are available to us, but for me the real challenge is around the way in which we are using that technology in multiple different ways - the creative and innovative - that we probably wouldn’t have thought about even a couple of years ago. For me getting in step with the business and understanding the opportunities there, is the real challenge for CIOs and IT departments in general.

What’s your take on CIOs getting the respect of their C-level peers?You’ve got to get the basics right for sure. It’s a little bit like taking things for granted, like when you walk into a room and it’s dark you press for the button for the light to come on, you expect the light to come on. I think there’s a lot about IT support and IT services that you need to take for granted and should be able to take for granted. So if you’re in an organisation that is struggling to get basic devices, basic simple services like email working, like office products working, then you’re not really going to be very influential at senior level or any other level in the organisation frankly.

So getting the basics right - getting what some people call the ‘hygiene factors’ nice and clean - I think that has to be taken as read. Once you’ve got that I think the important aspect is finding out how the

and what we’re called then we’re probably barking up the wrong tree frankly. I think we should be concentrating on the value that we bring to the business and to our colleagues generally rather than what we’re called,

I think CIO is a well-recognised, understood title, some people say we should change the ‘I’ to integration or something like that but, personally, I don’t give it much thought at all, I don’t think we should do, I’m quite happy to use the title that’s well known and widely understood.

I’d prefer and I’d encourage people to think about what their role enables them to do in terms of adding value and contribution to a business rather than what they’re called.

What should someone who wants to be a CIO be doing?Well I think being a CIO is quite a long career to look at - it’s not something you can achieve very, very quickly. We do of course have some quite young CIOs, but it is something which you need to put time and effort into so you need to plan your career.

The sorts of things I would say from my personal perspective would be having a good grounding in the various aspects of IT should take for granted, so you should plan your career to include different aspects. Personally I’m a great believer in understanding the infrastructure of IT, but also having insight into the various business applications that are used in, for example, purchasing or engineering or supply chain or HR.

I think getting as many of those under your belt as possible is very useful.

earlier, what sort of catalyst can this create and finding the language to do that I think is quite important. Let’s face it, most of us in IT are not renowned for our communication skills or our engagement skills; we often find it difficult to build partnerships, but it is those elements that we need to concentrate on going forwards.

There’s been quite a lot of discussion about whether the CIO title is very helpful, some people think it should be chief digital officer or chief innovation officer, what’s your view on that?Well I personally believe if we’re spending a lot of time worrying about our job title

business ticks. Focusing on information, rather than information technology, can support the business. It’s often through integration, connecting up the various divisions and functions of a company, it is about being a bit of a catalyst for some innovation and providing data in a way that can be used in a competitive way.

I think that getting in tune with business leaders on those sorts of things, having got the hygiene factors all sorted out, is the way to influence not just senior people but actually all colleagues through the organisation.

What about communication between techy and business people?The first challenge is just the multinational languages used, for many people I’ve worked with English is not their first language. It’s our business language, but it’s not our first language so we need to take that into consideration. We need to take into consideration people’s cultural backgrounds and cultural identities as well, and saying the right thing to get our point across and to influence somebody I think is as much about getting it in literally the right language and the right cultural position.

Probably where I need to concentrate quite a lot on the sort of language of influence is around describing how the fruits of technology can be used, so rather than talking about technology itself, it’s what opportunities does it open up? As I mentioned do

i:10.

1093

/itn

ow/b

wt0

58 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

Looking at the future of CIOs and the future role, I would certainly, early on in my career, start to understand about how the business works and an MBA qualification could be incredibly valuable as part of your overall career development. Understanding how business functions work together and how the businesses that you work in operates, how its marketplace is driven. That sort of insight, I think, is very, very useful and certainly skills like product management, finance management, I think they can all be useful as well.

So it is a pretty mixed role going forwards, quite a broad role and I would certainly steer clear of trying to become a

future CIO purely on technical areas alone, as I don’t think that’s likely to succeed in the future.

What other business challenges do you foresee?Well I think the IT industry has a certain responsibility to the planet, actually. We depend upon huge data-centres, tens of megawatts if not hundreds of megawatts of data-centres - many of them around the world - as an industry we generate about the same amount of carbon dioxide as the airline industry, which frankly we need to think about very carefully and find innovative ways of reducing that, which is about being low powered. Obviously our mobile devices are very low powered, we need to think about our back office as More CIO interviews and resources are

available via the BCS Enterprise Hub:enterprise.bcs.organd the Digital Leaders section:www.bcs.org/digitalleaders

Brian Runciman MBCS spoke to Richard Harris, former CIO at ARM UK and Rolls Royce.

well, getting our data-centres to be much cheaper to cool in terms of the power that they drain.

So for me I think, as an industry, we really need to start being very creative and very innovative about that and that would bring great rewards for society at large and people will thank us for that.

Do you think it’s the CIO’s role to actually raise the profile of these more ethical issues?I think so yes, we certainly have a role to play at the executive table, it’s not always uppermost on everybody’s mind as a business divisional leader. It might be fairly

down your list depending what sort of business you’re in, but I think we should encourage it.

It’s the same amount of polluting gas that we’re putting out there as the airline industry and most people are aware about how much travel they do and how much they go on an aeroplane and that sort of thing. So somehow, when we’re using computer devices, connecting to the web, the whole internet of things out there, that needs to be low powered as well.

That needs to be green and sustainable and we as CIOs and IT departments can really drive creativity in this area.

‘I would steer clear of trying to become a future CIO purely on technical areas, I don’t think that’s likely to succeed in the future.’

‘I think we should be concentrating on the value that we bring to the business and to our colleagues generally rather than what we’re called.’

CIO vs. CDO vs. CIO vs. CDO vs. CIO vs. CDO

Page 30: BCS Sep13 Big Data

Paul Coldwell, one of the finest printmakers working in the digital medium in Britain today, calls his ‘a layered practice’. Since 1995, Paul has been concerned with the question of how to integrate the computer without relinquishing the knowledge he has gained from twenty-plus years of traditional practice.

Our image this month is from a series that combines the digital with analogue techniques - here, lino printing is employed over an inkjet print, thus creating, in the artist’s own words, a truly ‘personalised surface’.

This series was commissioned last year by the Awagami factory, Japanese producers of specialist papers (including for inkjet) and exhibited at the Bumpodo Gallery, Tokyo.

Still Life with Keys and other artwork made by the artist since 1993, was on show at the Stephen Lawrence Gallery, University of Greenwich until July 2013.

Paul writes, ‘These prints continue my preoccupation with simple objects and the way in which they can open up associations for the viewer. By linking them, I want to suggest the interconnectivity of things and through this suggest the complex set of relationships that we each have with the world.’

Although no people are present they

have a figurativeness that reminds us, through their artfully constructed objects, of human presence. Like the prints of the Italian master Giorgio Morandi, an artist he has a particular affection for and the subject of his 2006 exhibition, Giorgio Morandi: Influences on British Art, Paul has built-up a cache of everyday objects and in place of Morandi’s bottles we see coats, briefcases, a chair, an envelope, flowers, books and so on.

Such various everyday household objects create a sense of memory and of time passing. The viewer feels compelled to puzzle out the relationships between them and no doubt each of us has a different answer.

In the view of David Waterworth, Curator at the Stephen Lawrence Gallery, this ‘create[s] a layered and differentiated surface within which histories can be drawn out.’

Over the past decade or so, the computer has revolutionised the art of printmaking. A print no longer needs to be a piece of paper physically pressed against a template.

The difference with working digitally is that the image is printed in a single sweep through a printer, whereas in traditional printmaking the image is only complete at the end after all the colours have been

printed. As Paul says, ‘With photography there is always the air between the lens and the object, whereas in the scan there is nothing.’

Additionally, the use of a program affords the ability to layer image upon image. What we are left with is a surface that may appear to be flat but one that holds the key to a depth of layers that remain hidden in the computer. As the artist points out, ‘Photoshop is predicated on the concept of layering and the final inkjet print is a flattened image produced from a single printing.’

Engaging with the surfacePaul has been considering how to work within this flattened space and consequently work across the single surface and even how to disrupt this unified surface. Such questions of how to engage meaningfully with the surface within a digital environment and the implications of working with a physical surface compared to that of the digital were researched in the FADE Project 2007-9 at the University of the Arts London, led by Paul as Professor of Fine Art, Chelsea.

Various early problems with digital printmaking of a decade or so ago have been overcome. These included fugitive

dye-based inks and a very limited range of substrates. Now inkjet prints use pigment-based light-fast inks and there is an almost infinite range of substrates to choose from including various papers, fabrics, even plastic. The scale can range from the size of a postage stamp through to covering an entire building.

Paul’s prints are composites of imagery, collaged and layered together in grey scale, sizes adjusted on to a grid and manipulated on the computer before being taken into colour. The keys are relief-printed, ‘In order to reassert the surface.’ This is a reference to the traditional art of Japanese woodcuts and contrasts the virtual layering of inkjet with the physical layering of ink through relief.

A notable aspect of Paul’s work is the pattern of small dots covering much of

the surface, as seen in the image above. Paul makes use of the half-tone dot - a reference of course to old-fashioned printing press photography and the reproducible image. (Dots are also a feature of Pop Art, most notably Roy Lichtenstein’s work, which the artist admires.)

However, in Paul’s case the use of this as a pictorial device gives him ultimate flexibility as each individual dot can be modified in the computer if required, for example enlarged, the colour altered or removed entirely.

Paul says, ‘I intentionally wanted to make these images visually demanding on the viewer, to engage them in the act of looking. The prints operate quite differently when viewed either close up or from a distance, so hopefully they prevent the

ITNOW September 201358 September 2013 ITNOW 59

COMPUTER ARTS

Credit: Paul Coldwell, Still Life with Keys, Inkjet + laser cut relief, 2012. Image size 47x64 cms, paper size 59x84 cms.

Copyright the artist. Reproduced with permission.

A LAYEREDPRACTICE

doi:1

0.10

93/i

tnow

/bw

t059

©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

Catherine Mason MBCS is the author of A Computer in the Art Room: the origins of British computer arts 1950-80.For more information on the computer arts please see the Computer Arts Specialist Group website: www.computer-arts-society.com

viewer from forming a fixed reading.’Paul Coldwell’s prints are subtle,

yet engaging and reward deeper contemplation. If you didn’t get to the show, the useful, well-illustrated catalogue is particularly recommended and includes insightful essays by the curator Ben Thomas and Christian Rümelin (Keeper of Prints & Drawings of the Cabinet d’arts Graphiques in Geneva).

Page 31: BCS Sep13 Big Data

ITNOW September 201360

This special issue was guest edited by John Lloyd, School of Computing Science, Newcastle University, and was in Volume 56 Issue 6 June 2013.

The papers in this special issue are dedicated to Brian Randall, Newcastle University, and his lifetime’s, and ongoing, work on dependable software.

Following two quite seminal NATO Software Engineering Conferences in 1969 and 1970, in which Brian Randall was a participant and co-editor of the conference proceedings, a popular goal then, and now, was to prove the correctness of programs, an approach notably pursued by Hoare, Dijkstra and colleagues.

In contrast, Brian advocated an alternative and complementary approach: the development of software that could tolerate faults arising from a variety of sources, including design, implementation, hardware and ultimately user error.

The papers in this volume address a range of issues in the design and implementation of dependable software systems.

This includes the following. 1. The design of efficient error detection

mechanisms for transient data errors based on a data-mining approach.

2. Situations where faults arise as a result of malicious intrusion, a topic introduced in the EU MAFTIA project that Brian Randell led, showing that the introduction of a single, trusted component can reduce the degree of replication required for fault recovery.

3. A low latency technique for providing fault tolerance in distributed applications in a local area network that can be used in a common industrial case where there is a primary system and only one backup replica.

4. Approaches to the semantics of non-deterministic expression evaluation, highlighting their commonalities, differences and the relationships between them, and a new notation used to reason about interference.

Efficient Adaptive and Dynamic Mesh Refinement Based on a Non-recursive StrategyTesselation through triangular meshes involves a subdivision of surfaces for manipulation and rendering of objects in graphics. This is used to model smooth surfaces of arbitrary topology with complex details in a wide range of applications such as 3D games, movie production and graphical modelling.

This work uses a locally adaptive, patch-wise, approach. The key aspect of the approach is to organise and represent tessellation triangle vertices in strips.

The data management employed permits the reconstruction of the generated triangles on the fly by connecting the inserted vertices. The simplicity of the proposal makes it suitable for a hardware implementation in a graphics card. In this work, the geometry shader of current GPUs generates primitives dynamically.

Terrain models are used to exemplify the quality of results obtained, using images of mountain scenes including Mount St. Helens in Washington.

This article appeared in Volume 56 Issue 7 July 2013. The authors are with the University of Santiago de Compostela, Spain, the University of A Coruña, Spain, and the University of Lund, Lund, Sweden.

Novel Compression Algorithm Based on Sparse Sampling of 3D Laser Range ScansThree-dimensional models of environments are commonly employed in areas such as robotics, art and architecture, facility management, water management, environmental/industrial/urban planning and documentation.

A 3D model is typically composed of a large number of measurements. When 3D models of environments need to be transmitted or stored, they should be compressed efficiently to use the capacity of the communication channel or the storage medium effectively.

In this work a novel compression technique based on compressive sampling is applied to sparse representations of

THE COMPUTER JOURNAL

doi:1

0.10

93/i

tnow

/bw

t060

©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

3D laser range measurements. So at the data scan stage, a far smaller signal is captured, using sparse representations of laser range measurement sequences.

This work is of relevance for sensor systems; robot sensing systems; intelligent sensors, to be deployed in 3D mapping; 3D modelling. 3D laser range finder data from Schloß Dagstuhl and from Bremen city centre are used in this work.

This article appeared in Volume 56 Issue 7 July 2013. The authors are from Bilkent University, Ankara, Turkey.

RE-UML: A Component-Based System Requirements Analysis LanguageAn extension to unified modelling language (UML) named RE-UML is presented, with formal semantics utilising the Prolog programming language to support component-based system (CBS) requirements analysis. RE-UML extends the UML sequence diagrams with a satisfaction interaction frame and mapping operators to model matching criteria between stakeholder demands and candidate component features.

Furthermore, associations between requirements and candidate components are introduced to model risk assessment and conflict resolution during CBS requirements analysis. To demonstrate the use of RE-UML, its application is presented to the software system of Seven-Eleven Japan, relating to stores and franchises.

This article appeared in Volume 56 Issue 7 July 2013. The authors are from King Fahd University of Petroleum and Minerals, Dhahran, Saudi Arabia, and La Trobe University, Victoria, Australia.

The Computer JournalThe Computer Journal has published advances in the field of computer science for over 55 years.Members can get heavily discounted subscription rates: www.bcs.org/cjournal/subscribe

The extracts below are taken from the BCS’s eminent academic monthly, The Computer Journal, published with Oxford University Press.

www.bcs.org/bookshop- also available from all good booksellers

SMART IT DECISIONS

£29.99 £23.99with BCS member discountISBN 978-178017-154-8

Practical guidance for implementing and maintaining an effectiveand robust governanceframework.

‘An essential reference for anyone in governance or executive management.’Rob England, The IT Skeptic

Includes links to useful tools, templates and other resources.

© BCS, The Chartered Institute for IT, is the business name of The British Computer Society (Registered charity no. 292786) 2013 BC2

90/L

D/A

D/0

713

BC290_ld_ad_itnow_qp_ma_Layout 1 18/07/2013 16:46 Page 1

Unblindedby scienceOur technology lawyers are passionate about theirchosen specialisation. They understand the challengesof developing, marketing and exploiting technology.

Help with IP, R&D, e-commerce or software licensing?Call 01732 224000 or visit vertexlaw.co.uk and

talk to someone that’s on your wavelength.

Testing Solutions Group Ltd (TSG) 117-119 Houndsditch, London, EC3A 7BTT: +44 207 469 1500 E: [email protected] W: www.testing-solutions.com

For us, it’s about business assurance andoutcomes, not just tes�ng.

We are testing solutions group.

Our Services

➢ Strategic Consultancy, Risk➢ Quality Assurance➢ Agile Consul�ng➢ Test Management➢ Vendor Management➢ Func�onal Tes�ng➢ Non Func�onal Tes�ng➢ Test Automa�on➢ ISO 29119 Consul�ng➢ Cer�fied Training➢ Agile Training

Specializing in the Banking,Finance and Legal sectors

We specialise in assuring successful outcomes for business cri�calprogrammes, that provide businesses with confidence in their tes�ngprogrammes;

Talk to us and we’ll help put together the right level of assurance you needto mi�gate risk and gain confidence in what you do and generate a returnon your investment.

Download the LinklatersManaged Services

Challenge case study

TSG 1/2 page Ad Final:Layout 1 1/8/13 16:12 Page 1

Page 32: BCS Sep13 Big Data

such as soaring international roaming costs, poor in-building coverage, and integration into corporate PBX and unified communications systems.

ShoreTel Mobility allows workers to just connect their iOS device (iPad or iPhone) into ShoreTel Dock and they are off and running – addressing the above problems, and offering simple, cost-effective mobility.

The scalability and simple design of ShoreTel’s solutions means that they can be used in conjunction with existing systems.

Whilst the simplity of ShoreTel’s mobility system ensures a reduction in roaming costs for the business traveller, expenditure on hardware will also be minimised.

ShoreTel’s ease-of-use and cost effectiveness differentiates it from competitors. ShoreTel is able to offer companies of any size, from SMEs to larger organisations, the ability to adapt to latest trends, such as the mobile workforce and BYOD, and maintain excellent customer communications at low prices. Brilliantly simple is ShoreTel’s business ethos, and this helps to separate us from the crowd.

We have a huge number of delighted customers. Our net promoter score (NPS) is 63 – a score of 50 is deemed to be world class – and this is only bettered by Google and Harley Davidson.

On top of this fantastic measure of customer satisfaction, we have a large number of customer advocates here in EMEA. Our advocates have given us video references and have been known to speak on ShoreTel’s behalf at Industry events.

ShoreTel is one of the only IP phone system manufacturers in the industry to deliver 100 per cent of its business through channel partners, in other words we operate an indirect model.

This model is replicated in the US and APAC and works extremely well in every country in which the company operates. ShoreTel has a four-tiered Champion

Partner Program for resellers, offering targeted support depending on the resellers needs. This allows resellers to provide exemplary management, customer service, solution experience, product features and performance, to any customer.

ShoreTel’s solutions are both brilliantly simple and cost-effective, which should already appeal to the UK SME. More s pecifically, though, ShoreTel recently announced the Small Business Edition (SBE) 100 – a new bundled offering that provides communication and collaboration capabilities to businesses under 100 users.

This scalable solution allows the enterprise to adapt to its own growth – by simply applying a software licence, the system can be easily enabled to support up to thousands of users. By using ShoreTel’s SBE 100, SMEs in the UK can provide communication capabilities on par with larger organisations, at a fraction of the price.

The scalability also means that if an SME grows, its communication capabilities can grow with it.

Docking stationThe ShoreTel Docking Station is a business-grade device that transforms smartphones and tablets into deskphones for the mobile generation. It is the first and only business-grade device that transforms the Apple iPad and iPhone into desk phones for the mobile generation.

The user can conveniently sync their mobile device to their desk phone. Designed from the ground up to support Apple’s iPhone and iPad in both orientations, it is supported through Bluetooth connectivity and directly through the Apple 30-pin connector to deliver the best audio quality whilst simultaneously charging their mobile device.

In the age of bring your own device, the ShoreTel Dock turns the iPhone and iPad (and even the iPod touch) into a

desk phone, giving users the comfort, battery life, and call quality they are missing from smartphones alone. This will allow employees to use the device they love optimally in all personal and work environments.

The company has recently announced a partnership with Williams the Formula 1 racing team. This technology partnership means that the Williams F1 team will benefit from our unified communications and collaboration platform across their business.

It was imperative for Williams to address the need for a secure, easy-to-use communication system that can be utilised by members of the F1 team as 50 to 60 employees travel every two weeks around the globe on race weekends.

This solution offers mobility, and can satisfy Williams F1’s security requirements – unlike most companies; Williams F1 can often find its competitors metres away, in the next paddock, highlighting the need for a secure platform. ShoreTel’s business communications solution will also extend to Williams’ Technology Centre in Qatar, ensuring a truly international collaboration. As for ShoreTel, the deal helps the company benefit from the extended global reach of this highly-respected and fast-growing brand.

Over the next five years the company aims to establish itself more strongly in continental Europe, primarily in ShoreTel’s three Investment Regions – UK, DACH and France.

We are investing in growing the sales teams in each if these regions, we also have a presence in Benelux, Italy, Spain and Scandinavia.

In one sentence ShoreTel’s ethos is to say no to the status quo, say yes to simpler business communications for our customers.

For more information contact Ami Glass on 01628 826 336 [email protected]

ShoreTel is a US$250 million company conceived in California back in 1996 and since its first installation in 1998, the company has grown to become a worldwide leader in the IP-based business communication solutions field.

It fully expects to become a key player not only in the telephony market in the US but also in APAC and EMEA, allowing our business to grow consistently year on year as our worldwide market share continues to increase. International business represented 12 per cent of our premise revenues in Q3, and was up 1 per cent over the third quarter of 2012.

ShoreTel saw another quarter of solid growth in our EMEA region which grew 28 per cent over Q3 of the previous year.

The company offers a wide variety

of business communication solutions, all of which, whether on-site or in the cloud, exploit the power of IP with unified communication tools that anyone can use.

Among its offerings is the ShoreTel Unified Communications (UC) solution, which collects all of a company’s internal and external interactions in one place: emails, faxes and voicemails are stored in one central repository and across devices and locations and, with the on-premise business phone system, lets both on-site and remote users see who is available at any given moment, and decide how best to reach them.

We also offer ShoreTel Mobility – a simple, cost-effective way for your enterprise to embrace the mobile workforce. ShoreTel Mobility enables

users to access a full suite of mobile unified communications tools from any location (office, home, hotspots), and on any network (voice over Wi-Fi, voice over 3G/4G, or cellular), using their mobile phones or tablets.

It offers several other business communication solutions, such as contact centre, which easily connects internal and external customers to the right agent at the right time, and conferencing, which leverages ShoreTel’s VoIP phone system to deliver full-featured collaboration tools – including audio conferencing and desktop sharing.

The company’s technologies can also help companies save money. Companies struggle to adapt to the issues brought about by the rise of the mobile workforce,

ShoreTel is an acknowledged challenger brand in the UK that is now looking to grow its market share and revenue streams in continental Europe, particularly France and Germany.

ADVERTISEMENT

A LITTLE MORE CONVERSATION

Page 33: BCS Sep13 Big Data

September 2013 ITNOW 6564 ITNOW September 2013

Magazine and Journal databaseThe role of the tester’s knowledge in exploratory software testingThe authors present a field study on how testers use knowledge while performing exploratory software testing in industrial settings. They video-recorded 12 testing sessions in four industrial organisations, with subjects thinking aloud while performing their usual functional testing work. They analysed how the subjects performed tests, what type of knowledge they used, and how they recognise failures based on their personal knowledge without detailed test case descriptions. By Juha Itkonen, Mika V Mäntylä and Casper Lassenius, Aalto University School of Science, Espoo, FinlandSource: IEEE Transactions on Software Engineering, May 2013

C2-style architecture testing and metrics using dependency analysisTraditional software testing methods cannot be used directly to solve the test issues of software architecture, according to this paper’s authors. They suggest either some techniques are needed to improve the traditional methods or new software architecture testing techniques need to be developed to solve the test issues related to software architecture. This paper proposes a new method to analyse dependencies for C2-style architecture. By Lijun Lun and Xin Chi, College of Computer Science and Information Engineering, Harbin Normal University, China; and Xuemei Ding, Fujian Normal University, Fuzhou, ChinaSource: Journal of Software, February 2013

Using mutation to enhance GUI testing coverageMutation testing is often bypassed because it consumes extra resources. Here the author presents an automatic technique to generate valid and mutant test cases. In traditional mutation testing, one or more parameters in the specification or the code are changed, and the technique finds the test cases that can detect those mutations. In this approach, the test cases generated by a GUI model are mutated and the mutants are then applied to the

model to test its capability to kill the mutant test cases by rejecting them. By Izzat Mahmoud Alsmadi, Yarmouk University, JordanSource: IEEE Software, January 2013

Particularities of verification processes for distributed informatics applicationsThis paper presents distributed informatics applications and characteristics of their development cycle. It defines the concept of verification and identifies the differences between it and software testing. It also defines the particularities of the software testing and software verification processes. The authors present the verification steps and necessary conditions and establish factors that influence quality verification. They analyse software optimality verification and define some metrics for the verification process.By Ion Ivan and Cristian Ciurea, Academy of Economic Studies, Romania; Bogdan Vintilă, Ixia, Romania; and Gheorghe Noşca, Association for Development through Science and Education, RomaniaSource: Informatica Economica, 2013

TMSTAF – The extended use of STAF on test automationAs software packages become increasingly large and complex, testing time has also risen. A test platform was needed to speed up test cycles without any decrease in test result accuracy. The authors report here on the use of Python 2.6 scripting language to create a faster, automated testing platform. Trend Micro Software Testing Automation Framework (TMSTAF) used Python 2.6 based on Software Testing Automation Framework (STAF), to improve automated testing. They found that TMSTAF decreased testing time, provided faster process integration, test feedback, improved overall software quality, and seamlessly integrated into the build release process.By Lee Bevis, Trend Micro Incorporation, Taipei, TaiwanSource: Python Papers, March 2013

Software needs seatbelts and airbags The article discusses the challenges in correcting post-deployment computer software code bugs written in computer

generation for direct tester guidance and a system for tester’s guidance based on the model of the system.

The proposal increases the efficiency of manual software testing in terms of effort spent on test design and the reliability and accuracy of performed tests.By Karel Frajták, Miroslav Bures, and Ivan Jelinek, Czech Technical University, Czech RepublicSource: Proceedings of the International Conference on Information Technologies, September 2012

Handling giant programs: strategies for testing programsThe writer presents his views on, and suggestions for, testing software. He mentions that regression testing is the common method used for software testing analysis. It uses unified modelling language which marks user workflow and helps in developing general test cases in which are then elaborated using Microsoft Excel. Testing is performed using different testing tools.By Steve Kilner,vLegaci

now widely used, with regression testing used to assure system quality because changes made to one component could affect it or the entire system. The paper’s authors identify diverse changes made to components and the system based on models, then perform change impact analysis, and finally refresh the regression test suite using a state-based testing practice. A case study shows the approach is feasible and effective.By Chuanqi Tao, Southeast University, Nanjing, China, and San Jose State University, USA; Bixin Li, Southeast University, Nanjing, China; and Jerry Gao, San Jose State University, USASource: Journal of Software, March 2013

Web software systems testing, supported by model-based direct guidance of the testerThe common approach to software testing based on manual test design and manual execution of test cases can be made more efficient by suitable automation. This paper proposes a new approach to the testing process using automated test cases

programming languages with vulnerabilities to memory errors. The authors explore the efficacy of software testing tools such as static analysers and randomised fuzz testing to detect bugs. Other topics include detecting bugs in software installed on computer desktops or mobile devices, server error log files, and computer garbage collection. Several error prevention, detection, and repair programs developed by researchers are included, such as the DieHard memory management system, the Exterminator error repair software, and the self-repairing Grace runtime system.By Emery D Berger, University of Massachusetts, USASource: Communications of the ACM, September 2012

Validating second-order mutation at system levelSeveral works have approached techniques for cost reduction in mutation testing. There are two approaches to $(n)$-order mutation. This paper is focused on one of them: decreasing the costs of mutation testing by reducing the mutants set through the combination of the first-order mutants into $(n)$-order mutants. This use entails a risk: the possibility of leaving undiscovered faults in the system under test, which may distort the perception of the test suite quality. This paper describes an empirical study of different combination strategies to compose second-order mutants at system level, as well as a cost-risk analysis of $(n)$-order mutation at system level.By Pedro Reales Mateo and Macario Polo Usaola, University of Castilla-La Mancha, Spain, and José Luis Fernández Alemán, University of Murcia, SpainSource: IEEE Transactions on Software Engineering, April 2013

A systematic state-based approach to regression testing of component softwareComponent-based software engineering is

doi:1

0.10

93/i

tnow

/bw

t061

©20

13 T

he B

ritis

h Co

mpu

ter

Soc

iety

Helen Wilcox outlines some of the resources in the member library on software testing.

Imag

e: iS

tock

Pho

to/1

3516

1171

Source: System iNEWS, April 2013

Centroidal voronoi tessellations – a new approach to random testingTo increase the effectiveness of random testing (RT), researchers have developed adaptive random testing (ART) and quasi-random testing (QRT) methods which attempt to maximise the test case coverage of the input domain. This paper proposes the use of centroidal voronoi tessellations (CVT), and a test case generation method, random border CVT (RBCVT), which can enhance the previous RT methods to improve their coverage of the input space.

An extensive simulation study and a mutant-based software testing investigation have been performed to demonstrate the effectiveness of RBCVT. By Ali Shahbazi and James Miller, University of Alberta, Canada, and Andrew F Tappenden, The King’s University College, Canada.Source: IEEE Transactions on Software Engineering, February 2013

Practical Software Project Estimation: A Toolkit for Estimating Software Development Effort & Duration This guide explains the tools and methods necessary to extract conclusive business intelligence from disparate corporate data. It aims to help in the deployment of high-performance data transformation solutions in enterprise. Peter R. Hill (ed), McGraw-Hill/Osborne

.NET Performance Testing and Optimization: The Complete Guide This book for .NET code describes: why you should test; the steps of setting up your test environment; how to actually run and record tests; and what you should be looking for. Paul Glavich and Chris Farrell , Red Gate

Software Testing and Continuous Quality Improvement, Third Edition

YOUR RESOURCES

This book discusses software testing as part of the project management process. The author’s approach emphasises testing and quality goals early on in development and explains how compliance testing helps an IT organisation meet Sarbanes-Oxley and Basel II requirements. William E. Lewis, Auerbach Publications

Software Testing and Quality Assurance: Theory and Practice This book sets out to balance theory with practice, complemented with an abundance of pedagogical tools, so that it can be used both by professionals, and as an introductory text for courses in software testing, quality assurance, and software engineering. Sagar Naik and Piyu Tripathy, Wiley

Manage Software Testing This book guides you through issues that

you are confronted with on a daily basis, and explains what you need to focus on strategically, tactically, and operationally. It covers all aspects of test management.Peter Farrell-Vinay, Auerbach

Foundations of Software Testing: ISTQB Certification Covering the fundamental principles that every system and software tester should know, this guide is designed to help you pass the ISTQB exam and qualify at foundation level, and includes learning aids. Dorothy Graham et al, Cengage

To get these articles and more login to the BCS secure area, go to ‘My Knowledge’ then ‘EBSCO Databases’

BRIEFING: SOFTWARE TESTING

Members: login to ‘My Knowledge’

in the secure area to see further

listings with direct links

To get these books and more login to the BCS secure area, go to ‘My Knowledge’ then ‘Books 24/7’

Books 24/7

Page 34: BCS Sep13 Big Data

ITNOW September 201366

LEFT OF THE INSIDE BACK COVERdo

i:10.

1093

/itn

ow/b

wt0

62 ©

2013

The

Bri

tish

Com

pute

r S

ocie

ty

DAYS PAST Whimsical birth announcementIn September 1963 The Computer Bulletin announced offspring for an IBM 1401 in the form of one million new Swedish surnames.

‘All the new surnames were born at the Central Bureau of Statistics in a night,’ it reports, with ‘the mother’ – the IBM 14021 –feeling ‘good in the six hours it took to deliver them all.’

The Bureau in question was in Stockholm and was tasked with producing new surnames to supplement the Anderssons and Johanssons so popular in Sweden.

It ‘began a little tentatively by suggesting the name Abbeback. After A came B as in

Bvarling – a wonderful name n’est-ce pas?’ asked the Bulletin, going truly multi-lingual.

A 1967 book on Scandinavian Studies mentions the Swedish names law of 1964, and a further search prompted by this odd little news item taught me that, with the limited pool of Swedish names available this is rather a fraught topic.

I landed up on www.behindthename.com to search for any sign of these computer generated monikers being adopted – to carry on the maternal metaphor. So I looked for Abbeback, Bvarling or Oxorn, but in vain. Perhaps this isn’t a surprise as the 1963 Bulletin mentions that only about 50,000 of the IBM 1401’s offspring would be saved for posterity and there is

no mention of them going into actual usage.

Despite the early BCS’s reputation as being a bit dour, the news item had a nice ironic couple of asides. Noting the name Oxorn it editorialised that ‘even a robot’s fancy can run short.’ And the piece concludes with this observation: ‘The IBM 1401 at least did what it could to satisfy every taste.’

Computing in Schools Part ‘x’The Institute’s recent success in getting the concept of ‘proper’ computing disciplines taught in schools is far from a new subject. (See two other examples in June ITNOW p 66 alone.)

In September 1963 The Computer Bulletin ran a full feature on the

Have you seen something that belongs on this page?If anything in IT has recently struck you as amusing please let us know - anecdotes, pictures, cartoons are welcome - as long as no copyright is infringed, naturally.

(We love XKCD here, hence the cartoon below.) Drop us a line via [email protected] We look forward to hearing from you.

subject, with many points that would have been excellent, had they been introduced then...

For example, it mentions that the ‘changing school syllabuses arises from a need to develop skills suitable to modern living.’ It also says that ‘it is important that nothing should be done to prejudice the continuance of the general education of students or to cramp their mental development simply for the sake of including some form of specialised proficiency’.

(The conflation of computing and certain software packages anyone?)

The benefits were suggested as helping students to think logically, helping inculcate accuracy and neatness, being a stimulating subject and, finally, that ‘it will be essential to many of them later.’

It also encouraged following the algorithmic approach as much as possible. A laudable goal focusing on principles of computing rather than the aforementioned ‘form of specialised proficiency.’

The article gave an example: ‘When the class is being taught how to factorise a quadratic expression or solve a quadratic equation or a pair of simultaneous linear equations, they could be given a logical flow diagram for the procedures.

‘The introduction of programming for electronic computers should be encouraged as an extra-mural subject for those particularly interested, but programming languages should not form part of the curriculum for schools.’

Sensible policies for a happier Britain. Eventually. Reproduced from the excellent xkcd.com

Sponsors

REGISTER FREE TODAY and save £35 †

www.ipexpo.co.uk/BCS

† Visitors who have not pre-registered online by 19:00 on Tuesday 15th October 2013 will be charged an admission fee of £35.

Broader coverage. Greater focus.

EXPOIP16-17 October 2013 Earls Court 2 London

®

Focusing on a broader range of technologies, with 5 new themes and 5 new theatres, this year’s IP EXPO will once again prove to be the ONE ‘must-attend’ end-to-end IT event.

The ONE place where technology works together

Join us for two fact-filled days in October– Attend thought-provoking keynotes and seminars– Gain valuable insight from industry experts– Meet over 250 industry-leading exhibitors– Network with key players and IT professionals NETWORKS

CLOUD

SECURITY

COMMUNICATIONS

GOVERNANCE & RISK

STORAGE

APPLICATIONS

MOBILITY & DEVICES

DATA & ANALYTICS

VIRTUALIZATION

NEW!

NEW!

NEW!

NEW!

NEW!

New Themes

– Applications

– Communications

– Mobility & Devices

– Data & Analytics

– Governance & Risk

5

5 New Theatres– Communication &

Collaboration

– Data Centre

– Data Insight & Analytics

– Enterprise Mobility Management

– Network Security

10%New Exhibitors

Page 35: BCS Sep13 Big Data

© 2013 Flexera Software LLC. All other brand and product names ment ioned herein may be the trademarks and registered trademarks of their respect ive owners.

Learn More:http://www.flexerasoftware.com

• Maintain licence compliance and avoid software audits

• Assessing the ROI for a SAM and Licence Optimisat ion Program—building the business case

• SAM Process Maturity Modelling and Improvement—developing best pract ice processes

• Business and Technical Implementat ion

• Rat ionalisat ion, Reconciliat ion and Optimisat ion—‘Snapshot’ services for Audit Readiness and more

• Value Realisat ion—How to achieve the fastest ROI

• Empower users with an enterprise App Store

The success of every software asset management and licence opt imisat ion programme hinges on delivering business value. But there are many challenges, such as poor Software Asset Management (SAM) processes, lack of ent it lement data, licence complexity, and de-centralised software procurement.

Use the market leading solut ions from Flexera Software:

Achieve Breakthrough Results with Your SAM and

Licence Optimisat ion Program


Recommended