+ All Categories
Home > Documents > The Times Efficient It Supplement

The Times Efficient It Supplement

Date post: 12-Jan-2015
Category:
Upload: arch-goulis
View: 736 times
Download: 0 times
Share this document with a friend
Description:
The journeycontinuesFor smart CIOs, the currenteconomic climate offers anopportunity to shine, writesSteve O’Donnell
Popular Tags:
16
Growing pains As digital information explodes, how are smart companies using information management to get the most out of storage capacity? page 4 A cloud of our own Organisations reluctant to release private data onto public IT infrastructures can still reap the benefits of cloud computing. page 8 Walls come tumbling down When data lives in the cloud, traditional approaches to information security no longer offer adequate protection from threats. page 14 EMC, Cisco and VMware come together to shape the future of computing - centre pages Three’s a cloud EFFICIENT IT This supplement is an independent publication from Raconteur Media July 2009 W here next for effi- cient IT? It’s a ques- tion plenty of CIOs are asking right now. After all, those who have done their jobs well in the last few years have already done much to drive out cost from bloated IT infrastructures and deliver more efficient, high-quality IT services to the business. For many, that has involved funda- mental shifts in their approaches to architecture, to sourcing and to or- ganising available talent to best effect. As a result, many CIOs may now feel that they have little room left to move on the cost side of the equation. What the situation calls for is a whole new, ‘super-charged’ approach to efficient IT. It’s not enough for CIOs to simply maintain their cur- rent focus on driving costs out of the current infrastructure. They need to radically rethink how IT resources are sourced and utilised, in order to make a quantum leap in efficiency. For many, that will mean delving deeper into three technologies. First, virtualisation needs to become an overarching data centre design prin- ciple, rather than a handy way to ad- dress immediate tactical issues. Sec- ond, automation needs to become The journey continues For smart CIOs, the current economic climate offers an opportunity to shine, writes Steve O’Donnell continued on page three
Transcript
Page 1: The Times   Efficient It Supplement

Growing pains As digital information explodes, how are smart companies using information management to get the most out of storage capacity? page 4

A cloud of our own Organisations reluctant to release private data onto public IT infrastructures can still reap the benefits of cloud computing. page 8

Walls come tumbling down When data lives in the cloud, traditional approaches to information security no longer offer adequate protection from threats. page 14

EMC, Cisco and VMware come together to shape the future of computing - centre pages

Three’s a cloud

EFFICIENT ITThis supplement is an independent publication from Raconteur Media

July 2009

Where next for effi-cient IT? It’s a ques-tion plenty of CIOs are asking right now.

After all, those who have done their jobs well in the last few years have already done much to drive out cost

from bloated IT infrastructures and deliver more efficient, high-quality IT services to the business.

For many, that has involved funda-mental shifts in their approaches to architecture, to sourcing and to or-ganising available talent to best effect.

As a result, many CIOs may now feel that they have little room left to move on the cost side of the equation.

What the situation calls for is a whole new, ‘super-charged’ approach to efficient IT. It’s not enough for CIOs to simply maintain their cur-

rent focus on driving costs out of the current infrastructure. They need to radically rethink how IT resources are sourced and utilised, in order to make a quantum leap in efficiency.

For many, that will mean delving deeper into three technologies. First,

virtualisation needs to become an overarching data centre design prin-ciple, rather than a handy way to ad-dress immediate tactical issues. Sec-ond, automation needs to become

The journey continuesFor smart CIOs, the current economic climate offers an opportunity to shine, writes Steve O’Donnell

continued on page three

Page 2: The Times   Efficient It Supplement

ENTERPRISE-CLASS FILE SHARING

Dell™ NX4 breaks down traditional barriers, allowing users to share files between Windows®, Linux®, and UNIX® environments

LEARN MORE AT DELL.CO.UK/EMC

Page 3: The Times   Efficient It Supplement

EFFICIENT IT 3

the default option when it comes to handling day-to-day administrative tasks, freeing up IT staff to focus on more strategic projects. Third, the full potential of cloud comput-ing (and in particular, private cloud architectures) needs to be explored, so that CIOs and their teams can be sure that key business services are underpinned by the most appropri-ate IT resources, whether these are the company’s own or those of a third-party provider.

The aim of this report is to help CIOs rise to the challenge. Those

that do so successfully have “a tre-mendous opportunity to establish themselves as board-level influenc-ers who deliver a very visible dif-ference to the organisation’s bot-tom line,” says a recent report from management consultancy firm PA Consulting.

Such measures can greatly con-tribute to an organisation’s agility - a top priority in today’s uncertain times. In a recent briefing paper from the Economist Intelligence Unit, ‘Organisational agility: how businesses can survive and thrive in turbulent times’, sponsored by information management company

EMC, 90 per cent of the CIOs and CEOs surveyed said they view or-ganisational agility as critical for business success. Yet most admitted that their organisations are not yet flexible enough to truly compete successfully, with more than one-quarter (27 per cent) admitting that their organisation is at a competitive disadvantage because it is not agile enough to anticipate fundamental marketplace shifts.

Better information management, supported by more efficient IT, lies at the heart of achieving the kind of responsiveness required, according to the report’s authors. “In today’s

knowledge age, the ability to trans-form information into insight in re-sponse to market movements is core to sustainability,” they say.

That means that, in the quest for greater efficiency, CIOs must con-tinue to manage their IT portfo-lio ruthlessly, rationalise IT assets still further and become ever-more adventurous in their approach to sourcing. It’s time for CIOs to assess what their internal IT department does best and to concentrate on the projects that deliver the most value and competitive advantage to the business. That should enable them to identify the products and services that might be less costly and more efficient in the hands of a third-par-ty provider, using today’s huge range of cloud computing models as a de-livery channel to business users.

And they must work hard to de-velop an outstanding pool of IT tal-ent, ensuring that skilled IT profes-sionals in-house lead the projects with the highest business impact and developing the IT leaders that will help the organisation in finding new efficiencies in future.

The danger for those CIOs who struggle in the face of these chal-lenges is clear: they will increas-ingly be circumvented by business process owners, whose influence in determining where and how IT in-vestments are made will grow. The benefit of cloud computing – but also one of its potential risks – is that it makes sourcing IT products and services far easier than ever.

If the sales and marketing direc-tor needs a new salesforce manage-ment system, so that their team can make the most of every business prospect they identify, a slow re-sponse from the CIO might mean that the marketing directors simply goes out and procures a software-as-a-service solution to perform the function.

This is no time to sanction any kind of maverick procurement. While that SaaS solution may indeed provide a good fit for the sales de-partment’s immediate need, it’s vital that the CIO remains in control of the technology products and serv-ices – both internal and external – that are used to support the business

and how they interrelate with one another.

“The CIOs that we work with pas-sionately believe that IT in the right hands – their hands and those of equally talented colleagues – is cru-cial to the delivery of innovative and important business change, that in turn contributes to their organisa-tion’s success,” say consultants at PA Consulting. “Now is the time to stand up and convince the leader-ship to share this passion.”

EFFICIENT IT

The information contained in this publication has been obtained from sources the proprietors believe to be correct. However, no legal liability can be accepted for any errors. No part of this publication may be reproduced without the prior consent of the Publisher. © RACONTEUR MEDIA

Publisher: Dominic Rodgers Editor: Jessica Twentyman Contributors: Guy Clapperton, Gareth Kershaw, Guy Kewney, Sally Whittle Design: Hervé Boinay

For more information about Raconteur Media publications in The Times and The Sunday Times, please contact Freddie Ossberg T: 020 7033 2100 E: [email protected] W: www.raconteurmedia.co.uk

FurThEr rEaDINg

Organisational agility: how businesses can survive and thrive in turbulent times Briefing paper from the Economist Intelligence Unit, sponsored by EMC, March 2009http://tinyurl.com/dyovv7

CIO Agenda 2009: Board position or bored disposition? Briefing paper from PA Consulting Grouphttp://tinyurl.com/r63s9l

continued from page one

CIOs are aiming to get people, information and technology working together to beat the economic slowdown

ThE auThOr

Steve O’Donnell is an interna-tionally recognised leader in data centre operations with 30 years’ experience running some of the largest IT organisations in the world. His blog, “The Hot Aisle” is a globally renowned source of information on the IT industry. He is managing director for En-terprise Strategy Group’s EMEA operations and heads up the global IT operations practice.

Previously, O’Donnell ran IT internationally for First Data, and was global head of data centres at BT running the larg-est data center operation in Europe. O’Donnell has a world-wide reputation as a thought leader in green IT, having won six industry awards for his 21st century data centre vision.

www.thehotaisle.com

Page 4: The Times   Efficient It Supplement

4 EFFICIENT IT

In 2007, the ‘digital world’, consist-ing of all the data produced and replicated across the globe, was estimated at 281 exabytes, or 281

billion gigabytes (GB) in size by ana-lysts at IT market research company IDC. That’s around 45GB of informa-tion for every person on the planet. By 2011, they predict it to grow to over ten times its current size.

The technology industry is no stranger to unrest and upheaval - but for many organisations, keeping pace with that kind of growth has become more onerous in recent years than ever before, putting skills in information and storage management to the test and pushing demand for storage capacity to new heights.

In fact, the explosion of digital in-formation is just one of the problems with which IT professionals are hav-ing to contend. At the same time, they are also wrestling with new legal and regulatory demands that dictate what data must be kept, for how long and how quickly it must be retrieved. And they must find smarter, more innova-tive tools and technologies to address these challenges, against a backdrop of widespread IT spending cuts.

As a result, current pressures demand a complete reassessment of information management strategies at many organi-

sations. And that will be no easy task – particularly because

it will require a complete break from recent practice

when it comes to mak-ing storage and in-

formation manage-ment investments.

The decade preceding the recent economic slowdown was defined by untrammeled spending on stor-age. During those years, the cheaper storage capacity became, the more companies seemed compelled to buy. Now that the financial outlook is less positive, it’s time for business-es to be more realistic.

Time for a reThinkIn fact, it’s a case of “back to the future” for storage and information manage-ment, says Dr Graham Oakes, an in-dependent technology consultant who has provided advice on storage strate-gies to organisations including Oxfam, government-owned savings bank Na-tional Savings & Investments (NS&I) and the Office of the Deputy Prime Minister. Organisations must revert to the more stringent levels of scrutiny that were applied to purchases back when storage media and systems were still relatively costly, he says.

In fact, those organisations that don’t take a step back from rampant storage acquisition and haphazard informa-tion management practices may floun-der sooner than they expect. Many data centres are rapidly running out of space and power, so while UK firms need to grow their storage capacity, many can’t expand beyond their current physical or energy footprint.

As a result, they will be forced to get more not only from existing storage systems, by boosting utilisation rates and jettisoning redundant and dupli-cate information, but also from avail-able data centre space, by consolidat-ing storage capacity into fewer, more efficiently utilised systems.

a Three-pronged aTTackIn the drive for greater efficiency, say industry watchers, organisations will battle that storm on three fronts: com-plexity, cost and automation.

In terms of complexity, for example, there is a notion that high-profile initia-tives like storage consolidation and vir-tualisation have greatly simplified and demystified storage infrastructures.

Growing pains

hittinG the mail on the head

One of the most visible symbols of the explosion in digital information, email can also be one of the trickiest to manage. As a communications medium, its availability is now taken entirely for granted by users at most companies; but for hard-pressed IT departments, the storage and management of emails pose a number of significant challenges.

For a start, emails need to be auditable, searchable and easily retrieva-ble for operational and compliance purposes. Email systems, meanwhile, are expected to be up and running 24 hours a day, 7 days a week and 365 days a year. Business users now find it almost impossible to do their jobs without it, so stringent business continuity plans are paramount.

That’s perhaps why many organisations’ email management strate-gies have moved beyond basic back-up to a more holistic approach, based on wider information management principles.

For a start, strong archiving practices are a must-have, says David Par-kin, director of sales for EMEA at security specialist Sunbelt Software. “Approximately 80 per cent of businesses now use email for closing orders and performing transactions, making them subject to statutory records retention requirements. But exactly what should be stored and how long for is poorly understood by most businesses,” he says.

In fact, robust information managment policies should be applied long before the archiving stage. For example, it’s particularly important to elimi-nate duplicate copies of emails before the archiving stage, which is why organisations are increasingly deploying de-duplication technology, which scans each email, assigns it a unique tag (much like a fingerprint), indexes and retains it. Redundant or duplicate copies are simply deleted.

“Now, more than ever, cost containment is a key concern. Data re-duction technologies for primary data and secondary copies of data (backup and disaster recovery copies) can help drive down costs by using less storage, and can, perhaps, extend the useful life of cur-rently deployed solutions,” says a recent Gartner report.

Just as vital is the ability to retrieve email rapidly - especially im-portant if a regulator or potential opponent in a law case comes knocking. That’s why many companies, particularly those working on e-discovery projects, are prioritising access and availability of stored emails, says Gareth Meatyard (pictured above), product specialist for EMC’s recently announced SourceOne products.

“Users need seamless access to archived content and proactive in-formation management to help with litigation readiness, including a central archive to accelerate large-volume discovery searches and enable secure legal hold,” he explains.

The EMC SourceOne product family is a suite of information gov-ernance and integrated content archiving solutions that share a com-mon goal: to help organisations manage their information resources intelligently – for the highest return on investment, at the lowest risk, and for maximum competitive advantage.

Within this family, EMC SourceOne Email Management aims to sup-port proactive e-discovery, email retention policies and cost-efficient tiered storage in high-volume email environments. It provides all core email archiving capabilities for Microsoft Exchange and IBM Lotus Notes/Domino environments, as well as instant messaging.

“Many of the email archiving solutions written ten or more years ago have been challenged to meet very large mailbox environments (that is, 50,000 or more mailboxes),” says Laura DuBois, an analyst at IT market research company IDC. “The larger the environment, the more strain the system architecture faces from ingestion performance, indexing speed, database scalability, index integrity, as well as search and policy management.” EMC SourceOne Email Management, she adds, “offers a next-generation archiving architecture to meet these challenges.”

As digital information explodes, how are smart companies preparing to take the strain? Gareth Kershaw reports

Gareth meatyard (emC): “Users need seamless access to archived content and proactive information management”

the growth in corporate information is putting both skills and storage capacity to the test at many companies

Page 5: The Times   Efficient It Supplement

EFFICIENT IT 5

Chris Gabriel, director of solutions marketing at systems integration com-pany Logicalis, notes that, without the right “mindset”, such technologies can spawn more information manage-ment problems than they solve. “The thing is, virtualisation ain’t new and – shock, horror – it’s actually not that clever,” he says. “Yes, it allows you to put more [data] onto less [storage], and in today’s increasingly data-based lifestyle, that’s undoubtedly a benefit. However, in having access to seeming-ly endless storage capacity, it’s easy to get lazy and slip back into bad habits.” The point, he says, is that it isn’t how a company stores its data, it’s how it manages it that makes the difference.

If companies wish to tackle the sec-ond issue – cost – then it’s time they stopped treating their storage systems like a garage, “a place they chuck things because they don’t want to throw any-thing away”, says Darren Thomas, glo-bal vice president and general man-ager for enterprise storage at systems company Dell. Instead of old bicycles and boxes, he says, enterprise storage systems are full of junk data that will never be needed again.

Here, data deduplication technology can be a big help, according to Dennis Ryan, partner sales development man-ager at EMC in EMEA. This works to detect and eliminate information that is already stored elsewhere in an organ-isation’s storage infrastructure.

Today’s deduplication can work at a number of different levels, he says.

With file-level deduplication, for ex-ample, one copy of a file is retained as a reference and all other copies of the file are replaced with a unique identifier, or ‘pointer’, to the file. “This approach lends itself well to data re-tention policies, where retention requirements are applied to the ref-erence copy and adopted by all appli-cations using the file,” says Ryan.

Object-level de-duplication, mean-while, can be applied not just to a single file, but also to collections of files. “This type of de-duplication is usually associated with compliance projects,” he says.

Finally, block-level de-duplication breaks data into small blocks, or ‘chunks’, and assigns a unique identifi-

er to each chunk, says Ryan. This kind of de-duplication, he says, is largely relevant in back-up and restore envi-ronments today.

The cost issue can also be tack-led by using storage disks of varying performance and capacity to store information according to its business

value. In ‘storage tiering’ scenarios, says Ryan, “current data can be stored on high-performance disk drives and older data can be archived to very large, lower-performing drives.”

When it comes to implementing stor-age tiering, organisations can choose a static approach or an active approach – or a combination of the two.

In a static scenario, different types (or ‘volumes’) of data, relating to the same application, are stored on different disk drive types. So the log files, index files and tables that make up a database – but which are accessed with different degrees of frequency – all reside in different places. “We refer to this as ‘static’, because the different volume types

will always reside on their respec-tive disk types, rather than move be-tween tiers,” he explains.

In active scenarios, by contrast, data regularly moves between tiers of stor-age, depending on a number of fac-tors, with “age being the most com-mon”, says Ryan.

“Many users implement the static approach and combine this with the most common element of active tier-ing – archiving,” says Ryan. “Tradi-tionally, data was archived to tape and optical media, but current legislative requirements make these unsatisfac-tory for rapid information retrieval, e-discovery and other modern booby-traps,” he says.

Which bring us to the third key tar-get that CIOs are looking to address: automation. Darren Thomas of Dell calls it “the factor that’s truly driving today’s market”, taking its place along-side more established information management drivers of “scale, capac-ity and performance”.

Today’s storage automa-tion technology aims to take day-to-day storage decisions and tasks out of the hands of hard-pressed IT staff and automati-cally allocate data and information to dif-ferent storage tiers, according to pre-de-fined rules relating to their business value. Automation tools for storage virtualisation, thin provisioning and tiered storage are three hot tickets in this space.

A more intelligent approach to informa-tion management and

storage needs to take all three ‘storm fronts’ into account. But wider eco-nomic conditions notwithstanding, it’s a great time for companies to be thinking about re-architecting their storage infrastructures, because the efficiency gains and increased val-ue that can be achieved will make maximum impact on efficiency-focused businesses.

Want to make your business sharper?

Contact us for a no obligation assessment to see how our industry knowledge and skilled consultancy could make your business sharper.

Tel: 0845 604 5151 Visit: www.computacenter.com/virtualisation

Virtualisation makes commercial sense...It’s just simple mathematics. But choosing the right independent expert can ensure it delivers what it says on the tin.

For over a decade, Computacenter has delivered complex virtualisation projects across server, storage, application and desktop platforms. We understand that each client is different and that one solution doesn’t necessarily fit all. That’s why we choose to work with industry-leading vendors with proven pedigrees – vendors like EMC and VMware – to ensure we deliver pragmatic solutions that make your business sharper.

Realise the benefits…Whatever stage of the virtualisation journey you’re at – if you’re looking to analyse the impact of virtualisation, de-risk the project, or validate the efficiencies and ROI, we have the tools and skills to help you realise the benefits of virtualisation.

SOLUTION CENTER

SOLUTION CENTER

We do this for some of the UK’s best-known organisations – and we can do it for you too.

“Enterprise storage systems are full of junk data that will never be needed again”

Page 6: The Times   Efficient It Supplement

Eff ic ient IT = Vir tual ized IT

130,000 businesses choose to virtualize with VMware. Why? Because today’s IT mantra is all about doing more with less.

More efficiency. More control. More choice.Less capital expenditure. Less operating expense. Less energy cost.

VMware virtualization solutions transform your IT and build the foundation for the next generation of IT efficiency gains that cloud computing will bring.

To find out more, contact us now on 0800 032 7597 or visit www.vmware.com

Page 7: The Times   Efficient It Supplement

EFFICIENT IT 7

It is a medical fact that only a rela-tively small part of the human brain is used during the average lifetime. Nobody knows what

purpose the rest is meant to serve. It would be wrong to say the same

rule applies to computers, but it’s true that the tasks of an individual PC – or an array of servers, or any other com-bination – may be better served by ‘virtualising’ a system of software onto another physical system.

So, on a corporate level, a massive amount of storage might sit on physi-cal systems at a remote location that are shared with another enterprise, making one set of servers work as two ‘virtual’ sets.

On a much smaller level – the small-est possible – your correspondent is writing on an Apple computer that has a virtual PC running in one window, so one computer is acting as two.

That idea is catching on at many businesses, but the main benefits have been noted at the higher end so far, says Rene Millman, senior analyst with Gartner. “This has made a beachhead in the large enterprise, where server utilisation has traditionally been low and organisations are looking to ex-tract maximum usage from their in-frastructure,” he says. “As usual, it is

the financial sector that has embraced this model of computing [first].”

The IT industry is now eyeing the mid-market as the next likely area of massive growth. “Virtualisation has come to the fore over the last five years as computing power far outstrips what the operations systems and the applications that run atop them are capable of,” says Millman.

BOOSTING UTILISATION RATESIt’s happened because the technology to multitask without harming core tasks has become available, making IT systems far more efficient. “Servers run with many more processing cores than before and thus utilisation of these resources has been low. The need to run one appli-cation per operating system so it doesn’t interfere with other apps means that, for the most part, the server is idling. Virtu-alisation solves that problem, with many operating systems running on the same server without each operating system in-stance impacting on the other.”

Vendors of virtualised solutions con-cur. Serguei Beloussov, chief executive of Parallels, which also works in the related cloud computing arena, points to the many downsides of having a dedicated server for every mission-critical applica-tion in every department: “This approach has led to organisations accumulating a

number of servers, each needing power to run and cool them, while most are sig-nificantly under-utilised. Virtualisation addresses this problem by enabling busi-nesses to simultaneously run multiple, isolated workloads on one physical server, so several applications can be safely hosted on one box. This leads to better utilisation of the hardware, reducing the amount of boxes needed, resulting in far greater en-ergy efficiency. The more virtual servers you can run on one box, the more energy saved, so density and efficiency is key.”

Solid business benefits start to ac-crue quickly when this sort of technol-ogy model is in use. Take, for example, the City of Rotterdam, which employs some 1,000 people in its Dienst Steden-bouwe en Volkshuisvesting (DS+V) department and is responsible for town planning, housing and traffic.

In 2004, the department implemented the open-source operating system, Red Hat Enterprise Linux 3, for a few appli-cations, but stuck with a combination of Windows and Unix for the main plat-form, using 40 servers to run the com-bination of Microsoft, Unix and Linux. When it came to putting a new adminis-tration and registration application in for the City’s real estate activities, however, the Council started looking at a virtual-ised system based exclusively on Linux.

After a period trying the system on a pilot basis, the organisation migrated its servers to Red Hat Enterprise Linux 5 and was able to use virtualisation to run 10 virtual machines across just six servers.

The council used Red Hat Global File System for storage virtualisation and Red Hat Satellite Server for faster deployment of both physical and vir-tual systems on the network.

“One of the key benefits of Red Hat Enterprise Linux 5 with virtualisation is that we can install and roll out a new application in 60 minutes to all our systems, compared to four hours per system previously,” says Hennie Stam, senior system administrator at DS+V.

WHO OWNS THE CLOUD?Others see further potential benefits. Fre-drik Sjostedt, director of product mar-keting EMEA at virtualisation company VMware, believes this is a staging point

on the way to cloud computing. “One of the big issues with cloud computing so far has been the use of proprietary cloud plat-forms, which make it very difficult for an IT department to move their workloads into the cloud,” he says. “Then you have the problem of vendor lock-in, whereby a customer that has gone to the effort of re-writing an application for the cloud finds it too difficult to change providers.”

Because virtual machines are hardware independent and portable, he says, virtu-alisation can help customers to move their applications between their own data cen-tres – or the internal cloud – and external clouds. “This idea of federation between internal and external clouds based on vir-tualisation is where we are focusing a great deal of our development efforts,” he adds.

Inevitably, there are a handful of cave-ats. Chris Coulter is a partner at City law-yers Morrison & Foerster and, although he recognises the considerable benefits of virtualisation, he has concerns. “De-pending upon how [virtualised systems]

are used, there may be issues for either the user or the vendor or both regard-ing data security, privacy and other legal compliance,” he says. “Since virtualisa-tion depends upon moving data around the world, perhaps splitting it up and sending it to different locations, depend-ing on capacity, use and bandwidth, then it’s much more difficult for the user to know where the data is held. ”

It gets even more complicated when the data holder is regulated by the fi-nancial authorities, he adds, in which case, different regulations will apply in different territories.

In a period when increased efficien-cy is a major corporate goal across the board, however, the benefits of virtuali-sation remain clear. On the face of it, it’s a no-brainer. Does your enterprise want to buy hundreds of servers and systems - or tens that can behave like hundreds, with all the savings in time and energy that go with that? For many CIOs, it’s not an issue that requires much thought.

From virtualisation to the Cloud

Having embraced virtualisation, some organisations are using early wins in the area to start exploring cloud computing in more depth. One example is the Pensions Regulator, the UK government body charged with overseeing work-based pensions schemes. There, virtualisation technology from VMware has allowed IT staff to decommission over 40 physical servers and cut power and cooling costs by 30 per cent.

It’s a great start, but the journey is far from over, says Ray Heffer, techni-cal infrastructure manager at the Pensions Regulator. “Perhaps the most critical service we support is a pensions web portal, ’Exchange’, for pen-sions scheme administrators across the country, and we have already taken steps towards a cloud approach to support this, using VMware.”

The physical infrastructure and virtualisation technology needed to sup-port the 24x7 portal is provided by a hosting provider, he explains, “but crucially, we can monitor and maintain the virtual machines running on that infrastructure centrally, as if they were within our own data centre.”

“This has been such a success that we are now looking at using this host-ing facility for offsite disaster recovery purposes in the future,” he adds.

virtualise to capitaliseThe benefits of virtualisation are an issue that no organisation with an eye on efficiency can afford to ignore, says Guy Clapperton

the virtualisation promise

With a virtualised environment, organisations have the opportunity to get their entire IT infrastructure running as a single pool of highly efficient computing resources.

Source: VMware

One application per operating system no longer applies

rotterdam, where city officials are keen advocates of virtualisation

Page 8: The Times   Efficient It Supplement

8 EFFICIENT IT

There’s no doubt that business executives the world over are sold on the efficiency benefits promised by cloud

computing. What’s not to like about an infrastructure model that, through smarter use of hardware resources, promises to deliver a hefty boost to server and storage utilisation rates and to cut bloated energy bills?

By using a cloud service provider, organisations don’t even have to own

the IT infrastructure. They can just rent it on an on-demand basis.

Attractive as that promise may be in capital expenditure terms - think of the money saved on servers, stor-age and networking equipment - many companies still have under-standable concerns about releasing valuable corporate information onto public IT infrastructures that they will probably be sharing with other companies.

At the very least, those infrastruc-tures may be sited in other countries, even other continents. What will that mean for these companies in terms of data protection compliance?

There’s also the issue of service level agreements. While public cloud pro-viders certainly offer their customers a comprehensive range of SLAs in areas such as application uptime and capaci-ty/performance management, there are still some systems that companies con-sider so vital to business performance that each requires its own set of deep-grained SLAs, specific to that system.

And that’s without mentioning the many compliance standards that com-panies must adhere to, both regulatory and legal. These dictate that application deployments may be subject to rigour-ous rules regarding the way they are developed and run. “Public clouds that have successfully completed various audits provide these protections as part of their services, but their assurances don’t cover how you use their services, and ensuing use of these clouds in a compliant fashion can be tricky,” says James Staten, an analyst with IT market research company Forrester Research.

For these reasons, he says, many organisations are “taking [the cloud computing] concept in-house and building their own internal clouds.” His company defines an internal cloud as a “multi-tenant, dynamically provi-sioned and optimised infrastructure with self-service developer deploy-ment, hosted within the safe confines of your own data centre.”

It’s early days for this kind of model, but already, it has captured the imagi-nations of the technology industry and its customers alike. In a survey conducted by Forrester among en-terprises and small to medium-sized businesses [SMBs] in North America

and Europe in the third quarter of 2008, four per cent of respondents said they had implemented an inter-nal cloud, while 17 per cent said they were interested and were either imple-menting or budgeting to deploy one.

Private clouds are an even newer take on established versions of cloud computing. These offer the benefits of smaller, cloud-like IT systems that

operate within a a closed internal net-work, but which can also be opened up, where appropriate, to external services or to the internal systems of partners and suppliers, without devi-ating from internal control standards.

“Public and private clouds will even-tually all be the same thing,” predicts Marc Silvester, global chief technology officer at Fujitsu. “We’re currently see-

Smart companies that are not ready or willing to release corporate information onto public IT infrastructures are exploring a new way to bring all the benefits of cloud computing under private ownership and control. Jessica Twentyman reports

A cloud of our own

Private clouds: All the benefits of cloud computing, but with greater internal control over corporate information

PersPecTives on The PrivATe cloud

A private cloud, by necessity, is construct-ed from a range of technologies from dif-ferent IT suppliers. So when Chuck Hollis, vice president and CTO at EMC took the stage in May 2009 at the company’s EMC World user conference, it’s no surprise that he was joined by executives from networking giant Cisco and virtualisation vendor VMware.

Together, said Hollis, the combined forc-es of EMC, VMware and Cisco offer the full range of required technologies to create private clouds, in a package that offers full integration between components, as well as a compelling future product roadmap.

The partnership between VMware, Cisco and EMC is based on key technologies from each company: the vSphere virtual data centre operating system from VMware; Cisco’s unified computing system (UCS) and Unified Fabric offerings; and EMC’s virtual information infrastructure, based on its new Virtual Matrix Architecture and the Symmetrix V-Max storage systems.

We spoke to leading executives at these three companies, to get their perspective on what the private cloud means to their com-pany and what it could offer in return.

Bernadette Wightman, channel director for Cisco UK and Ireland“Cloud infrastructures, whether they’re internal, external or a feder-

ation of the two – which is where most organisa-tions want to get – need to be underpinned by dynamic and scalable networking technology, which is where Cisco excels. It simply makes sense for companies to access certain servic-es from external third parties. Just look at the popularity of WebEx conferencing, web-based email and software-as-a-service applications such as Salesforce.com. From the customer’s perspective, it’s a matter of where to deploy their resources. Where will they get the best re-turns of their efforts? There’s some services that will remain in house and some that will come from third-party providers. What’s important is

being able to tie them into an infrastructure that can deliver on demand and at scale, one that is fully virtualised and integrated according to industry standards.”

Adrian McDonald, vice president of EMC UK & Ireland“From a CIO’s perspective, embark-ing upon a cloud computing ini-

tiative can be a gamble between efficiency and control. The private cloud changes that, by bring-ing together both internal and external services, in a way that they comply with internal controls. From there, CIOs can start to offer these serv-ices to the business as a whole in the form of an internal market for IT resources, where services can be offered to departments and functions at a cost based on the resources used and the quality of the service delivered. But to do that, you’ve got to make the connection between price and quality, in the form of meaningful service level agreements. From companies that we speak to, we’ve learnt that they’re not prepared to go to

anyone other than tried and tested names to give them the infrastructure to underpin this kind of environment. We’ve been close partners with Cisco and VMware for many years – and, with this initiative, this is a partnership with real purpose. There’s a natural fit here. An organisation looking to create a private cloud needs a cloud operat-ing system, and that’s where VMware comes in. It needs robust and scalable cloud networking, and that’s where Cisco comes in. And it abso-lutely needs virtual information management, and that’s EMC’s main focus.”

Chris Hammans, regional direc-tor for VMware UK & Ireland“For us, the fundamentals of cloud computing start and end with the

thing that organisations know and love today - their data centre. But it’s a love/hate relation-ship: they know they have inefficient servers and services within that data centre. Compa-nies have already started on the private cloud journey by virtualising systems to give them-

selves a data centre that offers more efficiency and control and choice. That element of choice is a vital element in a private cloud environ-ment, because if organisations are going to tie in external services too, then the industry has to work together to ensure that public and private clouds can work together and are built on industry standards. Nobody wants to move to a cloud environment that locks them into a particular infrastructure or provider, or forces them to re-write applications. That defeats the object and purpose of cloud computing.

WEBINARPrivate Cloud – The future shape of cloud computing Wednesday 29 July 2009

Join VMware, Cisco and EMC for an overview and discussion of the “Private Cloud” vision – including a full review of how the technolo-gies of today provide the building blocks for the cloud computing of tomorrow.Registration and full agenda at www.emc.co.uk/vce-webinar

WhAT is A PrivATe cloud?

A private cloud is where internal and external cloud computing models meet, enabling an organisation to tap into the services of third-party pro-viders, without relinquishing control over valuable corporate information.

© Copyright 2009 EMC Corporation. All rights reserved.

Cloud computing: a shared pool of infrastructure resources that can flexibly accommodate business services

Page 9: The Times   Efficient It Supplement

EFFICIENT IT 9

ing companies taking their first step with experimenting with both and then looking to tie the two together, so that content and information that is com-mercially sensitive stays private, but commodity services can be purchased from a provider and delivered at a changeable rate, according to need.”

Virtualisation lies at the heart of any private cloud architecture. Many

organisations have this already and are achieving huge efficiency gains as a result (for more on this, see article on page 7, ‘Virtualise to capitalise’).

“Architecturally, an internal cloud isn’t that different from a virtualised scale-out infrastructure in today’s en-terprise. Both are composed of a col-lection of servers, topped with either a grid engine or a virtual infrastructure based on hypervisors,” says Staten of Forrester Research.

But private clouds differ in two key respects. First, in a private cloud, developers deploy new applications to the cloud via a self-service portal, without needing the help of systems administrators to configure a server for them. They simply configure a ‘virtual machine’ themselves.

Second, and arguably more impor-tantly, private clouds offer a hefty dose of automation, that frees systems adminis-trators from manual administrative tasks, “such as determining the best placement of new workloads and optimising the virtual pool to make room for more ap-plications,” says Staten. (For more on data centre automation, see article on page 13, ‘Just keeping the lights on’.)

What this amounts to is an IT infra-structure primed to manage and store burgeoning volumes of corporate information in a more efficient way, says Adrian McDonald, vice president of EMC’s UK & Ireland operations. “What we’re ultimately talking about is freeing up information from its physical infrastructure,” he says.

That has two important benefits.

First, it enables organisations to dramatically increase the utilisation of that physical infrastructure. “In many companies, utilisation rates for servers and storage systems hover at around 10 per cent. The virtualisation capa-bilities of a private cloud infrastruc-ture can push these up to 70 per cent plus. So straight away, you’re reducing the costs associated with information management and storage.”

Second, with the appropriate informa-tion management tools in place, he says, organisations can move that informa-tion around the infrastructure, accord-ing to that information’s overall value to the business. “In a private cloud, data can be moved and manipulated more freely - that could involve, for example, the migration of data from operational systems to a data warehouse. An organi-sation that can achieve more agility with its information is in a better position to analyst and interpret it. In other words, they can more easily turn data into in-formation.” (For more on information management and storage challenges, see article on page 4, ‘Growing pains’.)

Naturally, any talk of cloud comput-ing raises inevitable questions about data security. How do you lock down data when it resides not behind a tra-ditional firewall and subject to stand-ard network security approaches, but somewhere out there in in the cloud? Approaches are emerging that aim to tackle this issue head-on, but for now, it’s sufficient to say that organisations need to move to a model whereby the security of data and information takes priority over infrastructure-centric approaches. (For more on security in virtualised environments, see article on page 14, ‘Walls come tumbling down’.)

Either way, the private cloud trend is clearly one that no organisation can afford to ignore. The advice from ex-perts is to start small. “In this economy, few companies can afford to invest in a massive internal cloud. You will likely limit your cloud to a small set of sys-tems, since the cloud’s frequency of use and total capacity needed won’t entire-ly be known and every IT investment needs a clear business case today,” says Staten of Forrester Research.

Many organisations will need help with that, says Aad Dekkers, chief marketing officer at MTI Europe. “Virtualisation that covers servers, storage, networking and desktops involves a range of skills that can test the in-house resources at even large organisations, which is where a trusted partner can help.”

The most important thing, howev-er, is that organisations get that start under their belts as soon as possible. After all, says Staten, the economic value of an internal cloud “rises with its use, which normally means invit-ing as many applications as possible.”

At his company, analysts are in-creasingly seeing a cross-over between data centre virtualisation and auto-mation, and the multi-tenant, scale-out infrastructures of cloud comput-ing. “There’s a good reason there’s so much hype around cloud computing right now – it’s the fulfilment of an architecture we have all been seek-ing for many years, a shared pool of infrastructure resources that flexibly accommodate business services.”

Private clouds: All the benefits of cloud computing, but with greater internal control over corporate information

Thomas Bittman, Gartner

PersPecTives on The PrivATe cloud

selves a data centre that offers more efficiency and control and choice. That element of choice is a vital element in a private cloud environ-ment, because if organisations are going to tie in external services too, then the industry has to work together to ensure that public and private clouds can work together and are built on industry standards. Nobody wants to move to a cloud environment that locks them into a particular infrastructure or provider, or forces them to re-write applications. That defeats the object and purpose of cloud computing.

WEBINARPrivate Cloud – The future shape of cloud computing Wednesday 29 July 2009

Join VMware, Cisco and EMC for an overview and discussion of the “Private Cloud” vision – including a full review of how the technolo-gies of today provide the building blocks for the cloud computing of tomorrow.Registration and full agenda at www.emc.co.uk/vce-webinar

WhAT The AnAlysTs sAy

“The business wants cloud computing, because it wants fast time-to-market and to pay only for what it consumes; that requires IT resources to organically adapt to the business and deliver com-mensurate economics. An internal cloud provides businesses with the same assurance that the specific safeguards and processes that govern the business are being applied. Before this, you could get one or the other, but not both. An internal cloud accelerates the evolution of your virtual infrastructure to a true utility model and your IT department to an internal service provider. So embrace this trend and leverage it to transform your organisation.”

James Staten, Forrester Research

“I believe that enterprises will spend more money building pri-vate cloud computing services over the next three years than buying services from cloud computing providers. But those investments will also make them better cloud computing customers in future. Build-ing a private cloud computing environment is not just a technology thing - it also changes management processes, organisational cul-ture and relationships with business customers. And these changes will make it easier for an IT organisation and its customers to make good ‘cloudsourcing’ decisions and transitions in future.”

Thomas Bittman, Gartner

“The cloud is at its core nothing more than flexible hosting. It has three core attributes: cost, control and performance. If it doesn’t have cost advantages, there is no point in doing it. If control isn’t adequate, it can’t be secured (creating an inexpensive way to get folks fired). And if performance drops, the cost savings can’t be justified. Given that the cloud is based on dynamically shifting loads across wide distances and locations, it would seem that the network is, in fact, the central critical path. You can’t forget the servers, anymore than you can forget the structure in a new house, but you focus on optimising the network so that your cost, control and performance needs are met. Other parts comes to mind, and they are the virtualisation and storage layers. Infor-mation from all three - the virtualisation platform, the storage platform and the network - need to be optimised to assure that the resulting cloud system performs to specification.”

Rob Enderle, The Enderle Group

For insight from IDC market analyst Chris Ingle on how cloud comput-ing and virtualisation can boost business continuity, please see article on page 11, ‘A better way to defeat downtime’.

Page 10: The Times   Efficient It Supplement

Fujitsu is helping private and public sector organisations fi nd new ways to reduce their cost base and enable greater operational fl exibility.

With our standardised services, we’re substantially reducing the cost, complexity and lead time commonly involved in implementing and managing IT services.

Instead, our clients benefi t from lower cost, enterprise-class IT services, pre-built to perform. It offers them a viable alternative to owning IT infrastructure, reduces capital expenditure and provides real fl exibility moving forward.

Find out moreTel: +44 (0) 870 242 7998Email: [email protected]

WORKING WITH LEADING ORGANISATIONS TO REDUCE THE IT COST BURDEN

uk.fujitsu.com

21982_FUJ_Ants_AD_264x338mm.indd 1 10/7/09 15:11:47

Page 11: The Times   Efficient It Supplement

EFFICIENT IT 11

If one was asked to summarise the benefits of virtualisation in a single sentence, the increasingly popular mantra of “do more with

less” would get pretty close. In fact, in the early stages of virtualisation at least, most organisations seem happy to “do the same with less” – content with the apparent “immediate ben-efits” of significant cost reductions through server consolidation.

However, as we look beyond server consolidation, and consider the im-pact of virtualisation on wider busi-ness processes, the promise of “do-ing more with less” starts to become clearer. And as many organisations are starting to realise, business continuity is a critical IT discipline where the op-portunity to actually achieve the feat is particularly strong.

Many observers still equate business continuity with disaster recovery – en-suring operations keep going when the entire data centre burns down. Essential as it is to plan for extremes, almost all the cost and risk addressed by business continuity is in fact the result of more common instances of application downtime. Mundane as it may sound, anyone who has experi-enced a prolonged shutdown of their email system, or simply slow network performance, will immediately un-derstand the damage to productivity (and morale) that even short episodes of downtime can cause.

Virtualisation holds out the promise of big reductions in the cost of minimising downtime, whilst simultaneously ena-bling very significant increases in levels of application performance and availability. (For more on virtualisation, see article on page 7, ‘Virtualise to capitalise’.)

Let’s have a look at four different business continuity situations which illustrate how virtualisation can de-liver on this “breakthrough” prom-ise when compared with traditional, physical IT environments.

Server maintenance Most downtime is planned. It’s a sim-ple requirement to shut down many operating environments when main-tenance or implementation of new features are planned.

In a virtualised environment, vir-tual machines with applications run-ning can be moved in real time to oth-er servers, without any disruption to service levels. This means that server maintenance tasks can be performed at any time of day (or night), with zero scheduled downtime.

Server failure Unplanned downtime arising from hardware or software failure can be very costly to a business.

In a physical environment, an operat-ing system and the applications it sup-ports are highly dependant on the server on which they are hosted. If the server

fails, and there is no failover provision, the application will be taken out of op-eration until the server can be restored.

In a virtual environment, virtual ma-chines (VMs) can be created in pairs that run in lockstep, but on different physical servers – a passive machine es-sentially mirroring the active one.

In the event of an unexpected hard-ware failure that causes the active, pri-mary VM to fail, the secondary, formerly passive VM immediately picks up where the primary left off, and continues to run, uninterrupted, and without loss of network connections or transactions.

Workload management Less visible than server failure, but of-ten just as costly over time, is the is-sue of workload management. High levels of utilisation can significantly affect the ability of an application to perform, with costly consequences where the application is performing a mission-critical function.

In a physical environment, the applica-tion is entirely dependant on the resourc-es of the server on which it operates. Ei-ther server capacity has to be provisioned at all times to cope with peaks in activity – meaning wasted resources at all other times – or the business has to incur the cost of degraded performance or even failure at times of peak activity.

In a virtual environment, VMs that re-quire extra processing capacity can be re-deployed, with no interruption to other hosts, affording them greater perform-ance. This process is typically automat-ed, with pre-set business rules dictating at what point and to which destinations overloaded VMs will be redeployed. This reduces both the cost of outage itself and the cost of staff required to manage the rebalancing of workloads.

large-Scale diSaSter Although large-scale disaster recov-ery is rarely invoked, the potential impact of such a disaster may be enough to risk putting the company out of business.

Virtualisation can dramatically reduce the costs associated with pro-visioning for, and executing, disaster recovery processes in two main areas.

The first is cost of the recovery en-vironment. In a physical environment, a full mirror site has to be maintained in the event of partial or total failure of the main production site. This can be prohibitively expensive. In a virtual en-vironment, the seamless portability of VMs means that different (physically separate) parts of the production en-vironment can be used to host mirror copies of VMs from other parts. This makes the provisioning of the recovery environment more cost-effective in terms of both cost of infrastructure and cost of management. This is obviously best practised across multiple sites.

The second area is cost of recovery process: By providing a fully automated

framework for disaster recovery, the DR process in a virtualised environment is far faster and easier than in a physical environment, where installation, recon-figuration and testing of restored OS and applications is a laborious process.

A virtual environment can poten-tially recover a system in hours rather than days. A traditional system can take many hours to rebuild the operating system and application configuration. In the case of virtualised systems, the back-up of the complete virtual ma-chine can be directly restored without operating system or application rein-stall, requiring much less testing.

adequate preparationS From the discussion so far, it will hope-fully be clear why virtualisation can bring about significant improvements to busi-ness continuity performance, optimising levels of assurance to the business within the limits of acceptable spend.

However, the benefits of server vir-tualisation with regard to high avail-ability and data protection may be severely compromised if the wider information infrastructure is not ad-equately prepared.

Data back-up offers a prime ex-ample. A defining benefit of virtu-alisation is the significant increase in server utilisation rates. But this means that if, for example, a virtualised server increases its average utilisation from 20% to 70%, its spare process-ing capacity is no longer available as a performance buffer. Back-up and replication processes of databases and

application files still demand time and processing bandwidth, which should not impede the performance of a highly utilised server.

In a virtualised environment, there-fore, critical back-up processes will need to be re-examined and changed where necessary. Back-up to disk and data deduplication are increasingly coming to be perceived as essential components of a high-performance in-formation infrastructure environment. The processes themselves will be highly automated and intelligent; scheduling and provisioning will be driven dy-namically in real time by predefined business rules; and applications are restored in an order that reflects their importance to the business.

The overall end-game of combin-ing server virtualisation with a highly efficient and automated data manage-ment infrastructure, is to enable the IT function to move beyond the in-effective and wasteful policy of equal business continuity provisioning for all applications.

In this scenario, provisioning can fall short of real requirements for highly critical applications (creating risk) and conversely can be unneces-sarily high for non-critical applica-tions (creating waste).

It is this fundamental shift from static, “highest common denomina-tor” provisioning to dynamic, in-telligent provisioning, that enables virtualisation to truly deliver on its promise to help organisations “do more with less”.

ABOUT THE AUTHOR

Chris Ingle is consulting director, systems research with IDC, and co-author of the 2009 white paper, “virtualisation and Business continu-ity”. A full copy of this document is available from www.emc.co.uk/bc

A better way to defeat downtimeVirtualisation is helping companies to achieve the goal of ‘business as usual’ more efficiently than ever before. Chris Ingle of IDC explains

SEAmlESS TRAnSITIOn

In the event of unexpected hardware failure that causes an active, primary virtual machine (VM) to fail, the secondary, formerly passive VM picks up where the primary left off.

Source: VMware

Page 12: The Times   Efficient It Supplement

EMC2, EMC, and where information lives are registered trademarks of EMC Corporation. © Copyright 2009 EMC Corporation. All rights reserved.

Deliver the cost savings and productivity improvements yourbusiness demands.

MTI and EMC are uniquely positioned to help you take a holistic approach that enables you to address IT challenges one at a time or across your entire information infrastructure. Our unmatched expertise and experience combined with the broadest range of industry-leading solutions have enabled customers to:

• Realise a 25% reduction in storage TCO in less than a year• Reduce backup data and time by 90%• Reduce data centre space, power, and cooling costs by 70-80%• Reduce e-mail operational costs by 50%• Lower cost of compliance by 30-70%• Decrease security spending by 150%• Manage 3-4 times more servers, storage, and network devices without adding headcount

MTI invite you to ‘touch and feel’ these savings in our newly launched Solutions Centre.

Call 01483 520227 or email [email protected] to book your appointment.

For more information visit http://www.mti.com

VirtualiseConsolidateDeduplicateAutomate ProtectComply

Thrive

Information Infrastructure, Insight

Page 13: The Times   Efficient It Supplement

EFFICIENT IT 13

The typical IT department spends 70 per cent of each day ‘just keeping the lights on’. That means that the vast majority of staff time and resources are dedicated to

routine, day-to-day administrative and trouble-shooting tasks, rather than investigating opportu-nities to support overall company strategy.

That’s clearly not efficient, and in recent years, the situation has prompted widespread invest-ment in data centre automation tools. These are designed to take on tasks that would otherwise need to be performed manually, automating the processes of managing enterprise applications based on policy and priority and maximising the use of available hardware resources.

We asked four IT industry executives for their ad-vice on using automation to relieve the data centre management burden. Our commentators are:

Chris Ingle, a consulting director, systems re-search with IDC; Andy Waterhouse, technology services director for EMEA at information man-agement company EMC; Robert Schwegler, chief technical architect at Betwin, the world’s largest listed gaming company; Luca Lazzaron, vice pres-ident and general manager for EMEA at systems and service management specialist BMC.

In your experience, how much time do IT de-partments spend ‘just keeping the lights on’? CI: Our research suggests that large UK compa-nies are only devoting 13 per cent of their infor-mation and communication technology (ICT) resources to new developments, a figure that hasn’t changed in five years. Operational issues

are important, of course, but it’s an even bigger problem if you can’t launch new projects.

AW: Absolutely. If you want to improve busi-ness performance, you’ve got to devote more time to strategic projects. Saving time in the data centre means you’ve got time to start looking at things like virtualisation and cloud computing, which can really drive value.

What are the key technologies available for data centre automation and which do you think are most interesting? Why? AW: There are four key stages in data centre automa-tion. First, there’s discovery - finding out what you’ve got and the relationships between those components. Next, you have IT service management, which governs how services are delivered. Then, there’s root cause analysis, which is about understanding the causes of problems. Finally, there’s change configuration and management. Any or all of those processes can be im-proved and monitored with automation tools.

CI: There are lots of automation tools available, and what you need will vary depending on your company’s IT infrastructure. But automation will only be effective if you invest time in rationalising and virtualising the data centre first – that’s essen-tial or you’re simply automating chaos.

Can any data centre task be automated, or are some tasks more difficult than others? CI: Personally, I’d say you can automate pretty much any administrative task by either having software perform the task, or having the user request a serv-ice, and automatically provisioning that service.

LL: Repeatable, standardised processes are the most suitable candidates for automation. Tasks that involve subjective decisions make less sense, but you can still partially automate these processes, if you can supply the decision-maker with the necessary information in a single, au-

tomated view. For example, we have an incident management tool that automatically collects and presents the data needed for operators to identify changes that may be causing a problem, thereby reducing time to diagnose and repair.

RS: At Betwin, there’s no single answer about what we should and shouldn’t automate – it’s all about re-turn on investment. For high-volume ecommerce applications, it makes sense to automate by building up a blank server with the target application stack and configuring it automatically. This isn’t hard to do and can be done with relatively simple tools. The less convincing arguments are around complex work such as orchestration and analysing different sequences of changes to minimise downtime.

When selecting a data centre automation prod-uct or supplier, what are the most important questions to ask? AW: I think it’s key to ask [suppliers] not just what they’re doing today, but also how will they support future data centres. Whether you like it or not, the world is going virtual, and from an automation point of view, you want to be sure you’re looking at tools which can automate that kind of infrastructure. So you should ask if they

can manage both physical and virtual environ-ments and integrate the two.

RS: It’s critical to ask how an upgrade of an automation tool is likely to impact the test and production environment. I know that […] this can easily become horribly complex.

How will data centre automation help as organ-isations increasingly look to adopt newer infra-structure models, such as utility computing, service oriented architectures (SOAs), software as a service (SaaS), virtualisation and so on? LL: These new architectures bring increased complexity. Automation will be critical in free-ing up resources to manage this increased com-plexity. As we’ve seen within many organisations, virtualisation offers dramatic savings on capital expenditure, at the cost of increased operational expense. With a data centre automation solution, companies can achieve the best of both worlds, with lowered capital and operational costs.

RS: The great thing about automation is you don’t get the high-pressure, very risky changes rolled out manually overnight. It’s often easier to produce a version of a virtual computer, make a snapshot, deploy a new version and if it doesn’t work, fall back to the last known ‘good state’, instead of defining the delta change and saving only that before updating. Without a doubt, the tools are getting there to help to support complex tasks in a data centre, but one needs clear support contracts [from suppliers]. If something doesn’t work, the experts are sitting out-side of your company and may be hard to reach.

TOP CANDIDATES FOR AUTOMATION

Which data centre tasks are ripe for automa-tion, taking them out of the hands of quali-fied IT professionals, who then get to spend their time on more strategic work?

Discovery and dependency mappingMany organisations have no real knowledge of how many servers and other hardware resources they have in their environment, what operating system version and patches are running on them and how these assets depend on each other to run smoothly. Data centre automation tools can gather this in-

formation by searching corporate networks, so that systems are maintained at necessary operating levels.

Service provisioningIf you need to get a new service up and run-ning, it can easily take six weeks to provision a new server to run a new application. Using automation, data centre staff can automate processes such as configuring a new server and applying patches and security policies to it. Combined with virtualisation technolo-gies, these tools can help IT staff to provi-

sion a new server – in the form of a ‘virtual machine’ – in less than one hour.

Change management Changes to systems and the applications that run on them need to be managed and monitored. With 80% of data centre down-time caused by changes to the infrastructure, why not remove human error from this proc-ess by automating change management?

Resource managementAutomation tools can be used to continually

manage physical (and sometimes virtual) in-frastructure in the data centre to reduce op-erational risk and improve efficiency, by mon-itoring and controlling equipment, power, cooling, network infrastructure and storage.

Patch managementCompared to a manual approach, auto-mated patch management can reduce the annual cost of installing software patches from £150 per computer to £25 per compu-ter, according to researchers within Novell’s Zenworks business.

Just keeping the lights onIf IT staff are too busy working on routine, day-to-day tasks, there’s no time left for them to explore the strategic projects that could deliver real business value. That’s where data centre automation comes in, industry experts tell Sally Whittle

“Automation will be critical in freeing up resources to manage this increased complexity”

“If you want to improve business performance, you’ve got to devote more time to strategic projects”

Luca Lazzaron

Andy Waterhouse

Page 14: The Times   Efficient It Supplement

14 EFFICIENT IT

Any management team con-cerned about the security of their corporate data might imagine that the best way to

stop strangers gaining access would be to keep it on a protected corporate serv-er, in a top-security building - preferably one with bars on the windows. On that basis, they might be horrified by the concept of ‘virtualising’ that server and moving it out into the cloud.

Some security specialists say they’d be wrong; the opposite, in fact, is true. Their argument goes like this: it can be more secure to ‘de-perimiterise’ data se-curity and move an IT infrastructure to a truly virtual one. The way to do this successfully is to make the data itself se-cure, not the box on which it runs.

As Eric Baize, senior director of the product security office at EMC, puts it: “If you look at history, the whole security industry has been chasing infrastructure innovations; as a consequence, we came to a situ-

ation where security was not man-ageable. But with virtualisation, we are making security inherently part of the infrastructure.”

The message that data should take precedence over infrastructure in se-curity terms is one that the Jericho Forum, an industry think-tank, has been preaching for four years. But many security professionals find this new, “walls come tumbling down” theory alarming, and continue to in-sist on robust firewalls and intrusion detection and prevention technology to guard their companies’ perimeters.

Real oR not?Virtual servers, however, don’t work like ‘real’ computers. They are imagi-nary machines.

The first time people hear this, their reaction is incredulity; but in fact, the concept of virtualisa-tion is pretty straightforward. You take a standard personal computer or server, but you don’t load the normal Word or Excel programme. Instead, you load a virtualisation package, which sits in a corner of the main computer, and runs vari-ous operating partitions. One par-tition might be Windows, another might be Mac OS or Linux; and each copy of the operating system runs different programmes.

A virtual desktop is simpler still: you sit down at a computer, but the software is running elsewhere. All you see is a copy of what would appear on the remote machine’s display. None of the data is on the client device.

Today’s virtual desktop technol-ogy includes tools for hiding the data from malware. Today’s virtual net-work can lock down data, even if the hardware is somewhere outside the company campus.

Well, that’s the concept – but many inside the data centre still see dangers in the approach and are unsure about the solution. For example, a virtual ma-chine that shares a physical system with four other virtual machines, could (at least in theory) still hold sensitive data in its memory – data left over when a different virtual machine is closed

down. Can intruders get at that data? Nobody can give a firm answer, until a hacker tries and declares it impossible.

There’s also the ‘multiple vulnerabil-ity’ problem. The virtual servers all run on one real machine. That machine has an Internet address. A denial of serv-ice attack launched at that IP address could lock the machine up; instead of losing just one server, the corporation loses all the virtual machines together. Is it a real threat? Possibly not; but it’s still a worry for many organisations.

Playing catch-uPThe fact is that a lot of corporations still rely heavily on what EMC and the Jericho Forum might regard as obso-lete security. The usual way of pro-tecting data is to monitor traffic on the local area network (LAN); if one machine sends data to another, it has to give it to a router, which handles se-curity. If it spots a threat, the data isn’t transmitted. End of story.

But on a virtual server, the virtual machine receiving that data might be on the same piece of hardware. The send-ing virtual machine need not to talk to the LAN at all; it just transfers the data internally. Network security isn’t being used. Yes, there are virtual routers, but many CIOs doubt that they are as rig-ourous as network-based intrusion de-tection and prevention devices.

Those same CIOs, however, are un-der pressure to adopt virtualisation, if they haven’t already, and to explore its potential further. Already, companies that provide wide area networking services are hoping to persuade them to move whole data centres. Why run a huge battery of power-hungry serv-ers in the heart of the City of London where electricity supply is at its limits, they say, when you can move them to cheaper premises in Slough? And be-yond that, CIOs must consider cloud computing, where server hardware might not even be in the same coun-try. It might not even belong to them.

Jericho Forum board member Paul Simmonds reckons security profes-sionals just have to catch up. To assist them in this, the think-tank has pub-lished a set of principles that describe how best to handle security in a de-perimeterised world, its own version of the Ten Commandments.

“My favourite tenet is Command-ment No 2: Unnecessary complexity is a threat to security,” he says. “I’ve yet to find a security person who understands this in a virtualisation context.”

HOW SAFE IS THE CLOUD?

When an organisation operates a secure corporate network environment, its data stays on that network. When it migrates corporate systems to the cloud, it’s a dif-ferent story.

In the cloud scenario, the organisation may not own the computers or the net-work. All it is given are login details to ac-cess the data, which resides on the cloud provider’s infrastructure. There’s almost no way of knowing where the data might be – it may not even be in the same coun-try. And if the data isn’t self-secured, then there’s no security other than that offered by the cloud provider itself.

So how can managers at an organisation that uses cloud computing convince their customers and partners that data is safe?

The Jericho Forum approach is designed to make cloud computing easier for data secu-rity specialists. If data isn’t important, you don’t bother securing it at all. If it’s important that no-body changes it, then you impose digital rights management (DRM), or simply store it as a read-only document. And if it is really sensitive, you lock it down with new ‘de-perimeterisation tools’ that the Jericho Forum has been developing.

That doesn’t cover all security threats, of course. Simon Young, general manager for server security in EMEA at Trend Micro, ar-

gues that not all problems are data-centric. Some are hardware-centric, he points out: “A credit card approval server should never be compromised.”

Young claims that cloud providers like Trend Micro, who understand virtualisation, are fur-ther ahead in data security terms than end-user organisations, who focus on securing perimeters. “We live in the virtual network, working from inside virtualisation, and moni-tor traffic, deciding whether this is a rogue or a good guy. This is a new boundary.”

And according to Eric Baize, senior direc-tor of the product security office at EMC, even thorny issues such as data privacy

compliance have been figured out by cloud operators – some of them, at least.

European regulations state that certain types of personal data may not be stored outside the country in which the people concerned live. “So the EMC cloud storage offering can decide where in the cloud data can go, based on security you define for the data,” he says. That means, for example, that companies in countries where privacy rules forbid them to send employee data abroad can decide which server is used in the private cloud (ei-ther external or internal), or they can directly define policies so that data on employees is only stored in its country of origin.

Walls come tumbling downWhen data lives in a virtual environment, perimeter security is no longer enough to safeguard it. New approaches are needed, as Guy Kewney reports

Paul Simmonds from the Jericho Group believes security professionals need to catch up with current thinking on de-perimeterisation

Eric Baize of EMC says it’s time that security was made an inherent part of the virtual infrastructure

Page 15: The Times   Efficient It Supplement

It creates understanding, where oncethere were walls. It connects a kid to ascientist to a CEO to save a glacier.It brings ideas together.Passions together.And people together.It’s the human network effect.The effect that is changing the world.When technology meetshumanity on the human network,the way we work changes.The way we live changes.Everything changes.that's thehuman network effect

www.cisco.com/uk/effect

Page 16: The Times   Efficient It Supplement

EMC2, EMC, and where information lives are registered trademarks of EMC Corporation. © Copyright 2009 EMC Corporation. All rights reserved.

Build Effi cient IT with EMC

EMC is uniquely positioned to help you take a holistic approach that enables you to address IT challenges one at a time or across your entire information infrastructure. Our unmatched expertise and experience combined with the broadest range of industry-leading solutions will enable your business to thrive and to emerge from today’s economy stronger than ever.

www.emc.co.uk/effi ciency

VirtualiseConsolidateDeduplicateAutomate ProtectComply

Thrive


Recommended