+ All Categories
Home > Documents > June 15 2006

June 15 2006

Date post: 30-Jan-2016
Category:
Upload: sreekanth-sastry
View: 216 times
Download: 0 times
Share this document with a friend
Description:
Technology, Business, Leadership
58
Transcript
Page 1: June 15 2006

Alert_DEC2011.indd 18 11/16/2011 12:47:16 PM

Page 2: June 15 2006

From The ediTor

“Just threaten to delete all the company’s e-mail and see the response.” That was

a CIO’s response to my query on how to get managements serious about dealing with

unstructured data.

Take the case of another CIO in the service sector. To get business buy-in for beefing up

security, he highlighted how vulnerable the organization was by actually hacking into a

director’s laptop. Funds for infosecurity? It was more a case of how much and when after

that. Pertinently, he stated that he’d tried and failed with the usual route before going in for

more strong-arm tactics. He justified his approach since his organization’s IT spend was a

mere half percent of revenue (other back-end operations took in close to 20 percent).

He’s not alone in this. I know of few CIOs who’ve used SOX or other regulatory compliance

to get managements to fall in line, adopt processes and loosen their purse-strings.

So, is potential business loss or risk a more persuasive argument for IT investment

than possible business gain? Is scaring the pants off the managing council the way

forward? The answer is yes if you consider the opinions of a few CIOs who’ve been in

touch with me recently.

This is an approach that leaves me very

disturbed. Whatever happened to business-

IT alignment and making a good business

case for IT spend?

Call me Gandhian, but IT projects,

I feel, are as much about the means as they

are about ends. Using scare tactics, I’m

convinced, will neither increase management respect for CIOs and their teams, nor are

they a long-term solution. It stands to reason that a ‘cry wolf ’ strategy can’t be employed

repeatedly, since managements are bound to wise up.

Just when I was getting all worked up about this, I interacted with a CIO from a

venerable brick-and-mortar company. He feels that the proper way to show the value

of any IT investment to the management is to avoid attempting anything theoretical. He

also recommended winning over one or more business stakeholders and running pilot

projects as proof-of-concepts to demonstrate the impact of technology on productivity. It

then becomes much easier to build a business case to scale them up enterprisewide, he

observed. It’s a method that’s really worked out well for him.

What do you feel about this? Which approach works for you? Send in your thoughts to me.

Scare tactics will neither increase management respect for CIOs and their teams, nor are they a long-term solution.

Business gain and not loss should drive IT spend.

Vijay Ramachandran, Editor [email protected]

Don’t Cry Wolf

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 �Vol/1 | ISSUE/15

Content,Editorial,Colophone.indd3 3 6/12/2006 4:30:35 PM

Page 3: June 15 2006

44

PIeCIng TOgeTher STOrage ManageMenT | 32Managing islands of storage across the enterprise is no mean task, especially at a time when the need for centralized storage manage-ment has never been greater.

Feature by Logan G. Harbaugh

Take Charge, run ILM | 38Not all information is important. Information Lifecycle Management can organize your enterprise’s information based on its value and make your systems less bloated, more stable apart from bringing overall cost down.Feature by Leon Erlanger

VIrTuaL STOrage | 44Virtualization isn’t taking off like it was supposed to. But that’s because companies don’t have all the tools, not because anyone doubts its benefits. Start virtualization with point solutions. Feature by Galen Gruman

BeyOnd The rear-VIew MIrrOr | 18For decades, the storage industry has thrived on finding recovery and backup solutions. It’s high time it began to create value for itself by looking forward and anticipating issues that enterprise 4itself by looking forward and anticipating issues that enterprise 4will have to grapple with — tomorrow.4will have to grapple with — tomorrow.4

4will have to grapple with — tomorrow.

4Column by John Webster4Column by John Webster4

COVER STORy | STORAGE WHIppInG IT DATA InTO SHApE | 26While making the transition to service-oriented architecture, how does the modern enterprise deal with the challenges of data integration?Feature by Galen Gruman

JUNE 15 2006‑|‑Vol/1‑|‑issUE/15

content

whaT dISaSTer reCOVery PLan? | 20A majority of companies don’t have disaster recovery plans, and those that do don’t test them. If you want the business guys to listen to your DR plans, stop talking about them.Column by Jon William Toigo

eSSenTIaL TeChnOLOgy | 62Data Encryption | Safe and SoundBy Stacy Collett

pundit | Vendors Rewrite the RulesBy Mario Apicella

more »

Vol/1 | ISSUE/15

4 4

3 2

� J U N E 1 5 , 2 0 0 6 | REAL CIO WORLDREAL CIO WORLD

2 6

3 8

Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8Content,Editorial,Colophone.indd8 8

Page 4: June 15 2006

executive expectationsVIEW FROm THE TOp | 50In five years, ICICI OneSource made a place for itself among India's top-ten BPOs. With President & COO Raju Venkatraman, the company is taking its excellence to its delivery centers and new geographies. He talks about the best practices that will ensure it remains a cut above the competition. Interview by Ravi menon

governCREATInG CHAmpIOnS FOR TOmORROW | 58J. Satyanarayana, CEO, National Institute for Smart Government (NISG), realizes just how acute the lack of leaders is among e-governance projects and is determined to do something about it. Interview by Balaji narasimhan

content (cont.)

Trendlines | 13 Security | Mouse Trap Research | Boards Still Ignorant of IT: Survey network Security | Best Practices for the Worst WLAN Security Telephony | An 802.11 Hoop for Villages Technology | The Glue Gun and More Sticky Stories Security | No More Bullets in Bank Heists By The numbers | Regulation’s Silver Lining Internet | Broadband for Everyone in Singapore

From the editor | 3 Don’t Cry Wolf | Business gain and not loss should drive IT spend. By Vijay Ramachandran

Inbox | 12

58

deParTMenTS

NOW ONLINE

For more opinions, features, analyses and updates, log on to our companion website and discover content designed to help you and your organization deploy IT strategically. Go to www.cio.in

c o.in

50

Raju Venkatraman on, President & COO, ICICI OneSource, on the policies that have made it one of India's leading BPOs.

Co

VE

r:

Ill

US

Tr

AT

IoN

BIN

ES

h S

rE

Ed

hA

rA

N

Vol/1 | ISSUE/151 0 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Content,Editorial,Colophone.indd10 10 6/12/2006 4:31:03 PM

Page 5: June 15 2006

AdverTiser index

Avavya 4, 5

Canon 67

IBM India 24, 25

Interface 21

Lenovo 68

Microsoft Cover Gatefold

Mcafee 9

Raritan 2

SAS 17

Seagate 35

Symantec 41

Storagetek 47

Wipro 6, 7

All rights reserved. No part of this publication may be reproduced by any means without prior written permission from the publisher. Address requests for customized reprints to IDG Media Private Limited, 10th Floor, Vayudooth Chambers, 15–16, Mahatma Gandhi Road, Bangalore 560 001, India. IDG Media Private Limited is an IDG (International Data Group) company.

Printed and Published by N Bringi Dev on behalf of IDG Media Private Limited, 10th Floor, Vayudooth Chambers, 15–16, Mahatma Gandhi Road, Bangalore 560 001, India. Editor: Vijay Ramachandran. Printed at Rajhans Enterprises, No. 134, 4th Main Road, Industrial Town, Rajajinagar, Bangalore 560 044, India

Anil nAdkArni

head IT, Thomas Cook, [email protected]

ArindAm Bose

head IT, lG Electronics India, [email protected]

Arun GuptA

director – Philips Global Infrastructure Services

Arvind tAwde

VP & CIo, Mahindra & Mahindra, [email protected]

Ashish kumAr ChAuhAn

Advisor, reliance Industries ltd, [email protected]

m. d. AGArwAl

Chief Manager – IT, BPCl, [email protected]

mAni mulki

VP - IS, Godrej Consumer Products ltd, [email protected]

mAnish Choksi

VP - IT, Asian Paints, [email protected]

neel rAtAn

Executive director – Business Solutions,

Pricewaterhouse Coopers, [email protected]

rAjesh uppAl

General Manager – IT, Maruti Udyog, [email protected]

prof. r.t.krishnAn

Professor, IIM-Bangalore, [email protected]

s. B. pAtAnkAr

director - IS, Bombay Stock Exchange, [email protected]

s. GopAlAkrishnAn

Coo & head Technology, Infosys Technologies

s_gopalakrishnan @cio.in

s. r. BAlAsuBrAmAniAn

Sr. VP, ISG Novasoft, sr_balasubra [email protected]

prof. s sAdAGopAn

director, IIIT - Bangalore. [email protected]

sAnjAy shArmA

Corporate head Technology officer, IdBI, [email protected]

dr. sridhAr mittA

Managing director & CTo, e4e labs, [email protected]

sunil GujrAl

Former VP - Technologies, Wipro Spectramind

[email protected]

unni krishnAn t.m

CTo, Shopper’s Stop ltd, [email protected]

v. BAlAkrishnAn

CIo, Polaris Software ltd., [email protected]

AdvisorY BoArd

mAnAGement

president N. Bringi dev

Coo louis d’Mello

editoriAl

editor Vijay ramachandran

BureAu heAd-north rahul Neel Mani

AssistAnt editors ravi Menon;

harichandan Arakali

speCiAl Correspondent Balaji Narasimhan

senior Correspondents Nagesh Joshi; Gunjan Trivedi

Chief Copy editor Kunal N. Talgeri

Copy editor Sunil Shah

www.Cio.in

editoriAl direCtor-online r. Giridhar

desiGn & produCtion

CreAtive direCtor Jayan K Narayanan

desiGners Binesh Sreedharan

Vikas Kapoor

Anil V.K.

Jinan K. Vijayan

Unnikrishnan A.V.

Sasi Bhaskar

Vishwanath Vanjire

Sani Mani

photoGrAphy Srivatsa Shandilya

produCtion T.K. Karunakaran

mArketinG And sAles

GenerAl mAnAGer, sAles Naveen Chand Singh

BrAnd mAnAGer Alok Anand

mArketinG Siddharth Singh

BAnGAlore Mahantesh Godi

Santosh Malleswara

Ashish Kumar

delhi Nitin Walia; Aveek Bhose

mumBAi rupesh Sreedharan

Nagesh Pai; Swatantra Tiwari

jApAn Tomoko Fujikawa

usA larry Arthur; Jo Ben-Atar

sinGApore Michael Mullaney

uk Shane hannam

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 1 1Vol/1 | ISSUE/15

Content,Editorial,Colophone.indd11 11 6/12/2006 4:31:03 PM

Page 6: June 15 2006

reader feedback

Competitive AdvantageI have been meaning to give CIO my feedback for sometime. The problem is that every time I finish an article and pass the magazine to the rest of my department, it hardly ever comes back. This time, however, I made it a point to read the magazine from end to end.

I agree with CIO’s editorial (Champion Growth, June 1, 2006). As globalization catches up with India and Indian companies become more ambitious, there is going to be increasing demand on every key process in an organization. Therefore, streamlining and improving key processes through effective use of IT will make a major difference to an organization’s effectiveness in meeting customer demands. The time has come for CIOs and the IT function to move upstage, but that is easier said than done.

Geoffrey Moore’s column (How to Find Your Competitive Advantage?, June 1, 2006)was excellent, but I must add that senior management in most organizations must already know where their competitive advantage lies. Therefore, CIOs don’t need to start from scratch, but can work on key strategies and develop support IT applications around these. Even Susan Cramm’s column (The worst job in IT, June 1, 2006) was well written. In India, however, IT outsourcing is yet to gather significant momentum.

The feature on Google (The Enterprise Gets Googled, June 1, 2006) is also very good, and gave insights into a company whose products we use everyday but know so little of. The cover story (Step on IT, June 1, 2006)of. The cover story (Step on IT, June 1, 2006)of. The cover storyalso made for interesting reading.

In terms of layout and quality of printing, CIO is as good as any international magazine.

N. KAilAsNAthNAthNA ANCIO, Titan Industries

target CXOstarget CXOstI have received CIO from its first issue and always look forward to reading it. In fact, I also have an online subscription for CIO international and CIO news updates.

The content of the magazine is appropriate to the current-day challenges of the CIO. The articles provide necessary inputs for CIOs, who have evolved from the EDP days to the present time, when they need to respond to complex business demands. CIOs in India, as a breed, still report to the CFO and rarely find themselves on management teams. CIOs in the West lived through similar circumstances a decade ago. It is, therefore, important that CIO also highlights IT success stories in IT-business alignment, for instance, which showcases a CIO's dream of being able to participate actively rather than merely keeping the lights on.

A few suggestions, if you have not already thought of them: in the Indian context, we need to address the education of chief executives using articles around

technological success stories. This will also serve to improve readership among chief executives. Maybe, events are also a good way to achieve this.

Vendors and consultants make sure that large organizations and MNCs are updated with most best practices. However, small and medium businesses, which have no compulsions to engage vendors and consultants, do not see the value in IT investments. The CEOs of these companies (mostly promoters themselves) also need to be included in the reader group.

s. sridhAr Head IT, Hutchison Essar South

Practicing What We Preach The april 1 cover story (Turning IT Doubters into True Believers) made for interesting reading. In MNCs, IT has certainly become a necessity for business. But among Indian organizations, there are still challenges for CIOs in turning doubters into believers.

I also happen to use the principals behind the ‘Top Perks of Positive Perception’; ‘Top Consequences of Negative Perception’; and ‘The Best Ways to Change Perception’ in areas where no IT systems were in place.

I would like to be part of a forum of CIOs to share views and exchange ideas.

AruN PhAdKeVP – IT, Nicholas Piramal

What Do You Think?

We welcome your feedback on our articles, apart from your thoughts and suggestions. Write in to [email protected]. Letters may be edited for length or clarity.

editor@c o.in

“The time has come for cIOs and the

IT function to take centerstage, but that

is easier said than done.”

Vol/1 | ISSUE/151 2 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Inbox.indd 12Inbox.indd 12Inbox.indd 12Inbox.indd 12

Page 7: June 15 2006

R e s e a R c h Most company boards operate without any of the directors having a working knowledge of what impact IT can have on their business. And few CIOs find a seat on the board, according to new research by a recruitment firm.

The survey, by Talent2 International, covered the top 200 Australian companies and found that fewer than 5 percent of directors displayed any level of knowledge enabling them to provide support and guidance in IT strategy, and fewer than 3 percenthave CIOs on the board.

“The figures are indicative of an industry problem,” says Con Colovos, Executive Director of the CIO Executive Council, adding that council members estimate about 20percent of CIOs are sitting on boards.

Michael Axelsen, director-IS at BDO Kendalls, says that IT knowledge is not a prerequisite for a director's role. However, BDO Kendalls has a dedicated IT strategic planning board. “There is a great deal of communication between IT and the board,” says Axelsen.

Paul Rush, CIO practice lead at Talent2, says businesses are putting their balance sheet and strategic direction at risk by

failing to involve IT executives in boardroom discussions. “IT bosses in Australia say this is symptomatic of the view of the IT function as a cost center rather than something that can provide competitive advantage,” says Rush.

This is in contrast to companies in the Asia-Pacific region, including India, which are 10 times more likely to have a CIO on the board, according to Talent2.

Rush added that Harvard Business Review’s research showed that 95 percent of senior management do not know what their IT budget is spent on.

— By Rodney Gedda and Darren Pauli

s e c u R i t y Online fraudsters might want

to try some method acting classes before

they attempt to log in to an online banking session

using a stolen user name and password. A new technology

from Fair Isaac claims to be able to spot fishy Web sessions by, among

other things, comparing mouse movements and typing mannerisms

with those of the account holder.The new multi-factor authentication

product, Falcon One for Online Access, uses neural network technology to monitor online transactions and learn customer behavior patterns. The product is targeted at US banks, which are under pressure to find alternatives to simple username and password security for online banking.

Falcon One works with other Fair Isaac anti-fraud technology.Like other anti-fraud companies, Fair Isaac notes the IP address an account holder typically uses for online banking and raises flags when a session is initiated from a new address. But it digs deeper

into the remote host, noting details like the system clock setting and screen resolution to determine suspicious activity, said Ted Crooks, VP, Global Fraud Solutions, Fair Isaac.

The software also monitors other characteristics of account holders, such as their style of typing and mouse movements to determine whether the user attempting a transaction is the actual account holder. Characteristics such as the speed and character pattern that account owners type, as well as whether they are a jittery or staid mouse user are individual and nearly impossible to mimic, Crooks said.

The company also monitors traffic on outbound communications channels, noting how a customer links to an online banking session and whether there are delays in online session traffic that could signal a ‘man in the middle’ attack, he said.

Falcon One combines back-end analysis with a Web browser plug-in that collects data without breaking the browser security model, or “sandbox”, he said.

None of the data collected necessarily signals fraud. Instead, the company weighs the data to calculate risk measurement for online sessions. Banks can then decide whether to change the course of a session. For example, users could be asked to enter an additional one-time password that is sent to their cell phone or a pre-approved e-mail address, Crooks said.

— By Paul F. Roberts

n e w * h o t * u n e x p e c t e d

Mouse Trap

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 1 3VOl/1 | ISSUE/15

Boards still ignorant of IT: Survey

Ill

US

TR

AT

IOn

By

BIn

ES

H S

RE

ED

HA

RA

ES

H S

RE

ED

HA

RA

n

Page 8: June 15 2006

tR

en

dl

ine

s

IM

AG

InG

By

BIn

ES

H S

RE

ED

HA

RA

n

Best Practices for the Worst WLAN Security

VOl/1 | ISSUE/151 4 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

R u R a l t e l e P h O n y Is 802.11 (wireless) mesh network the way forward for the most affordable telephony in villages? Bell Labs, the Lucent Technologies company which does innovative research in telecom and IP technologies, seems to think so.

Bell Labs has developed a wireless-based solution called VillageNet that it hopes will be able to get around the deficient telecom infrastructure in rural India. Most of India’s 6.5 lakh villages are afflicted by low broadband penetration rates, and have low average revenue per user (ARPU).

VillageNet will connect kiosks in rural areas to a district headquarters on an 802.11 link, according to Dr. Krishan Sabnani, Senior VP (Network Research), Bell Labs. Being an inexpensive technology, it should encourage instant adoption by service providers in villages, he added.

Bell Labs claims that it has been able to come out with price points that are far cheaper than other commercially usable technologies. “WiMax and other 3G technologies are great but when it comes to providing services in villages, they are not the most attractive technologies because of the cost,” he added.

“ARPUs in villages are extremely low, so this would match the expectations of millions of people who are not able to spend too much to access both voice and data,” said Sabnani. The technology will soon be out of the lab and ready for commercial use.

VillageNet is in line with Bell Labs’s goal of leveraging WiFi and WiMax to provide broadband wireless access to remote areas and rural villages.

— By Rahul Neel Mani

An 802.11 Hoop for Villages

n e t w O R k s e c u R i t y Security-conscious attendees at Computerworld’s Mobile & Wireless World conference heard a new twist on an old theme: the 10 things not to do when implementing a wireless LAN.

The hard-learned lessons came at the expense of an unnamed national retail chain with more than 1,000 stores whose sorry story was detailed by John Stehman, a consultant at Robert Frances Group. “We’re going to talk about ineptness,” Stehman said. And then he delivered.

The retailer, from one seemingly innocuous misstep to another, ended up victimized by security breaches that included fake store websites set up by hackers to capture customer account information. “The goals were good,” Stehman said. A year after the project was implemented, frustrated users who didn’t know how to configure security options couldn’t access the network, denial-of-service attacks had crippled communications, and “untold sums of money were lost,” Stehman said. “We’re talking big bucks.”

Further, the company’s director of IT was looking for a new job after being fired, Stehman noted. With that in mind, he offered a list of the top 10 practices for poor security exhibited by the retailer:

1����� Implement a mixed WlAn environment — multiple vendors and technology.

2 Do not provide centralized network management and/or administration.

3����� Ensure that there is no documented corporate policy for WlAn usage.

4 Verify that rogue access points have easy access to the network.

5 Do not provide users and executives with education on using WlAns.

6���� Take the vendor’s advice for product selection, training, implementation and upgrades.

7 � Do not monitor the network and test security for effectiveness before actual rollout.

8�� Keep the network in a constant state of flux — add new products, services and users as fast as possible!

9�� Do not roll out a corporate plan for implementing and using WlAns.

10�� Different locations should use whatever products they want.

He had more advice about how to ensure poor WLAN security: never consult

corporate officials; don’t ask senior executives to a strategic or tactical plan for WLAN use; do not worry about upgrades and future needs; give vendors the final say on what products to use and how they should be configured; and be sure to outsource WLAN technical support from the start.

— By David Ramel

Trendlines.indd 14 6/12/2006 4:36:25 PM

Page 9: June 15 2006

t e c h n O l O g y Tracking technology is getting cheaper and easier to implement. As a result, separating the truth from science fiction is getting more difficult. See if you can tell which of these stories are the real deals, and which are gags. Answers below.

1 Suspicious wives and girlfriends in Suspicious wives and girlfriends in Korea can use GPS-enabled cell phones Korea can use GPS-enabled cell phones

to keep a watchful eye on their husbands and boyfriends. And to avoid being caught at the local bar rather than at the office, our source in the cell phone industry says, some men have begun paying people to carry their phones to less risky places during their after-work carousing.

2Tiny, wealthy Manalapan, Florida, Tiny, wealthy Manalapan, Florida, has installed infrared security has installed infrared security

cameras that record every car that drives

through town while software checks the plate numbers against law enforcement databases. “Courts have ruled that in a public area, you have no expectation of privacy,” said police chief Clay Walker.

3To avoid being tracked by a To avoid being tracked by a state-mandated GPS system, a state-mandated GPS system, a

Massachusetts snowplow operator allegedly Massachusetts snowplow operator allegedly left his GPS device in a paper bag by the side of the road while he ran off to work a private job. Another time, he reportedly handed his transmitter to a fellow snowplow operator.

4 In order to cut down on the number In order to cut down on the number of dangerous, high-speed chases, Los of dangerous, high-speed chases, Los

Angeles police officers are testing a ‘glue gun’ that can fire a sticky GPS transmitter at a fleeing vehicle. That way, the officers can track the suspect’s vehicle without chasing it and putting lives at risk. (There’s been no word yet on whether sales of Goo-Off adhesive remover have increased in high-crime areas.)

5Security camera network operator Security camera network operator CityWatcher.com has asked its CityWatcher.com has asked its

employees to get RFID chips implanted in their arms to facilitate entry into the company’s secure data centers. CityWatcher CEO Sean Darks says that the program is voluntary, and employees can easily have the chip removed if they desire. “The joke here is that we make them leave their arm,” he says. Ha, ha. Ouch.

Answer: all these stories are straight out of the news. It’s not paranoia if it’s true.

— By Chris Lindquist

ttRR

en

dl

ine

se

nd

lin

es

en

dl

ine

s

s e c u R i t y Australia’s banking industry is under threat due to a heavy reliance on Single Socket Layer (SSL) encryption that hackers increasingly find their way around.

There are no ‘stick-em-up’ dramatics in today’s bank heists, but SSL-evading Trojans and refined phishing techniques.

Banks are reluctant to quantify losses, but research by Australia’s Computer Emergency Response Team (AusCert) proves attacks are on the rise. AusCert GM Graham Ingram says a false sense of security surrounds SSL encryption, which is used across financial services.

This reliance on Internet browser encryption means banking sessions can be hijacked by Trojans and key-logging programs, especially if users engage in lax security protocols and don’t use current anti-virus signatures. The bottom line is that social engineering tricks are circumventing Internet banking encryption.

Ingram said there is a belief that customers are safe and privacy is protected through the use of SSL but “this is not the truth”. Neal Wise, director of security firm Assurance Pty, says SSL does serve a good purpose but leaves users prone to a ‘man in the middle’ attack.

“Unfortunately, the only controls a bank can rely on is SSL encryption; it leaves them in an interesting situation of having to cover related security issues that they have not created,” Wise said. “We will see financial institutions, as part of shoring up their own risks, providing cut-price antivirus and content checking tools for their clients, because right now if someone manages to put a keystroke logger on a client computer, and a banking session gets recorded, banks have to cover that risk, and it is not their fault.”

While security experts claim Internet banking fraud drains as much as two to five percent of revenue, the financial services industry isn’t as forthcoming when it comes discussing online threats. The Australian Bankers Association refuses to comment.

— By Michael Crawford

no More Bulletsno More Bulletsn in Bank Heists

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 1 5VOl/1 | ISSUE/15

Ill

US

TR

AT

IOn

By

BIn

ES

H S

RE

ED

HA

RA

n

and Other Sticky Storiesthe Glue Gun

a result, separating the truth from science fiction is getting more difficult. See if you can

through town while software checks the plate numbers against law enforcement

and Other Sticky Storiesand Other Sticky Storiesand Other Sticky Storiesand Other Sticky StoriesGlue Gun

Page 10: June 15 2006

Sarbanes-Oxley helps improve business processes.

c O m Pa n i e s t h at h av e d e P l O y e d n e w s y s t e m s t O c O m P ly with government regulations such as the Sarbanes-Oxley Act are finding that these investments can do double duty by helping them improve business processes, according to a survey of 332 companies by AMR Research. Nearly 75 percent of respondents plan to use their compliance investments to support other activities, such as streamlining business processes.

John Hagerty, vice president (research) with AMR, says regulatory mandates have put a new spotlight on IT as a means to mitigate business risk. Prior to these mandates, risk management didn’t get executive attention. CEOs and boards were reluctant to invest in technology to combat risk, he says. But Sarbanes-Oxley, in particular, has made them more attuned to the technology underpinnings that compliance requires. “So, you see the board open its wallet to fund some of these programs,” says Hagerty.

One area where compliance mandates have prompted support is for security and identity management. Sarbanes-Oxley, for example, requires appropriate access controls to corporate systems so that an employee cannot change data unless he is authorized to do so. And so, CIOs have permission to deploy access management systems and procedures that they may not have been able to justify previously.

Meanwhile, Hagerty adds, the emphasis of Sarbanes-Oxley on process controls fosters greater awareness of quality assurance. The required reviews prompt companies to examine processes that are not working well or controls that are failing.

Sarbanes-Oxley has also provided support for business process management (BPM). Sarbanes-Oxley compliance depends on standardizing processes, so that points of failure are minimized. BPM technology supports this standardization across a company.

ttRR

en

dl

ine

se

nd

lin

es

regulation’s silver Lining

VOl/1 | ISSUE/151 6 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

with

ilver Lining

B Y d i a n n d a n i e l

Best practices

Compliance comes first. Focus on what is absolutely required to satisfy the regulatory mandate.

Determine what types of training, new business processes and software you will need to meet the requirements.

Ident i fy double-duty investments. CIOs have a global view of the company that other

business leaders do not, says Hagerty. Use that knowledge to target compliance investments toward processes that need improvement. Other business leaders are likely view compliance spending as an expense. Therefore, CIOs need to explain the business benefits to help make a case for investment in new technology.

Watch for overlap. Many regulations have common business requirements, such

as managing documents and records, standardizing business processes, creating reports, managing risks, and implementing security and audit controls. you may you may ybe able to use the same applications to comply with multiple regulations.

double-duty ItTechnology for regulatory compliance supports multiple business uses.

Companies re-purpose compliance investments for:Companies re-purpose compliance investments for:

Visibility into operations 10%

Support for globalization 11%

Security 14%

Quality assurance 28%

Other 2%

Streamlining business processes 36%

67%of companies have automated their compliance processes using IT.

33%think they don't need new IT for compliance.

Adds up to 101% due to rounding.Adds up to 101% due to rounding.SOURCE : AMR ResearchSOURCE : AMR Research

Page 11: June 15 2006

VOl/1 | ISSUE/15

i n t e R n e t | Singapore is planning a national broadband network that would provide ultra-fast Internet access at speeds up to 1Gbps for every home and business. “Today, a high-speed broadband network is an essential infrastructure for economic development, investment, talent attraction, education and a host of other activities,” says Lee Boon Yang, Singapore’s Minister for Information, Communications and the Arts.

Lee said the country’s current broadband infrastructure will not be able to cope with surging demand for data access in the years ahead, making a network upgrade necessary.

The planned Next-Generation National Broadband Network, which was first outlined in Prime Minister Lee Hsien Loong’s budget speech in February, will connect all homes, schools and businesses in Singapore. A wireless broadband network will also blanket the city-state.

Government officials view broadband Internet access as a priority for Singapore’s economic development. The country was among the first in Asia to embrace the Internet during the 1990s, but since then other countries in the region, such as Japan, have built faster, more advanced networks.

The upgraded broadband network will allow Singaporeans to make video calls to stay in touch with relatives and friends overseas, according to Lee, the information minister. In addition, the faster connections will make new consumer services possible, such as high-definition Internet protocol TV. “It will sharpen our business efficiency and spark off many new opportunities for entrepreneurs,” he says.

The government plans to work with private companies to build the network, and is prepared to provide funding to kick-start the project, Lee says. As a first step, Singapore’s Infocomm Development Authority has invited service providers to submit proposals for offering wireless broadband. The operating model specified requires service providers to offer a basic service for the lowest possible cost, which may include a year of free access. Service providers also must offer a premium service.

— By Sumner Lemon

Broadband�for�Everyone��in�Singapore

Broadband will make new consumer services possible, like high-definition Internet protocol TV.

Trendlines.indd 17 6/12/2006 4:36:34 PM

Page 12: June 15 2006

The Vision Beyond Your Rear-view Mirror

The storage industry would generate tremendous value by coming up with solutions to potential threats such as ID theft, instead of only resolving backup and recovery issues.

R ecently, my wife asked me what was Hollywood’s most often quoted movie phrase. I was stumped. She, of course, was ready with the answer: “Frankly my dear, I don’t give a

damn.” That’s the quote, not my response.I then started wondering if there was a computing equivalent to

Clark Gable’s most memorable line. Scott McNealy’s “The network is the computer” would certainly qualify, I thought.

Scott is good with one-liners. Here’s another: “You can’t drive forward by looking in the rear-view mirror.” While it may be true that Scott is doing a bit more gazing into the rear view mirror than he’d like to if only to better see what happened to Sun in the wake of the dot-bomb era, I think his observation is right-on when applied to the storage industry. My personal observation is that the industry has been doing far too much rear-view mirror gazing.

Don’t get me wrong. I’m not blaming the industry for pursuing solutions to problems that rise to the top of end-user pain-point surveys. For example, backup and recovery are the most often quoted storage problems by a long shot. It’s just that the backup and recovery situation is to me symbolic of an industry that is focusing more on managing risk than any other single thing. Value creation needs to get equal time too if we are to move forward.

To me, managing risk requires storage professionals to look backward at data that’s already been created. It requires that storage managers become responsible for covering their enterprises’ risk exposure, which relates directly to data loss, legal inquiry, regulatory compliance in many forms, and data theft. All these are critical issues, but they deal with managing data that’s already been created and stored. The road ahead is where we’re going, and it’s dotted with signposts that should be studied as well.

John Webster Storage InSIder

Ill

us

Tr

aT

Ion

sa

sI

Bh

as

ka

r

Vol/1 | IssuE/151 8 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

Coloumn Looking not backward.ind18 18 6/12/2006 3:47:26 PM

Page 13: June 15 2006

As a self-proclaimed analyst, part of my job description includes making predictions and doing the ‘vision thing’, so I’m going to point to what I see on the road ahead. But, if at the end of this ride, you find yourself saying, “Heck, I could have predicted that,” then welcome to the fold fellow Storage Visionary. If not, I suggest you cover your rear-view mirror with duct tape for a day or two.

Son of SarbanesThink of Sarbanes-Oxley as a shot across the bow of the storage industry. In the wake of Enron, Worldcom, Tyco, etcetera, two senators got together and drafted legislation that shook the storage industry. Who among us saw that one coming? For those of you who raised your hand, you can probably see the next one. Call it Son of Sarbanes.

Identity theft is now front-page news. My bet is that congress will see ID theft as a far bigger crime than Bernie Ebbers or Dennis Koslowsky could ever commit because it touches everyone with a credit card — virtually every registered voter.

See where I’m going with this? Do an Internet search using the Senate Commerce Committee and ID theft as subjects. You’ll discover that there are no less than 20 bills pending before congress that deal with this problem. Here’s my first prediction: one of them will impact the storage industry in some way.

Are we ready for that? I don’t think so. Do we have a storage lobbyist in Washington? I don’t know of one. I’m anything but keen on lobbyists, but when I hear talk of the federal government licensing ‘data brokers’, I have to ask: just what is a data broker? Is it some entity that only sells personal data? Or is it an entity that merely shares personal data with other entities to improve, for example, a consumer’s buying experience?

If you thought that senators Sarbanes and Oxley need a storage education before they sat down with pens in hand, grab the next flight to Washington and stay a while. Our industry can make the case that storage is the last line of defense for ID theft. The storage industry would also generate tremendous value for itself by coming up with solutions to ID theft.

What’s my second prediction? Simply this: we will see some form of ID theft legislation coming out of the US Congress before December 2006 that will impact the storage industry.

To Block or Not to BlockThink of how toilets are constructed. You have a bowl, a cistern, and a handle. Only the positions of these three essentials has changed since the toilet was invented. Next, think of how computers are constructed. You have a processor, some memory, and some storage. The biggest change to that basic structure has been the introduction of the network and, up to now, that network has consisted mostly of wires.

So here’s another prediction: the next major change to computing will be the extension into the wireless realm — in the form of wired and wireless network convergence.

OK, that was an easy one. Think about it, and you’ll realize that we’re already there. RFID, for example, takes wired retailers to the world of ‘consumer-aware’ stores. There are already three retailers that I’m aware of that have at least pilot RFID projects going, to say nothing of WalMart’s RFID mandate. Wireless digital cameras put observing — and identifying — eyes in discrete places. Check out the security measures implemented for the two national political conventions in 2004 if you doubt that this technology works. Our Big Brothers were truly watching us.

Prediction four: wireless data sources (cell phones, GPS transceivers, RFID tags, etcetera) will produce massive amounts of new data in real time. Question: is the storage industry ready to turn that data into real-time information? Answer: not really.

The people creating the applications that will take multiple data streams and turn them into real-time rivers of information see storage as something of a bottleneck. Consider this statement from Dr. Michael Stonebraker, 2005 recipient of the IEEE’s John von Neumann medal, founder of Streambase, and developer of a new process designed to process real time streaming data: “Traditional systems bog down because they first store data on hard drives or in main memory, and then query it.” There are also those in the storage industry who see block-based storage as too granular to match the performance needs of real-time information applications. However, there is no consensus regarding what would be better. Files? Objects? Blobs? From my vantage point, the jury hasn’t even started to deliberate.

Storage is About Value Creation TooA few years back, I wrote a column in this space entitled “Storage is now about managing risk.” Hmm. I’m regretting that I didn’t add a line or two about value creation. Wall Street doesn’t hold the storage industry in particularly high esteem because it sees the industry chasing new capacity schemes and forever beating itself up in price battles, rather than creating new value opportunities. I’ve outlined my top two value creation candidates. I invite you to ignore the rear view mirror and look down the road ahead. You will see more. CIO

Reprinted with permission. Copyright 2006. Computerworld SNW Online.

Send feedback about this column to [email protected].

Backup and recovery are critical issues, but they deal with managing data that’s already been created and stored. Value creation needs to get equal time too .

John Webster Storage InSIder

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 1 9Vol/1 | IssuE/15

ST

OR

AG

E

S P E C I A L

Coloumn Looking not backward.ind19 19 6/12/2006 3:47:26 PM

Page 14: June 15 2006

What Disaster Recovery Plan?A majority of companies don’t have disaster recovery plans, and those that do don’t test them. If you want the business guys to listen to your DR plans, stop talking about them.

At the Fall Storage Networking World (SNW) in Orlando, I had the privilege of moderating a panel discussion featuring four guys responsible for disaster recovery and business continuity

planning in their respective business IT environments. Meanwhile, a few hundred miles south, Hurricane Wilma had cut a swath of destruction from the Florida Keys to Fort Lauderdale and Miami in what had become almost a weekly occurrence of ill-tempered weather. It had the effect of making the topic very timely indeed.

That said, the panel proved to be less than satisfying in some respects, partly because of time constraints that precluded a deeper dive into certain subjects and partly because one of the participants was distracted by the damage that Wilma had wreaked on several of his storefront operations in South Florida. But mainly, my dissatisfaction stemmed from the fact that panels of this type are always a bit stilted. Participants tend to come from the enlightened side of the business IT community: those who are actually doing something about preparedness.

Who I would really like to chat with are those folks that are doing absolutely nothing. Survey after survey shows they’re in the majority: over 50 percent of companies make no effort whatsoever to prevent avoidable disasters or put in place strategies for recovering from outage events that can’t be avoided. Of those companies that do plan, fewer than 50 percent actually test the strategies they develop — which is like having no strategy at all. I would like to ask those guys why.

The recent spate of natural disasters and terrorism incidents has provided an in-your-face demonstration of both the potential threat and the consequences of the failure

Jon William Toigo User advocaTe

Ill

us

tR

At

Ion

VIk

As

kA

po

oR

Vol/1 | IssuE/152 0 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

Coloumn What disaster recovery p20 20 6/12/2006 3:49:53 PM

Page 15: June 15 2006

to prepare. Companies denied access to mission-critical data for longer than 48 hours tend not to exist after one year. Those who plan have a four-times greater chance of survival than those who don’t. These aren’t casual observations; they are statistical averages based on empirical data, and validated in the microcosm of the World Trade Center bombings in 1993 and 2001.

Moreover, regulatory requirements in many industries have driven home the legal consequences for failing to protect data even for those who can’t see the obvious pragmatic value. Penalties range from embarrassment — ask Bank of America, Citicorp and a growing list of others how they feel when data is accidentally disclosed and Graham Leach Bliley requires them to out themselves publicly — to fines and imprisonment. (Under SEC rules, Sarbanes-Oxley and HIPAA, data must be protected — or else.)

Unadulterated AmbiguitySo, what’s there to discuss? Disaster recovery planning must be done. It’s not just about common sense; it’s the law, right?

Wrong. The various laws and rules we have come to know and love over the past few years require that companies make their ‘best effort’ to protect their data assets. That means organizations can relegate the entire disaster recovery planning effort to ‘under scrutiny’ status for quite a while, while side-stepping most of the legal consequences. You must do something to protect data, but the law doesn’t specifically state what, or more importantly, when.

Unfortunately, common sense rarely provides a sufficiently compelling business case on its own to obtain Front Office funding. It’s like the old cartoon that shows a business guy yelling at his IT director, “You mean we spent all that money on Y2K, and nothing happened?” It is precisely because a successful disaster recovery capability is often measured in non-events that makes it so difficult to cost-justify in executive suites.

Add to this the fact that we have absolutely no idea how frequently disasters will strike the company, no actuarial table for disasters, and you have a more or less complete explanation for why management won’t cough up bucks for building a capability. The result: no bucks, no Buck Rodgers.

So, that helps illuminate the issue of ‘lack of management backing/funding’ cited by a significant percentage of the attendees of the SNW panel discussion in an impromptu poll I conducted at the outset. But it doesn’t suggest how we solve it.

In a later discussion, we tried to tackle this question, and here, in case you weren’t in Orlando, is what we started to say.

First, disaster recovery planning is often contextualized quite narrowly in terms of risk reduction. It’s like an insurance policy against a worst-case scenario that we hope we will never need to cash. Anyone who is trying to get funding for just about any IT initiative today will probably agree that a business case based on risk reduction isn’t going to fly.

The Front Office is looking for more. Risk reduction (construed as rapid recovery and regulatory compliance) is important, but they also want to know how the proposed initiative will save money and improve operational efficiency. Only proposals that can make a full business value case in terms of cost savings, risk reduction and process improvement are likely to get ‘the nod from the BoD’ (Board of Directors, that is).

Data Management is the KeySo, how do you realize cost savings and process improvement from disaster recovery planning? Simple. You can’t.

Disaster recovery planning provides no value in these categories whatsoever. Trying to build a full-fledged business value proposition for disaster recovery planning is a no-win scenario. It can’t be done, which isn’t to say that you stop trying and simply update your resume.

The solution to this no-win scenario is to change the rules. Stop talking about disaster recovery and start thinking about data management.

If you think about it, there are considerable procedural overlaps between the techniques we must use to characterize data for lifecycle management, regulatory compliance and disaster recovery. In every case, we must look at the business process, find out what applications support each business process, and then identify the data flows to and from each of those applications.

We do this in disaster recovery planning to identify what data is critical and must be restored in a hurry, post-disaster. We do this against the backdrop of compliance planning to determine what data needs to be retained, deleted or otherwise managed via policy in order to comply with the law. And we do it in data lifecycle management [if we are doing ILM/DLM (Information Lifecycle Management/Data Lifecycle Management) at all correctly] in order to ensure that the right data is placed on the right media across its useful life — with ‘right media’ being a function of data access requirements and update frequencies, as well as the costs and performance characteristics of the media itself.

To solve the business case no-win scenario, we need to change the rules. Stop talking about disaster recovery; start talking data management. The result is a much stronger business value case.

Jon William Toigo User advocaTe

There are procedural overlaps between the techniques used to characterize data for lifecycle management and disaster recovery.

Vol/1 | IssuE/152 2 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

Coloumn What disaster recovery p22 22 6/12/2006 3:49:53 PM

Page 16: June 15 2006

Jon William Toigo User advocaTe

From a cost-savings perspective, managing data will enable us to improve the utilization efficiency of our existing storage investments, forestall new purchases, breathe new life into our backup processes, and otherwise cull the dupes and dreck from our most irreplaceable asset — data itself.

From a risk reduction standpoint, data management will help us identify exactly what data we need to recover from the unthinkable, so we can provide it with appropriate data protection services. From a process improvement perspective — and this one will be of particular interest to the business planners in the Front Office — effective data management planning will produce a data model of the organization.

Befriending the Business GuysWhat good is a data model? Well, mapped to the infrastructure that supports the data (and to the cost of ownership details of that infrastructure), such a model could have enormous value. Imagine the business office calling down to IT with the following problem: “We’re thinking about buying a line of business from another company. What do you think the IT support costs would be for that?” With a data model, you could noodle out a pretty good answer in very short order, based on your status quo costs for supporting comparable lines of business. You would be a hero (for a few minutes, at least) in the eyes of the business guys.

The bottom line is this: a disaster can be defined as an unplanned interruption in normal access to business-critical data from any cause for an unacceptable period of time. The measure by which the efficacy of disaster recovery strategies is judged is ‘Time to Data’ — how quickly the strategy returns mission-critical data access to applications and end users that need it. So, in the final analysis, data management and not disaster recovery planning is the goal toward which we must strive.

Maybe protecting ourselves from unplanned interruptions ultimately comes down to managing data better. CIO

Reprinted with permission. Copyright 2006. Computerworld SNW

Online. Send feedback about this column to [email protected].

Vol/1 | IssuE/15 Web

Excl

usive

Resources

CIO Focus: Security Black BookWith security issues and risk mitigation increasingly dominating technology management, we at CIO felt this was an opportune moment to create the Security Blackbook - a compendium of the most essential reading on infosecurity, corporate security, business continuity, and related topics.

Complete Data Protection StrategyBuilding a robust data protection strategy is now a business requirement.

IT Consolidation Drivers and BenefitsOrganizations are finding themselves in a position where consoli-dation does not necessarily

Download more web exclusive whitepapers from www.cio.in/resource

Features

A Fine BalanceSuave and focused, Microsoft’s Corporate Vice-President--Services & IT, Richard Devenuti almost fits the ideal of an IT manager who would waltz with grace from managing internal IT systems to becoming an active stakeholder in guiding business imperatives. Devenuti, who joined Microsoft in 1999, talks about balancing his experience leading Microsoft’s IT management functions against a strong customer services thrust.

Read more of such web exclusive features at www.cio.in/features

Columns

Master data management is key to compliance Expect to see ongoing consolidation in the MDM market as vendors assemble more comprehensive product suites.

How to Find Your Competitive AdvantageIf a product or process allows you to differentiate from your competitors, it’s ‘core’.For Domino’s Pizza, delivery is core.

Read more of such web exclusive columns at www.cio.in/columns

NE

WS

|

FE

AT

UR

ES

|

CO

LU

MN

S

| T

OP

VIE

W

| G

OV

ER

N

| E

SS

EN

TIA

L T

EC

HN

OL

OG

Y |

R

ES

OU

RC

ES

Log In Now! CIO.in

R E A L

WORLD

Coloumn What disaster recovery p23 23 6/12/2006 3:49:54 PM

Page 17: June 15 2006

Trendline_Nov11.indd 19 11/16/2011 11:56:19 AM

Page 18: June 15 2006

Vol/1 | ISSUE/152 6 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26Feature - 01 Cover story.indd 26 6/12/2006 4:49:57 PM6/12/2006 4:49:57 PM6/12/2006 4:49:57 PM

Page 19: June 15 2006

Cover Story | Data IntegrationIl

lU

St

ra

tIo

n b

y b

InE

Sh

Sr

EE

dh

ar

an

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 2 7Vol/1 | ISSUE/15

Into

Enterprises are tackling the ugly problem Enterprises are tackling the ugly problem Enterprises are tackling the ugly problem of reconciling widely distributed data, of reconciling widely distributed data, of reconciling widely distributed data, driven in part by the move to service-driven in part by the move to service-driven in part by the move to service-oriented architecture.

By Galen Gruman

No one likes data integration. It’s painstaking, hard to automate, and hard to measure o one likes data integration. It’s painstaking, hard to automate, and hard to measure o one likes data integration. It’s painstaking, hard to automate, and hard to measure in terms of business ROI. Yet it’s required for making systems work together, in terms of business ROI. Yet it’s required for making systems work together, in terms of business ROI. Yet it’s required for making systems work together, whether as a result of an acquisition, as part of a migration to new tools, or in an whether as a result of an acquisition, as part of a migration to new tools, or in an whether as a result of an acquisition, as part of a migration to new tools, or in an effort to consolidate existing assets.

“The first question is always, ‘What database are we going to use as our customer source?’” notes “The first question is always, ‘What database are we going to use as our customer source?’” notes “The first question is always, ‘What database are we going to use as our customer source?’” notes John Kolodziejczyk, IT Director at Carlson Hotels Worldwide. Rather than keep addressing that John Kolodziejczyk, IT Director at Carlson Hotels Worldwide. Rather than keep addressing that John Kolodziejczyk, IT Director at Carlson Hotels Worldwide. Rather than keep addressing that question, the hospitality company devised a common data architecture, and a platform for managing question, the hospitality company devised a common data architecture, and a platform for managing question, the hospitality company devised a common data architecture, and a platform for managing it, for all its applications as part of the migration to a service-oriented architecture. Similarly, ball-it, for all its applications as part of the migration to a service-oriented architecture. Similarly, ball-it, for all its applications as part of the migration to a service-oriented architecture. Similarly, ball-bearing manufacturer GGB needed a central product information hub to ensure consistent data bearing manufacturer GGB needed a central product information hub to ensure consistent data bearing manufacturer GGB needed a central product information hub to ensure consistent data mapping among its Oracle e-Business Suite and three aging ERP systems, rather than try to maintain mapping among its Oracle e-Business Suite and three aging ERP systems, rather than try to maintain mapping among its Oracle e-Business Suite and three aging ERP systems, rather than try to maintain a raft of point-to-point connectors, says Matthias Kenngott, IT director at GGB.

Much enterprise data is either locked away in data stores or encapsulated within applications. Much enterprise data is either locked away in data stores or encapsulated within applications. Much enterprise data is either locked away in data stores or encapsulated within applications. Traditionally, applications ‘know’ what the data means and what the results of their manipulations Traditionally, applications ‘know’ what the data means and what the results of their manipulations Traditionally, applications ‘know’ what the data means and what the results of their manipulations

Whipping ITWhipping ITWhipping ITWhipping ITdaTa

By Galen Gruman

Shape

Feature - 01 Cover story.indd 27Feature - 01 Cover story.indd 27

Page 20: June 15 2006

Cover Story | Data Integration

mean, in essence creating a consistent data model, at least locally. As modern enterprises mix and match functions across a variety of applications, however, the data models get mixed together as well.

“The more you distribute the data, the more likely there will be problems,” says Song Park, Director (pricing and availability technology) at Starwood Hotels. The result could be what Don DePalma, president of the Common Sense Advisory consultancy, calls ‘frankendata,’ calling into question the accuracy of the results generated by the services and applications. “There’s always a context to data. Even when a field is blank, different applications impose different assumptions,” notes Ron Schmelzer, Senior Analyst at SOA research company ZapThink.

Ultimately, ‘frankendata’ can make a set of integrated applications or a vast web of services both unreliable and hard to repair. Many relationships must be traversed to understand not only the original data components but how they were transformed along the way. The antidote to frankendata is to provision data needed for multiple applications as a service — incorporating contextual metadata where needed and reconciling discrepancies among distributed data sources.

The SOa Imperative

Atwo-fold advantage of SOA is that creating services which perform oft-used functions reduces redundant development — and increases agility by making

application functionality available across a variety of systems using standardized interfaces and wrappers. The loosely coupled, abstracted nature of SOA has profound implications for the data that the services use, manipulate, and create.

“Do you divvy it up, or do you provide a central service?” asked Starwood's Park when the company began its SOA effort. That question led it down a path many enterprises must travel en route to SOA: a services approach to data based on knowing what data means no matter where it comes from. “SOA raises the fact that data is heterogeneous,” Schmelzer says.

As services exchange data, the potential for mismatches and unmapped transformations grows considerably. “SOA propels this problem into the stratosphere,” Common Sense’s DePalma says. “Put together your first three-or-four way data service,” and you’ll quickly discover the pain of data management. Without an initial data-architecture effort, an SOA won’t scale across the enterprise, says Judith Hurwitz, President of Hurwitz Group.

Vol/1 | ISSUE/152 8 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

aG

e

SS p ep ep ep ep ep ep e C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

MAstER DAt DAt DA A GOVERNANCEtA GOVERNANCEt

Organize

Validate Cleanse and Cleanse and Cleanse and Cleanse and Enhance

Workflows

Map and Map and Map and Map and Map and Map and Map and Move

Secure

Master data RepositoriesRepositories

ERP CRM PLM

SCM ServicesServices

Master Data Architecture MDM Components and services

stewardship and Business Rules

taxonomy of Relationshiptaxonomy of Relationshipt

External source and Destinations

Global data registeries

Industry data repositories

Trading partners

Data cleansing services

At its heart, master data management relies on a set of metadata encapsulated as rules. services and applications call on those rules when working with the data.

Content

Web Content

transaction Appstransaction Appst AnalyticsAnalyticsAnalytics

WWW

External source and Destinations

Inf

og

ra

ph

IcS

: VIk

aS

ka

po

or

Reporting, scoreboards, dashboards

Internal Source and destinations

Employees

Customers

Assets

Suppliers

Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28Feature - 01 Cover story.indd 28

Page 21: June 15 2006

Cover Story | Data Integration

The solution, according to analysts and consultants, is to develop a data services layer that catalogs the correct data to use and exposes its context to other services. This approach de-couples the data logic from the business logic and treats data access and manipulation as a separate set of services invoked by the business processes. Without such a scheme, enterprises will find themselves with loosely-coupled business processes that rely on tight data dependencies, eliminating SOA’s core benefit of loose coupling.

This effort is a change from past data integration approaches. “We used to solve data integration by imposing controls at critical choke points,” recalls Schmelzer. “SOA eliminates these choke points, so I have a data integration problem everywhere. That means every data access point has to be able to transform and manage data,” he says.

“Data integration and process integration are inexorably linked,” says Henry Morris, Group VP (integration systems) at IDC. “You need to think of services to manage data. Think about the processes that affect the master data wherever it lives,” he advises.

SOA also raises concurrency issues, notes Nikhil Shah, Lead Architect at the Kanbay International consultancy. For example, how data changes during the process may affect the results, especially in a composite application,

as old data is propagated through the process, or when multiple services access the data at different times. Shah recommends that IT implement monitoring services, so that they can determine whether to restart the process or adjust their computations.

Moreover, the more granular the data services, the greater the impact orchestration overhead has on processes, which could slow response time and create synchronization issues, Shah says. He advises IT to model data management requirements before a service can consume that data. Generally speaking, the more transactional the service, the more the specific data manipulation should be hard-coded into the business logic, he says.

Another SOA data issue is the ‘snowplow effect,’ which occurs when services pass on the context about their data manipulations to subsequent services in a composite application, says Ken Rugg, vice president of data management at Progress Software, which provides caching technology for data management in SOA environments.

Publishing those transformations can help later services understand the context of the data they are working with, says IDC’s Morris. But that can also flood the system with very large data files and slow down each service. IT needs to consider carefully how much context is passed through as aggregated parameters versus limiting that metadata and having the service interface look for exceptions, says Shah.

Return of Master data

The rise of SOA has given vendors reason to revisit their tools to simplify data management, for both SOA and non-SOA environments. Many are now

promoting MDM (master data management) tools to help ensure that applications or services use only correct, current data in the right context. ‘Master data’ incorporates not only the data itself but attributes, semantics, and context (metadata) needed to understand its meaning for proper use by a variety of systems. (Some vendors call these systems enterprise information integration, or EII, tools.)

Although not new, the concept was largely relegated to after-the-fact data systems such as data warehouses and business intelligence, notes Bill Swanton, research director at AMR Research. Before SOA, enterprises could largely get away without worrying about master data because most

information resided in application suites, where the vendors had at least an implicit, internal master data architecture in place. IT could thus focus just on transmitting processed or raw data between application suites — by creating connectors — and allowing the applications to handle most of the contextual issues, he notes.

SOA’s many-to-many architecture no longer allows IT to leave the problem to application vendors and to limited integration conduits. Even non-SOA environments, though, benefit from moving from the one-off approach of creating connectors to a more rationalized data architecture that makes integration simpler, Swanton says.

Some providers, including IBM, Informatica, Oracle and Siperian, approach the issue as an operational data warehouse, providing one or more data hubs that services access both from stores of cleansed data and from services that generate validated data from other applications as a trusted broker. These emulate the hub-and-spoke architecture common to traditional enterprise

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 2 9Vol/1 | ISSUE/15

ow data changes during a process may affect the results, especially in a composite application, as old data is propogated through the process, or when multiple services access the data at different times.

ST

OR

aG

e

SS p ep ep ep ep ep ep ep ep ep ep ep ep ep ep ep ep e C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

ow data changes during a process may affect the results, especially in a composite application,data is propogated through the process, or when multiple services access the data at different times.H

Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29Feature - 01 Cover story.indd 29

Page 22: June 15 2006

Cover Story | Data Integration

environments. Others, such as BEA Systems and Xcalia, approach the issue at a more federated level to better mirror the loosely-coupled, abstracted nature of an SOA.

Analysts and consultants warn that today’s technology is very immature and at best can help only specific data management processes. “There is no silver bullet,” says Shawn McIntosh, Senior Manager at consultancy AGSI. For example, Starwood’s Park notes that his IT group is hopeful that IBM’s planned Systems Integration Bus will provide a way to manage the data services in the hotelier’s SOA. “But we can’t wait for the tools to come out,” he says.

Many of the data hubs offered are geared to one data subject, such as customer or product information. That’s fine as an initial building block; later, however, IT will have to generalize the hub or work with a federation of specific data hubs, says Satish Krishnaswamy, Senior Director of MDM business at i2. “We won’t ever get to one single hub, so IT should instead work toward a standard canonical [hierarchical] view of data across its sources,” says Morris.

To make the scope manageable, IT organizations generally define the rules and context for one subject area

and then extend the system out to other subject areas over a period of time. That’s what Carlson Hotels is doing, starting with the customer-oriented hub IBM acquired from DWL. But, according to Kolodziejczyk, the hospitality company is not yet sure if it will extend that hub to include product data or use the product hub IBM acquired from Trigo.

Deciding whether to start with a subject-specific system — such as product information within SCM — or a generalized system depends on how targeted the integration efforts are to specific application suites. It may make more sense to start with a subject-specific hub if your focus is on interactions with an ERP or SCM system, whereas a generic hub makes more sense if your focus is on an SOA in which services interact with a wide variety of applications.

The data architecture

MDM tools can help, but they do little good if the enterprise doesn’t understand its data. “I see a fair amount of hype around the concept of master

data management,” says Fred Cummins, a Fellow at EDS. Because centralized data stores deal typically with after-

the-fact results, not with states and transactions, the more an MDM system looks like a traditional data warehouse or master database, the less likely it meets the needs of a transactional system, whether in a traditional or SOA environment, Cummins says. “It’s unrealistic to expect that there is one master database that everything reads or feeds. Some of the data is transactional,” concurs Paul Hernacki, CTO of consultancy Definition 6.

For an SOA, MDM tools that simply re-package EAI tools are not very helpful, Cummins says. That’s because an SOA should be driven by business processes, whereas EAI typically focuses on connecting applications together without worrying about the underlying data context for each. Even for traditional integration efforts, “you can’t just put in middleware and off you go,” adds Brian Babineau, an Information Integration Analyst at Enterprise Strategy Group.

“Primarily, it’s a design issue,” echoes ZapThink’s Schmelzer. “We have great tools for databases, messaging, transformation, etcetera,” to implement the design, he adds. Designing the architecture and the specific services correctly requires that developers understand all the data

used and generated by services and the applications they interact with — a labor-intensive process.

That’s why IT needs a commonly accessible set of data services or at least data mappings. “At some point this will have to be formalized as a repository,” Common Sense’s DePalma says. Critical for an SOA, this approach is also very useful in traditional environments, he adds.

With those mappings created, IT can then focus on building the connectors or services that implement them. IT must understand which mappings should be available to multiple services and applications — and thus implemented as separate processes — and which are endemic to specific business logic and should be encapsulated with that business logic, consultant Hurwitz says.

Many enterprises have avoided such data architecture efforts because there’s no obvious ROI, notes DePalma. Some remember earlier-generation efforts such as custom data dictionary creation, which also involved understanding the organization’s data architecture; by the time they were completed, they were outdated. Fortunately, IT can approach the data understanding incrementally, creating the rules and metadata around the information used for

Vol/1 | ISSUE/153 0 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

eciding whether to start with a subject-specific system, such as product information within ScM, or a generalized system depends on how targeted the integration efforts are to specific application suites.

ST

OR

aG

e

SS p ep ep ep ep ep ep ep ep ep ep ep ep ep ep ep ep e C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

eciding whether to start with a subject-specific system, such as product information within Sdepends onspecific application suites.D

Feature - 01 Cover story.indd 30Feature - 01 Cover story.indd 30Feature - 01 Cover story.indd 30Feature - 01 Cover story.indd 30Feature - 01 Cover story.indd 30Feature - 01 Cover story.indd 30

Page 23: June 15 2006

specific applications’ or services’ needs, specific applications’ or services’ needs, says Marcia Kaufman, Partner at Hurwitz says Marcia Kaufman, Partner at Hurwitz Group. Over time, the enterprise will build Group. Over time, the enterprise will build up a complete data architecture. “It’s a long-up a complete data architecture. “It’s a long-term journey,” says Hurwitz.

That data architecture will typically That data architecture will typically include multiple data models, each oriented include multiple data models, each oriented to a specific type of subject or process, notes to a specific type of subject or process, notes Paul Patrick, chief architect at BEA Systems. Paul Patrick, chief architect at BEA Systems. That actually helps IT by allowing the data That actually helps IT by allowing the data architecture to be developed in stages, plus it limits the mapping required between data models. (A unified data model must account for all possible mappings, whereas a federated model does not.) Furthermore, IT should concentrate on dealing with exceptions, says ZapThink’s Schmelzer. For example, IT should develop services that check for data that are out of normal bounds, rather than try to develop an enterprisewide ontology that maps out every possible state or relationship, he says.

Ultimately, the enterprise should build up layers of data services in which master data is distributed, says William McKnight, Senior VP (data warehousing) at consultancy Conversion Services International, although the infrastructure and tools to deliver on this goal aren’t yet mature.

Getting It Right

Provisioning data sources as services across an organization is a monster undertaking. For a traditional

integration effort, it means understanding the context within each application and how data is transformed for delivery to other apps. For an SOA, it requires understanding the multiple relationships and dependencies data can have with various business processes. “There are so many variables here,” notes Common Sense’s DePalma.

Analysts and consultants agree that this complexity requires both an upfront investment in modeling data architecture and an ongoing effort to systematically think through data dependencies and context. Discovering the data models and relationships among systems to create the mappings is about 70 percent of the effort in an SOA’s data architecture, says IDC’s Morris. At GGB, IT director Kenngott said the modeling and discovery effort was about 30 percent of the data-integration effort within its ERP consolidation project.

That initial push is well worth it, argues Park. “Otherwise, you can get pretty far along with your project and discover that you have 10 fields that you don’t need, 10 that you do but

didn’t know when you designed the service, and five that are different than you thought. Complex systems with hundreds of services need to have these interfaces nailed down.”

In most organizations, the tough slog of codifying interfaces and reconciling distributed data models is long overdue. But today, with the majority of large organizations pushing ahead with some sort of SOA initiative, the natural inclination to avoid this ugliest of hairballs can no longer be sustained. “The problem is too big to sweep under the rug any more,” says Conversion Service’s McKnight. CIO

Reprinted with permission. Copyright 2006. Infoworld. Send feedback on Infoworld. Send feedback on Infoworld

this feature to [email protected]

specific applications’ or services’ needs, says Marcia Kaufman, Partner at Hurwitz Group. Over time, the enterprise will build up a complete data architecture. “It’s a long-

That data architecture will typically include multiple data models, each oriented to a specific type of subject or process, notes Paul Patrick, chief architect at BEA Systems.

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 3 1Vol/1 | ISSUE/15

Finding aHome forMetadataWorking with data across an enterprise — especially in an Soa environment — requires oa environment — requires oa

understanding its context and semantics, not just its format and field attributes. and that means metadata. for developers as well as services to track that metadata, a repository for developers as well as services to track that metadata, a repository fwould be useful. theoretically, they would provide the intermediary services, but with today’s technology, “this is just too ... hard to do,” says paul patrick, chief architect at bEa Systems. a Systems. a“no one has assembled the pieces yet.”

the metadata repositories in use today tend to be part of Etl (extract, transform, load) and bI systems, says William Mcknight, Senior Vice president (data warehousing) at consultancy conversion Services International. “Standalone repositories are complex, mainframe-oriented, very expensive, and not integrated with modern tools,” he says.

“previous efforts at a metadata repository were a debacle,” says don depalma, president of the common Sense advisory consultancy. In addition to high licensing costs, “the work to create an encyclopedia of all applications, with its indeterminate benefit, was too high,” he says. not only were the tools “too academic”, they asked developers to adhere to very formal processes at a time when “all of this formalization went out the window with the move to htMl” and the quick-and-dirty development of the early Web period, depalma says.

but vendors are now revisiting the metadata repository concept. Some are incorporating the technology in their information management tools. for example, Xcalia uses an XMl table-based metadatabase in its Intermediation l table-based metadatabase in its Intermediation l platform, which allows It to create metadata-based transformation rules, so services can interact with data sources in a consistent way that is mindful of the data’s context and semantics. the company hopes to develop a standalone metadata repository that allows these rules to be used by multiple applications, says Eric Samson, Xcalia’s cto. and Informatica uses a metadata repository in its powercenter data federation data-integration platform, notes ashutosh kulkarni, its principal product manager.

other vendors, including bEa Systems and Ia Systems and Ia bM, are also working on less expensive, easier to implement metadata repositories. “the master data management has to be in a repository, whether the architecture is federated, distributed, or centralized,” says dan drucker, director (enterprise master data solutions) at IbM.

— g.g.

Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31Feature - 01 Cover story.indd 31 6/12/2006 4:50:04 PM6/12/2006 4:50:04 PM

Page 24: June 15 2006

Vol/1 | ISSUE/153 2 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 02 Storage Management32 32 6/12/2006 4:53:59 PM

Page 25: June 15 2006

Storage spawns where it’s needed, from sensibly architected SANs serving transaction-intensive systems to storage appliances bought impulsively to fill a departmental need. That leaves IT to manage many islands of storage strewn across the enterprise at a time when the need for centralized storage management has never been greater. Compliance

requirements, multimedia-rich applications, and a proliferation of databases are pushing IT departments to increase the size and complexity of storage networks across the enterprise.

“I tell our senior management that we grow our storage at a rate of 40 to 50 percent per year and they can’t believe it,” says Lev Katz, Datacenter Operations Manager for EMC storage customer MidAmerica Bank. “But then, if our business grew 30 percent last year, it makes sense for storage to grow the same amount, if not more. You have that many more people, you have that much more e-mail, you have that many more files.”

Piecing Together

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 3 3Vol/1 | ISSUE/15

Piecing Together STorageManagementIn a world where compliance encourages companies to save just about everything, emerging standards promise flexibility and control for heterogeneous storage environments.

By Logan g. HarBaugH

Ill

US

tr

at

Ion

by

Sa

SI

bH

aS

Ka

r

ST

or

ag

e

S PS PS PS PS PS PS PS PS PS P eeeeeeeeeee C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

Feature - 02 Storage Management33 33Feature - 02 Storage Management33 33Feature - 02 Storage Management33 33Feature - 02 Storage Management33 33

Page 26: June 15 2006

Point solutions and hands-on labor are no longer enough. To help IT wrap its arms around the storage problem, major storage vendors offer a range of storage-management software that enables administrators to find and manage all the components of a storage area network.

“Automated storage management is an easy means to create operational efficiencies and help reduce IT costs,” says Matt Fairbanks, Director of Product Marketing for storage management company Veritas. ‘The IT workforce is more productive and they are able to deploy more assets, increase storage capacity, and reduce complexity.”

Each vendor’s applications vary in the number and type of SAN devices they support. If you’re lucky enough to have standardized on a single server platform and single storage vendor on a single Fibre Channel SAN,

your environment will be relatively easy to install and manage. However, distributed, heterogeneous, wide-area SANs can be tough.

“As coincidence would have it, 90 percent of our storage is EMC-provided,” says Scott Roemmele, SAN Engineer Team Leader for online mortgage lender Quicken Loans. “But we do design most of our platforms to be open vendor — they don’t really have to be used with one particular thing. EMC Control Center actually has a lot of open-endedness to where it will actually recognize other vendors’ storage as well.”

a Standard Solution

Currently, most would-be SAN and storage-management applications have to rely on published APIs from other vendors to enable

Vol/1 | ISSUE/153 4 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

the Storage networking Industry association (SnIa) was formed

with the aim of developing standards for storage hardware and software. one of its most prominent efforts to date has been SMI-S, the Storage Management Interface Specification. SnIa ratified SMI-S 1.0 in July 2003 and it was approved as an anSI standard in october 2004 (and should soon be approved by ISo).

this initial version of the specification covers communication between hardware and management applications, allowing management of any compliant Hba, switch, raID array, and so on. last year, San-management applications began SnIa conformance testing with SMI-S for discovery of arrays, switches, fabrics, tape libraries, and Hbas, including asset management and reporting on the properties of San devices.

Driven by the SnIa’s Storage Management Forum, version 1.1 of the SMI-S, which was created to develop and standardize interoperable storage management technologies, has achieved many milestones. the newest version of the standard is on track to be submitted to the

International Committee for Information technology Standards (Itechnology Standards (It nCItS) for ratification as an anSI standard.

Currently the SMI-S v1.1 is being put through the paces of the SnIaConformance testing Program, which testing Program, which thelps enable interoperability by validating a vendor’s implementation of the specification. to date 214 products that to date 214 products that thave been tested from eleven vendors including aDIC, brocade, Ca, Cisco, EMC, Fujitsu, HP, Hitachi Data Systems, IbM, lSI logic and McData have passed. this builds upon the 200-plus products that have already passed conformance testing for the previous SMI-S v1.0.3.

according to ray Dunn, industry standards marketing manager at SnIa, the 1.1 version adds a variety of services, including tracking replication, mirroring and other data-duplication functions at the block level, support for the iSCSI and FCoIP (Fibre Channel over IP) protocols, and management of naS. the revision will include more file- and data-management functions, including enabling file shares, change management, host-volume management, creation of storage pools

in San arrays or naS devices, and provisioning of volumes.

additional features, including performance monitoring to track I/oacross heterogeneous Sans, health and fault management, normalized alerts, state changes, and error messages will make it easier to find and fix problems across a heterogeneous San. Policy management with rules-based automated operations [to create new lUns (logical unit numbers) as needed, for example, with the appropriate raID level and security] and security enhancements such as role-based authentication, identity management, and provisions for encryption of management traffic and data streams, will also be added.

In addition, Dunn says that by the 1.2 version of the specification, we’ll see services added to SMI-S that will enable IlM (information lifecycle management). In addition, SMI-S is extensible, so vendors can develop new features independently, and then work to add support for those capabilities into future versions of the specification.

— l.G.H.

the Storage networking Industry International Committee for Information in San arrays or naS devices, and

Towards a Universal Standard

Strategy

ST

or

ag

e

S PS PS PS PS PS PS PS PS PS PS P eeeeeeeeeee C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

Feature - 02 Storage Management34 34Feature - 02 Storage Management34 34Feature - 02 Storage Management34 34Feature - 02 Storage Management34 34Feature - 02 Storage Management34 34 6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM6/12/2006 4:54:02 PM

Page 27: June 15 2006

communication. This is changing, however, with the widespread adoption of SMI-S 1.0 (Storage Management Interface Specification) for communication between SAN devices. So far, most management applications and the SMI-S specification only cover management of SAN hardware. Managing data is trickier, particularly ILM (information lifecycle management), which involves controlling data retention and the migration of data between storage tiers.

Although most vendors make a considerable effort to develop or license the technology for communicating with other vendors’ products, the wide variety of products on the market, the speed at which these new products are appearing, and the continuous development of new technologies make it extremely difficult for any platform to support everything. On the other hand, customers are telling the vendors in no uncertain terms that they won’t buy products that can’t manage a heterogeneous environment.

“Many customers today are already doing some form of storage management and virtualization,” says Veritas’s Fairbanks. “In talking to our customers about the future, we’ve found that they are looking for robust storage management and virtualization features that enable common storage IT practices across multiple OS and hardware storage platforms.”

That’s why almost every vendor in the industry, major and minor, is a member of the Storage Networking Industry Association (SNIA), that works to improve interoperability by drafting the SMI-S specification (see “Toward a Universal Standard”). The first version covers communications between devices on a SAN, and communications between a management application and the devices. Forthcoming versions in development address topics beyond the SAN hardware, including management of data services such as backups, replication, snapshots and ILM.

SMI-S is not a device specification. Instead, it covers how devices and applications on the SAN communicate with each other. It uses two existing technologies: CIM (the Common Information Model) — originally developed for LAN technologies — and XML to pass data between devices. Because SMI-S is extensible and continues to evolve, it will be able to address future needs of SAN management as necessary.

Struggling With SMI-S

SMI-S won’t solve storage-management problems overnight says Jeff Hornung, Vice President of the Gateway Business Unit at Network Appliance,

“but it should eventually allow a broad scope of storage management from a single platform. The more things that are adopted into the SMI-S spec, the better things will get for everyone — including us — though we may have to work harder to innovate.”

Tom Rose, Vice President of Marketing for AppIQ, agrees. “SMI-S is like SNMP in the early LAN days — it will take time to get everyone on board, but we’re seeing more companies get on board all the time. SMI-S has won in the sense that 100 percent of vendors have committed to it, though less than 70 percent are currently supporting it.”

It will be a while before SMI-S lives up to its potential. As Jack McDonnell, chairman and CEO of Crosswalk says, “The lack of maturity of the SMI-S spec and the dearth of available SMI-S capability to date makes integrating with most applications a challenge, due to the necessity to use published APIs to communicate with each different application.”

McDonnell says that so far, even when SMI-S is supported by a device or application, all the needed information isn’t necessarily available. Not every field

may be fully populated, or data may be in the wrong fields. He estimates that currently half of the information needed to successfully manage storage devices comes from SMI-S and CIM. The rest of the data is gathered using device and application APIs or SNMP.

The immaturity of the specifications is enough to keep SMI-S off the radar for many customers. “I wouldn’t say [SMI-S] is one of our top priorities to have,” says Quicken Loans’ Roemmele. “We really just haven’t found a specific need where it has to be at this point.”

But, to help fill the void for customers who do require SMI-S support, AppIQ has developed ‘wrapper’ technology, which translates the APIs for storage hardware from many vendors into SMI-S. Crosswalk’s McDonnell is optimistic that things will continue to improve as long as end-users and resellers push for SMI-S. “There’s no excuse for vendors with new storage products to create proprietary management interfaces,” he says. “They should support CIM and SMI-S.”

he constant development of technologies makes it tough for one platform to support everything. but, customers are telling vendors that they won’t buy products that can’t manage a heterogeneous environment.

Strategy

ST

or

ag

e

S PS PS PS PS PS PS PS PS PS P eeeeeeeeeee C IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC IC I aaaaaaaa LL

he constant development of technologies makes it tough for one platform to support everything. vendors that a heterogeneous environment.T

Vol/1 | ISSUE/153 6 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 02 Storage Management36 36Feature - 02 Storage Management36 36Feature - 02 Storage Management36 36Feature - 02 Storage Management36 36Feature - 02 Storage Management36 36

Page 28: June 15 2006

Plan of attack

Of course, organizations are already using inventory and management capabilities in

existing SAN management tools. Having a clear picture of where your storage is and how much it's being utilized can produce significant benefits (see “Creating Order From Chaos”).

“It’s really geared toward having information at hand as quick as possible, without having to make an excuse — that we’ve got to get it from tape, or we have a down drive on one system and we have to get information restored from tape, or anything like that,” says Quicken Loans’ Roemmele.

At the moment, available management platforms allow adminstrators to perform basic management of a fairly wide variety of SAN hardware. Will it be possible to manage data services such as replication, virtualization, and ILM, as well? As long as everyone in the industry continues to support SMI-S and implement the newer versions of the specification as they are ratified, the answer should be yes.

For now, managing a SAN is a matter of matching the components you have with the components that the management platforms support, adding in point solutions where necessary, and working to integrate the whole. More ambitious capabilities, such as ILM, will continue to be difficult to implement until the SMI-S 1.2 spec is ratified and integrated into management platforms — probably in two or three years.

Until then, you’ll see a lot of parallel development from all vendors, with storage hardware and applications based on proprietary management protocols but including hooks for SMI-S support. For the long term, increased standardization will simplify integration, which in turn should drive down prices for management components.

As storage components become increasingly standardized, you might expect industry consolidation to follow — but don’t count on it. In a market that offers commonplace, widely understood APIs, even small companies will be able to produce innovative storage

products that easily integrate into the overall data-storage network.

One consistent theme you’ll hear from storage vendors is that they support interoperability standards such as SMI-S because their customers are demanding it. As long as that demand continues, the vendors will continue to move toward a truly interoperable SAN environment. CIO

Reprinted with permission. Copyright 2006. Infoworld. Send feedback Infoworld. Send feedback Infoworld

about this feature to [email protected].

you don’t have to be a large enterprise to take advantage of storage-management technologies. Vanasse Hangen brustlin (VHb), a 700-person engineering consulting

firm specializing in transportation, environmental, and land-development services, has up to 3,000 projects in development at any given time. the records for these projects represent a few months to several years of work and are stored on servers in the firm’s 17 offices located throughout the northeast. For Greg bosworth, director of It at VHb, data management for these projects involved a series of manual processes that had become increasingly complex and labor-intensive as the volumes of stored records reached approximately 10 tb.

During the past 18 months, bosworth has been using VisualSrM, an SrM (storage resource management) application from EMC designed for multi-tier, multivendor environments. the offering has helped make management of VHb’s storage environment — including a central EMC CX400 networked storage system, 20-plus servers, and some aX100 storage subsystems at remote offices — more proactive and less labor-intensive across the enterprise. It has also reduced eight to ten hours of work per week to half an hour or less.

according to bosworth, before VHb deployed VisualSrM, managing and projecting storage growth was difficult.offices would dump new projects onto the network, causing it to run out of space and resulting in last-minute fire drills. tracking project life spans — which tracking project life spans — which tcould last anywhere from six months to five years — was also an issue, because archiving completed projects and recovering the space was a manual process.

Within a couple of weeks of installing VisualSrM, VHb was able to analyze space requirements and identify files that could be removed. For example, VHb engineers frequently make autoCaD plots during project development.

although the plot files are typically never used again, they were not being automatically deleted. VisualSrM allows VHb to automatically purge old plot files as well as temp files, old backup files, and other disposable items, freeing up considerable space. It can also correlate with VHb’s accounting application to identify all files associated with finished projects. the next step will be to automate moving finished projects to second-tier storage.

bosworth says it’s now easier to manage growth and keep on top of the need for storage expansion. VHb is now at 79 percent utilization of storage capacity for the whole network, whereas they were often at 50 percent utilization or less.

With a much clearer view of utilization, there’s no need to keep too much extra capacity online, which results in substantial cost savings, as does the reduction in the time needed to manage data.

— l.G.H.

f course, organizations are already using inventory and management capabilities in

existing SAN management tools. Having a clear picture of where your storage is and how much it's being utilized can produce significant benefits (see you don’t have to be a large enterprise to take advantage of storage-management

Creating OrderFrom Chaos

ST

or

ag

e

S PS PS PS PS P eeeeee C IC IC IC IC IC IC I aaaaaaaa LLStrategy

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 3 7Vol/1 | ISSUE/15

Feature - 02 Storage Management37 37Feature - 02 Storage Management37 37Feature - 02 Storage Management37 37Feature - 02 Storage Management37 37Feature - 02 Storage Management37 37

Page 29: June 15 2006

Trendline_Nov11.indd 19 11/16/2011 11:56:19 AM

Page 30: June 15 2006

Vol/1 | ISSUE/153 8 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 03 Taking charge.indd38 38 6/12/2006 4:41:40 PM

Page 31: June 15 2006

There’s a stage in the life of a new technology in which half the world thinks it’s a whole new paradigm and the other half thinks it’s all hype. Half says it will never happen and the other half says, “We’re doing it now.” And even the most improbable vendor claims to have strategies and products to support it. So it is

with ILM (information lifecycle management). The current darling of the storage industry, ILM is based on two simple concepts. First, not

all information has the same value to the organization. Second, whatever value information has tends to change over time.

If these assumptions are true, then why apply the same level of expensive storage, management, and protection to all information in an enterprise? By moving less-valuable information to less-expensive storage and applying appropriate levels of protection to each storage tier, companies save money and reserve high-end resources for the information that demands them.

Taking Charge of the

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 3 9Vol/1 | ISSUE/15

InformationEnterprise

New, more intelligent storage tools promise better information management from start to finish.

By Leon erLanger

lIfECyClE

ST

OR

AG

E

S P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I A llIl

lU

St

ra

tIo

n b

y S

aS

I b

Ha

SK

ar

Feature - 03 Taking charge.indd39 39Feature - 03 Taking charge.indd39 39Feature - 03 Taking charge.indd39 39

Page 32: June 15 2006

The result: mission-critical systems are less bloated, more stable, and better performing. Backup windows shrink, storage runs out less often, upgrades are less frequent, and the overall cost of storage and storage management drops.

That’s the idea, anyway. Given such a tall order, it makes sense to be skeptical about what, if anything, ILM can do for you on an enterprise scale. But once we stopped worrying about the ‘grand vision’ of ILM and focused on the reality, we found that a number of nascent, policy-based point solutions are already providing real benefit to organizations challenged by exploding storage and complex compliance requirements.

What It is and is Not

Superficially, at least, the ILM concept resembles earlier storage technologies, including HSM (hierarchical storage management) and DLM (data lifecycle

management). Whereas DLM focused on data as the unit of storage and HSM tended to associate data with applications and moved that data based on a single criterion — time — ILM sets policies based on the value of the information that data carries, regardless of the application or time.

“In terms of information, HSM is brain-dead,” says Jeremy Burton, Executive Vice President of Veritas’ Data Management Group. For example, he says, one e-mail might require a different storage policy from the next, depending on its subject, sender, or relationship to a particular lawsuit. Similarly, health records don’t always decrease in value; they may have to be quickly accessed if a patient has a recurrence. In these cases, it’s the information contained in each parcel of data that’s important, not the data itself.

The other difference is protection. “In some ways ILM is like HSM, but you protect each tier differently,” says Nancy Hurley, a senior analyst at Enterprise Strategy Group. “So you may snap tier 1 every few hours and do incremental backups every day. Tier 2 only gets backed up once a week. Tier 3 never gets backed up; you replicate it and that becomes your store.”Finally, in most cases, ILM assumes that despite migration or archiving, data will continue to be accessible for a long time, either as an identical archive instance or as a searchable repository.

Smarter Storage Now

The two principal drivers behind ILM are exploding storage management costs and compliance. Which one is more important depends on whom

you talk to. “Many people assume it’s compliance that’s driving ILM,” Hurley says, “but only two out of 10 users I interviewed cite compliance as the main reason they are interested. Most of the rest cite cost savings.”

Take the North Bronx Healthcare Network, which oversees several New York City public health facilities. “We did some analysis and found that 84 percent of our data is stagnant,” says CIO Daniel Morreale. “So using EMC’s DiskXtender software, we applied some business rules to move the data automatically from our EMC Symmetrix DMX (direct matrix architecture) storage to a less-expensive NAS (network-attached storage), if [the data] isn’t used for six months, and then to our EMC Centera CAS (content addressed storage) fixed content storage six months after that.”

All of the files, however, are easily accessible to users. “The difference between accessing files on the SAN and

NAS is imperceptible,” Morreale says, “and getting files off the CAS takes maybe an extra one and a half seconds.” Morreale says this tiered storage model lowered his storage and staffing costs significantly and enhanced business continuity, in addition to aiding HIPAA compliance.

Michael Howard, CEO of ILM vendor OuterBay, says compliance issues account for much of his business. For example, when Tektronix consolidated its Oracle systems from 27 countries to two locations in Beaverton, Oregon, its storage requirements exploded and compliance issues became much more complex.

“In the US, a customer invoice has to be retained for five years,” says Lois Hughes, Senior Manager of Business Application Systems at Tekronix. “But in Italy, it’s 12 years and in China, 15.”

Tektronix deployed OuterBay’s Live Archive to move transactions from its production environment to a less-expensive read-only archive storage tier after two years. Different levels of protection are applied to each tier because stable data doesn’t need to be replicated or backed up as often as live data. And data on the archive tier is readily available to

Vol/1 | ISSUE/154 0 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

I.L.M

ST

OR

AG

E

S P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I A ll

he two principal drivers behind IlM are exploding storage management costs and compliance. Which one is more important depends on whom you talk to.

he two principal drivers behind Imanagement costs and compliance. important depends on whom you talk to. T

Feature - 03 Taking charge.indd40 40Feature - 03 Taking charge.indd40 40

Page 33: June 15 2006

users. “It looks just like the production environment; no user training required,” Hughes says.

The next step will be to move data after six years to a third tier: OuterBay’s Encapsulated Archive, a self-describing XML archive store. “This brings us from the huge, demanding Oracle application environment to compact, Oracle-release-independent XML storage. We can still run queries and reports. SQL code identifies the owner of the transaction, so the system will know that if the legal owner is Tektronix Germany, it should purge after 10 years and a day.”

Frank Harbist, vice president and general manager (storage software and ILM) at Hewlett-Packard, sees yet a third driver: information leverage. “We see more and more companies wanting to use information as a way to help run their business more effectively,” he says. “They want more of it accessible, so they can take advantage of data mining, business decision support, and analytics tools to gain competitive advantage.”

Vision vs. Reality

How long will it take to achieve the full ILM vision? Experts only agree that it is at least a few years away. Missing from today’s ILM offerings is the

enterprisewide, single-console ideal — tools that would allow an enterprise to classify all its information according to value, set up a single system of storage tiers, and apply migration and protection policies across it all using a single management tool. Much more common are point solutions, each with different emphases and capabilities.

For example, companies such as OuterBay and Princeton Softech concentrate on structured data found in Oracle databases, as well as CRM, ERP and supply-chain-management systems. Other solutions from EMC/Documentum, HP and Ixos target unstructured data such as files and images. E-mail archiving solutions from iLumin, Ixos, Veritas, and Zantaz focus almost solely on messaging. StorageTek has separate point solutions for

I.L.M

ST

OR

AG

E

S P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I A ll

rrunning an Iunning an IllM strategy is neither M strategy is neither simple nor straightforward for any

organization. For one thing, although the point solutions offered by storage vendors today address parts of the problem, true IlM must encompass the whole datacenter.

What’s worse, although nearly every vendor seems eager to talk about IlM, it’s often difficult to get a clear picture of what they really mean when they say it, leaving too many questions unanswered. How effectively can IlM be applied to an existing data infrastructure? What are the best practices? What long-term goals are vendors working toward? What are the stages of adoption? In the absence of generally accepted, industrywide standards, charting a course of action within an enterprise It department can seem like an exercise in blind faith.

to address this problem, the vendors, to address this problem, the vendors, tIt professionals, and systems integrators

comprising the Data Management Forum comprising the Data Management Forum of the Storage networking Industry association (SnIa) have joined together to form the IlM Initiative, the goal of which is to unify current efforts around IlM in a unified, cohesive approach — a vision, if you will. like any good vision, the IlM Initiative’s begins with a definition of terms. according to the group’s members, IlM is “a new set of management practices based on aligning the business value of information to the most appropriate and cost-effective infrastructure.”

What’s significant about this definition is that it recognizes that implementing IlM is not as simple as plugging a software solution in to an existing infrastructure. rather, IlM relies heavily on processes and the people who execute them. the SnIa IlM Initiative will continue to develop further definitions, reference models, and educational materials that will help customers and vendors

implement Iimplement IllM without being tied to M without being tied to specific solutions or technologies.

Working alongside the IlM Initiative is another group within the SnIa, the IlM technical Working Group, whose goal is technical Working Group, whose goal is tto develop best-practice guidance for specific parts of the overall IlM vision. For example, if classifying enterprise data is essential to an IlM strategy, how should an organization go about defining and applying classifications to its data? or, how should an organization go about moving data between different tiers of storage?

the SnIa’s work on IlM is now well underway and more activities are scheduled. Still, SnIa members are quick to point out that standardizing IlM will probably take several years. as yet, the industry is still in the early stages of defining this important new direction for enterprise storage.

— neil Mcallister

SNIA Works TowardILM Standards

Vol/1 | ISSUE/154 2 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 03 Taking charge.indd42 42Feature - 03 Taking charge.indd42 42 6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM6/12/2006 4:41:43 PM

Page 34: June 15 2006

structured data and e-mail. Various other point solutions are available from such vendors as Hitachi Data Systems, IBM, Network Appliance, and Sun.Second, most current solutions calling themselves ILM only move and archive data. Protection at each tier — in the form of mirroring, replication and backup — is generally left to the storage manager for implementation using other solutions. The long-term vision of ILM assumes that a management architecture will tie the two together and manage them as one, possibly as just another feature of an overall storage management platform.

For information awareness to truly become a reality, the applications that capture and use information will inevitably have to be involved as well. Currently, products such as EMC’s Xtender line and OuterBay’s Application Data Management suite act as a kind of application-aware middleware sitting between individual applications and storage, providing information awareness and policy-based data movement and disposal. Also, Oracle Database 10g provides some of its own data partitioning and storage tiering and management capabilities.

“There has to be a marriage between the storage and applications,” Hurley says. “Once I’ve [assigned a value to] my information, all the applications that will use, move, migrate, protect and retain it should understand that valuation from the moment it’s created. It’s going to come from the vendors providing open APIs to work with each other and a level of integration such that policy engines understand each other.”

Legacy applications are often critical roadblocks to enterprisewide ILM. “We thought we could use our ILM solution to move information from our 12-year-old clinical information system to [EMC’s Centera] CAS,” Morreale says, “but in fact we can’t because the application doesn’t allow the freedom to move data dynamically.”

Metadata will also be a key provider of information awareness. “There are all sorts of things people will want to trap,” says Ken Steinhardt, Director of Technology Analysis at EMC. “They’ll need multiple metadata views — not just how frequently something was accessed, but also tagging things that might be associated with a sensitive project with a 20-year retention requirement.”

Eventually, fulfilling the ILM vision may require standards as well. SNIA’s Data Management Forum is in the early stages of crafting an ILM model, but most

experts agree that ILM standards are many years away (see “ SNIA Works Toward ILM Standards”).

forget The Vision

The reality may be that enterprise ILM is too huge a project for many companies to take on. “ILM has expanded to mean everything in storage hardware,

software, and services,” says Jeremy Burton of Veritas. “Customers don’t know where to start because ILM sounds like some kind of ERP project that will grow out of control and take 10 years.”

One way companies can cope is to stop worrying about the vision. Instead, start with the areas that are giving you the most pain. For many organizations, e-mail is a major source of pain and a great place to start, particularly with its compliance challenges. Others may find that ERP or CRM data hurt the most.

Wherever the pain is, the first step is a careful process of information discovery, analysis and classification. Enterprise Strategy Group’s Hurley recommends taking advantage of SRM (storage resource management) applications. “They’ll tell you quickly what you can get rid of and what’s taking up the most data, and you’ll be able to see access patterns clearly,” she says. “You’ll probably be surprised at what you find.”

The result of this analysis should be a system that puts your information into categories based on performance, protection, and retention requirements during its lifecycle. Then, based on the storage needs identified by each classification, decide on a series of storage tiers, each with its own appropriate performance, availability and protection service levels. Finally, investigate policy-based automated data-moving solutions, such as those from EMC, HP, Veritas and others, which address your requirements.

Many companies start ILM with one application or department, or to solve a particular problem, such as compliance. The key is to get familiar with the process and see what it can do for your organization. Then, you can argue about the ILM vision over lunch. CIO

Reprinted with permission. Copyright 2006. Infoworld.Send feedback Infoworld.Send feedback Infoworld

about this feature to [email protected].

ompanies don’t know where to start because IlM sounds like an ErP project that will grow out of control. you can cope you can cope yby not worrying about the vision. Start with the areas that are giving you the most pain.

ompanies don’t know where to start because Ilike an Eby not worrying about the vision. are giving you the most pain. C

ST

OR

AG

E

S P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I AS P E C I A llI.L.M

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 4 3Vol/1 | ISSUE/15

Feature - 03 Taking charge.indd43 43Feature - 03 Taking charge.indd43 43Feature - 03 Taking charge.indd43 43Feature - 03 Taking charge.indd43 43

Page 35: June 15 2006

Trendline_Nov11.indd 19 11/16/2011 11:56:19 AM

Page 36: June 15 2006

Vol/1 | ISSUE/154 4 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Feature - 04 Virtualized storag44 44 6/12/2006 4:59:38 PM

Page 37: June 15 2006

As senior director of enterprise technology operations at Corrections Corporation of America (CCA), a prison management firm that handles more than 60 facilities, Brad Wood faces several challenges. His group manages approximately 100TB of data — including inmate medical records, operational records, and e-mail — across four Hitachi

Data Systems (HDS) storage arrays in two datacenters. Because of federal and state rules, much of the company’s data is mirrored three or four times to keep it accessible in case of failure. Adding to the complexity, Wood buys his hardware based on price and performance, so he has a mix of suppliers.

With escalating costs, Wood needed a way to slow deployment of new storage hardware and make better use of existing disparate hardware. He chose to implement storage virtualization.

The idea behind virtualization sounds deceptively simple. It aggregates storage systems such as arrays from multiple providers into a networked environment that can be managed as a single pool. In Wood’s case, his storage engineer can now manage the company’s various hardware from

Virtues of

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 4 5Vol/1 | ISSUE/15

VirtualstorageSo, virtualization seems hard. But there can be no second-guessing its benefits. The way for companies to get on board is to take baby steps with point solutions. Here are three approaches.

By Galen Gruman

Ill

US

tr

at

Ion

by

Sa

SI

bH

aS

Ka

r

st

or

ag

e

ss PPPPPPP eeeeeeeeeee CCCCCCCCC i a li a li a li a li a li a li a li a li a li a li a li a li a li a li a li a l

Page 38: June 15 2006

Virtualization

one console, using Symantec Veritas Storage Foundation and HDS HiCommand.

Virtualization has several key benefits other than improved resource utilization. For one, it allows data to be moved to any storage device in the pool when subsystems fail. Easier data migration also makes it possible to implement a tiered architecture, in which data is moved to less expensive, lower performance systems as it becomes less business-critical.

Another benefit is easier replication, because virtualization removes the need for full redundancy. With virtualization, it’s easier to copy partial data — such as snapshots or delta files — and keep it linked to an entire data set across physical devices.

And maintenance costs are always a factor. “[Virtualization] also reduces the storage administration burden over time by getting storage folks out of the server business,” notes Rick Villars, VP (Storage Systems Research) IDC.

three Paths

Vendors offer three approaches to true storage virtualization: host-client (via software), in-fabric (mainly through appliances but soon also through

switches) and in-array (embedded functionality). Analysts agree that ultimately they all deliver. The determining factor is how well a particular approach fits into your existing storage infrastructure, says Gartner research director Stan Zaffos.

The in-fabric approach is the most common method, offered in products such as DataCore Software’s SANsymphony, EMC’s InVista, FalconStor’s IPStor, IBM’s SVC, NetApp’s V-Series, and StoreAge’s SVM. These products, which have been around for just a few years, use dedicated appliances or software running on a server to discover storage resources and then build metadata that lets IT manage them as a virtual pool. Of these, IBM and NetApp have the largest installed bases (about 1,000 each), notes Zaffos.

Coming soon are switch-based product that essentially do the same thing as a separate appliance. These will be from companies like Brocade, Cisco Systems, MaXXan Systems, McData and QLogic. By putting the virtualization functionality in the switch, the theory is that operations are more efficient because data travels through one fewer device than if it also went through an appliance, notes Brian Garrett, Lab Technical Director at Enterprise Strategy Group (ESG), a market research firm. He expects most of these ultimately to run a version of Symantec’s Veritas Storage Foundation host-client software, although Symantec says it has no immediate product plans to port its software to run on switches.

Storage Foundation has been around in various versions for a decade, running on file and application servers to detect storage resources and maintain the metadata used to manage them. Until recently, Veritas (now a division of Symantec)

Vol/1 | ISSUE/154 6 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Array

Switch

Appliance

Archive

Switch

Array

Appliance

Switch

Storagesystems

Mainframe

Application and file servers

Application and file servers

Switch

Array

Array

Switch

Archive

Archive

In-Fabric (appliance) In-Array (HDS TagmaStore) Host-Client (Symantec Veritas Storage Foundation)

Application and file servers

The Three Approaches

MainframeMainframeMainframeMainframeMainframe

SwitchSwitchSwitchSwitchSwitchSwitch

Archive

SwitchSwitchSwitchSwitchSwitchSwitchSwitchSwitchSwitch

An appliance monitors the SAN for all available storage and builds metadata about stored

data and its location. Pros: Cross-platform; allows multiple logical pools; fairly low cost; can scale incrementally. Cons: Uses additional network resources; adds additional devices; metadata can be scattered across multiple appliances.

Anetwork controller pools all SAN-at-tached Fibre Channel storage and

presents it as a single pool. Pros: Handles all processing off the network; maintains metadata in internal storage; centralized management. Cons: Designed for large-scale deployments; single point of failure.

Software at the file and application servers monitors data traffic and storage, building

metadata on the fly. Pros: Can be deployed and scaled incrementally; mature technology. Cons: Server software not always in sync; large operations can be difficult to manage; meta-data-exchange traffic can be burdensome.

InF

oG

ra

PH

by

VIK

aS

Ka

Po

or

st

or

ag

e

ss PPPP eeeeee CCCCCCCCC i a li a li a li a li a li a li a li a li a li a li a li a li a li a li a li a l

Page 39: June 15 2006

did not release its Unix and Windows versions in sync, so it was hard to use Storage Foundation in heterogeneous environments, says Zaffos. Still, he says, the technology is easy to use for many purposes, including data migration, load balancing, and flexible provisioning.

A third type of storage virtualization is exemplified by the TagmaStore network controller, from Hitachi Data Systems, which lets HDS’s management software work with multiple vendors’ storage systems as if they were one pool. Approximately 45 percent of the roughly 1,700 current TagmaStore customers implement its virtualization technology, says Claus Mikkelsen, HDS’s Chief Scientist. Its key benefit, according to Zaffos, is that “you’re not adding another element in the I/O path, so you’re not buying another asset”. Because it’s usually cheaper to replace storage arrays than to pay for their annual maintenance, Zaffos expects TagmaStores to be used mainly to ease migration from old arrays.

Pricing for all three strategies is fairly equivalent, though that’s not immediately evident when comparing, say, a TagmaStore controller with a NetApp appliance or a Symantec software license. “The pricing variables are driven more by scale,” says IDC’s Villars.

One potential gotcha is that while virtualization promotes the idea of cross-vendor storage utilization, all three strategies also enforce vendor lock-in. In-array products obviously lock you into a specific vendor’s array hardware, says Mark Lewis, EMC’s Chief Development Officer, but in-fabric and host-client products lock you into the virtualiz ation software or the appliance that embeds that software.

Virtualization Works

ESG’s Garrett says his research shows that storage virtualization applied to storage environments with at least six storage fabrics reduces costs in several areas:

hardware costs drop 23.8 percent, on average; software costs drop 16.2 percent; and administration costs drop 19.3 percent.

Once an enterprise has deployed storage virtualization, the technology is “relatively easy to use”, says Garrett. So, he recommends that IT focus on a specific tactical issue, such as getting non-disruptive data migration in place. If you apply storage virtualization to that specific issue, he says, “then you can extend into the other stuff as you get more experienced”.

That’s exactly what the Baylor College of Medicine in Houston, Texas, did. Two years ago, it the college decided to integrate dozens of file servers and ERP stores attached to Unix

and Windows servers to reduce unused storage capacity and lower administration costs. Despite the initial expense, Baylor decided to replace its storage devices with a FC (Fibre Channel) storage fabric and a set of HDS arrays, recalls Mike Layton, director of IT for enterprise services and mainframe systems. Not having a heterogeneous environment to support “a luxury” made the decision to deploy virtualization fairly safe.

Today, the Baylor system manages 200TB of data, including patient records and university operations data. HDS hadn’t yet released its TagmaStore array, so Layton deployed NetApp V-Series appliances instead. Baylor’s use of storage virtualization is mainly to pool storage resources, although the college is also considering how to use the technology to implement data lifecycle management, where patient data can be highly available during treatment but later moved to lesser systems for analysis, auditing, or other needs.

Dallas-Fort Worth International Airport had a different problem. It stored flight data such as passenger lists, baggage tracking, and gate information in two SANs (Storage Area Networks) using Oracle RAC (Real Application Clusters). Oracle RAC could treat one storage target as the primary

target and then replicate to secondary systems, but this process simply took too long, recalls John Parrish, associate VP (terminal technology). If one terminal’s SAN goes down, the other SAN has to step in immediately so flight boarding and baggage handling isn’t delayed. DataCore’s SANsymphony appliance made Oracle RAC think it was working with just one SAN, and Parrish has seen no latency issues crop up in this deployment.

Replication issues were also a problem for Freeze.com. The online retailer needed to keep its 400GB Microsoft SQL Server transaction databases in sync with its reporting databases but SQL’s resource requirements prevented the reporting tools from working on the same database as the transaction management, recalls Freeze.com IT director Kyle Ohme. He mirrored the database periodically, but replication took so long that the reporting database was hours behind, preventing the kind of analysis needed to manage supplies properly. Ohme deployed tools from FalconStor to pool the storage into a virtual volume, so both sets of applications could access it in real time. That way, he could send snapshots of the transaction database to the reporting tools, rather than replicate the entire thing.

Virtualization

st

or

ag

e

ss PPPPPPP eeeeeeeeeee CCCCCCCCC i a li a li a li a li a li a li a li a li a li a li a li a li a li a li a li a l

Vol/1 | ISSUE/154 8 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

he real effort lies in getting virtualization up and running. apply storage virtualization to a specific issue and then extend it into other areas as you get more experienced.

he real effort lies in getting virtualization up and running. storage virtualization to a specific issue and then into other areas as you get more experienced.T

6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM

Page 40: June 15 2006

a long-term effort

Although the storage virtualization promises touted in 2000 turned out to be premature, today’s technology does deliver at least the first step toward

an automated, self-managing storage infrastructure that functions more as an IT utility, notes Gartner’s Zaffos. But that version of storage virtualization is many years away.

One reason is economics. The storage vendors can’t suddenly afford to lose profits from their hardware businesses if storage hardware becomes a commodity managed by software tools. As a result, they’re only likely to take measured steps to support the independent standards needed by third-party management tools, Zaffos says.

Operating system vendors may help force the issue if they adopt some of the Storage Networking Industry Association standards, such as Volume Shadow Copy Service, and start moving storage virtualization from network/storage middleware to the OS, says George Teixeira, CEO of DataCore. In fact, IDC’s Villars says that in five years, Microsoft Windows could implement storage virtualization for mid-tier enterprise deployments.

Another impediment to the grander storage-virtualization vision is that large enterprises that benefit most from storage

virtualization are the most conservative in deploying new technology, since the risk of failure is greater. “We’re not seeing deployments pushing to the technical extreme,” says Villars.

The third reason is that the tools are still immature. Current tools focus on giving IT a common console for managing storage. Over time, vendors will begin to add automation and policy-based intelligence for provisioning storage and managing data migration and replication. But because of the complexity of the infrastructure that such tools must manage, “I don’t see it for years and years,” says Tom Clark, Director (Solutions and Technologies) at McData and author of Storage Virtualization (Addison-Wesley, 2005). In the interim, IT should expect to see point solutions such as heterogeneous replication and snapshots, he says.

But none of these factors prevent enterprises from benefiting from storage virtualization today. In the immediate term, companies should apply virtualization to solve specific problems, such as easing data migration. As time goes on, IT can incrementally broaden its use of the technology, taking advantage of the continued improvements vendors will make over the coming years. You will see more. CIO

Reprinted with permission. Copyright 2006. Infoworld. Send feedback Infoworld. Send feedback Infoworld

about this feature to [email protected].

as more and more products enter the market, iSCSI is becoming an

increasingly attractive alternative to FC (Fibre Channel) San (storage area network) technology. not only is iSCSI cheaper than Fibre Channel, but the technology is less complex to implement. because it uses the familiar IP network protocols, it simplifies the It skill set needed to maintain the St skill set needed to maintain the St an. thus, though it’s not as fast and has a lower maximum capacity than FC systems, iSCSI meets the needs of many small businesses and non-mission-critical enterprise storage applications, such as departmental file sharing and near-line data storage.

However, if you’re considering iSCSI and virtualization is on your storage road map, you may want to think again — at least for the time being. the management tools that take advantage of storage virtualization to aggregate storage subsystems across a network are not yet mature enough to

handle iSCSI Sans, says brian Garrett, lab director at the Enterprise Strategy Group, a market research firm.

that’s why Corrections Corporation of america has held back deploying cheap iSCSI Sans to handle lower-priority storage, according to brad Wood, the company’s senior director of enterprise technology operations. Wood says today’s storage-management software can’t handle iSCSI Sans well. For example, iSCSI doesn’t have a single facility to provide global names, as FC does.

Kyle ohme, It director at online retailer Freeze.com, also struggled to deploy storage virtualization on iSCSI Sans. Even with help from vendors bluearc and FalconStor, the difficult effort has led him to scale back his iSCSI plans.

one reason for the lack of tools is iSCSI’s comparatively small market share. Garrett estimates that there are maybe 10,000

iSCSI Sans in place, compared with hundreds of thousands of FC Sans. Plus, he says, most iSCSI deployments are in smaller enterprises that don’t use the tiered storage or storage lifecycle management techniques that these tools address.

“iSCSI Sans are not yet well supported,” says Mark lewis, Chief Development officer at EMC, but he expects that to change as IP-based storage gains market share. the Storage networking Industry association’s SMI-S standard may help bring to the iSCSI and IP world the same storage-management capabilities that Fibre Channel and SCSI enjoy — and a basis on which to translate among them, notes roger Wofford, IbM’s storage virtualization marketing manager. Until then, even in environments where iSCSI and FC Sans are interconnected, the management tools will remain separate.

— G. G.

as more and more products enter handle iSCSI Sans, says brian Garrett, lab iSCSI Sans in place, compared with

The iSCSI Issue

Virtualization

st

or

ag

e

ss PPPP eeeeee CCCCCCCCC i a li a li a li a li a li a li a li a li a li a li a li a li a li a li a li a l

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 4 9Vol/1 | ISSUE/15

Feature - 04 Virtualized storag49 49Feature - 04 Virtualized storag49 49 6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM6/12/2006 5:00:21 PM

Page 41: June 15 2006

Trendline_Nov11.indd 19 11/16/2011 11:56:19 AM

Page 42: June 15 2006

Dialling IT Right

CIO: ICICI OneSource was the first in India to implement a VoIP networking platform and ‘adaptive’ intelligent call routing and switching. How has this evolved?

Raju Venkatraman It has worked wonders. On the floor, it gives us the ability to provide total redundancy to a call. Signal drops in the middle of calls are not a problem and we are able to recover immediately without losing the customer midway.

As we add new centers, bandwidth redundancy is being built in and a

significant amount of planning has been done to tackle traffic. We believe that we have the best-in-class technologies that will give us an edge over the competition. During customer audits, it is a good endorsement for me, as head of operations, to hear clients say, “Wow, we don’t have these kinds of facilities in the US.”

How much customer traction have you gained from this?

In addition to VoIP, we have excellent data management technologies in place. We also have a strong workflow management software, reporting structures for managing

ICICI OneSource President &

COO Raju Venkatraman

is calling on technology and

its best practicesas he rings

in efficiency and profit. ICICI OneSource, which is among India’s top-ten BPO providers, is

weathering the winds of change. Managing over 8,500 employees, ICICI OneSource President & COO Raju Venkatraman has been piloting the consolidation of technological and operational excellence across the BPO’s 10 services delivery centers. As the company moves into new geographies, Venkatraman explains how its best practices have placed its technology benchmarking and training a cut above the norm.

BY Ravi Menon

View from the top is a series of interviews with CEOs and other C-level executives about the role of IT in their companies and what they expect from their CIOs.

Vol/1 | ISSUE/155 0 J U n e 1 5 , 2 0 0 6 | REAL CIO WORLD

Ph

ot

o b

y S

rIV

at

Sa

Sh

an

dIl

ya

View from the Top - Full Page.in50 50 6/12/2006 4:02:43 PM

Page 43: June 15 2006

View from the Top

REAL CIO WORLD | J U n e 1 5 , 2 0 0 6 5 1Vol/1 | ISSUE/15

Raju VenkatRaman expects I.t. to:

Facilitate smoother integration of IcIcI onesource core processes with diverse platforms and softwares used by clients.

make call agent training much more evolved through voice recognition technologies.

provide stronger and automated quality and productivity control feedback.

View from the Top - Full Page.in51 51 6/12/2006 4:02:45 PM

Page 44: June 15 2006

individual work units, a productivity and quality tracker — our proprietary product — which sits on top of our customer platforms, and even customer middleware. These technologies allow us to perform parallel tasks in certain cases, sequentially manage workflow and move work units around whenever we engage in provisioning work for customers. Provisioning work often arises when we get orders from US clients or clients in healthcare, where it is very important not to miss even a single call order. Here, real-time inventory tracking and control for each unit of work helps us.

On the voice side, we are achieving accent neutralization for our call center trainees through voice recognition applications.

Accent neutralization is still a challenge...

It is. But at least we can now concentrate on the keywords in a conversation. Vendors will have even greater play in fine-tuning and coming up with better versions of accent neutralization software. Character recognition on the data side and voice recognition capabilities on the voice side, for example, are still inadequate.

Voice recognition is a powerful tool with strong implications in training and accent neutralization where we can differentiate ourselves further.

What are the quality monitoring systems you have in place to maximize agent performance?

The most important one is a statistical tool we have built, in which we take an agent’s call — maybe two or four calls per week depending on the agent and his tenure — and rate them against different parameters as good, bad or ugly. Statistical models can be generated based on a call lasting a couple of minutes, and the need for higher training, voice modulation deficiencies, gaps in systems knowledge, etcetera, can be pointed out. Often, a banking

or telecom client’s CRM or transaction systems are used for voice services, and we are able to use this to analyse an agent’s performance using the Pareto chart and Six Sigma.

Constant monitoring is possible through our Business Process Management System (BPMS) created last year. BPMS has a dashboard, quality and productivity metrics and other analytical tools which help us work on our weak areas. Actual acquistion of data and ensuring that reports are running and readily accessible are a CIO’s responsibility.

How do you empower your internal IT staff to keep service levels high?

We are continuously empowering internal IT staff to help our customers. Network uptime is not an issue with us, but, at an operational level, we need to have high uptime for applications on the client’s side. We now have reverse SLAs placed on our customers in which we tell them to guarantee 99.95 percent uptime on their applications, while we guarantee network uptime.

Other challenges are when upgrades happen or new versions are introduced. Since clients don’t do maintenance over the weekend, we have to get our testing right and

control how their different software versions and security layers are integrated into our infrastructure. The IT focus is on predicting catastrophes, not reacting to them.

Apart from VoIP, which platforms do you depend on the most? CTI (Computer Telephony Integration) or IVR and Speech Recognition Systems?

We have a long way to go on speech recognition. IVR depends on the kind of work we are asked to do. Our IVR is often managed by our customers in the US or UK. But as we move towards managing particular clients 100 percent, we will have to come up with ideas on increasing the number of IVR calls, and even manage the client’s IVR systems. Our Six Sigma staff’s involvement will be crucial as IVR usage increases, especially when many customers want problems resolved on the first call.

We are seeing a first-call resolution rate of 62 percent, which is much higher than the industry average. We use a lot of tools to understand what our agents do while fielding 7 lakh calls a month.

CIO challenges are growing. SOX, for example, specifies that CIOs should track every call to comply with corporate governance ethics.

Business has always driven technology by pushing the envelope with new challenges. I feel technology will always find a way to solve a business problem, no matter how long it takes. I have been in this industry for 22 years. And, when we started using many imaging technologies, for example, we did not even know how to send a 500KB or a 2MB image, because compression or TCP/IP did not exist. We often came up with proprietary ideas and such problems were solved with time.

View from the Top

“The IT focus is on predicting catastrophes, not reacting to them.”

— Raju Venkatraman

Vol/1 | ISSUE/155 2 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

View from the Top - Full Page.in52 52 6/12/2006 4:02:48 PM

Page 45: June 15 2006

With SOX, the pressure on CIOs will be very high; but they will no doubt deliver on these demands.

How has your IT infrastructure evolved to match your customer base? What causes spurts in IT spend?

Fortunately, we are already five years old, so our core technology spends aren't huge. But, in the kind of high-capacity expansion mode we are in, our IT spend is higher than most others. Looking ahead, we will be seeing more and more investments in voice recognition technologies on a per server/seat level.

On the technology side alone, we should see IT spends increasing by 10 percent to 12 percent on a per seat basis. An increase of 25 percent wouldn’t be viable.

In the dotcom days, we saw over-procurement. What procurement predicaments could BPOs face?

The real challenges will lie more in the area of data processing for voice and non-voice, where our capabilities will increase with demand. For example, we will need more specialized voice processing applications to help us measure agent turnover against the onshore-offshore blend of services we provide.

As for non-voice revenues, over 50 percent of our business is now non-voice based. New challenges on this front relate to image processing, forms recognition, symbol recognition and pattern technology among others, which are all very CPU-intensive. Right now, we are carrying out these functions higher

up in our architecture, but will do it at the client level soon. I see processing capabilities going up and the prices of core CPUs falling. Yet, there is room for us to justify a 10 percent to 12 percent increase in per seat IT spend if it will achieve better quality and productivity.

What explains the increase in non-voice revenues in the last two years?

The board and CEO realized that we couldn't adopt a fishing net strategy by just building capacity. The team was good and the market was right, as was our timing in taking on that market with key differentiators.

The acquisition of RevIT, a Chennai-based transaction processing company, brought in new domain capabilities like

workflow imaging and processing capabilities. We tried to increase the transaction-based pricing component of our voice work over the last one-and-a-half years or so. We acquired a collections company which gave us domain capabilities in areas across the BFSI vertical. This kind of work, though voice-based, is not billed on a per-hour-per-seat basis, but on transaction quality. The idea was to be able to guarantee the output and charge the customer on quality of output, not on effort involved.

Our recent partnership with Metavante for payments and mortgage processing represents a strong technology play for us. While the BFSI segment continues to see client-specific growth — with each client bringing in his own platform — Metavante will provide us intermediary platforms to bring in our process excellence differentiators. The client should see a net reduction in their overall costs; it’s not just pure labor arbitrage.

How have you empowered your CIO to deal with scaling issues? Inorganically, you’ve tied up with Metavante Corp while, at the organic level, new training needs will have to be met...

When RevIT was acquired, it was a question of getting both systems to run the same way. With Metavante, the challenge lies in integrating the core components of our IT infrastructure with their platforms, and our IT staff has been doing a great job here.

Questions abound, like how do we provide content-centric security for one class of users as opposed to another? How do we align data compression rates and tools used in both companies? Or, how do we get platforms and standards to integrate and process data under a similar framework? There is a difference in the level of integration of data and voice security, which has been addressed by a mature IT team. They have also dealt with platform, tools, data, security, voice layers and switch/router integration challenges.

We have a technology workshop created whenever an acquisition is done or a new client is signed up where we sort out various integration challenges and design tailormade strategies. Apps migration is a very tough handshake. Once the integration plan is laid out, both sides can create a win-win situation while consolidating our IT resources.

On the call center side, the idea is to make agents listen to their own calls from Day one. We actually play back calls from our extensive call libraries which categorize and differentiate between good and bad calls. Besides accent neutralization training, there is also the question of how an agent can comprehend a query asked in 20 different ways and zero-in on what the customer really wants. In this way, the systems help agents and help develop rigor and discipline. CIO

assistant editor Ravi Menon can be reached at

[email protected]

SNAPSHOT ICICI

OneSourcePRIMARy BuSINESS

business process outsourcing in

banking, financial services, healthcare

and telecom sectors.

REVENuES rs 558 crore

(as of March 2006)

IT BuDgET rs 54 crore

TOTAL EMPLOyEES 8,500

IT STAff 200+

CuSTOMERS 55

BRANCH OffICES 3

DELIVERy CENTERS 10

CIO yogi Parikh

View from the Top

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 5 3Vol/1 | ISSUE/15

View from the Top - Full Page.in53 53 6/12/2006 4:02:50 PM

Page 46: June 15 2006

Amoebic in its form and gargantuan in size, ‘unstructured data’ is a mine of information for enterprises just waiting to be tapped — but yet to be formatted or indexed.

Merrill Lynch estimates that more than 85 percent of all business information exists as unstructured data. Information in this ‘form’ typically consists of thousands of files created by users in the course of business: text documents, e-mails, spreadsheets, presentation slides and even data such as images, CAD drawings and multimedia files. Gartner’s research on the subject has shown that white-collar workers spend 30 percent to 40

The Case of Unstructured Data The CIO needs to provide the infrastructure and tools

to manage unstructured data. But the larger challenge posed by such information is not one just for the CIO — it is a challenge for the enterprise itself.

Associate Sponsors

Ph

OT

OS

By

Sr

IvA

TS

A S

hA

nD

ILy

A

5 4 J U N E 1 5 , 2 0 0 6 | CIO CUSTOM PUBLISHING

Page 47: June 15 2006

Vijay Pradhan, Country Manager-DMG,Sun Microsystems

Praveen Sahai, Product Marketing Manager-South Asia, Sun Microsystems

Anand Naik, Director-System Engineering, Symantec

percent of their time simply managing unstructured data, i.e., searching for and sorting through e-mail and similar content.

Are Indian organizations and their managements aware of the challenges posed by this data, apart from the value it represents?

To get a sense of Indian enterprises’ understanding of unstructured data and the challenges posed by it, CIO recently organized panel discussions, as part of the CIO Focus-Storage series of events, across Bangalore, Delhi and Mumbai.

vinod Sadavarte, CIO of Patni Computer Systems, highlighted the need for better management of unstructured data to improve the efficiency of IT investment: “A look at fundamentals shows that 50 percent of storage is consumed by unstructured data, 20 percent by semi-structured data and the rest by structured data. Is management aware of the business impact of this? Building awareness at the business layer is vital [and] the task primarily lies with the CIO.”

While the CIO would need to argue the case for unstructured data to management, at the ground level, the onus of managing unstructured data lies with the user, emphasized Avinash Arora, Director-IS, new holland Tractors, during the panel discussion in Delhi. “A company can begin by identifying the key users in the business. What information is generated by them, and how much storage would they require? I think this is the approach an organization must take today, though an understanding of unstructured data is still at a nascent stage in the country,” he said.

For business, the pertinent question would be: how much value surrounds unstructured information? Tangibly put, according to market research firm Outsell Inc., office workers spend, on an average, 9.5 hours each week in searching, gathering and analyzing information — an effort which keeps them from their normal tasks. The manhour loss should, on the face of it, be enough to make any managing council stand up.

Still, most Indian companies view the subject differently. “In India, we haven’t reached that stage yet where you can put a value to the utilization of unstructured data and business return [thereof],” said n. Kailasnathan, CIO and vice President (IT),

Titan Industries. “To directly put a value to the data is a difficult task,” echoed Manish Choksi, vP - Strategic Planning IT, Asian Paints.

With people generating the data being the best to figure out how to classify the same, the role of a CIO then becomes more of a facilitator, by putting systems and processes in place to give a structure to the data.

For an innately IT-driven company like Ericsson, a business process to serve unstructured data has been useful, noted Tamal Chakraborty, its CIO. “A control mechanism has to be put into place,” he asserted, citing the process in his organization. “Every document [to be stored] has a document number, which a user will tag in a given form, and store in a document register. This data is audited every year. It has been made into a business process… So, if a user doesn’t go through that process, the documents generated will not be part of the structured data, and won’t be retrievable in the future.” It is what rajesh Uppal, GM-IT of Maruti Udyog, called “a structured process around unstructured data”.

Choksi agreed. “First, we need to get the people interested in contributing to classify the data and in identifying how it should be interpreted and searched accordingly. People who use this data are second in line, and they can potentially also indicate as to why they are looking at the data.”

however, unstructured data acquires a more complex shape and form across verticals, such as manufacturing and telecom, because of inputs through customer interface. This forms the basis of more and vaster forms of unstructured data. S. Sridhar, head-IT of hutch, spelt out the issue, “It is much like the six blind men examining the same elephant. Inferences can’t be structured – if I bring in a structure to capture such unstructured data, will it have value when it is retrieved in future? This might actually hamper innovation.”

As such, unstructured data can have many faces, based on the parameters under which they need to be classified. In fact, Unni Krishnan T.K., CTO (solutions & technology team), Shopper’s Stop, citing the auto industry, noted how the meaning of unstructured information would depend on its user. “how a CAD designer uses his own crafted CAD designs to make

Office workers spend 9.5 hours each week in searching, gathering and analyzing information.

Nearly 60 percent of that time is spent on the Net.— Outsell Inc., market research firm

Events_Storage.indd 55 6/12/2006 5:04:47 PM

Page 48: June 15 2006

business decisions — which vendor to choose, which part has changed, what’s the version number of the part, what volume should be ordered — is different from how a showroom salesperson uses the same CAD designs to make a customer understand the features and functionalities of the automobile and make sales… Interpretation of the data varies within the departments across situations and geographies of the same organization.”

Across the three discussions, there was consensus among CIOs to tag or label the information — better known as metadata. This would form the beginning of the process of bringing unstructured data into the structured world. “If you want structured output, you need to have structured inputs as well,” said Atul Kumar, Chief Manager-IT, Syndicate Bank. Interestingly, he stressed the fact that by limiting storage capacity to users in an organization, they would naturally begin to manage unstructured data

in order to use individual storage more efficiently. C.r. narayanan, CIO of Alstom, concurred: “you need to have metadata — at least in a database, so that you are able to trace it and store it in a form that is retrievable over a period of time.”

But even metadata throws up challenges, as Kailasnathan pointed out in Bangalore. “The meaning attached to a tag or metadata can change over time. It would depend on the meaning associated by the user who labelled it.” Even Arun Gupta, Director-IT, Philips Electronics, pointed out: “The real challenge is tagging data to say this is confidential and this is not. here, tagging is perspective based on the person who owns the data.”

Even as the end-user’s role in the process became apparent, the case for unstructured data in all three cities came back to the nature of technology at the disposal of the CIO. “There are hardly any tools today that I can apply to unstructured data, unlike those that are available for business intelligence … There is nothing that is generic to unstructured data, as it is in the case of structured information,” said narayanan.

Gupta mooted the same point at the discussion in Mumbai, “With key-words searches or heuristics tools, we can analyze trends of the unstructured data. Standard BI tools have not been very useful there, and we are hoping that software industry will come up with something useful for it.”

THE WAY FORWARDAs of now, for enterprises in India dealing with foreign corporations, managing unstructured data is a matter of compliance — or working in accordance

5 6 J U N E 1 5 , 2 0 0 6 | CIO CUSTOM PUBLISHING

The Mumbai panel, which stressed on tagging data, included (from left) Arun Gupta, Director of Philips Electronics India; Manish Choksi, VP (strategic planning & IT) of Asian Paints; Unni Krishnan T.K., CTO of Shopper’s Stop; and Vinod Sadavarte, CIO of Patni Computer Systems.

"Knowledge management is needed for information to be retained in an

organization even when people leave. You need a mechanism to capture that, and you need capable people

to classify that information."— Avinash Arora,

Director-IS, new holland Tractors

Events

Events_Storage.indd 56 6/12/2006 5:04:51 PM

Page 49: June 15 2006

with legal and regulatory guidelines. There is no way around the task of transforming unstructured data into structured information. In fact, as Sridhar of hutch said, “Compliance can be the CIO’s passport to walk up to the CEO with the issue of unstructured data, and also sensitize his team about it.” For other companies in India, it is a matter of understanding unstructured data at two levels:

- how to leverage the enormous value posed by unstructured data?

- how to conserve time that would otherwise be spent in searching through unstructured data?

Alstom’s narayanan reiterated the latter point during the panel discussion in Delhi, from the standpoint of a multi-location business. “you need to service the learnings and information for future projects — they need to be documented and kept. you don’t want to reinvent the wheel, especially if you have projects that are similar to ones you have done in the recent past. With more competition, storage and retrieval of information is crucial because cycle time is coming down. And that is a business case for unstructured data.”

he also pretty much summed up the need of the hour, vis-à-vis unstructured information. “Data redundancy is multiplying manifold. how do you take care of this? If it is a multi-locational enterprise [like ours], we need a system that has a bandwidth and data centre that is able to cope with the data requirement of a single place on such a single. We need to move towards technology that has very good compression or a good cache.”

Beyond technology, Choksi revealed one approach to secure business buy-in to tackle unstructured

data. “CIOs should go to one or two business stakeholders who believe that there is a significant amount of unstructured data in their business transactions, and try to run pilot projects and build a content management or a knowledge management system. This will help them demonstrate to their managements that they have managed to improve productivity and thus demonstrate the value of the data to business.

The onerous task of organizing unstructured data might be likened with an artist obsessing about bringing form to content. Seen another way, the case of such data is simply a testament to how vast and accessible communication and technology have become in modern enterprise. CIO

Events

CIO CUSTOM PUBLISHING | J U N E 1 5 , 2 0 0 6 5 7

Storage and retrieval is absolutely crucial, agreed the Delhi panel that featured (from left) Avinash Arora, Director-IS of New Holland Tractors; C.R. Narayanan, CIO of Alstom; Rajesh Uppal, GM-IT of Maruti Udyog; and Tamal Chakraborty, CIO of Ericsson India.

"In the wake of regulatory compliance, how do you ensure that unstructured data is both accessible and easily retrievable?”

— Vijay Ramachandran, Editor, CIO

The panel in Bangalore noted that the CIO must sensitize management about the need to process unstructured data. It comprised (from left) N. Kailasnathan, CIO & Vice President (IT); Atul Kumar, Chief Manager (IT), Syndicate Bank; and S. Sridhar, Head-IT, Hutch.

Events_Storage.indd 57 6/12/2006 5:05:01 PM

Page 50: June 15 2006

J. Satyanarayana, CEO, National Institute for Smart Government (NISG), is trainng the country’s future e-governance leaders.

Vol/1 | ISSUE/155 8 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

Interview.indd 58 6/12/2006 5:02:50 PM

Page 51: June 15 2006

for

Creating

CIO: Why do only 15 percent of e-governance projects succeed?Satyanarayana: The figure of 60 percent to 85 percent of partial-to-total failure is a global estimate, and India is no exception. The low 15 percent success rate is due to various reasons. These include poor project conceptualization, a lack of process reengineering and hardware-driven efforts, a shortfall in professional project management, a lack of institutional and individual capacities, and a lack of change management among others reasons.

The National eGovernance Plan (NeGP) launched by the Central Government seeks to address these gaps and enhance the success rate substantially. NISG has also been laying emphasis on these aspects.

How can government CIOs ensure greater success of e-governance projects?

One of the ways CIOs can ensure more success is by studying and replicating successful projects, with modifications to suit local requirements. They also need to plan and conceptualize their projects properly in association with a team of professionals and department employees. Process reengineering, legal reforms, user training, capacity building and professional project management are other areas that need to be tackled to ensure successful e-governance projects.

Project development and management call for diversified skill sets. How does NISG help identify these and then train bureaucrats accordingly?

Project development and project management demand two different skill sets. Project development calls for individuals who can think out of the box and suggest innovative solutions. Project management requires the ability to repeat a set of activities with the expertise that manages development with time and experience.

In short, project development requires people with innovation, and project management requires people with experience. Within NISG, both sets of individuals are given opportunities in their areas of

Like most projects, e-governance ventures are vulnerable to failure because nobody champions them. To work around a lack of leadership, J. Satyanarayana, CEO, National Institute for Smart Government (NISG), has embarked on several workshops to train key people who will take charge and drive e-governance projects.

Championstomorrow

By Ba l a j i N a r as i m h a N

Interview | J. Satyanarayana

Interview.indd 59 6/12/2006 5:02:50 PM

Page 52: June 15 2006

Interview | J. Satyanarayana

specialization. We have run a few project management programs in project-specific settings.

We also train people in project development skills with the e-Champions program. The program entails training 100 senior administrators in e-governance skills for 14 weeks. Currently, 13 administrators from different states are undergoing training with us in the first batch of the program. It is too early to comment on its outcome. However, a mid-term assessment of the program has been highly encouraging.

Do participants in the NeGP need specialized management training?

As I mentioned earlier, the participants in the NeGP need not have expertise in the above areas — but they need to understand all the areas of management, governance, and technology and information management. The e-champions program is designed to cater to such requirements. The Department of IT is piloting a major initiative on capacity building across all states, with assistance from the Planning Commission.

In order to ensure successful e-governance projects, the bureaucracy needs to overcome internal resistance andlearn to take risks. Have practitioners imbibed these critical capabilities?

Given the high degree of failure in e-governance projects, taking a decision to start an e-governance project is itself tantamount to taking a risk. However, taking a risk and taking a calculated risk are two different things. Recognizing the risks inherent in a project is the first big step to risk mitigation. It comes with the experience of working in an environment where small failures are seen as stepping stones to larger successes.

Internal resistance can be overcome only by dialogue and explaining Internal resistance can be overcome only by dialogue and explaining to employees the benefits of an initiative and its rewards and by training. to employees the benefits of an initiative and its rewards and by training. change management is an important aspect in e-governance and must change management is an important aspect in e-governance and must not be overlooked. Employees in a department must feel ownership of not be overlooked. Employees in a department must feel ownership of a project; it is only then that they will participate.

Is there a case for creating a distinct cadre of government CIOs? Or, for increasing the tenure of IT leaders in the government?

The need of the hour today is individuals who are equipped The need of the hour today is individuals who are equipped with skills in management, information technology and with skills in management, information technology and governance. It is very difficult to create such a combination governance. It is very difficult to create such a combination through a separate cadre. Insofar as e-governance being as through a separate cadre. Insofar as e-governance being as ubiquitous and pervasive as governance itself, the right answer ubiquitous and pervasive as governance itself, the right answer is to sensitize all civil servants to the benefits of e-governance is to sensitize all civil servants to the benefits of e-governance and to promote a few among them as e-Champions.

The need for increasing the tenure is felt, not only with respect The need for increasing the tenure is felt, not only with respect to IT leaders, but for all civil servants working in other important to IT leaders, but for all civil servants working in other important sectors. Civil service reform is the long-term solution.

How should a government CIO look at ROI?The return on investment in an e-governance project is difficult The return on investment in an e-governance project is difficult

to measure in the short-term. It is only in the long run that true to measure in the short-term. It is only in the long run that true return on investment can be measured. In projects where a PPP return on investment can be measured. In projects where a PPP (public private partnership) model is used, the financial return (public private partnership) model is used, the financial return is one parameter to measure the ROI of a project. But, in order to is one parameter to measure the ROI of a project. But, in order to measure real ROI, a CIO has to return to the original vision of his department and then measure how e-governance has facilitated department and then measure how e-governance has facilitated department and then measure how e-governance has facilitated that vision. Various measures of ROI can include customer that vision. Various measures of ROI can include customer that vision. Various measures of ROI can include customer satisfaction, cost and time savings, service improvement, political satisfaction, cost and time savings, service improvement, political satisfaction, cost and time savings, service improvement, political returns, employee satisfaction, and more. returns, employee satisfaction, and more.

I consider the reservation system of the Indian railwaysone of themost successful e-governanceprojects across the world.

Ph

ot

oS

by

SU

RE

Sh

Interview.indd 60Interview.indd 60Interview.indd 60Interview.indd 60Interview.indd 60

Page 53: June 15 2006

What do you think of public-private partnership in e-governance projects?

The adoption of a PPP model in e-governance projects combines the accountability of the public sector with the efficiency, cost-effectiveness and customer-centricity of the private sector.

NISG has developed specific service level agreements for use in e-government projects. How well have they stood up to the test of time?

The concept of service level agreements is recent in the e-governance sector. Bangalore One and the MCA21 projects are two cases in point. I believe that this is the right direction to take. More time will answer this question better.

The institute has prepared roadmaps for 12 states and designed e-procurement for three states. What are the learnings from this?

The biggest learning for NISG in working on e-governance roadmaps is that adopting a consultative approach while developing vision, strategy and a blueprint serves the powerful purpose of change management. It creates the right environment for an implementation that is to follow.

With regards to e-procurement, there is definitely a business case for public-private-partnership, and NISG’s engagements with various states have proved that it has a sustainable business model that can be easily replicated.

What has the CADS methodology for e-governance achieved?

The adoption of CADS (conceptualization, architecture, design and support) has enabled NISG to streamline its engagements with client-departments. It has also helped clients to see the different activities and deliverables, which are needed in various development stages, in the right perspective. For instance, the need for process reengineering, change management and best practice survey have gained the right importance with policy-makers now, something that was needed long ago.

Can you shed light on other initiatives like the National e-Government Gateway and National Mission Mode project for Municipalities?

The Gateway is designed to act as a standards-based routing and messaging switch. It seeks to provide five distinct benefits. These are: enabling the establishment of joined-up government (various government departments working together to tackle issues that no single department can); promoting inter-operability between disparate e-governance applications; facilitating the delivery of same or similar e-services through multiple service providers and providing choice to departments and consumers; promoting standards-based implementation

of e-governance projects; and, finally, working to integrating services across centre, state and local governments.

The mission mode on municipalities has brought to the fore different PPP models applicable to e-governance in urban local bodies (ULBs) and has provided a comprehensive plan for capacity building. The Ministry of Urban Development is taking up a comprehensive scheme to transform over 400 ULBs in the next five years.

Which is the best e-governance project you have seen in India and elsewhere?

There are several examples of successful e-governance projects globally. The e-Citizen and Tradenet projects of Singapore are among the best I have seen. I consider the Passenger Reservation System of the Indian Railways one of the most successful e-governance projects, not only in India but across the world because of the number of citizens it has impacted.

Bhoomi and e-Seva are also projects that I should mention. Their success is in terms of the convenience they provide to large sections of the population.

NISG was created in 2002 to develop architectures and standards, provide high-level consultancy services and capacity building at the national level. Looking back, how many of your objectives have been fulfilled?

NISG has gone several steps in the direction of realizing its vision. As far as providing high-end

consultancy is concerned, it is assisting many government projects like MCA21, Envision, ILIS, Bangalore One, eBiz, e-Procurement and the ICTD project. With regard to capacity building at the national level, NISG has already trained over 200 policy-makers. We have conducted training programs for political leaders in Manipur, Chhattisgarh and Madhya Pradesh. We have recently established a program management unit for the department of IT as part of our initiative towards institutional capacity building.

To drive architecture and standards, we are working on projects like the National Service Delivery Gateway with the department of IT. NISG is contributing through its representative on four committees on standards recently formed by the Government of India. CIO

Special Correspondent Balaji Narasimhan can be reached at [email protected]

SNAPSHOT NISGSTAkEHOLDERS

NaSScom: 51% Govts. of India and

andhra Pradesh: Rest

kEy ObjECTIvES e-governance architecture consultancy capacity building

MAjOR PROjECTS ENVISIoN IctD ebiz NSDG NeGP bangalore one eGovworld mca21 IlIS

PEOPLE TRAINED

2004-05: 140+

2005-06: 300+

Interview | J. Satyanarayana

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 6 1Vol/1 | ISSUE/15

Interview.indd 61 6/12/2006 5:02:56 PM

Page 54: June 15 2006

Safe and Sound By Stacy collett

Data Encryption | Vincent Fusca trusts his staff. But he can't take any chances. It's all about the money.

As operations director at Dartmouth Medical School’s Center for Evaluative Clinical Studies in Hanover, New Hampshire, Fusca oversees the handling of nearly 7TB of raw medical data from the Center for Medicaid and Medicare Studies. Programmers aggregate and refine the data down to data-analysis sets that researchers use to publish some of the most comprehensive comparative medical research in the US.

Fusca isn’t aware of any attempted or successful security breach involving personal medical information during his tenure at the center. But the Health Insurance Portability and Accountability Act (HIPAA) requires the center to safeguard patients’ personal data, and ignoring the regulation could mean losing millions of dollars in research grants.

So two years ago, the center purchased two network appliance servers that keep data encrypted until researchers request the information on their secure desktops. The data is then sent on to backup tapes in an encrypted form.

“We want to ensure that we exceeded the levels of security required by HIPAA, so we never place our funding sources in jeopardy,” Fusca explains.

Data encryption is gathering a

following as even mid-size

companies see the sense in it.

And the reason is simple: few of

the old rules apply.

technologyeSSential From InceptIon to ImplementatIon — I.t. that matters

essentiAl technology

ill

us

tr

At

ion

by

sA

si

bH

As

KA

r

Vol/1 | issue/156 2 J U n e 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

Essentisl Tec.indd 62 6/12/2006 4:15:12 PM

Page 55: June 15 2006

On The RadarLike it or not, encryption will become part of most data at rest.

Companies of all sizes are exploring encryption because of a real threat of losing data or having it stolen, and because of government regulations such as the Sarbanes-Oxley Act, the Gramm-Leach-Bliley Act and HIPAA, which require protection of Social Security numbers, credit card data and other sensitive information. While encryption isn’t required, it can provide an easy, blanket solution.

“First, we had the market leaders. Now, we’re getting the mid-size companies realizing that personal confidential information regulation is there to stay,” says Eric Ouellet, a privacy and security analyst at Gartner Ouellet says he saw a 10-fold increase in customer calls about encryption technology beginning in January 2005.

Security threats aren’t confined to the backup tapes stored at off-site facilities anymore, though last year’s highly publicized losses of tapes belonging to Bank of America, Time Warner and Citigroup put a spotlight on the need for encryption. Laptops and databases need encryption too.

Still, organizations are reluctant to use encryption. In the Ponemon Institute’s 2005 National Encryption Survey, only 4.2 percent of the nearly 800 companies polled said they have enterprisewide encryption plans. The primary reasons were concerns about system performance (69 percent), complexity (44 percent) and cost (25 percent).

It’s true that encrypting tapes using some types of backup software increases backup times, consumes more storage space and costs more money. But those arguments may be losing steam. A dizzying assortment of products were introduced last year, promising to make encryption better, smarter and faster. The bad news: a single encryption method can’t be used in moving data from a laptop to off-site storage in most cases. The good news: decryption has become simpler, and backup times have improved significantly, especially when using encryption appliances.

A successful encryption plan involves identifying the right data to encrypt,

choosing only the encryption technologies that you need and managing encryption keys effectively.

“There is still no right way to apply encryption,” says Jon Oltsik, an information security analyst at Enterprise Strategy Group in Milford, Massachusetts. “It

depends on what you perceive the risks to be and where the money is to solve the problem. Focus on figuring out one or two technologies that will take care of the biggest chunk of issues.”

Here’s a look at some of the newest encryption technologies.

essentiAl technology

Sensitive data. Depending on the type of business, sensitive data can include social security

numbers, credit card information, financial records, health data, intellectual property documents or

information about sexual orientation. Most companies will find an average of 8 to 12 bits of data per

record that need encryption. the difficulty is locating every place where that information is stored.

Encryption appliance. this hardware sits between servers and storage systems and encrypts

data as it moves back and forth. Many of these appliances can run in sAn, nAs, isCsi and tape

infrastructures. they encrypt data at close to wire speed with very little latency. in comparison,

encryption software on servers and in storage systems slows backups.

Library-based tape encryption. security features embedded in tape drive and tape library

hardware are often used when data is stored at an off-site facility. encryption co-processors

process the data stream at wire speed as it enters the library. security functions are completely

transparent to the software. no external software or operating system support is needed. but it

also means that the tape vendor is entirely responsible for managing security.

Edge encryption. this includes encrypting data at the point of entry on laptops, handhelds

and desktop PCs. basic encryption that requires a username and password offers little

protection, but it’s better than nothing, say industry watchers. A global key-management

system for Windows offers better protection. some laptop manufacturers are incorporating

encryption capabilities in new models.

Enterprise digital rights management. this is the next big thing in key-management

technology. still in its early stages, DrM offers the potential for persistent encryption and

security as data travels from laptop to e-mail, database and storage tape by assigning access

rights to the file. DrM becomes more important as companies distribute protected documents

beyond the enterprise to partners and vendors.

Quorum-based recovery. this is one of three key-management approaches that companies

should consider. Quorum-based recovery requires a group of three to five administrators to

grant permission before encryption keys can be recovered. encryption specialists also advise

that tape libraries shouldn’t have to maintain the mapping of keys to tape volumes. this method

adds another point of management and complicates long-term key escrow. it’s also important

to automatically replicate keys to an escrow service or tape library at a disaster recovery site for

fast data recovery in case the originals are lost.

Data compression. Appliances trump software-based encryption at the database level when

it comes to compression. software-encrypted data can’t be compressed. encryption hardware

devices have a compression chip in them, so they compress before they encrypt, which is a

tape-drive space savings of 1.5 to 1.

— s. C.

encryption DecryptedA glossary of common storage-encryption terms:

REAL CIO WORLD | J U n e 1 5 , 2 0 0 6 6 3Vol/1 | issue/15

Essentisl Tec.indd 63 6/12/2006 4:15:12 PM

Page 56: June 15 2006

Back-end AppliancesCompanies that want blanket encryption coverage on the backend before it goes to backup should consider appliances that sit between servers and storage systems and encrypt the data as it moves back and forth, says W. Curtis Preston, VP of Data Protection at GlassHouse Technologies, a storage services company.

Specialized encryption appliances like Decru’s DataFort, and NeoScale Systems’s CryptoStor can run in storage-area network (SAN), network-attached storage (NAS), iSCSI and tape infrastructures. They encrypt data at close to wire speed, with little latency. Both vendors have also developed versions of their products that will encrypt backup tapes. Decru’s offering encrypts NetApp storage, as well as EMC, Hewlett-Packard, Sun Microsystems and IBM storage.

Fusca says encrypting and decrypting data goes unnoticed by users at Dartmouth. “When they get up on the analytical servers and start drawing data through either the tape library or the electronic storage through the DataForts, it is relatively transparent, and there are no discernable delays in accessing the data,” he says.

Key management has been simplified. “Once we identify the appropriate client stations that are on the virtual private network that can draw requested encrypted data into their ‘cryptainer’ [a device that stores decrypted data on the desktop], it’s relatively fast and painless for them,” Fusca adds.

Appliances also trump software-based encryption at the database level when it comes to compression. Software-encrypted data can’t be compressed, which is a tape-drive space savings of 1.5 to 1. “Hardware devices have a compression chip, so they compress before they encrypt,” Preston says.

Library-based Tape EncryptionIn the highly competitive microprocessor market, protecting intellectual property is a serious concern, especially when sensitive data goes to an off-site storage facility.

At Advanced Micro Devices’ Longmont Design Center, Information System manager Tom Dixon has been evaluating the beta version of Spectra Logic‘s BlueScale environment for three months. Spectra Logic is one of two library tape vendors that have recently incorporated security into tape drive and tape library hardware. Quantum’s proprietary DLTsage architecture also offers a tape security feature at the drive level.

“Library-based encryption is a good idea for firms that need to lower the risk associated with sending tapes off-site,” wrote analyst Galen Schreck in a January report for Forrester Research.

The Spectra Logic product performs data encryption within the library using an enhanced version of its Quad Interface

Processor board. Three months into his evaluation, Dixon says the hardware was “fairly easy” to set up. “You don’t have to do anything on the host,” he says. “They set up the library, and you set up your keys. That’s the biggest headaches. We haven’t even talked about that yet.”

The hardware’s encryption keys are managed within the library and can be exported a USB flash drive or via an encrypted e-mail. The keys can then be imported into another Spectra library or used within a software decryption utility, in case no library hardware is available.

Library-based security has two big benefits over software-based alternatives, according to Schreck. First, there are no performance penalties. By embedding encryption in the

essentiAl technology

While encryption products can improve security, they also introduce additional management

tasks, especially for companies using multiple encryption products. Always include a strong

key-management approach, including quorum-based recovery.

“encryption products that don’t provide a means of recovering keys are asking for trouble,

particularly in a disaster recovery scenario where files may be lost or disorganized,” Forrester

analyst Galen schreck wrote in a January report.

“Quorum-based recovery allows a certain number of parties ... to present their credentials and

recover encryption keys.” Also, tape libraries shouldn’t have to maintain the mapping of encryption

keys to tape volumes. it adds another point of management and complicates long-term key escrow.

it’s also important to automatically replicate keys to an escrow service or tape library at a

disaster recovery site for fast data recovery in the event that the originals are lost, schreck says.

And don’t forget the human aspects of key management, says eric ouellet, an analyst at

Gartner. “you may actually have controls that already exist that you can leverage, like better

authentication or better separation of duties, or better access control” with databases or

applications, he adds.

“if you focus on those areas, then you don’t necessarily need to deploy encryption everywhere.”

employee access and separation of duties should be a top priority. “Maybe the encryption

technologies work fine, but does someone have access to a file that they shouldn’t have access

to? or do they have a key to get access to that data? if so, you’ve just compromised your

system,” says ouellet.

What’s more, systems administrators should not be system users, and auditors should not be

able to grant themselves access or privileges. “Anything that would cause a conflict of interest

would not be allowed,” he says.

— s. C.

Have a Key-recovery Plan

Vol/1 | issue/156 4 J U n e 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

Essentisl Tec.indd 64 6/12/2006 4:15:12 PM

Page 57: June 15 2006

essentiAl technology

tape subsystem, vendors can use encryption co-processors to process the data stream at wire speed. Second, security functions are completely transparent to the software. To outside applications and servers, they behave like just a regular tape library. No external software or operating system support is necessary.

But it also means that the tape vendor is completely responsible for managing security. So customers should look for products with strong key-management features, like quorum-based recovery, integration with backup and recovery tools, and automated replication of keys to an escrow service or tape library at a disaster recovery site.

Laptop and ‘Edge’ EncryptionWhile encryption efforts focus on back-end and off-site storage tapes, Preston says fewer companies are implementing edge-level encryption methods, such as encrypting data on laptops. What’s more, basic laptop encryption offers little protection.

“Most people use a Windows name and password. That becomes the key to encrypt the data. If someone actually stole your laptop to steal your data, that key would not stop them for very long,” Preston says. A harder-to-crack, global key-management system for Windows exists as part of Microsoft’s Active Directory infrastructure, “but not everyone uses it”, he adds.

Laptop manufacturers like Lenovo are incorporating encryption capabilities into their systems, and Microsoft will add encryption capabilities to the upcoming Vista version of its Windows operating system.

Don’t Encrypt EverythingWhen it comes to assessing what constitutes ‘sensitive’ data, most companies will find that there are only 8 to 12 bits of information per record, on average, that need encryption, says Gartner’s Ouellet. Depending on the type of business, this can include Social Security numbers, credit card information, financial records, health information, intellectual property documents or information about sexual orientation.

“Once you’ve identified what those bits are, you can choose what solution gives you the biggest carpet covering over the area,” says Ouellet. He offers the example of a large retailer that performs online and telephone transactions and holds a lot of credit card information. Within the database, the most sensitive data should be protected.

“Pick the most sensitive fields and encrypt those. Don’t encrypt everything, because you’re going to kill the performance on the database or have other issues with searching and access,” Ouellet says.

Also, keep track of sensitive data elements as they move throughout the process. “They go from one database to maybe a smaller database,” Ouellet says. “Is there a way you can leverage centralized storage, like a NAS or SAN, where both databases store their information in the SAN? There’s replicated data, but at least it can be protected using an encryption appliance.”

Few Short Cuts for Persistent EncryptionAlthough encryption strategies exist for laptops, databases and backup tapes, transferring encrypted data from one storage level to the next remains a sticking point. In most cases, data must be decrypted and re-encrypted as it travels from one resting place to another.

“There are some solutions that bridge a couple of the different areas, such as laptop encryption and e-mail,” Ouellet explains. “But as far as persistent encryption across the network — not right now. “

A few vendors, including RSA Security and nCipher, offer key management software that could exchange keys between applications from the same vendor. But that technology is in its infancy, Ouellet says.

Enterprise digital rights management (DRM) technologies have the potential to streamline this process. DRM offers persistent encryption and security, and rights activity that is defined as part of the file itself. “There’s a tag that’s assigned to the file. If I want to view or print the file, I have to validate that I have the proper access rights

for that activity,” Ouellet says. DRM becomes even more important if companies need to distribute protected documents beyond the enterprise. Microsoft and Adobe Systems are developing DRM products. Adobe plans to ship its LiveCycle Policy Server in the third quarter of this year.

“In five years, DRM is going to be the most pervasive way to protect your data,” Ouellet says. “Until then, there is no hybrid right now that covers everything. You’re going to have different areas that are covered with different types of technology.” CIO

Reprinted with permission. copyright 2006.

computerworld. Send feedback about this feature to

[email protected].

Even with all the new encryption technology,

vulnerabilities still exist. encryption keys

once thought to be safe, like MD5, sHA-1 and

sHA-256, were eventually cracked. How

long will the current 3Des or Aes 256-bit

encryption keys last?

“With any encryption algorithm, at some

point there will be enough number-crunching

capacity to work through it,” says W. Curtis

Preston, Vice President of data protection at

GlassHouse technologies.

using the fastest computers on the planet,

how long would it take to crunch these

numbers and come up with the code? “With

40-bit encryption, the answer is a couple of

weeks,” says Preston. some people believe

that 256-bit keys like 3Des will become

obsolete within five to 10 years. “but right

now, it’s fine,” he says. “Aes 256 goes an

order of magnitude beyond that...As long as

you’re using something at or beyond 256-bit

encryption you’re fine,” Preston adds.

— s. C.

How long Will it be safe?

REAL CIO WORLD | J U N E 1 5 , 2 0 0 6 6 5Vol/1 | issue/15

Essentisl Tec.indd 65 6/12/2006 4:15:12 PM

Page 58: June 15 2006

Pundit

Vendors Rewrite the RulesMarket competition ramps up as EMC and Dell, IBM and LSI Logic, and NetApp unveil new products.By Mario apicella

STORAGE SOLUTIONS | It seems inevitable, like death and taxes. Every year, storage vendors renew their portfolios, delivering significant changes if not complete redesigns of critical product lines. Recently, almost simultaneous major announcements from EMC-Dell, LSI Logic-IBM, and NetApp followed similar chest-thumping from Hitachi Data Systems, HP and Sun.

Why did I pair up EMC and Dell, and LSI Logic and IBM? Because each name on the right side of the dash sign is a major OEM of the one on the left, so although the name

may be different, the pair is essentially announcing the same new product. And IBM will probably resell NetApp’s new products, too. Confused yet?

Let’s dig in, starting in alphabetical order. The EMC-Dell duo has announced a new Clariion line, dubbed CX3 Ultra Scale Series, that extends beyond the capacity and performance of older versions. CX3 includes three models — the CX3-20, CX3-40 and CX3-80 — with capacity ranging from 365GB to 239TB. EMC suggests that a CX3-80 about doubles the performance of the older CX700. All of the CX3 boxes replace PCI-X with better-performing PCI-Express connectivity. In

addition, the arrays support both 4Gbps and 2Gbps FC (Fibre Channel) drives, and can move data between virtual LUNs (logical unit numbers) without disrupting applications. Another interesting new feature of the CX3 line is that customers can replace faulty components without the help of an engineer.

Moving on to LSI Logic and IBM. LSI is introducing three new arrays: the 3992, 3994 and 6994, which can mount 16 4Gbps FC drives in a 3U enclosure. Adding expansion modules brings up their capacity even higher, ranging from 5TB to 100TB.

If you need more than the 112 drives that the 3992/4 can host, you can perform a non-disrupting migration to the larger 6994, which will open up to 224 drives. Performance gain could be another reason to migrate because the 6994 can sustain twice the IOPS of the 3992. The modular architecture of the three LSI Logic arrays allow you to use the same enclosure for both controller and expansion modules. This is a welcome manufacturing simplification for its OEMs.

Now, NetApp. First, allow me to explain that these announcements are not necessarily targeting the same market. While NetApp is attacking the higher tier, where the company had no products, the other vendors are hoping

to grab more mid-tier market share by beefing up capacity and performance of their arrays.

NetApp is announcing the FAS6030 and FAS6070, which extend capacity up to 420TB and 504TB, respectively. To improve performance the two arrays mount 64-bit processors — four on the 6030 and eight on the 6070. They also have a larger cache, 32GB and 64GB, respectively. And did I mention the support for 4Gbps FC?

Further, NetApp is releasing Data ONTAP 7G, a new version of its OS. This allows you to set priority resource allocations for some

volumes, increases the number of volumes supported to 500, and offers tools to eliminate data redundancy. NetApp is also adding more service options to support its new strategy.

What sense can we make from this marathon of announcements? For one, more competition between vendors that you can leverage to strike better bargains. Because of a more crowded market, vendors will be comparing products’ performance. Make sure that those numbers are publicly posted and can be independently verified by groups like the Storage Performance Council. CIO

reprinted with permission. copyright 2006. infoworld.

Send feedback about this column to [email protected].

What does this marathon of announcements mean? For one, there is more competition between vendors that you can leverage to strike better bargains.

ESSENtIAL technology

VoL/1 | ISSUE/156 6 J U N E 1 5 , 2 0 0 6 | REAL CIO WORLD

ST

OR

AG

E

S P E C I A L

ET-Pundit.indd 66 6/12/2006 4:22:39 PM


Recommended