Follow Us

We use cookies to provide you with a better experience. If you continue to use this site, we'll assume you're happy with this. Alternatively, click here to find out how to manage these cookies

hide cookie message

How to get to utility computing

Moving from departmental silos to a shared architecture won't be easy, but the pay-off is huge.

Article comments

It's called grid, utility or on-demand, but it's all about the same thing: creating computing infrastructures that can dynamically change tasks as processing needs ebb and flow. It's a grand vision, but getting there won't be easy. Here's how to start.

A slew of vendors - advocates of so-called utility computing - are promising to give IT executives the tools to deploy IT resources on the fly as business conditions change. In theory, utility computing gives managers greater utilisation of data-centre resources at lower operating costs. At their disposal will be flexible computing, storage and network capacity that can react automatically to changes in business priorities. The data centre of the future also will have self-configuring, self-monitoring and self-healing features so managers can reduce today's manual configuration and troubleshooting chores, advocates say.

The allure of utility computing is easy to see, but there's no clear road map. Getting there requires an approach that encompasses network gear, servers, software, services and IT governance. It also requires balancing the maintenance of existing IT resources with strategic new investments.

Having a clear picture of what already has been deployed is crucial before companies start to roll out new, intelligent devices. This might sound obvious, but experts say it's not always done.

Gartner analyst John Phelps says many companies don't know where all their servers are located, who controls and owns them, and the main functions and applications running on them.

Companies also don't have a clear picture of how their IT assets relate to each other. For example, one e-business transaction might depend on data culled from several applications running on different platforms in multiple locations. Figuring out these sorts of dependencies is a prerequisite of higher level computing, analysts say.

"Utility computing does not exist in a vacuum," says Jasmine Noel, principal with JNoel Associates. "The only way to do it is to first understand the relationships between hardware and software resources delivering a particular business service. Inventory discovery and relationship mapping are the keys to starting."

While companies focus on the fundamentals, vendors are working to create intelligent devices, management tools and services for utility consumption. The field is crowded, weighted by HP, IBM and Sun. IBM uses "on-demand" to describe its initiative. HP has its Utility Data Centre lineup and Sun has its N1 data-centre architecture.

Several software vendors have a stake in utility computing, including management software makers such as BMC Software and Computer Associates and storage management software maker Veritas Software.

There's plenty of work to do. Analysts say it will be a long time before routers can reconfigure themselves, servers can provision themselves and applications can dedicate more resources to themselves on the fly without human intervention. Building a true utility computing infrastructure is at least a seven- to 10-year effort, analysts say.

That doesn't mean companies should shelve their plans. There are plenty of opportunities to begin consolidating, standardising and automating data-centre resources today - and begin reaping the rewards of improved system management and reduced complexity.
Management counts
Management is the cornerstone of utility computing. Management software in the new data centre proposes to do more than monitor devices; it will store and enforce policies, discover devices, track changes, meter usage and ultimately take action when performance degrades.

The challenge will be capturing end-to-end systems data and consolidating it into something manageable, says George Hamilton, a senior analyst with The Yankee Group. "IT managers need to focus on getting all the data they capture from instrumentation and testing, and consolidate it in one place so they can more effectively manage," he says.

But it won't be easy. Today, most companies use multiple management systems to collect performance and availability data, identify potential failures, and provision devices, applications and end users. To be useful in an adaptive environment, management systems must evolve beyond islands of expertise.

"It's a challenge to get multiple systems optimised and running and tied together. Anybody looking at utility computing should try to find a vendor that has a comprehensive solution that can help people tie the pieces together," says Bob Ackerly, president of Smith and Associates. The Houston semiconductor company uses Vieo's Adaptive Application Infrastructure Management (AAIM) appliance to monitor about 40 servers in its data centre.

Some vendors are working to make products more cooperative. For example, BMC recently partnered with security vendor Symantec and storage leader EMC to share management data across systems. Similarly, Cisco and IBM signed a deal last year in which the two will develop a common way to detect, log and resolve system problems.

Vendors also are developing management tools that not only watch devices but also monitor distinct business functions. Concord Communications, Mercury Interactive and Micromuse have begun to develop their software tools to track the success and failure of business processes.

Following the path of a business process means crossing Web and application servers, databases, storage devices and the routers that direct traffic. To adequately track the path, the software first must find relationships between data-centre components; map those relationships into a logical topology; and configure the devices to report on how they perform, change and respond to application requests, Noel says. "It can be a mess unless some smart operations planning is done upfront."

If moving to a utility computing model required IT executives to abandon their existing infrastructure and start from scratch, there likely would be very few takers. Fortunately, industry watchers say, companies can find inexpensive ways to incrementally add components of an adaptive data centre to existing setups.

For example, today's routers, switches, servers and storage devices can do more than their standard tasks in the data centre - if instrumented correctly. Vendors such as Cisco, HP and IBM today deliver 'intelligent' devices that can provide information critical to their state. These features could let them perform self-diagnosis, self-healing and self-managing tasks, shortening the time it takes administrators to identify and resolve potential problems.

"Enterprise companies can, through attrition, introduce and embed these intelligent devices that have self-managing features into the network fabric," says Ahmar Abbas, managing director at research firm Grid Technology Partners. As new or existing projects necessitate hardware purchases, IT managers should invest in equipment with automated data collection and export features, which could eliminate the need for future instrumentation on the devices, Abbas says.

Equipment consolidation is another strategic move to consider. Managers can decrease the number of physical servers they maintain by combining applications into fewer, larger multiprocessor servers, or merge their storage into pools that can be managed from one interface. "A company can't get to a dynamic environment if it doesn't get rid of some of the hodgepodge," says Mary Johnston Turner, vice president at Summit Strategies.

Mark McNamara, IT director at WeightWatchers.com, says thinning out the number of servers is an attractive option, but vendors need to enable virtualisation beyond storage and servers for utility computing to take off. The New York company recently started working with BladeLogic to automate server provisioning.

"Hardware vendors are laying the groundwork, but the initiative will need more supporters," McNamara says. "Virtualisation will need hardware, software, network and storage vendor support."

Prune those apps
Companies also need to consolidate applications. As companies move toward simpler, standardised hardware, they also should consider who their key software providers are so that any platform decisions made - such as cutting the number of different operating-system variants and release levels to reduce complexity and cost - don't preclude using those vendors' applications, Turner says.

Companies should consider defining and implementing standards in areas such as database management systems, application interfaces, development languages and middleware, Gartner's Phelps says. Many companies are bogged down with multiple directories, data repositories and rogue Web servers - often the result of departmental initiatives undertaken in the absence of corporate-wide system standards. Paring down the number of applications will improve system management and reduce complexity, Turner says.

Likewise, technologies such as identity management can simplify operations and automate redundant IT tasks. Identity management processes are aimed at creating and maintaining common security profiles across multiple applications - reducing the burden on IT managers to handle mundane tasks such as resetting passwords.

Analysts also identify opportunities to gain efficiencies at the business application level - although consolidation is toughest at this level, Turner says. It's not uncommon for companies to support multiple versions of the same applications, each with distinct data definitions. Reconciling the separate instances often requires re-deploying software.

If companies don't want to tamper with existing installations, there are other ways to simplify application infrastructure. In the same way storage resources can be virtually linked rather than physically consolidated, companies can link information to reconcile different data formats without tangling with underlying data structures. Information-integration tools such as those from IBM and start-up Avaki add a layer of abstraction that makes it easier for data elements to become part of shared resources across the organisation, she says.

Turner recommends starting small: Companies should consider first migrating the applications used by a portion of the employee population - such as sales and marketing - from individual servers to a shared infrastructure. Within that shared infrastructure, users can determine application priorities, common security privileges, and automated policy responses.

Pay attention to processes
Building a smarter data centre requires incorporating business goals and processes into technology systems. The issue is part cultural and part technical. Business leaders need to communicate business objectives to the IT department, and IT executives need to map those objectives to technology resources. Collaboration is required - but not always easy.

"The IT organisation almost has to become social workers," Turner says. "They have to sit down with all these different business constituencies who have no interest in making each other happy, and they have to convince them that there is a business reason and a business benefit to them, to their little business silo, of running on a shared architecture. And they have to demonstrate that they can protect the interests of each particular business area."

Lee Adams, vice president of infrastructure services at Hospital Corporation of America, agrees. "That's the hard work of delivering complete service-level management to the business process. You have to go out and pin folks down in different departments and then take their processes and put them into the software."

Lee works with BMC to automate service management across billing, medication and patient applications. He says inputting "how people get their job done, from the most basic chore to complex processes" is necessary to automate any system to support critical applications.

On the technical side, IT executives should start by documenting the steps they take today to ensure IT systems are available and performing as expected, and then export that information into a workflow management system. One caveat is that the more convoluted a company's IT processes are today, the more difficult it will be to translate those manual steps into automation-ready tasks, Noel says.

Consider services
Service providers offer an alternative way to ease into utility computing.

HP, IBM and Sun are among companies that offer such services, which provide everything from core server and storage capacity to specialised business applications delivery.

Uptake for utility computing is on the rise, according to Gartner. The research firm estimates 15 per cent of corporations will adopt a utility computing arrangement this year, and the market for utility services in North America will increase from $8.6 billion this year to more than $25 billion in 2006. By 2006, 30 per cent of companies will have some sort of utility computing arrangement, Gartner predicts.

It won't be easy, but the potential payoff is compelling.


Share:

More from Techworld

More relevant IT news

Comments



Send to a friend

Email this article to a friend or colleague:

PLEASE NOTE: Your name is used only to let the recipient know who sent the story, and in case of transmission error. Both your name and the recipient's name and address will not be used for any other purpose.

Techworld White Papers

Choose – and Choose Wisely – the Right MSP for Your SMB

End users need a technology partner that provides transparency, enables productivity, delivers...

Download Whitepaper

10 Effective Habits of Indispensable IT Departments

It’s no secret that responsibilities are growing while budgets continue to shrink. Download this...

Download Whitepaper

Gartner Magic Quadrant for Enterprise Information Archiving

Enterprise information archiving is contributing to organisational needs for e-discovery and...

Download Whitepaper

Advancing the state of virtualised backups

Dell Software’s vRanger is a veteran of the virtualisation specific backup market. It was the...

Download Whitepaper

Techworld UK - Technology - Business

Innovation, productivity, agility and profit

Watch this on demand webinar which explores IT innovation, managed print services and business agility.

Techworld Mobile Site

Access Techworld's content on the move

Get the latest news, product reviews and downloads on your mobile device with Techworld's mobile site.

Find out more...

From Wow to How : Making mobile and cloud work for you

On demand Biztech Briefing - Learn how to effectively deliver mobile work styles and cloud services together.

Watch now...

Site Map

* *