Archive for March, 2013

A Virtualized Data Center is a Happy Data Center

March 29th, 2013

A virtualized data center is a happy data center, says survey.

Server virtualization crossed an important milestone in 2012, according to a recent report by Aberdeen Group Inc. The market research firm found the number of applications currently running in virtual environments passed the 50% mark.

“Enterprises now feel quite comfortable deploying virtualized servers,” noted Dick Csaplar, senior research analyst at Aberdeen Group Inc.

Higher utilization rates and a reduction in server requirements are a few reasons why the technology has become so popular. Traditionally, companies purchased a server, or multiple systems, for each application. During nonpeak times, those systems often sat idly.

Virtualization enables enterprises to place multiple applications on a server, thereby increasing system utilization. Typically, companies aim for a utilization rate at about 80% capacity — they want to leave some room in case an emergency pops up. In many cases, virtualization increases utilization from about 10% to 15% to 50% to 75% utilization.

With servers operating more effectively, businesses need less hardware. In fact, Aberdeen found that companies gain a 10- to 15-fold reduction in hardware systems by moving to virtualization.

Server virtualization streamlines data centers

Kroll Factual Data, a firm with 300 employees that provides information services to the mortgage industry, offers a good example of the potential savings. The company delivers credit reports, risk assessment reports, business background research, collection information services and employee screenings to clients. In the last five years, Kroll Factual Data acquired 58 companies, and the purchases resulted in a hodgepodge of servers.

Maintaining the systems was a manually intensive effort. Traditionally, it took the IT staff two weeks to configure a server. IT had to physically transport an acquired firm’s servers to Kroll Factual Data’s data center, start up the servers to ensure they were operational and then integrate the new systems into its data center. The process was extremely time-consuming (it took 30 to 60 days), involved a huge amount of risk (some of the components did not work together) and resulted in occasional failures. Consequently, the IT professionals spent their time on mundane server configuration tasks rather than on creating new value-added business applications.

In early 2008, the business wanted to streamline its data center infrastructure. Virtualization was a good fit because the corporation wanted to reduce its data center costs as well as enable its server infrastructure to better meet spikes in demand. The firm’s business activity often fluctuated; for example, changes in interest rates or federal lending policies create surges in demand for the firm’s services.

Kroll Factual Data took a look at the virtualization software market and evaluated the available products, including Citrix System Inc.’s XenServer, Microsoft’s Hyper-V and VMware’s solution. The financial services company ultimately selected Hyper-V.

“About 99% of our applications run on Windows, so it made sense for us to go with the Microsoft solution,” said Chris Steffen, principal technical architect at Kroll Factual Data, which has served as a beta site for all three Hyper-V releases.

In the summer of 2008, Kroll Factual Data began updating its servers to Windows Server 2008 Datacenter with Hyper-V, a process that took several months. Upon completion, the company dramatically pruned its physical servers, going from 650 servers to 22 systems, which resulted in cutting its annual hardware expenditures by tens of thousands of dollars.

The savings did not stop there.

“With virtualizations, companies reduce their energy consumption significantly,” said David Brown, president of Datotel LLC, an IT data center services provider that operates a 34,000-square-foot data center in St Louis and provides businesses with colocation services, managed services and cloud computing resources to enterprises. Since companies run fewer servers, they consume less energy and reduce their data center footprint. For instance, Kroll Factual Data reduced its energy costs by $440,000 annually.

Personnel savings are also possible. At Kroll Factual Data, provisioning a virtual machine now takes 10 to 15 minutes rather than the weeks typically required previously. Consequently, the staff now concentrates more on higher-value projects, such as creating new reports for customers, and less on deploying, maintaining and upgrading its servers.

But virtualization presents enterprises with new challenges. Training is one possible hurdle.

“Data center technicians need to develop the skills so they understand how to deploy and maintain virtualized servers,” said Datotel’s Brown.

New management challenges arise. Information from many different applications is consolidated on single servers, which means large amount of data need to be sifted through and interpreted. Also, virtualization software dynamically moves workloads from server to server depending on which has unused processing cycles. Consequently, techies often do not have clear visibility into what data is being processed where, so troubleshooting becomes more difficult.

Finally, corporations have expressed the trepidations evident with any new technology. “Companies have moved most of their simple apps to a virtualized environment but have been less likely to tinker with mission-critical systems,” said Csaplar.

As evidence, more than two-thirds of enterprises have already migrated their test, light business and Web applications to a virtualized environment. However, less than half of their database, email and mission-critical applications run on virtualized servers, according to Aberdeen.

“Traditionally, virtualization solutions could not support the number of CPUs and memory needed to run large, complex, transaction-oriented applications,” said Csaplar.

But recent releases of these products have increased their upper limits — now 32 CPUs and 1,000 GB of memory — so that hurdle has been cleared. Consequently, Aberdeen found that enterprises have already approved projects to move more applications to virtualized servers. In fact once the projects are completed, more than 70% of corporate applications will operate in this manner.

“Virtualization has been evolving, so it is becoming the foundation for every application in some enterprises,” said Csaplar.

ABOUT THE AUTHOR: Paul Korzeniowski is a freelance writer who specializes in cloud computing and data-center-related topics. He is based in Sudbury, Mass., and can be reached at paulkorzen@aol.com.

Understanding Server Virtualization

March 25th, 2013

Understanding Server Virtualization

Virtualization is a hot trend, for those small businesses that decide virtualization is the right move for them, there has to be a complete understanding of existing resources, hardware and applications before they take action.
1. Understand your infrastructure. Before you make a decision about virtualization, it is imperative to learn about your business’s existing infrastructure. This includes the numbers and types of servers, operating systems, CPU and memory utilization, application names and versions. Without a thorough understanding of these components, it would be difficult to understand how virtualization technologies could best be used within your organization
2. Don’t virtualize everything. Although virtualization is a flexible technology and can bring benefits to a wide variety of environments, it is not the answer for everything. Operating system virtualization provides the most benefit when it replaces a physical server that is underutilized. As an example, a server running Active Directory that is using a small percentage of its processing power. This is an ideal situation for virtualization.
3. Understand your administration model. Virtualization brings a new style of administration that may impact the existing processes within an organization. Realize that existing server teams with provisioning responsibilities could have to adapt to this new model in order to create new virtual servers.
4. Understand the applications you have. “Before virtualizing any applications, it is best practice to understand exactly what applications are included in the estate, what versions they are currently using and how they work,” Colombo said. When you have a complete understanding of your applications, you’ll be able to make the best decisions when considering virtualizing those applications.
5. Make capacity planning decisions. Understanding the infrastructure that will be used to virtualize an environment is a must — especially the specification and capacity of the chosen systems. If you do it incorrectly, then the solution you choose may not provide the expected performance and service levels.
If you need more information, attend our next webinar, Understanding Server Virtualization at The Computer Company, Thursday, April 4th at 2:00pm EST
» Read more: Understanding Server Virtualization