Currently Being Moderated


Posted by Community Admin on Jan 7, 2009 12:01:50 AM
Even with the incredible traction virtualization is making in the market it’s still easy to get confused as to what virtualization means in every case. Is it server virtualization provided by hypervisors like Xen or ESX? Is it storage virtualization? Network virtualization? Broadly, it's all of the above.


One thing that's clear is that the market for virtualization is growing and maturing, and one advantage of this evolution is that the industry analysts are starting to categorize and differentiate between virtualization technologies and virtualization vendors. This helps us sure, but it also helps customers understand where they are in their own deployments vs. where they can go in the future.


Recently, the pundits have begun making distinctions between the different types or phases of virtualization. For example, IDC has identified two current levels of virtualization; Virtualization 1.0 and 2.0+. Virtualization 1.0 is defined as server virtualization – using hypervisors to partition resources – while Virtualization 2.0+ is defined as the next generation of virtualization technology – focused on virtualizing data center infrastructure beyond just the server.


Virtualization 1.0 is targeted at reducing capex through consolidation, something that hypervisors do extremely well. Virtualization 2.0+ focuses on reducing opex by adding capabilities around infrastructure virtualization, management tools to simplify management, and higher value tools like Disaster Recovery and hardware failover. Ultimately, Virtualization 2.0+ transforms data center infrastructure into flexible, changeable assets that can be deployed, moved and managed seamlessly.


The categorization that's happening now is important. It allows customers to wade through the confusing virtualization landscape and choose products that they can actually benefit from and complement what they already have installed. We know that virtual machine sprawl is becoming real and with this sprawl comes a new set of challenges – managing the infrastructure that connects the sprawling VMs.


Virtualization relative to IT infrastructures has been huge success. We are effectively separating the Software from the Hardware which can provide a multitude of benefits. One of the early benefits has been consolidation. As IT had evolved to a “one application, one server” mentality – server virtualization offers a way to radically consolidate HW resources. Virtual networks let use share resources without building new infrastructures.


It has been done on Mainframes for decades. The point is that we now can do it on HW that is affordable, scaleable, and where little is required in terms of fault tolerance. Virtualization is deemed to go well beyond it beginnings and become a key base unpinning within the Flat IT world. Utilization will be a part of it but, more importantly, virtualization allows us to create a completely fluid and dynamic IT environment. This fluidity is the lynchpin in terms of how we really build IT Utilities.


If you have ever configured a server, think about the time it takes to change a server from a Web server to a database server, to an Email server. Even with the best tools, it is not easy and it is definitely not dynamic. With virtualization the fluidity of change will simply move to a completely new level, allowing IT resources to be applied (leveraging other technologies like Grid) almost instantaneously to meet changing needs.


Most IT Managers, I believe they would just be happy not having to plan downtime when the want to migrate a server or storage system. Yes, in ten years we might have a fully autonomous dynamic self-managing IT environment but, there is huge value is just the first basic step – separating the Software from the Hardware.


Key changes:
Virtual Infrastructures - Virtualization has been a hot trend for some time now and the technology (virtualization) will exist wherever there is Hardware. Virtualization is important for utilization but also ultimately critical for building a truly dynamic IT environment. Virtualization will simply free IT from any specific coupling to HW.


Information-Centric IT - In the existing IT environments, we've moved from being Server-centric to OS-centric to Application-centric. In the next generation, we will become more network-centric but fundamentally start building Information Technology actually around the Information. This is powerful. It means that Information is no longer captive to a single application but can be leveraged across any number of applications.


Services Oriented Architecture - You may think there is nothing new here but this is where major new change are coming. We've always considered applications services as a part of this construct but that all interaction with data/information will occur at the SOA layer. Applications and users will receive and store their information by interacting with Information Services. These services will provide the protection, archiving, compliance, security, and other capabilities as a service. A single application will no longer “own” data. Information will exist as an independent element that can be managed independently and used by any authorized application. Combined with delivering resources, this creates a Services Oriented Infrastructure.


Composite Apps built without code – Within the services framework complete applications are simply connected with workflow (BPM) tools just like working with Visio. Composite applications are built by coupling information, security, application, and other services together in a prescribed way.


Model-Based Management Provides Orchestration or Resources and Services – To pull all of these capabilities together we need management. Traditional framework-centric management is just not going to cut it, however. Today’s management technologies simply can’t handle the virtual, dynamic, and complex environments that will be constructed. This is where model-based management comes in; it will transform how we think about management. Simply, model-based management will provide the orchestration necessary to deliver highly reliable and scaleable systems across these complex environments.


Virtual Appliances as preferred delivery model for Application Services As all interaction and communications between application services and information services will operate at the SOA layer, many of the complex, driver-centric functions that exist within today’s operating systems will simply no longer be utilized. Base operating environments will exist principally to provide a compute environment for applications. Hence, we will start to see more Applications embed base OS and other base capabilities directly with their offerings. This will simplify integration, test, security, delivery and support. We are seeing major examples of this today – for example with Oracle’s recent embedded Linux announcement.