What Is Virtualization

Virtualization refers to a process whereby key components in computing systems, such as hardware platforms and operating systems, are recreated in a virtual format in order to enable a more efficient use of a system’s resources. Virtualization, as a means of extending the functional capacity of a complex computing system, it is a technique that has been slowly evolving since it was first used in the 1960’s. The mainframe computers of the time used up a vast amount of resources to perform relatively simple tasks, so virtualization was developed as a way to distribute a mainframe computer’s resources more efficiently, allowing for the simultaneous running of numerous applications.

Since those early days, virtualization has developed significantly, in keeping with the manifold changes in hardware, software, and computing technology that have taken place over the last several decades. Today, virtualization is used to increase operational efficiency and lower running costs for complex computing systems in a variety of settings. Virtualization has numerous modern variations, including hardware virtualization, application virtualization, operating system level, storage, network, and server virtualization.

Hardware virtualization is one of the most commonly used techniques, enabling the creation of a virtual, guest computer running on the host machine, which is the physical hardware platform. The virtual guest machine contains a virtual operating system and functions like another real computer. This type of virtualization can be designed to take one of three forms, each of which is marked by varying degrees of virtual simulation. Full virtualization, as the name suggests, refers to a total recreation of the original system, enabling the virtual machine to run independently. Partial and paravirtualization refer to increasingly lower levels of virtual simulation, meaning that most or all guest programs need to be modified before they can be run on the virtual system.

Operating system virtualization is one of the oldest forms of virtual, dating back to the days of mainframe computers, and allows for multiple operating system applications to be run simultaneously off one piece of hardware. Network, storage, and server virtualization all constitute relatively recent developments in virtual techniques.

Network virtualization separates the total bandwidth available to networks into distinct channels which can be rotated for use by different servers or guest devices at different times. Network virtual therefore increases the functional reach and capacity of a network, enabling more users to access it at once.

Storage virtualization creates a virtual depository where all of the information from a system’s storage devices is centrally administered. It is important to note that although storage virtualization presents the appearance of creating a central storage bank, in reality, all of the data is still stored in the various storage devices – storage virtualization simply renders all of that data centrally accessible.

Server virtualization creates virtual servers in a network setting, enabling greater server capacity and masking the system’s complexity from users.

Virtualization is an important aspect of systems set-up and management in corporate settings, industrial settings, and in applications such as game console emulation. Without virtualization, the vast majority of today’s complex network dependent computing systems would be unable to function efficiently and would prove to be a serious drain on our increasingly limited resources.