Infrastructure virtualization allows legacy software and hardware to emulate other software or hardware. It began in the mid-1960s with the creation of a “virtual memory” system and a “machine emulator.” Virtualization took off in the early 21st century with comprehensive server virtualization options. Virtualization is used for operating system emulation, virtual desktops, and virtual servers. The first open source hypervisor was released in 2003, enabling monitoring of multiple operating systems running simultaneously on a single machine. Virtualization is common in the corporate world, and home computer users often encounter it through web portals and virtual operating system wrappers.
Infrastructure virtualization is all about using legacy software and hardware to emulate other software or hardware. Virtualization was born with modern computers, but it really took off in the early 21st century with comprehensive server virtualization options. Almost any piece of hardware or software can be virtualized, making the scope of this technology very broad. Some of the more common applications of infrastructure virtualization are operating system emulation, virtual desktops, and virtual servers.
The technology that eventually became infrastructure virtualization began in the mid-1960s. Initially, virtualization had two main goals, the creation of a “virtual memory” system and a “machine emulator” capable of running software designed for other computer platforms. While several companies have achieved these goals, results have been mixed, and research into virtualization has continued.
The middle years of the research had their ups and downs. True virtual machines were developed in the mid-1970s. These were programs that so closely mimicked entire computers that it was possible to run software through them. These early virtual machines rarely had enough horsepower to run applications of any real size or complexity, but the technology had enough potential that several companies pursued research. Infrastructure virtualization research in the 1980s and 1990s brought many improvements but few breakthroughs.
In 2003, the first open source hypervisor was released. This program enabled monitoring of multiple operating systems running simultaneously on a single machine. While virtual machine monitors have been around since the mid-1980s, this program was free, full-featured, and powerful. With this software and multi-core processors, it was possible to run multiple virtual servers on almost any real server with little reduction in overall power. By running virtual servers, a company could reduce energy costs and increase the overall power of its network.
In the corporate world, virtualization is often the norm. Server rooms may have machines running a half dozen or more virtual servers with little or no reduction in speed or power. Virtual desktops have replaced the need for a computer for every worker. Instead of having your own machine, a single version of a base machine copies itself over the network and gives the worker access to a virtual computer. All of your information is stored on a central server.
Home computer users come across infrastructure virtualization all the time, even though many are unaware of it. Programs that run through web portals without any form of installation are usually virtualized, often to reduce transmission lag and improve performance. Applications originally designed to run on one operating system (OS) can now run on multiple systems thanks to virtual operating system wrappers. These programs run within a host program and convert input and output to that of the appropriate operating system. This is especially common when moving PC-based games to other systems.
Protect your devices with Threat Protection by NordVPN