Virtualization’s widespread applicability helped reduce vendor lock-in and made it the foundation of cloud computing. Servers started being used more efficiently (or not at all), thereby reducing the costs associated with purchase, set up, cooling, and maintenance. It was the natural solution to 2 problems: companies could partition their servers and run legacy apps on multiple operating system types and versions. This is where virtualization really took off. As companies updated their IT environments with less-expensive commodity servers, operating systems, and applications from a variety of vendors, they were bound to underused physical hardware-each server could only run 1 vendor-specific task. Most enterprises had physical servers and single-vendor IT stacks, which didn’t allow legacy apps to run on a different vendor’s hardware. All the while, virtualization remained a largely unadopted, niche technology.įast forward to the the 1990s. ![]() One of those other solutions was time-sharing, which isolated users within operating systems-inadvertently leading to other operating systems like UNIX, which eventually gave way to Linux®. Batch processing was a popular computing style in the business sector that ran routine tasks thousands of times very quickly (like payroll).īut, over the next few decades, other solutions to the many users/single machine problem grew in popularity while virtualization didn’t. The technologies that enabled virtualization-like hypervisors-were developed decades ago to give multiple users simultaneous access to computers that performed batch processing. While virtualization technology can be sourced back to the 1960s, it wasn’t widely adopted until the early 2000s.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |