France: Virtualization

Virtualisation 2.0 pour les nuls (en anglais)

Issue link: http://hub-fr.insight.com/i/532074

Contents of this Issue

Navigation

Page 13 of 87

6 2013. By 2015, respondents expect to virtualize more than 70 per cent of their x86 servers. Virtualization has clearly gone mainstream in the 21st Century, but why? How did we reach a point where virtual machines and resources are everywhere, propagating like mad? In the Beginning: The Mighty Mainframe The data center as we know it today evolved from the mainframes of the 1950s. Back then, mainframes were housed in large, climate-controlled, secure facilities with sophisticated power and cooling systems. Mainframes are complex, expensive and powerful, and after 50 years, they still play a role in today's computing hierarchy. By the 1990s, many mainframes were being replaced by server rooms where banks of servers are connected, powered, cooled and maintained onsite. Server rooms brought improvements over mainframes, allowing a more modular approach to provisioning resources and handling increased data growth. Like mainframes, server rooms are complex and expensive, and can generate enough heat to warm an entire building. In the 2000s, offsite colocation facilities entered the scene, allowing companies to house their servers and data in multiple locations. It was the beginning of the modern data center: Data started flowing to servers that may have been offsite or onsite, creating a level of abstraction between a company and its data. That trend continues today with cloud-connected, virtualized data centers. Like the early mainframes, today's virtualized data center comes in all shapes and sizes. In fact, it's misleading to refer to 'the' data center as though a single blueprint exists. It doesn't.

Articles in this issue

Archives of this issue

view archives of France: Virtualization - Virtualisation 2.0 pour les nuls (en anglais)