Do cloud applications need an operating system?
Microsoft claims that Windows Server 2012 is the definitive cloud OS; Red Hat claims the same but is the term outmoded?
There is a theory within software application development circles stating that developers programming for the cloud should rethink the way their apps integrate with (and ultimately depend upon) the underlying operating system.
Actually, that sentence should read: developers should rethink the way their apps integrate with (and ultimately depend upon) the underlying operating system, OR NOT.
But would a cloud application work without an underlying operating system to support it?
If we accept that the cloud model disaggregates traditional computing structures into independent units with the power for discrete, controlled, selectively manageable deployment; then a new opportunity for dynamism must surely exist.
Would a cloud application work without an underlying operating system to support it?
If we also accept that a cloud application might not necessarily need to be positioned via the host machine, (the hypervisor – virtual machine – deployment route) then we couldt imagine some type of management software controlling the host processor and resources (rather than the hypervisor) and "bundling" the app to a state of cloud deployment with access to the functional resources that it needs.
In the above conceptual scenario there is no cloud operating system and no virtual machine.
Software in this environment could also be engineered to share a symbiotic cross-application interdependency. The cloud itself would ultimately become disconnected from legacy structures of old, so much so that much vaunted agility and flexibility promises are finally and completely delivered upon.
Of course this is not what is happening right now and the Linux Foundation reports healthy sales of Linux based servers in its Linux Adoption 2012 survey. These are companies buying servers, building towards the cloud and using an operating system. Simple.
The Red Hat Enterprise Linux team sits (unsurprisingly) in the ‘cloud needs traditional operating system’ space. The team blog reads, “Linux is the original cloud OS. Its popularity as a cloud architecture remains high today, and we expect it to continue to be the cloud OS. Unlike competing OSes, Linux was born on the Internet, making it a natural fit for cloud.”
But could a shake up of immense proportions actually occur? Could the cloud operating system crumble away as applications become a sort of all-powerful focus for both enterprise and personal use (if they haven’t already that is)? Could this mean that our cloud application structures start to spawn their own virtual deployment life support systems?
For an answer, let’s look at the market.
Windows Server 2012 was officially announced back in September 2012 and Microsoft’s Satya Nadella described it at the time as a "cornerstone" of the cloud OS with Azure to provide one consistent platform across private, hosted and public clouds.
"The operating system has always been the heartbeat of IT and is now undergoing a renaissance in the new world of continuous cloud services, connected devices and big data," said Nadella.
Microsoft has claimed to have a "unique legacy" operating systems, applications and cloud services. But unique legacies don’t always count for everything, especially in the fast moving cloud computing market.
The cloud is a radical space of still-nascent mould breaking new architectures is it not? We’re just sayin’ …