- Sales & CRM
- Business Intelligence
Cloud security is all up in the air. The issue of security in new virtualised cloud models has, if anything, overshadowed what might have been an even more meteoric ascendency had service-based data uncertainties not existed in the first place.
The problem here is that the marketplace is too fluid and doesn’t know its left foot from its right. The Cloud Security Alliance (CSA) after all exists to “promote the use of best practices” for providing cloud security assurance. This self-stated remit “to promote” exists rather than any more actionable purview with a promise to sanction (and therefore endorse) a solid cloud security assurance standard.
But is the problem here the fault of the security bodies, or does the responsibility lie with cloud service providers as they exist today? Perhaps we should even we lay the blame at the fault of the commercial organisations and individuals who are worried about data security in the first place. Can we suggest that some of these misgivings come from an erroneous and possibly even invalid perspective?
Does it actually matter if the US government can access our personal cloud files? As previously reported on Cloud Pro, this puts us in roughly the same amount of danger of wider data loss as we might be at risk of in a traditional local data centre or corporate network environment. Will companies find their cloud-based accounting ledgers are hacked, copied and then passed off as some other firm’s work? Again as discussed on Cloud Pro, this is “possible but improbable” in reality.
A distributed apportionment of blame
Whatever the real severity of these threats, a much needed set of cloud security insurance and assurance standards are required if cloud is to fulfill its widest potential. If we can move forward with some distributed apportionment of blame, then perhaps we can start to lay down some responsibilities.
Cloud providers themselves are generally happy to tell us that they are there to provide cloud access with application processing and storage functionality. That is, they are not security specialists, so customers should direct the same level of security controls to their cloud data as they would to any terrestrial data.
So what are bodies like the aforementioned CSA, the European Telecommunication Standards Institute (ETSI) and the European Network and Information Security Agency (ENISA) doing to promote and support EU-wide certification schemes for trustworthy cloud computing - and do the cloud providers follow current developments?
ETSI confirms that there is a "jungle" of existing cloud standards out there to navigate through. The organisation says it is working to identify those standards worth adhering to based upon their ability to deliver on three factors: interoperability, data portability and reversibility.
It is this data portability element that may prove to be the most telling and yet most basic factor. Akin to and reminiscent of Mac vs PC users not able to open Microsoft Word documents between each others machines, the most fundamental facilitating technology behind cloud data portability is an exchangeable document format. So it is that the Open Virtualisation Format (now at version OVF 2.0) has come about.
Cloud-standards.org describes OVF as, “A specification to describe an open, secure, portable, efficient and extensible format for the packaging and distribution of software to be run in virtual machines.”
The OVF exists as part of the work that has come out of the Cloud Standards Customer Council, a group formed by founding enterprise members including IBM, Kaavo, CA Technologies, Rackspace & Software AG. More than 400 of the world's leading organisations have joined this council including Lockheed Martin, SAP, Citigroup and Fujitsu. That’s great news isn’t it? No, it’s not. A mere 400 firms is a drop in the ocean if cloud computing safety standards are going to be ironed out, as big as these founding a member firms are.
No visible signs of accord
There is a real sense that things are getting crowded with no visible signs of accord and solidarity. In addition to the already mentioned handful of working bodies, community initiatives, special interest groups and perhaps even sponsored walks all focused on cloud safety certification, there are also the following factions to consider:
Clearly we have some way to go on the road ahead towards the standard secure certified cloud dream. But where did all the security concerns start?
Cloud computing has been mired with so-called FUD (Fear, Uncertainty and Doubt) ever since high-profile incidents such as the Chinese hacker on Google’s infrastructure in late 2009. Google withdrew from China at the time. And that kind of stuff makes headlines. Enterprise attitudes have evolved considerably since that time, but widespread pockets of caution still exist across this still seemingly fragmented landscape.
The problem is that public clouds are not inherently engineered with an architecture that speaks fluently of robust authentication and encryption technologies. This is not their initial raison d'être, they were born to connect, host, process, store and repeat until fade. Where public clouds do exhibit security functionality it is in the related areas of ID federation, API security controls and virtualisation privacy; all of which need to mature beyond their state as we stand in 2013 if they are to contribute more tangibly to the our security debate.
Ovum senior analyst for software IT solutions Laurent Lachal points out that based upon the resources at their disposal in terms of technology and expertise, public cloud providers claim that they are “at least as secure” as most internal datacentres.
Specialisation, homogeneity, automation and centralisation
“They point out that the specialisation, homogeneity, automation and centralisation that public clouds offer increase security. They can remain tightly focused on their particular offering, which they can easily and rapidly update or upgrade in case of a problem, whereas enterprise IT staff have to be more general,” Lachal says.
“Providers also rightfully assert that, while some public clouds have been used as platforms for malware, they are mostly used as platforms for a new generation of security-related services (such as identity management services, disaster recovery, antivirus and application security testing) to secure both private and public clouds,” he adds.
In a whitepaper entitled Cloud-computing quality of service in perspective, Lachal lays a weight of accountability upon the shoulders of a variety of players throughout the cloud stack. While IaaS service providers should shoulder responsibility for the security of the compute, storage and network resources they provide, the Ovum analyst then reasons that customers are in charge of the rest – namely, securing virtual images and their content (operating system, applications and data).
According to Lachal’s apportionment of responsibility: “Developers need to build their virtual stacks carefully, removing any unnecessary software and opting for a closed configuration policy by default. System administrators need to keep an eye on their applications to detect any sudden changes of traffic pattern that might indicate a security breach.”
Taking these arguments full circle, we might also argue that an increased level of national level cloud security stipulations should be put in place by countries to ensure that firms operate within controllable data parameters.
If this higher-level control can be matched by Lachal’s law of shared cloud security responsibility, then perhaps then we integrate with the pic-and-mix assortment of still-emerging standards in this space to eventually lock down cloud security.