- Sales & CRM
- Business Intelligence
The Internet of Things could spell doom for the datacentre, with enterprises and service providers struggling to match their capacity with the possible untold potential it holds.
By 2020 there will be around 5,200 GB of data for every person on the planet, according to CIO strategy advisor Tim Crawford. In the future, the smallest amount of data that will be discussed will be the petabyte, not the gigabyte or megabyte, he added.
“Data is a huge problem”, he told journalists at Monaco’s Enterprise Cloud Forum. “Our traditional datacentres, platforms and applications simply can’t handle the Internet of Things.”
The prospect of the Internet of Things poses a problem, said Crawford, that organisations are already encountering today, requiring them to rapidly reorganise their services and platforms.
Pointing at the US regional airline Southwest Airlines, he said the engines alone of the company’s 600 Boeing 737s spin off around 256 PB per day from more than 5,000 onboard data points. This information is imperative for the modelling and analysis of engine performance, but requires CIOs to sift through an enormous amount of output.
“If you can’t leverage data using visualisation and automation, you’ll already be behind the curve,” he added.
It's not all doom and gloom, though. Datacentres survived virtualisation; they can survive this, Andrew Roughan, product and marketing director of London-based datacentre provider Infinity SDC, told Cloud Pro.
“I think the market is organic enough to react to new challenges,” he said. “Ten years ago virtualisation was just an idea and now almost all mature enterprises will be 50-70 per cent virtualised.
“Datacentre requirements have gone from 100 per cent to 30 per cent almost overnight, which has freed up a large amount of space we can use in order to handle whatever the Internet of Things produces," he added.