- Sales & CRM
- Business Intelligence
Research institute CERN is set to open a new datacentre to cope with the volume of data that it has to process and store.
The organisation is looking to isolate the Higgs-Boson, widely regarded as a missing piece in the jigsaw to explain the origin of the universe, and has a Big Data problem. Or rather, it has a massive data problem.
Speaking at the Structure Europe conference, Tim Bell, CERN's head of infrastructure, explained when the Large Hadron Collider (LHC) is operational, particle collisions are filmed by 100 megapixel cameras, taking 40 million pictures a second, leading to the creation of 1 petabyte of data every second.
“Our challenge is that 35 petabytes of that data needs to be recorded every year and the the Large Hadron Collider will double that, he said.
“And the scientists want to keep that data for 20 years,” he added, for good measure.
To cope with the additional data demands, CERN is moving some of its functionality to a new Hungarian datacentre in Wigner, near Budapest.This will provide an additional 2.5mW to the 3mW capacity of the existing CERN datacentre.
The new facility already has about 500 servers, 20,000 computing cores, and 5.5 Petabytes of storage available to CERN researchers.
Bell said by 2015, when the LHC is switched on again, the organisation will be running 90 per cent of its computing in the private cloud.
Bell said there had been three lessons learned in expanding beyond the datacentre: Don't listen to people who said don't be hasty. “You can't do transformation in transitional stages”; move away from silos and treat people as part of the process.
“You have to include the people who check the lights are on in the morning and are the last to leave at night,” Bell added.