As cloud computing continues to thrive and demand for data grows ever greater, Vicky Glynn, product manager for Brightsolid asks whether Microsoft’s underwater data centre pilot could point the way forward
Microsoft has launched Project Natick, a research project to explore the energy efficiency benefits of storing a data centre under the sea.
Sinking computers into the North Sea sounds almost as barmy as sending a sports car into space.
However, the subsea data storage experiment may potentially offer invaluable insight and intelligence that can inform the future of data storage.
Does this ultimately mean we’ll see an emergence and growth of data centre farms at the bottom of the ocean? Probably not. Or at least, not any time soon. The cost of set-up is likely to be prohibitive on any meaningful scale and it’s unlikely we will have to consider such radical logistics to manage data in future. The development in quantum computing and its ability to process more with less means we will be able to reduce energy and space requirements.
However, Microsoft’s experiment will offer the chance to explore key issues surrounding data storage and could lead to some not-so-barmy, but highly valuable learnings.
The project’s main goal is to examine the energy efficiency benefits of storing data centres under the sea, offering a natural cooling source. This seems simplistic – put something in cold water to keep it cool. However, there could be more to this. The coast off Orkney has been chosen as the location for this ground-breaking project and the island has a growing reputation for renewable energy production. Combine this local knowledge, expertise and infrastructure with Project Natick and we might start to see some interesting developments around using renewable energy sources to cool data centres.
Cooling is a major consideration and cost factor in data centre management. Currently, air conditioning systems are the universal source of cooling for data centres. It’s an expensive and high energy consumption method, but it’s currently the best option available. Building data centres in the sea possibly isn’t going to be a better, less expensive option. However, this project may identify solutions to adapt current technology to make better use of sea water and other renewable energy sources.
Likewise, the subsea conditions in this test will potentially present some useful intelligence on how to make equipment more resilient to challenging, less stable environments, such as oil rigs, ships and other sea vessels. The Internet of Things, artificial intelligence and robotics are creating revolutionary technology for the oil and gas and maritime sectors, among others. Building technology that is better suited to sea and salt water environments, that doesn’t degrade under or near water and doesn’t require hands-on maintenance, will potentially create huge opportunities for several industries.
Geography remains a key concern for customers in the market for data storage and support, to reduce latency and to be nearby to maintain and fix equipment. It will be interesting to see what we can learn from a submerged data centre that will run without maintenance for five years. How will it cope without the ability to switch off and on again?
Game Changer for Data
This is one of the potential challenges of the project, but also one of its biggest opportunities. If the computers break, they can’t be fixed. But what if they don’t break? Could we be looking at the emergence of long-life, highly resilient, self-healing computers? This type of learning will be a game changer for data storage across the world.
As the journey to the cloud accelerates for organisations of all sizes and in all sectors, questions around energy, accessibility, reliability and performance become increasingly important. Whether Microsoft’s experiment proves to be logistically, economically and environmentally sound remains to be seen, but the industry will no doubt be watching with keen interest.
This article is also published on Digit.