[By Alix Willemez]
Do you know where your computer data is stored? Previously, servers were located within companies. According to Didier Renard, director of Cloudwatt, a French company specialized in cloud computing founded in 2012 by Orange and Thales, “in the early 1990s, when computer engineers were making the network architecture diagrams where you had to place equipment such as servers, they drew a cloud whenever they rose out of corporate networks.”
What was then the exception has now become the rule. In most cases, IT infrastructures are outsourced. We save all our documents online on sites like Google, iCloud, Dropbox etc. For a few euros, you can buy storage megabytes. The cloud is clearly essential to the functioning of the Internet.
Unfortunately, these outsourced data centers contain many servers that generate a lot of heat. If it is too high, servers fail to function properly. That is why they are usually located in very cold places like Alaska and Finland (Google) or Sweden (Facebook).
Image courtesy ABB
Facebook also created a data center in Clonee, Ireland, where there is significant wind energy resources and Google based its servers in Hamina, Finland, where the company uses sea water for cooling.The goal is always to reduce energy costs to cool the servers.
These data centers also consume a lot of electricity. Indeed, they facilitate strong economic growth, because they are also large consumers of electricity for their operation and cooling. Those located in France account for 9% of the total electricity consumption. In the US, data centers owned by Google and Facebook consume as much power as a city of 250,000 inhabitants.
The Natick Project and the Leona Philpot Prototype
Eager to reduce energy consumption in its data centers, market leader Microsoft has been working for three years on a project called Microsoft Natick. This is to determine the technical feasibility of a new type of data center that could be submerged at the bottom of the ocean. A first prototype of 17 tons, the Leona Philpot (named after a character in the Xbox game Halo Nation), has been immersed in over thirty meters off California. The idea began in February 2013 Sean James, a Microsoft employee and a former submariner in the US Navy. This article then caught the eye of Norm Whitaker, another Microsoft executive who had served in DARPA (Defense Advanced Research Projects Agency). In 2014, Whitaker created a team within the NEXT unit of Microsoft (New Experiences and Technologies) to launch the project Natick. This represents a technological and energy challenge.
During deployment in 2015, the Leona Philpot prototype was powered by onshore electricity, but now the plan is for it to harness tidal energy and wave energy in order to supply electricity. In addition, at thirty meters deep, the cold surrounding seawater would lower its temperature and thus prevent overheating.
Finally, these data centers would be located near the coast, where most of the world population lives (50 percent of the US population lives within 200 kilometers of the sea), which would accelerate computer data transmission. This is why Leona Philpot was located one kilometer off the coast between August and November 2015.
Microsoft’s aim is therefore to diminish the distance and thus the latency-that is to say the time taken by certain data to get from the source to destination-between the place where the data is stored and final users.
Project Natick; image courtesy Microsoft
According to Whitaker, it would be possible to deploy a submarine data center within 90 days versus two years for a land-based data center. This could allow a rapid response to demand, particularly during natural disasters or when organizing a large event such as the World Cup.
After a deployment cycle of five years, which corresponds to the life of the computers it embarks, the Leona Philpot would be removed, the computers renewed and the prototype would be returned to water. The submarine data center could then run between ten and twenty years without the need for on-site personnel. It would then be recovered and recycled. But making data center fully resistant to the marine environment presents major technical difficulties.
What could Microsoft do in case of failure, breakdown or leak in the prototype? We must also take into account the usual parameters such as currents, corrosion, wildlife, marine traffic, pressure, humidity, etc. Leona Philpot is well equipped with a hundred sensors to track the daily state of the data center and computers in it, but the slightest incident could turn into a real rescue mission.
Testing of the prototype will be extended. This first conclusive test has allowed Microsoft to launch the construction of three similar prototypes. Microsoft plans further tests in 2016, particularly in Florida and the North Sea. The research group is designing a submarine system that will be three times larger and will be coupled to an alternative energy system that has not yet been chosen.
What Environmental Impact?
The Microsoft team says that marine life has quickly adapted to the presence of the prototype. But some are concerned about whether the presence of data centers in the ocean could warm it up. The Natick program could perhaps help Microsoft get back in the ranking by Greenpeace on the environmental impact of data centers.
Alix Willemez previously served as a French Navy’s Deputy Bureau Chief for State Action at Sea, New Caledonia Maritime Zone and as a policy advisor to the New Zealand Consul General in New Caledonia. She now works in the marine renewable energy sector and is currently writing her PhD on the laws regarding the exploitation of marine energies and deep sea minerals.
This article appears courtesy of CIMSEC, and may be found in its original edition at http://cimsec.org/an-underwater-cloud/23466.