Is the Future of Data Centers Under the Sea?

Several companies are putting data centers underwater. Are these submerged data centers just a novelty -- or is data management really better down where it’s wetter?

Richard Pallardy, Freelance Writer

November 28, 2023

11 Min Read
Air bubbles rise from the bottom of the ocean to the surface.
Valentyn Volkov via Alamy Stock

At a Glance

  • Microsoft staged the first large-scale underwater data center experiment beginning in 2015.
  • The inert gases and liquids used to fill underwater data centers are less corrosive than ambient air.
  • Probably the most compelling factor in favor of siting data centers in the deep is the non-existent cost of cooling.

Energy use by data centers is an increasing concern. From powering the servers themselves to cooling the facilities, these establishments are likely to be an increasing drain on the power grid. Data centers account for anywhere from 1% to 3% of global energy use.

Numerous efficiencies have been devised, from reducing the CPU of servers that are not in use to siting data centers in regions less likely to need artificial cooling to altering the architecture of the buildings that house the equipment. Still, it is estimated that these approaches may only reduce energy consumption by around 10%.

Now, some companies are putting their data centers under the sea. Microsoft staged the first large-scale underwater data center experiment beginning in 2015 with its Project Natick initiative. And several smaller companies have followed suit and have actually begun marketing their services.

Some terrestrial data centers, such as one operated by Interxion in Stockholm, have already experimented with the use of seawater to cool their operations. And immersion cooling is gaining traction -- though it is still energy-intensive to circulate contained pools of water.

While actually putting data centers underwater may sound like a gimmick, this seemingly impractical approach has much to recommend it.

Related:What Are Scope 3 Emissions and How Can Data Centers Address Them?

For one, a limitless supply of cold ocean water is available to cool servers as they churn data -- a process that generates an enormous amount of heat. Cooling is believed to account for around 40% of electricity use. Overheating of servers can significantly reduce their effectiveness and may result in failure. Given that energy costs on the whole amount to some 70% of operating costs in IT, any reduction is a win.

Further, reductions in energy use equate to decarbonization, an increasingly crucial goal for data center operators due to both the importance of environmental, social, and governance (ESG) scores in obtaining investment and looming legislative demands. Underwater data centers are particularly amenable to renewable sources, such as wind and wave energy, that are abundant in marine environments.

Here, InformationWeek investigates the history of these pioneering data centers -- and whether they will be practical in the long-term -- with insights from Owen Williams, a technical manager with underwater data center operator Subsea Cloud.

The First Experiments

Microsoft was among the first to experiment with underwater data centers -- to the tune of approximately $25 million. The idea was first raised in a white paper presented at a 2014 ThinkWeek event, during which employees are encouraged to advance unconventional ideas. It was soon put into action.

Related:Nuclear-Powered Data Centers: Practical or a Pipe Dream?

The company launched Phase 1 of Project Natick in August of 2015. Microsoft submerged a 10-foot-long cylindrical structure containing a single rack of 24 servers off the coast of California near San Luis Obispo. The rest of the space was filled with load trays to generate heat. It was situated 30 feet below the Pacific and retrieved in November of that year. This phase was basically proof of concept -- to test whether the cooling system could actually effectively moderate temperature within the pod.

Phase 2 kicked off in June 2018, when a larger data center was sunk off the coast of the Orkney Islands in Scotland, near the European Marine Energy Centre (EMEC), which tests wave and tidal energy systems.

This data center was much larger than the one used in the initial trial. At 40 feet long by 10 feet wide, the steel cylinder was roughly the size of a shipping container and contained 12 racks housing 864 servers. Constructed by French naval defense company Naval Group, it was submerged at a depth of 117 feet.

The system was powered by a combination of sustainable sources -- onshore wind and solar as well as tidal and wave energy. And the cooling system drew on technology already in use on submarines, pulling cold seawater into a radiator system at the back of the unit and then expelling the warmed water back into the ocean. The container was filled with nitrogen gas, which is less corrosive than oxygen.

Related:One Company's Experience With Data Center Sustainability Certification

The unit was retrieved two years later, in July 2020. While a number of servers failed, the operators claim that the rate of failure was an eighth of that observed in land-based data centers.

The company has declined to comment on its much-anticipated Phase 3. The company provided this boilerplate statement: “While we don’t currently have data centers in the water, we will continue to use Project Natick as a research platform to explore, test, and validate new concepts around data center reliability and sustainability.”

Potential Benefits of Underwater Data Centers

One of the most appealing aspects of underwater data centers is their proximity to large population centers. Around half of the world’s population lives within 125 miles of a coastal area. Situating data centers near coastal population centers would allow for lower latency and more efficient handling of data. This would increase speeds for various digital services.

Perhaps counterintuitively, the servers themselves might also benefit from being dropped in the drink. The inert gases and liquids used to fill underwater data centers are less corrosive than ambient air, leading to longer lifespans for the equipment. The servers are also protected from possible human damage incurred by everyday movement -- people banging into them, dropping them, or accidentally unplugging them.

Placing a data center pod or retrieving it for maintenance is fairly simple, according to Subsea Cloud’s Williams. “Let's say the water is 100 meters deep. It's just an hour an hour job. If it’s 3,000 meters deep, it will probably take five or six hours to get the pod down.”

Further, the vessels that house the servers are modular and relatively easy to manufacture, as are the components that are used to build the cable infrastructure connecting them to the shore. “We do prefer at least a ring net -- that means two points of contact on the fiber optic site just for redundancy,” Williams says.

Ocean environments are also rich in potential sources of sustainable energy. Wind, wave, tidal and even solar plants can provide sufficient energy to power these data centers, reducing greenhouse gas emissions. Some have suggested that purpose-built plants might make them essentially self-sustaining. And backup power from the land-based grid is relatively easy to access through cable connections, already necessary for transmitting data.

Subsea_Cloud_rending_of_one_of_their_pods.jpg

“We do need power out to the units. We take that from land,” Williams expounds. “The perfect scenario for us is to host the data center under a floating wind farm or other renewable source. In those circumstances we can tap off [from land-based power supplies] when there’s high production.”

Probably the most compelling factor in favor of siting data centers in the deep is the non-existent cost of cooling. The ocean is essentially a limitless resource in that regard, particularly at greater depths. Undersea data centers do not draw from freshwater resources needed for other human purposes -- a persistent criticism of land-locked data centers, which sometimes draw directly from potable water supplies. Their water usage effectiveness (WUE) score is zero.

Potential Downsides of Underwater Data Centers

While placing data centers underwater does have clear benefits, accessibility is a potential problem, especially if they are located in deeper waters or areas that tend to experience rough seas. Even when redundancy is built into the system to avoid the need for frequent maintenance, underwater data centers will periodically need to be hauled to the service for replacement of malfunctioning servers and parts.

Depending on the size and shape of the structure, housing an optimum number of servers may also prove challenging. Microsoft’s data centers, for example, were cylindrical, while servers are rectangular. Algorithmically designed solutions have the potential to minimize wasted space.

And though the availability of renewable energy in coastal areas is high, it can fail. Wind farms only work when there is wind. Wave energy is most productive with steady wave action, which may be seasonal. And solar energy waxes and wanes with the extent and intensity of sunlight.

Thus, it is generally most efficient to position these data centers near large urban areas. “We need to know what kind of access they would have to power and whether there is connectivity to broadband fiber optic cables nearby,” Williams claims.

It is also preferable to site them near shore due to jurisdiction. While colonizing the open ocean has its benefits, most clients prefer the legislative protections of placing them in nationally controlled waters, Williams confides. Permitting can, however, be challenging in some cases.

“We have countries that are willing to expedite this dramatically because they understand the benefits,” he relates. “And then you have countries that are sticking to the rules and letting us go through the process. It can take anywhere from two weeks to five years to get proper permission.”

Ocean water temperatures are also inconsistent. Marine heat waves may necessitate the implementation of backup cooling systems. Banking cold water has been proposed as one means of mitigating the potential damage done by an influx of warm water incapable of properly maintaining server temperature. Sensors installed on the data centers would determine when the water in these reservoirs would be deployed. It may also be possible to implement warm water cooling with the use of servers that have thermoelectric cooling mechanisms.

But the potential environmental disruptions caused by even warmer water being emitted during heat waves still need to be explored. Warmer water may contribute to deoxygenation and thus have a deleterious effect on surrounding organisms.

Additional ecological impacts remain unknown, especially since existing underwater projects have been on a relatively small scale. Microsoft’s data center was hauled out of the water encrusted in marine life, suggesting that at least some organisms adapted to the presence of a foreign object in their environment. And Microsoft has patented a data center that might serve as an artificial reef.

In determining the location of a center, the stability of the seafloor must also be taken into account. “Is it an area where there’s relatively low seismic activity?” Williams asks when he and his colleagues are determining an optimum spot for their centers.

Security may be an issue, too. While accessing a data center underwater would be challenging for an average criminal, determined cybercriminals might be able to disrupt service with the right knowledge and equipment. Some have suggested that acoustic attacks might be used to crash hard drives and rattle the data center structures themselves.

Williams says that Subsea Cloud’s data centers are heavily monitored for both human and non-human interference.

“Each of the pods has seismic sensors. There are cameras on all corners. There are lights that we can turn on and off.” If something goes wrong, he says, “We will have a look at the cameras and seismic sensors to see what’s going on.”

Current Efforts to Establish Underwater Data Centers

Though Microsoft appears to have hit the pause button on its underwater projects -- or at least remains tight-lipped about their progress -- several companies are bringing underwater data centers to market.

Subsea Cloud now has several in operation. They have announced the deployment of data centers in the North Sea, near Port Angeles, Washington, and in the Gulf of Mexico. Spokespeople declined to specify exact numbers and locations for all planned facilities.

Unlike Microsoft, Subsea Cloud uses rectangular structures more suited to the shapes of servers and server racks. And rather than nitrogen gas, the structures are filled with an oily, viscous fluid. Some may be deployed at depths of up to 9,850 feet.

“Instead of countering the water pressure with brute force -- thick walls -- we fill the whole pod with fluid,” Williams explains. “Inside the pod, we have the same pressure as the water on the outside and that makes it possible for us to use very thin walls.”

The design, Williams says, obviates the need for additional power to circulate gas, as in the Microsoft design. “In our pod there are absolutely no active parts contributing to the cooling mechanism,” he enthuses.

The company plans on servicing the equipment every three to five years, though it will observe the schedule set by the client. The pods themselves are designed to last around 25 years and can be reused multiple times.

At least one other company has started dunking servers, too. Beijing Highlander Digital Technology sunk its prototype model, containing four racks of servers, in 2021 off the coast of Zhuhai, China. The company uses cylindrical vessels similar to those used in Microsoft’s project. It then established a data center comprising 100 more vessels off the coast of Sanya, a city in the Hainan Islands. It began operating in December 2022. Their first two clients were telecommunications companies, which have reported satisfactory performance.

While this novel technology has yet to be adopted on a wide scale, if these early efforts are any indication, everything from our streaming movies to our banking services may be bubbling up from under the waves.

About the Author(s)

Richard Pallardy

Freelance Writer

Richard Pallardy is a freelance writer based in Chicago. He has written for such publications as Vice, Discover, Science Magazine, and the Encyclopedia Britannica.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights