Enthusiasts have been water-cooling PCs and even enterprise servers for years.
So surely water cooling an entire data centre by dropping it into the ocean, can’t really be that ridiculous of an idea?… can it?
Well, it’s not.
Because that’s exactly what Microsoft did when they deployed Project Natick Phase 2 off the coast of the Orkney Islands in Scotland.
I was intrigued and naturally wanted to learn more, to understand:
- What are the technical and ecological benefits?
- Is it a viable business model?
- Can it become a repeatable solution?
… and those are the questions that this blog post will aim to answer.
With some help from Jules Verne…
Twenty Thousand Leagues Under the Sea
Something instinctively stood out for me. I couldn’t help but draw parallels to Jules Verne’s Twenty Thousand Leagues Under the Sea. The pivotal character in this well-known science fiction novel is known to us as Captain Nemo. Though he was later identified as Prince Dakkar.
Prince Dakkar was an Indian Prince who journeyed to the depths of the ocean in his submarine – The Nautilius.
The parallels between the Prince, his submarine and Project Natick are central to this post. It’s not just the commonality between the names – Nautilius vs. Natick – but the parallels in the underlying political motivations that drove the creation of both projects:
- Captain Nemo’s mission was driven by his aversion towards imperialism, whereas the social injustice of the British Empire was his primary antagonist
- Project Natick’s mission is driven by our aversion towards global warming, whereas our ever-increasing carbon footprint is this project’s primary antagonist
Both creations were born from a drive to change the world, deliver benefit to those around them and leave a lasting footprint on the globe.
And where did they both exist? In the depths of the ocean.
Why does Project Natick exist?
- Data centres have a global annual energy consumption of between 200TWh to 500TWh – that’s quite a range, but it covers the disparity in the reporting and estimation
- This represents between 1% to 2.5% of the world’s energy consumption, which is between 0.3% to 0.5% of the worlds carbon emissions footprint
- When you fold the lower end of these estimates into the entire ICT sector – networking, digital devices, televisions and cellular comms – this industry today accounts for approx. 2% of global emissions. This is equivalent to the carbon output of the airline industry!!!
Cooling a vast array of servers, storage and networking equipment is the largest energy burn for data centres.
Which is why Project Natick exists; to deploy data centres in locations where the requirement for cooling is not only reduced but eliminated and where power can be sourced from renewable means.
Future Energy Projections
Our data centre energy consumption is bound to increase but there are two schools of thought here:
- Data centre providers argue that compute is becoming more efficient and energy demand will be steady as our data requirements grow
- Environmentalists are projecting an 8-fold increase in power consumption in as little as 5 years
It’s not surprising why there is a polarised view between these two communities. It’s also not surprising where there is such a disparity in the current actuals.
However, whatever side of the spectrum you lean towards, it’s largely irrelevant. Just measuring ourselves against today’s emissions – even at the lower end – is enough to justify why we need to act now.
What’s the specification of the submarine-like vessel?
- The unit compromises of two core components, a pressure vessel and a subsea docking structure
- It’s the approximate size of an ISO shipping container, the ones that we typically see on the back of a lorry
- The payload in this pressurised vessel is 2 racks with 864 standard Microsoft data centre servers and 27.6 Pb of storage
- It has a maintenance-free life span of 5 years and has the data centre designation of Northern Isles – SSDC-002
What are the environmental benefits of the Project?
- It’s purposefully positioned in the EMEC – European Marine Energy Centre – around the Orkney Islands
- The EMEC is the world’s largest site for wave and tidal based power, so the data centre runs entirely from 100% renewable energy
- The vessel uses a saltwater cooling system adapted from a submarine
The operational power demand for the project is entirely carbon neutral.
What are the business benefits of the Project?
Business Benefit #1 – Latency
With more organisations moving and deploying services into the cloud, the physical distance from networking hubs/offices to cloud-based applications or data stores can be problematic:
- The fibre optic cables that transmit data are limited to the speed of light. The further the data has to travel the longer it takes.
- You might think, why anything on earth needs to travel faster than the speed of light? Well, many data scenarios; especially synchronous data processing, where data writes must be acknowledged at the receiving end before processing can continue, can be performance limiting.
- The advent of Machine Learning and Artificial Intelligence coupled with the fact that 50% of the world’s population lives by the sea, will increase the demand for data to be physically closer to people and remote devices.
Business Benefit #2 – Time to Deploy
This is where the story gets really interesting. This project took only 90 days to build and drop into the ocean. Deploying a whole data centre in less than 3 months is an incredible turnaround time. It’s often taken me 3 months to just get servers purchased, racked and deployed into an existing data centre. This solves two problems for cloud providers:
- Planning consent is likely to be quicker and easier to achieve in comparison to the planning and build of a new on-site facility
- Acquisition time and variable purchasing costs are eliminated, as many hyperscaler’s have been increasing their cloud footprint by negotiating the procurement of existing data centres
Business Benefit #3 – Reliability
This model is going to drive more focus on reliability and redundancy of hardware. For hardware geeks out there MTTF – Mean Time to Failure – will have to be greatly increased.
- As system architects, we usually design for a 5-year lifecycle
- However, traditional deployments are rarely maintenance-free within that time frame
- When a datacentre in the ocean has a 5-year maintenance cycle, the equipment within must have sufficient reliability and redundancy to avoid re-floating, servicing and resubmerging
- Having to physically maintain the payload within it’s lifecycle is unlikely to be economically viable
- This entirely changes the focus around architecture design and reliability in the minds of engineers, architects and manufacturers
Deployment of such vessels is not limited to cold locations. You only need to drop a container to 200m below sea level – even in tropical climates – to leverage the same cooling benefits.
It opens up a vast gateway into the unknown around data sovereignty. Using the UK as an example The Crown Estate can only exercise territorial jurisdiction of up to 12 nautical miles.
The pandemic has shifted our use of real cash to digital and Bitcoin alone consumes 0.33% of global electricity. If the trend towards digital currency continues – which I’m sure it will – then this is another likely requirement that will drive energy demand upwards.
Is there a future business model here?
However, if this data centre concept is going to be economically viable, then the payload will need to stand the test of time. Redundancy and reliability for a maintenance-free operation is a key requirement.
Ironically, I drafted this post before Microsoft floated the vessel last week. Though Microsoft stated a lower failure rate compared to equivalent land-based deployments, it’s far too early in the lifecycle of this programme to make predictions. There are no other independent studies – not at this scale – and there are indeed a set of corporate optics that the marketing team at Microsoft must align too.
Captain Nemo’s Nautilus even for a work of fiction was an engineering achievement of epic proportions. Verne himself described the Nautilus as “a masterpiece containing masterpieces”, which itself is a testament to himself. His imagination and scientific foresight was unprecedented at that time.
It’s the same degree of creativity that will drive similar creations to Project Natick. A containerised deployment where cloud providers could allow customers to design and deploy their own subsea datacentres.
Imagine, customising a payload, having it fitted, the vessel screwed shut and dropped at the bottom of the ocean, to only resurface for a payload refit in 5 years? All this in 90 days, with a whole array of opportunities to scale out by simply buying more containers.
This concept as a commoditised product would be an achievement as grand as when Verne penned the Nautilus on paper. It’s an exciting space to watch, not only for the environmental benefits but to support the imminent increase in demand for edge computing.