Building A Water Cooled Raspberry Pi 4 Cluster

A couple of weeks ago I built a single water cooled Raspberry Pi 4 just to see how well it would work. This was obviously crazy overkill for a single Raspberry Pi, but it isn’t actually why I bought the water cooling kit. I bought it along with 7 other Raspberry Pi 4Bs so that I could try building my own water cooled Raspberry Pi 4 Cluster.

Here’s my video of the build, read on for the write-up:

While water cooling a single Raspberry Pi doesn’t make too much sense, water cooling a whole cluster is a bit more practical. The whole system is cooled by a 120mm fan, which is significantly quieter than even a single small 40mm fan. The water cooling system, while expensive by itself, actually costs a bit less than some other cooling solutions, given that I’d have to buy 8 of them.

An Ice Tower is an effective cooling solution for an individual Pi, but they’re quite noisy, and at around $20 each, you’re looking at $160 just for cooling the cluster. The water cooling system was only around $85 for the whole kit, blocks, and additional tubing.

Ice Tower On Raspberry Pi

For those of you who don’t know what a Pi Cluster is, it’s essentially a set of two or more Raspberry Pi’s which are connected together on a local network and work together to perform computing tasks, by sharing the load.

Raspberry Pi Cluster

There is usually one Pi which is designated as the host or master node and it is in charge of breaking up the task into smaller tasks and sending these out to all of the nodes to work on. The master node then compiles all of the completed tasks back into a final result.

The Parts I Used To Build My Cluster

To build my cluster, I got together 8 Raspberry Pis, a network switch, a USB power supply, and then the water cooling kit, cooling blocks and a bunch of network cables, USB C cables, standoffs, and screws to put it all together.

I also used a 3mm MDF board and some wood sections I had lying around to make up the mounting board and frame.

Building The Raspberry Pi 4 Cluster

Making & Assembling the Cooling Block Brackets

I started off by making up the 8 acrylic brackets to hold the cooling blocks in position over each Pi’s CPU.

Liquid Cooled Raspberry Pi 4B

These are the same design as the one used previously for my single Raspberry Pi, but are now red to suit the cables and fan.

Laser Cutting Cooling Brackets
Water Cooling Block Bracket

Each bracket consists of two parts which are glued together to hold the cooling block in place.

Gluing Brackets Together
Water Cooling Blocks

I also had to include a spacer to lift the cooling block a bit higher off the CPU so that it clears the surrounding components, otherwise, I’d have to remove the display connector from all 8 Raspberry Pis. I used a bit of thermal paste between the blocks and the spacers.

Spacer Blocks Installed

The cooling blocks were then mounted onto the Raspberry Pis. I started by securing the Pi between some red aluminium standoffs, which would be used to mount the Pi onto the base, and some nylon standoffs for the cooling block to screw into.

Installing Standoffs On Each Pi

The bracket picks up on the hole on the standoffs and clamps the cooling block down onto the Pi’s CPU.

Cooling Block Mounted With Standoffs

I then repeated this 7 more times for the other Pis needed to build the 8 node Raspberry Pi 4 Cluster.

All 8 Raspberry Pi's Prepared With Cooling Blocks

Deciding on the Cluster Layout

The traditional way to build a cluster is to place standoffs onto each Pi and then mount them on top of each other to form a stack. This is the easiest and most compact way to assemble them, but doesn’t really work that well with my cooling block bracket and isn’t all that eye-catching.

Raspberry Pi Cluster

This got me thinking of a way to better layout the Raspberry Pi 4 Cluster so that the cooling water circuit was clearly visible and the cluster was both functional and eye-catching. It would be even better if it could be mounted onto a wall to form a hang-up feature.

I played around with a couple of layout options, considering the placement of the components to minimise cable and tube lengths and trying to maintain some symmetry to keep it looking neat.

Pi Cluster Layout Option 1
Pi Cluster Layout Option 2
Raspberry Pi 4 Cluster Chosen Layout Option

I settled for having four Pi’s on each side of the radiators, keeping the large fan as the focal point in the design. I’d then put the reservoir and pump underneath the radiator to circulate the water through the loop. The Ethernet switch would be positioned at the top of the cluster to feed the network cables down to each node.

Ethernet Patch Leads To Be Used

I’d be connecting the Pi’s to the switch using some red patch leads. I found some red 50cm low-profile leads which looked like they would work well for this application. The thinner leads meant that that excess cable could be coiled up a bit easier and the runs were really short, so conductivity wasn’t a big issue.

USB C Cables and Charging Hub

To power the Raspberry Pis, I bought a high power USB charging hub and some short USB C cables. The hub provides up to 60W, distributed over 6 ports. I couldn’t find a suitable 8 port one, so settled on splitting two of the ports into two.

I’d also have to keep an eye on the power consumption as the official power supply for the Pi 4B is a 3 amp, so a bit more than this hub could supply to each. But, I have also never seen one of my Pis run over 1 amp in practice, even under load. If need be then I could buy a second power supply down the line.

Positioning the Pis on the Back Board

Once I had my layout in mind, I started making the backboard. I positioned all of the major components onto a piece of 3mm MDF and then marked out where they would be placed and the holes needed to mount them.

Checking Clearances With Cables Installed

I checked the clearances required for the cables and then started planning the cooling water tube routing. It was at this point that I realised that having four Pi’s arranged in a square would result in an unnecessarily complex cooling water loop, and I switched to having the four Pi’s in a straight line on each side. With the four in a line, the tubing could just be looped from one to the next along each side.

Change Layout To Two Vertical Lines

I also had to make a decision on how best to run the cooling water loop. If I put each Pi in series then the first will be the coolest in the loop and each will get progressively warmer, with the last one running the warmest. If I put the Pi’s in parallel then they’ll all receive the same temperature water, but balancing the flow rates becomes a problem and it’s quite likely that one or two which are the furthest away would receive little to no flow through them. I decided that warm water was better than no water and I didn’t want to buy 8 valves to try and balance the flow rate between them, so I set out connecting them in series.

Add A Display To The Middle Of The Raspberry Pi 4 Cluster

I also had a gap at the top where there was a lot of spare space, so I decided to pull out an old touch panel which I had used on a previous project. Having a display for the master node meant that I would have a way to monitor the system and even display stats, graphs or diagnostics directly on the cluster.

I then marked out the positions for each of the components on the back board and their mounting holes.

Marking Out Layout On The MDF

Making the Back Board

I decided to cut the corners off of the back board to give it a bit more of an interesting shape. I used a Dremel to cut the board to size, cut the corners off and cut a section out of the middle for the airflow through the radiator.

Use Dremel To Cut The Board
MDF Cutouts Made

To mount the Raspberry Pi’s onto the board, I decided to design and laser cut a small acrylic base to add a red accent and guide the power cable through to the back.

Designed And Cutout Bases For Pi's

Each base consists of a red bottom layer and a black top layer, which were glued together with some acrylic cement.

Glued Bases Together

I also designed a couple of cable and tube management stands to help with the routing of the cables and the tubes.

Cut Out Some Cable And Tube Mounts

I then checked all of the positions of the mounting holes and drilled them out.

Drilled Holes For Mounting Components

I decided to add some wooden sections to the back of the board to stiffen it and to create an area behind the board for cable management and the power supply. I used a spare section 0f 40mm x 20mm pine which I cut into three sections.

Glue Wood Strips Onto Back Of Cluster Board

I made holes underneath the acrylic bases for the USB cables to run through. These aligned with the large hole in the acrylic bases.

Cut Holes For Cabling

To complete the back board, I sprayed the front and back of the board black.

Spraying The Back Board Black

Assembling the Raspberry Pi 4 Cluster Components onto the Back Board

I then mounted the Pis, the network switch and cooling water components onto the back board.

Mounting The Raspberry Pi 4Bs

Each Pi was secured using four M3 x 8mm button head screws, which screwed into the aluminium standoffs.

Pis Secured With Button Head Screws

The cooling water components were mounted with the fasteners which came in the kit.

Mounted Switch And Cooling Water Components

I then cut the cooling water tube to the correct lengths and pushed them onto the fittings.

Cut Water Cooling Tubing Runs
Installed Water Cooling Tubing

I could then start adding the Ethernet and power cables. I started by plugging a USB C cable into each Pi and routing these to the back of the board, where I mounted the USB hub.

Feed Cables Through To Back Of Pi

I then added an Ethernet cable to each Pi, routing these up each side and into the cutout which would be behind the display, before bringing them back out to the front of the board and up to the switch. This meant that the excess cabling could be coiled up behind the board, out of ordinary sight.

Plugging In Network Cables on the Raspberry Pi 4 Cluster

I used the acrylic stand which I had cut out to hold and guide the Ethernet cables.

Use Stands To Secure Network Cables

The last thing to add was the display, which I mounted on an acrylic face panel with some acrylic side supports to hold it in place.

Mounted Touch Screen Display

You’ll notice that I had to mount the master node a bit lower than the others so that I could get the HDMI cable in without clashing with the Pi next to it.

Cable Routing On The Back Of The Cluster

I tied up all of the cables at the back of the cluster using some cable ties and some cable holders which I cut from acrylic and glued into place. The cabling at the back isn’t particularly neat, but it would be out of sight in any case.

I also made up a connector to take the 12V supply from the switch and split it off to power the 120mm fan and cooling water pump. They only draw around 150mA together while running, so there was some extra capacity in the power supply.

RGB Strip For Back Accent Colour

As a final touch, I added an RGB LED strip to the back to create some accent lighting on the wall behind it. The strip has an RGB remote control with a number of colours and features, but I was only going to be using the red LEDs.

Radiator and Fan Components on the Raspberry Pi 4 Cluster

Filling The Cooling Water Loop Up

With all that done, I just need to fill the cooling water circuit and hope that I didn’t drown one of the Pis. I had visions of filling it up and having water pour all over one of the new Pis.

I obviously kept everything turned off while filling the reservoir, although I did flick the pump and fan on twice to circulate the water a bit so that I could re-fill the reservoir.

I turned the cluster onto its side to fill it so that the reservoir was upright.

Filling Water Reservoir on the Raspberry Pi 4 Cluster

Luckily there were no major leaks!

There was one minor slow leak on the inlet to the first cooling block, probably because of the twist and pressure on the tube to get to the radiator. I clamped the tube with a small cable tie and the leak stopped.

All the cooling water loop needed now was some colour.

Adding Some Colour To The Cooling Water

Preparing Raspberry Pi OS On The Pis

I prepared a copy of Raspberry Pi OS Lite on 7 microSD cards and a copy of Raspberry Pi OS on the 8th for the master node which has the display attached.

Burned Pi Os Images To SD Cards

I could then power on the cluster and check that they all boot up. This was done primarily to check that all of the Pis booted up correctly, were able to run on the single power supply and were all recognised and accessible over the network.

I’m not going to get into the software side of the cluster in this post as there are a number of options, depending on what you’d like to do with it and how experienced you are with network computing, but I’ll be covering that in a future post, so make sure that you sign up to my newsletter or subscribe to my Youtube channel to follow my projects.

Running The Completed Water Cooled Raspberry Pi 4 Cluster

With all of the SD cards inserted and the water cooling circuit surviving a 10 minute run with no leaks, I powered up the USB hub to supply power to the Pis.

Raspberry Pi 4 Cluster System Running And Booted Up

The system was initially quite noisy as the air bubbles worked their way out of the radiator and cooling water loop, but it eventually settled down. You can hear an audio clip of the cluster running in the video at the beginning of the post.

Raspberry Pi 4 Cluster Running

The display is also a nice way to run scripts and visualise information or performance stats for the cluster.

Running Scripts On Touch Screen

Here I’m just running the script I used previously to display the CPU temperature of the Pi.

Stats Display on Raspberry Pi 4 Cluster

Have a look at my follow-up post in which I set up the cluster to find prime numbers and compare its performance to my computers.

I hope you’ve enjoyed following this Raspberry Pi 4 Cluster build with me, please share this post with a friend if you did and leave a comment to let me know what you liked or disliked about the build.

Michael Klements
Michael Klements
Hi, my name is Michael and I started this blog in 2016 to share my DIY journey with you. I love tinkering with electronics, making, fixing, and building - I'm always looking for new projects and exciting DIY ideas. If you do too, grab a cup of coffee and settle in, I'm happy to have you here.

27 COMMENTS

    • Thanks Brian! There will definitely be an update soon on how it’s running! I’ll probably do a performance comparison with a PC as well.

  1. Excellent work! Looks like a fun project. For balancing water flow, consider a “reverse return” arrangement — most commonly used in HVAC systems. Would be interesting to see if there is any performance difference between first and last nodes on the loop w/ and w/o

    • That’s a great idea, I’ll have a look at testing that out as well. I don’t think there would be any performance difference, as long as the last node is not near the thermal throttling temperature.

  2. I enjoyed the video, Tried the MPI script on my SLURM cluster and ran it on 4x 7210 Xeon phi nodes (hyperthreding disabled) I got the following results:
    100,000 = 2.05s
    200,000 = 7.40s
    500,000 = 38.52s
    1,000,000 = 123.24s

  3. Hi. We’d love to feature this build in a piece I’m writing for HackSpace magazine. Are we ok to use a picture from this post (with full credit of course).

  4. Hey, this is probably the coolest pi cluster I have seen. Really GOOD work there. A couple of questions:

    Would you have the schematics of the adapter you made to power fan/water pump from the switch?
    Did you also power 8 PIs using a 6 port usb hub?

    Working on my version of this cluster

    • Hi Thiago,
      Thanks for the great feedback.
      I just used a standard 12V 3A power supply (like one for an Arduino) and connected the 12V and GND to the fan, pump and switch individually. There’s nothing really unique to it.
      I did initially (using two USB splitters on two ports), this works to just boot them up and idle but if you push the CPUs then they start having supply issues. I landed up adding a second hub so it’s split 4 and 4 now.
      Good luck with your build!

  5. OHHHH… one power supply providing power to switch, fan and pump. For some reason I thought you had a switch with PoE and was taking power from the ethernet ports. lol

    • Yeah it’s just that simple. The Pi’s becoming expensive if you add a PoE hat to each so I choose to not go this route

    • Hi Assad,
      Being a completely custom-built cluster, it would likely be far more expensive to buy than what its perceived value is.

  6. Hi Michael,
    I would like to discuss the Details. How much it would be cost? With the delivery costs to Germany?

  7. Well until you tell us something about the software its not a cluster. Its just a bunch of Pis sharing a cooling system and they send their metrics to the “master” so you can monitor and graph them.

    Im mainly here for the cluster not the cooling system. 😉

    • There is a follow-up video and blog post which goes through the setup and running of an example script, this is just the build post.

  8. Great video and instructions Michael. I actually was looking for a price to buy this unit assembled. Are there any for this or your other projects?

  9. Hi Michael,

    Just wondering what is running in your cluster today? Have you had a chance to test infoDB? It should be quite useful for all the different projects you run as your central DB for time-based data.

    Thanks,
    Miguel

  10. not wanting to be negative – to be fair the build does look really good: neat and functional – but the blocks you have used are aluminium whereas the radiator isn’t (will be brass or copper … usually). overtime the aluminium will attack the radiator thru the water via galvanic corrosion – although it shouldn’t be overnight thing! i’m looking into WCing my “array” of PIs, and have been into WCing computers in general in the past (your video was another “inspiration nudger” of many i have watched), so I would advise u look out for (and swap to) copper blocks in future if u intend to keeping the loop running (this is my current “hunt item” for the loop components and organisation I’m going with). also highly recommend some additive if the die doesn’t have any to prevent any growths – and always flush once a year 🙂 that’s my only “complaint”: the rest is really good – looks good, colurs are good and I like the symmetry and the overall layout – nice job.

    • Thank you for the feedback!
      The radiator is aluminium as well, it’s not brass or copper.
      Yeah I need to pick up some proper coloured coolant with a anti-microbial and anti-corrosion additives, that’s definitely an issue I’ve noticed on some of my older builds over time.

  11. Have you ever tried to run the High Performance Linpack HPL, I have seen the results for a Turing pi 2 cluster with four nodes of raspberry pi 4Bs and a single raspberry 4B and it would be nice to compare these results with your cluster. I’m thinking about to build a cluster of 24 pi 4Bs for OpenFOAM simulations and maybe the Linpack test on several compute modules like your cluster would be an indicator of performance.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest posts

Which NVMe Hat Is The Best For A Raspberry Pi 5

If you don’t know already, I’ve been selling these 3D printed cases for Raspberry Pi’s online for a few years now. With the launch...

Track Aircraft In Real-time With Your Raspberry Pi Using The FlightAware Pro

Have you ever seen a flight overhead and wondered where it is going? Or seen a unique-looking aircraft and wondered what type or model...

Pi 5 Desktop Case For NVMe Base or HatDrive! Bottom

Today we're going to be assembling a 3D-printed case for the Raspberry Pi 5 and Pimoroni's NVMe Base or Pineberry's HatDrive! This is an...

Related posts