The portable cloud

By | October 15, 2015

In late 2012 I constructed myself a bare bones cluster of a couple of motherboards, stacked up and powered, to be used as a dev cloud. It worked, but was a huge mess on the table, and it was certainly neither portable nor quiet. That didnt mean I would not carry it around – I did, across the atlantic a few times, over to Asia once. It worked. Then in 2014 I gave the stack away. Which caused a few issues, since living in a certain part of London means I must put up with a rather sad 3.5mbps adsl link from BT. Had I been living in a rural setting, government grants etc would ensure we get super high speed internet, but not in London.

I really needed ( since my work pattern had incorporated it ), my development and testing cluster back. Time to build a new one!

Late summer last year the folks at Protocase kindly built me a cloud box, to my specifications. This is a single case, that can accommodate upto 8 mini-itx (or 6 mini-ATX, which is what i am using ) motherboards, along with all the networking kit for them and a disk each. Its not yet left the UK, but the box is reasonably well traveled in the country. If you come along to the CentOS Dojo, Belgium or the CentOS table at Fosdem, you should see it there in 2016. Here you can see the machine standing on its side, with the built in trolley for mobility.

Things to note here : you can see the ‘back’ of the box, with the power switches, the psu with its 3 hot swap modules, the 3 large case cooling fans and the cutout for the external network cable to go into the box. While there is only 1 psu, the way things are cabled inside the box, its possible to power upto 4 channels individually. So with 8 boards, you’d be able to power manage each pair on its own.

Box-1

Here is the empty machine as it was delivered. The awesome guys at Protocase pre-plumbled in the psu, wired up the case fans ( there are 3 at the back, and 2 in the front. The ones in the front are wired from the psu so run all the time, where as the back 3 are connected as regular case-fan’s onto the motherboards, so they come up when the corresponding machine is running ) – I thought long and hard about moving the fans to the top/bottom but since the machine lives vertically, this position gives me the best airflow. On the right side, opposite from the psu, you can see 4 mounting points, this is where the network switch goes in.
Box-2

Close up of the PSU used in this machine, I’ve load tested this with 6x i5 4690K boards and it works fine. I did test with load, for a full 24 hrs. Next time I do that, I’ll get some wattage and amp readings as well. Its rated for 950w max. I suspect anything more than 6 boards will get pretty close to that mark. Also worth keeping in mind is that this is meant to be a cloud or mass infra testing machine, its not built for large storage. Each board has its own 256gb ssd, and if i need additional storage, that will come over the network from a ceph/gluster setup outside.
Box-3

The PSU output is split and managed in multiple channels, you an see 3 of the 4 here. Along with some of the spare case fan lines.
Box-4

Another shot of the back 3 fans, you can also see the motherboard mounting points built into the base of the box. They put these up for a mini-itx / mini-ATX as well as regular ATX. I suspect its possible to get 4 ATX boards in there, but its going to be seriously tight and the case fans might need an upgrade.
Box-5

Close up of the industrial trolley that is mounted onto the box ( its an easy remove for when its not needed, i just leave it on ).
Box-6

The right side of the box hosts the network switch, this allows me to put the power cables on the left and back, with the network cables on the right and front. Each board has its own network port ( as they do.. ), and i use a usb3 to gbit converter at the back to give me a second port. This then allows me to split public and private networks, or use one for storage and another for application traffic etc. Since this picture was taken, I’ve stuck another 8 port switch on the front of this switch’s cover, to give me the 16 ports i really need.
Box-7

Here is the rig with the first motherboard added in, with an intel i5 4960k cpu. The board can do 32 gb, i had 16 in it then, have upgraded since.
Box-8

Now with everything wired up. There is enough space under the board to drive the network cables through.
Box-9

And with a second board added in. This time an AMD fx-8350. Its the only AMD in the mix, and I wanted one to have the option to test with, the rest of the rig is all intels. The i5’s are a fewer cores, but overall with far better power usage patterns and run cooler. With the box fully populated, running a max load, things get warm in there.
Box-10

The boards layer up on top of each other, with an offset; In the picture above, the intel board is aligned to the top of box, the next tier board was aligned to the bottom side of the box. This gives the cpu fans a bit more head room, and has a massive impact on temperature inside the box. Initially, I had just stacked them up 3 on each side – ambient temperature under sustained load was easily touching 40 deg C in the box. Staggering them meant ambient temperature came down to 34 Deg C.

One key tip was Rich Jones discovering threaded rods, these fit right into the mother board mounting points, and run all the way through to the top of the box. You can then use nuts on the rod to hold the motherboard at whatever height you need.

If you fancy a box like this for yourself, give the guys at Protocase a call and ask for Stephen MacNeil, I highly recommend their work. The quality of the work is excellent. In a couple of years time, I am almost certainly going to be back talking to them about the cloudybox2. And yes, they are the same guys who build the 45drives storinator machine.

update: the box runs pretty quiet. I typically only have 2 or 3 machines running in there, but even with all 6 running a heavy sustained load, its not massively loud, the airflow is doing its thing. Key thing there is that the front fans are set to ingest air – and they line up perfectly with the cpu placements, blowing directly at the heat sinks. I suspect the top most tier boards only get about 50% of the airflow compared to the lower two tiers, but they also get the least utilisation of the lot.

enjoy!

4 thoughts on “The portable cloud

  1. Sean Hull

    This story gives me warm fuzzies… a few years back in 1992, right around when Linux was first released, I was all excited in a similar way. I bought all the parts to build a 486 tower, graphics card, motherboard, memory cards & IDE drives. Keyboard, monitor, even an optical mouse which were cool at the time because they felt like you were sitting at a sun workstation. This was at home!

    I remember putting all this together, and loading the first floppy disk into the thing. Did I image these disks properly? Will it really load something?

    Up comes the bios and sure enough it’s booting off of the floppy drive. Who, mother of god!

    From there I had init running, and soon the very seat of the soul, the Unix OS itself. That felt so darn cool.

    After that you might spend a week configuring x-windows, but to have a GUI seemed like the mission impossible. And then you’d go about tweaking and rebuilding your kernel for this or that.

    So thx for your story Karanbir. It’s a great one!

  2. Thibault

    Very cool stuff, but I’m a bit disappointed, I was expecting more pictures of the beast, once mounted! Please, post some 😉

    1. Karanbir Singh Post author

      I have a slight problem – the way things are setup right now, the top tier of the motherboards in there are unreleased / nda’d hardware 🙂 so putting pictures of that up is going to cause some problems. I hope to have this resolved later in the year and will add a lot more pictures.

      regards,

  3. Zdenek Sedlak

    A nice box, I believed the multi-mobo cases almost disappeared, but this one looks a pretty good 🙂

    Got some thoughts where to put my OpenStack home lab 🙂

    Thanks for sharing the information…

    //Zdenek

Comments are closed.