My Homelab

9 minute read

This week, I’m discussing my homelab that I use for personal infrastructure development. I’ve had a few folks reach out and ask me about what it is that I did, and how I set it up. I’ll go over each piece of hardware, and give an explanation as to why I chose those parts.

Goals

1. I want to run an entire development environment locally on my systems

This is serious business. I can’t sit around and wait for AWS, Azure, or GCP to maybe spin up vms. I also don’t want to get charged for leaving vms just sitting out there running for extended periods of time that I don’t need. Doing it in-house is faster and a heck of a lot cheaper.

2. That server needs to be quiet.

It’s in my office, so I couldn’t go full-server. Most are loud and unruly, so building my own was the way I wanted to go.

3. I want a single system that can handle all my hobbies and daily tasks.

This was a personal challenge to myself. I wanted it to do it all, and for the most part it does.

4. I want it to sit in a rack.

Hard requirement. Nothing goes on the floor. I have a rack, so I might as well use a rack.

Hardware

Current Specs

Case: Chenbro RM41300-FS81

I chose this case for one simple reason: It could fit an XL-ATX motherboard. Having that extra slot means that I can fit a total of 4x 2 slot PCIE 4.x cards. Secondly, the case looked pretty easy to gut, and had a lot of space for adding new toys.

CPU: AMD Threadripper 3970x

I wanted something fast with high cores. I also considered going with an Intel I9-10980XE, but sourcing one at close to MSRP was close to impossible in early 2020. The 3970x cost more, but was priced right.

Motherboard: Asroc Rack TRX40D8-2N2T

When this board was announced back in May of 2020, it had a few killer features that made me immediately fall in love with it.

  • Integrated dual 10GB and 2.5GB Ethernet
  • Internal USB ports
  • Integrated BMC/KVM/Video out

One feature that’s lacking on threadripper is no integrated graphics, so once esxi booted up, and it passed through to the graphics card, you’d lose console access. This allows you to get it back. Also, the layout is better for rackmount servers, as the chip and RAM are rotated 90 degrees. That’s a very sexy feature to have when you’re blowing air through the motherboard.

RAM: 256GB Corsair Vengeance LPX 3000mhz C16 (8x32gb)

This was really because this was the fastest ram that I could find at the time I started this build that wasn’t prohibitively expensive. Nowadays there’s 4000mhz kits that are maybe 60 dollars more per 128GB set. I’m already invested in this RAM, so I won’t be changing it out. No frills, rock-solid ram.

Graphics Cards

Virtualizing Geforce cards is blocked at the driver level, and while there are workarounds, in my experience, it was very buggy. This system was going to make me lose performance compared to my gaming system, which had dual 2080TIs in SLi. Why not ATI? The VFIO Reset Bug. Pretty simple.

Nvidia Quadro RTX 5000

I wanted the most bang-for-the-buck as possible. The RTX 6000, which would have been equivalent performance to a single 2080ti, cost $3000 for a USED card. This was available on eBay for about $1400. The system needed to perform, but I know I couldn’t sleep at night knowing that I just spent 3 grand on a graphics card that cost more than both of my old gpus. I use this gpu primarily for ML and gaming. For ML, it’s attached to a linux vm that I do my processing on, and for gaming, it’s attached to a windows vm. Simple enough use case. Works great, works simple, and fits in nicely. Even threw a nice water block on it. :)

Nvidia Quadro RTX 4000

I wanted to replace my work laptop with a more powerful linux vm, and migreate my development onto the homelab. My mac was starting to misbehave because of battery problems and I knew I would have to send it out for repair, so I picked up another quadro card. Why reinvent the wheel?

Got it cheap off ebay. It’s about equivalent to a 2070 super, and single slot. Great performance. Added bonus is that I now have 2 workstations running off the same system.

Liquid cooling:

Because why not? I said I wanted it to be quiet, right?

  • EKWB CPU Block - Gotta go with the best here.
  • Corsair XD3 Res/Pump combo - I wanted something low profile that fit in a rack.
  • Soft tubing - because it’s in a rack. You won’t see it anyhow
  • Alphacool Nexxos ST30 radiator
  • Alphacool Eisblock ES Acetal GPU Block - This fits the quadro 5000 series card perfectly!
  • Be Quiet! 2000 RPM Fans (2x80mm for exhaust, 3x120mm for the radiator)- because quiet. I’ve noticed a bit of overheating on the VRMs and I will be replacing the 120mm fans with 3000rpm Noctua fans soon, to move a bit more air.

Power Supply: NZXT C850

Why only 850w? Because the total wattage under 100% load on all parts of the system is only 710w. I used math. Plus it’s a rebrand of the Seasonic Gold, and was actually available at the time I was building this. All those factors made this an easy choice.

Gaming Monitors: 4x 27’ monoprice TN 1440p 144hz monitors

Great monitors for the price. I actually own 6 of them. 2 of them are on my wife’s machine. I have 4 of them in a 2x2 pattern on my work desk. All hooked up via Displayport. The colors on them are fine, the viewing angles are pretty top notch for TN panels and the fast refresh rate makes gaming on them good.

Work monitors: 2x 27’ LG 4k 60hz monitors

4k is great when you’re working. 60hz refresh rate is fine for watching videos and staring at text. 27 seems to be a nice sweet spot. My only complaint is that they aren’t curved. If I do upgrade, I would go with larger, curved monitors for better viewing angles.

USB Peripherals:

USBC from nvidia cards to USB hubs - Everything runs off a usb hub. Not much to say. The built-in usb c port on the gpu passes through great, and allows me to hook up my peripherals without having to mess with passthrough on both systems. I did try a few different pcie usb controllers, but none were terribly stable in esxi This works well enough.

Infrastructure

Rack: Dell 24U rack

I found it on craigslist for 160. Works great.

PDUs: 4x dell 6016 PDUs (5-port)

Found them on ebay for 50 bucks. I used them back when I used to actually build servers. Great PDUs.

Network: 10GB

This was a requirement I built into. Originally built the system out to use local SSDs, but wasn’t thrilled with performance. I ended up moving the disks to a NAS for speed and redudancy.

Router: Unifi UDM Pro

I wanted a high quality 10GB router that I could use to do all the things.I have a 10GB gateway that allows fast network between all my things.

Switch: Mikrotik CRS309-1G-8S+IN

Originally, I was planning on using this as a 10gb router. It’s a 9 port 10GB switch with routing capability. Quite simply, it couldn’t keep up with my infrastructure in routing mode, so I ended up just using it as a switch. I have a single 10-gig uplink between the UDM pro and this switch for traffic.

Wifi Access Points: 3x Unifi Nano-HD Access Points

I mean, if you have a UDM pro, you might as well get the access points too, right? 3x Access points covers my house perfectly. It’s long, L-Shaped, and Stucco. 2 downstairs, 1 upstairs in the loft. Not much else to say. They’re rock solid.

NAS: Qnap TS932X with 9x 1TB WD Digital Blue SATA SSDs and 16GB RAM (Upgrade)

Why did I choose this NAS? I don’t need virtualization, so that cut down the price by about 500 bucks. This had dual 10GB SFP+ ports and was super easy to set up. I use RAID6 for my disk configuration, for a total of 7TB usable. More than enough for gaming and vms. I upgraded the ram as I was seeing problems with the intial 4GB running out of memory during high-write operations. Works great, and great performance for less than half what the intel-based model cost.

Cable Modem: Arris S33

So, why not get a new cable modem that has 2.5gbps network built in? I’ve got gigabit cable, but losing on that top end. Might as well upgrade and actually get that full gigabit speed, right?

The junkyard

This is where things that I originally bought for the project go to die. I may or may not have sold these items. They just didn’t work out or were replaced by better.

Gaming System: i9-9900k, 32GB RAM, Dual Nvidia Geforce 2080ti GPUs

This was the original system I built and used as my gaming box for about 2 years. At one point, it had 6x 1440p screens attached to it. Hell of a chip in that one. I could get it to stay stable at 5.3ghz without messing with the voltage. This has been transplanted into a much nicer case and is currently in use by my spousal unit as her gaming machine, minus the 1 GPU that I ended up selling about a week before the 3000-series announcement came out. This bad boy started to really struggle over 20 vms and I wasn’t able to put more than a maximum of 128GB ram in it. Out with the old, in with the new.

Case: Be Quiet Dark Base Pro 900, rev.2

Great case, if you’re into cases. Terrible case if you’re into fiddling with your stuff after it’s put into the case. Hot, but quiet. The RGB controller ended up dying at some point, and it’s now in the graveyard. Plus, the rackmount case gave me much better form factor. That thing was big and heavy. Loved the case, hated building in the case.

Gigabyte Aorus TRX40 Designare

Great board. Big board. Not a board made for doing server stuff. No IPMI, BMC, or fast networking. When the Asrock Rack board was announced, this one was destined for the yard. Had some cool features, like supporting thunderbolt 3 (which did work in esxi, btw) Ended up selling this one.

Network card: dual 10gbe isci offload card

I can’t remember the model, but it didn’t work very well. It ended up in the pile.

SCSI Controller: LSI 9260-8i

Once I went to the NAS, this went to the yard. Rock solid card, and I would recommend it.

Final notes

As I said above, this was a personal challenge, just to see if I could do it. After breaking my shoulder last year, I needed a project. This definitely fit the bill for that. It’s faster than I could have ever imagined, I can throw more at it than ever expected, and more powerful than I could have ever dreamed. I’m not recommending you spec your system out anywhere near this. You can get better performance for under $2000 if you’re only looking to game. This was my project I did to prove to myself that I truly have still got it.

My next project is going to be extreme overclocking. I’ll be posting more on that in the future.

Updated: