SDS

Dell PowerEdge C6100: Upgrading to All-Flash Home Lab – Part 2

June 4, 2016 - - 2 Comments

In the previous article, we walked through what kind of reasons made me to acquire an old Dell PowerEdge C6100 with 3 nodes as my “new” home lab. In this article, you will see the upgrades I did to get a All-Flash Home Lab. These upgrades allow us to run solutions like VMware VSAN, or Nutanix Community Edition.

1st Upgrade – USB as ESXi drive

When I bought the C6100 I added to my configuration 3 x 300 GB SAS 3.5″ – 10K RPM drives because at the beginning I didn’t think around build an All-Flash VSAN. If you are thinking to buy a C6100 home lab and make it all-flash, I recommend you to do NOT buy the SAS drives. You can use these money for the USB memories.

Like you know, ESXi can be installed in a SD/USB memory. The minimum space required to install ESXi is really low, so you can just buy a 8 GB memory per node. I bought 3 x SanDisk SDCZ33-008G-B35 8GB for £4.14 each one.

SanDisk

SanDisk SDCZ33-008G-B35 8GB for ESXi installation

I followed the recommendation of Vladan Seget (@vladan) in his article about to use VMware Workstation as tool to install ESXi in the USB drives. It worked like a charm.

2st Upgrade – NVMe + Adapter

Samsung SM951 NVMe 128GB M.2 (Cache for All-Flash Home Lab)

Following the recommendations that William Lam (@lamw) got from his readers and posted in his blog virtuallyGhetto, I bought just 2 x Samsung SM951 NVMe 128GB M.2 for “Caching” Tier. I bought just one at the beginning to see if it worked on a C6100. After check the performance and reliability, I decided to acquire a second one to build a VMware VSAN ROBO deployment (2 x VSAN nodes + 1 x Witness appliance running on Workstation). To install the VSAN Witness Appliance I followed this article by William Lam too, “How to deploy and run the VSAN 6.1 Witness Virtual Appliance on VMware Fusion & Workstation?

You have available two model of Samsung SM951 NVMe 128GB: MZVPV128HDGM (I got this) and MZHPV128HDGM. The first one is a bit cheaper, but the main difference between both is you can’t boot an OS with the first one. If you’re looking to boot from NVMe, you must buy the MZHPV128HDGM model with AHCI support.

NVMe

SM951 NVMe 128GB (MZVPV128HDGM) for “Caching” Tier

Lycom DT-129 Host Adapter for PCIe-NVMe M.2

The Dell PowerEdge C6100 doesn’t include a M.2 socket, so you need a PCI adapter to install the NVMe. The C6100 has only one PCIe 2.0 slot and one Mezzanine slot as well. It means you have limited options to install additional components in your nodes. Anyway, with these two slots you will have enough for install a NVMe drive (PCIe slot) + 10 GbE or LSI® 2008 6Gb SAS (Mezzanine slot). Currently I’m just using the PCIe 2.0 slot for the NVMe, so in the future I can expand with a 10 GbE mezzanine card. The PCI adapter I bought is the 2 x Lycom DT-129, it supports PCIe 3.0 as well as 2.0.

DT-129-4

Lycom DT-129 Host Adapter for PCIe-NVMe M.2

Note: you won’t get the max. performance of your NVMe drive since we’re using PCIe 2.0, but it will be enough to do functional tests.

3rd Upgrade – SSD + 2.5″ to 3.5″ Converter

Samsung 850 EVO 500 GB 2.5″ SSD (Capacity All-Flash Home Lab)

To build an all-flash VSAN platform, I needed to replace the 300 GB – SAS 3.5″ drives for SSD drives as “capacity” tier. I bought 2 x Samsung 850 EVO 500 GB. The drive is connected to the SATA ports available on the motherboard (ICH10) with a “Queue Length” equal to 31.

SSD

Samsung 850 EVO 500 GB 2.5″ SSD for “Capacity” Tier

Tip: If you’re looking to build an all-flash home lab, don’t make the same “mistake” than me, you don’t need to add any drive during the configuration of your Dell PowerEdge C6100 bundle, just add the caddies (at least 3). I added the SAS drives because I didn’t know if the SSD and USB drives would work fine. Now you know that, use this money to buy other components like the USB memories for ESXi installation.

ICY Dock EZConvert 2.5″ to 3.5″

The Dell PowerEdge C6100 is available in two  models, 12-bay of 3.5″ or 24-bay of 2.5″. The model I bought is the 12-bay of 3.5″, so I needed a converter from 2.5″ to 3.5″ for the SSD drives. I did a lot of research to see which converters worked properly since the disks are mounted in a caddy and inserted in a bay. If the SSD doesn’t fit as a 3.5″, you will face lot of connectivity issues with them.

I bought two different converters to see which one was better. The “official” Dell converter (1 x Dell 9W8C4 Y004G) and the 2 x ICY Dock EZConvert Lite MB882SP-1S-3B. To be honest, even using the cheaper model of ICY Dock, it’s much better than the “official” Dell converter. The Dell converter is cheaper, but is really weak keeping the SSD drive on the air, and the holes don’t match properly. I highly recommend the ICY Dock EZConvert Lite MB882SP-1S-3B for any 3.5″ bay regardless of your server vendor.

ICY Dock

ICY Dock EZConvert Lite MB882SP-1S-3B (RECOMMENDED)

 

9W8C4-Y004G-adapter

Dell 9W8C4 Y004G (NOT recommended)

Conclusion

With these upgrades, you can have a really powerful home lab with a reduced investment. The VSAN ROBO installation with 2-nodes + Witness in Workstation works like a charm, even using the embedded dual-port NIC.

I can still do more upgrades and support more workloads just with another small investment. The future upgrades I’m looking for are:

  • Buy the components above for the third node and move from VSAN ROBO with 2-nodes + Witness in Workstation, to an all-flash VSAN platform with 3-nodes.
  • If I need more resources like CPU & RAM, I can still add a second CPU in each node, and 48 GB of RAM more per node to get a total of 6 x Intel L5630 + 288 GB RAM (around £350 this upgrade). In case of storage, I can still add two drives more per node.
  • When the price of 10 GbE switches drops, I can add the 10 GbE Mezzanines and increase the network performance and avoid any bottleneck with VSAN or any other SDS solution.
  • Finally, if I still need add more resources a forth node is not so expensive. For £250-300 you have a node with 2 x Intel L5630 / 96 GB RAM / 128 GB NVMe / 500 GB SSD

Below you can see the Bill of Materials (BoM) of the upgrade. Also, you have available how much will be the next upgrade to enable the third node as part of the VSAN cluster.

BoM Dell PE C6100 Upgrade

Component Qty. Price Total
SanDisk SDCZ33-008G-B35 8GB 3 £4.14 £12.42
SM951 NVMe 128GB (MZVPV128HDGM) 2 £59.99 £119.98
Lycom DT-129 Host Adapter for PCIe-NVMe M.2 2 £21.99 £43.98
Samsung 850 EVO 500 GB 2.5″ SSD 2 £112.00 £224.00
ICY Dock EZConvert Lite MB882SP-1S-3B 2 £11.05 £22.10
Grand Total £422.48
Future Upgrade Investment for 3rd node
SM951 NVMe 128GB (MZVPV128HDGM) 1 £59.99 £59.99
Lycom DT-129 Host Adapter for PCIe-NVMe M.2 1 £21.99 £21.99
Samsung 850 EVO 500 GB 2.5″ SSD 1 £112.00 £112.00
Grand Total (Upgrade 1-node) £193.98

Currently (04/06/2016) the invested money for this home lab is £1007.44

For Sale!!!

The C6100 came with MR-SAS-8704EM2 SAS/SATA controller on the PCIe slot. This controller is supported by VMware vSphere 6.0 U2, but NOT for VSAN. Anyway, vSphere sees the controller as an Avago (LSI) MegaRAID SAS 1078 Controller with a “Queue Depth” of 975. This SAS controller has a bandwidth of 3 Gb/s and 1-port. If you’re interested to buy one of those controllers, I’m selling mine (3) for £25 each one (£350 new). You can reach me with a comment or through Twitter.

SAS

Avago (LSI) MegaRAID SAS 8704EM2 with QL = 975 for VSAN