vSpecialist Labs: Roadmap

Part Two – Roadmap

In the hardware post for the vSpecialist Labs, we listed the hardware we have to play with to construct the Lab. Now is the challenge of what to do with it. Here is a scratch list of the projects we will be running in the Labs over the coming months.

Status: Testing (T), Research (R), Proof-of-Concept (POC), Evaluation (E).

Infrastructure / vSphere:

- ghettoVCB VM backups (T)

- ESXi Hardening using host profiles (T)

- Cisco UCS Platform Emulator v2.0 deployment and configuration (T) (E)

- vCenter Orchestrator appliance (T)

- Integrating vCenter Orchestrator appliance with Cisco UCS platform emulator (T) (E)

- vCenter Operations (E)

vCloud Director:

- Multi-site vCloud Director configuration with v1.5 (R) (POC)

- vCloud Director appliance (T)

- Multi-site PvDCs per Organisation (R)

- vCloud Connector 1.5 within an Organisation as an endpoint (T)

Site Recovery Manager 5

- SRM within a vCD Organisation as an endpoint (T)

- vSphere Replication (T)

VMware View 5

- View deployment with Cisco UCS (T) (POC)

Over time, we will link to these roadmap items with posts and more information. Please also see the vSpecialist Labs Overview page for lab configuration information.

vSpecialist Labs: Hardware

Part One – Hardware

I’ve been fortunate to get my hands on some pretty decent hardware (for a ‘home’ lab setup anyway) to build this lab. The servers are as follows:

Servers:

- 2 Servers @ 2 x Quad Xeon 2.0GHz / 16GB RAM

- 2 Servers @ 2 x Dual Xeon 2.66GHz / 16GB RAM

- 1 Server @ 2 x Quad Xeon 2.0GHz / 8GB RAM

- 1 Server @ 2 x Dual Xeon 2.6GHz / 8GB RAM

- 1 Server @ 2 x Dual Xeon 3.0GHz / 4GB RAM

Storage:

- 1 x HP MSA1000 (14 x Ultra320 SCSI 15k)

- 2 x HP / Brocade Switches (7 ports per switch

- 1 x MSA60 (14 x Ultra320 SCSI 15k)

Network:

- 4 x 1GB NICs per ESXi host

- 2 x 4GB HBA per ESXi host

- 1 x 48-port Cisco access switch

All the kit is now installed into the racks and available via remote control. Fortunately, this isn’t a true home build in the definitive sense, as the kit is being hosted in a Tier 3 datacenter in the South West – something that will make my wife happy (show me a long suffering vAdmin significant other who loves servers in the spare room), and also keep the power bills to something near normal levels.

Back to vSpecialist Labs: Overview, next page vSpecialist Labs: Roadmap.

vSpecialist Labs: Overview

So, after several months of procrastination and searching for build information, my ‘home lab’ build begins which will grow over time into the new ‘vSpecialist Labs’ sandbox infrastructure. With this, I plan to do test and research work for configurations for production deployments, as well as have an environment which we at www.vspecialist.co.uk can use for study (VCP, VCAP and VCDX) too.

This page is the basis for the documentation of the configuration of our lab, the reference information used to create the configurations and and hints and tips learned along the way. If you are building a home lab or are looking for information on how to go about this, then we hope you find it useful.

A note about licensing. The majority of the licenses used in the lab for research are the 60-day evaluation licenses available from VMware. Currently, there is a movement to get the popular VMTN subscription (comparable to a MS TechNet account) reinstated by VMware. If you would like information on this, there is discussion here, being driven on Twitter my Mike Laverick (@Mike_Laverick) using #VMTNSubscriptionMovement (Twitter external link).

vSpecialist Labs Documentation
Part One – Hardware: http://www.vspecialist.co.uk/vspecialist-labs-hardware/
Part Two – Roadmap: http://www.vspecialist.co.uk/vspecialist-labs-roadmap/