If you’re running your vSphere deployment on HP kit then there’s a pretty good chance you use the HP Customized ISO Image for installation, for example this one for ESXi 4.1 U1. These customised images typically contain HP management tools and drivers and are great for saving time during the installation process. Naturally you will be upgrading ESXi at some point, but it’s important that you also keep the HP part up-to-date too.
The HP Enterprise Virtual Array Family with VMware vSphere 4.0 , 4.1 AND 5.0 Configuration Best Practises Guide, available here, contains many recommendations for ESXi configuration. There are a number of recommended settings in this document to enhance the storage performance, a subset of which I have picked as appropriate for the environment and then needed to configure them on all ESXi hosts.
They can be implemented via PowerCLI and the below script demonstrates how these different types of settings can be configured.
Whilst attemping to install the Dell ESXi Agent and upgrade the ESXi revision on a new VMware host, I hit the following issue.
Dell Agent - Error encountered: Description - I/O Error (17) on file /var/tmp: [Errno 17] File exists: '/var/tmp' Message - Unable to create, write or read a file as expected.I/O Error (17) on file /var/tmp: [Errno 17] File exists: '/var/tmp'
ESXi upgrade with Update Manager - The host returns esxupdate error codes: 10.
In ESXi 4.0 the Advanced Software Setting UserVars.CIMOEMProvidersEnabled is used to enable an agent such as the Dell OpenManage offline bundle for ESXi which provides hardware management for Dell PowerEdge servers. This is supplied as a vSphere Installation Bundle (VIB) and Alan Renouf has a great post on his blog on how to install a VIB like this one via PowerCLI.
Whilst testing out getting the same Dell Agent installed into ESXi 4.
During the initial stages of an upgrade of a number of VMware hosts from ESX 3.5 U5 to ESXi 4.0 U2 the boot times rose from the normal few mins (most of which is Dell Hardware checks) to around 12 mins.
In particular it was appearing to hang for 5 mins, whilst on the screen the below was displayed:
Loading module multiextent
This would only happen after the install was completed and the host connected back to the fibre channel SAN, otherwise boot times were normal.