Horizon Logon Monitor Data Analysis
For those of you who don't know, the very useful fling for the Horizon Logon Monitor fling has now been built into the Horizon agent as of 7.1. If you are running 6.X or 7.0, I strongly urge you to install this lightweight tool. The monitor gives you fantastic insight into what is chewing away at precious vCPU cycles during the Windows log on process causing longer than anticipated log on times.
I wrote a nifty PowerShell script that will parse the logs for a specific line, sum the time value, and then calculate the average log on time for a selected set of logs. The more logs you have the longer it takes, but this script can process about 10,000 logs per minute. Here is the script in it's current form:
$LogPath = "\\servername\share\folder"
function averageArray($times) { $Total = 0 foreach ($i in $times) { $Total += [double]$i } return (($Total)/$times.length) }
$values = Select-String -path $LogPath\*.txt -Pattern "\[LogonMonitor::LogSummary\] Logon Time:\s*([\d|.]+)" | ForEach-Object { $_.Matches.Groups[1].Value } $Avg = averageArray($values) $Avg = [math]::Round($Avg,2)
echo "`nThe average logon time for the pool is $Avg seconds`n"
By default, the logs are written to %ProgramData%\VMware\VMware Logon Monitor\Logs\vmlm___.txt It is possible to redirect these logs to a UNC path on a fileshare so long as the user account that is logging in has write access to said share. You need to modify the "RemoteLogPath" string registry key located HKLM\Software\ VMware, Inc.\VMware Logon Monitor. Put the UNC path as the value for this key. For our larger pools we create their own pool folders in the share for their logs.
After recomposing, let a day or two go by and let the logs collect. They are rather small so you don't need to worry too much about the space utilization right away, but I am implementing a log cleanup method in the next revision of my script. We usually make changes as least once a month so I'd clear up everything older than thirty days.
Stay tuned for an updated script that will provide more data. I am planning to look into specific processes and finding out the process that is taking the most time.