Lync 2013

Monitoring Your Lync 2013 Peak Call Capacity

Recently, my company wanted to analyze our SIP trunk usage with our vendor and determine if we had unnecessary capacity. When the vendor was only able to provide call detail records, I went looking for solutions on the web.

I found several, but the problem was that they relied on the LcsCDR database to determine what sessions were open. The behavior has changed with Lync 2013 and session detail is no longer stored until the end of the call. This means that you cannot query for active/open sessions to determine usage.

I found this script by Rune Stokes in his article Monitoring your Lync Peak Call Capacity. It wasn’t viable as written for Lync 2013 because the counter names changed.

I updated it for the 2013 counter names as well as making a few other adjustments to the loop in order to stabilize the script on our W2k12 Mediation Server.

Download here: Get-CallCounters

<# ---------------------------------------------------------------------
    .SYNOPSIS
    Get-CallCounters.ps1
    
    Script to retrieve the performance counters of ongoing inbound and
    outbound calls, and also calculate the combined usage.
    
    It will output the average and peak values for each hour for as long 
    as it runs.

    .NOTES
    Must be run on Mediation server.
    
    Made by Rune Stoknes - https://stoknes.wordpress.com/
	Adaptations by Duncan Bachen - http://eureka.greenhead.com

    v0.5 - Apr 15 2014 - first simple version, CSV output only
    v0.8 - Apr 16 2014 - added console output for current counters
    v0.9 - Feb 16 2015 - Altered the outbound and inbound call paths to match what the counter references were in Lync 2013 [Duncan Bachen]
         - Feb 17 2015 - Changed clear screen not use console class so it can be tested in ISE and other shells [Duncan Bachen] 
					   - Script kept hanging during long running without explanation. Changed the wait for keypress logic slightly.
					   - Added loop output time and counter to the display so that it's easier to monitor and make sure it's progressing.  
					   - Altered loop time to 10 secs to have finer granularity of counters [Duncan Bachen] 
		 
   
   -------------------------------------------------------------------- #>

   # Lets define the actual counter paths we will be fetching.
   # Changed to 2013 paths
<#
   $outboundCallPath = "\LS:MediationServer - 00 - Outbound Calls(_total)\- 000 - Current"
   $inboundCallPath = "\LS:MediationServer - 01 - Inbound Calls(_total)\- 000 - Current" #>
   
   
   $outboundCallPath = "\LS:MediationServer - Outbound Calls(_Total)\- Current"
   $inboundCallPath = "\LS:MediationServer - Inbound Calls(_Total)\- Current"

  
   # Create CSV file for output. It will be named \Users\<user>\Documents\Call counters DDMMYYY.csv
   
   $outputPath = $env:HOMEPATH + "\Documents\Call counters " + (Get-Date -UFormat "%d%m%Y") + ".csv"
   
   try
   {
        "Date;Hour;Avg inbound;Peak inbound;Avg outbound;Peak outbound;Avg concurrent;Peak concurrent" | Out-File -FilePath $outputPath -Encoding default
   }
   catch
   {
        "Unable to create file " + $outputPath
        exit
   }

   # Create empty arrays to store counters. These will be reset every hour.

   [int[]]$inboundCalls = $null
   [int[]]$outboundCalls = $null
   [int[]]$concurrentCalls = $null
   [int]$counter = 0
   
   # Get the time
   $now = Get-Date

   # Now, for every 15 seconds we will repeat this same procedyre:
   # -> Fetch current call counter for inbound and outbound
   # -> Store data in array
   # -> if the hour changes, calculate average and peak counters and store to file

   $Stopped = $False

   do {

        # That was then, this is now
        $then = $now
        $counter += 1

        # Get the counters we want, and add them to the arrays
        [int]$currentInbound = (Get-Counter $inboundCallPath).CounterSamples[0].CookedValue.ToString()
        [int]$currentOutbound = (Get-Counter $outboundCallPath).CounterSamples[0].CookedValue.ToString()
                
        $inboundCalls += $currentInbound
        $outboundCalls += $currentOutbound
        $concurrentCalls += ($currentInbound + $currentOutbound)

        
        # Now let's get the time
        $now = Get-Date
        
        # Output to console
        # Console class is not available is ISE and other shells, so don't call it directly
        # [System.Console]::Clear()
        Clear-Host
        "Current number of inbound calls: " + $currentInbound
        "Current number of outbound calls: " + $currentOutbound
        "Current number of concurrent calls: " + ($currentInbound + $currentOutbound)
	    "Current Hour Counter: " + $counter
        "Current Loop Start: " +  $Now

        
        # Has the hour changed since then?
        if ($now.Hour -inotlike $then.Hour)
        {
 
            # The peak and average value can be calculated and derived using Measure-Object
            $inbound = ($inboundCalls | Measure-Object -Maximum -Average)
            $outbound = ($outboundCalls | Measure-Object -Maximum -Average)
            $concurrent = ($concurrentCalls | Measure-Object -Maximum -Average)

            # Let's append this to the CSV file we created
            (Get-Date -Date $then -UFormat "%d %b %Y;") + $then.Hour + ";" + $inbound.Average + ";" + $inbound.Maximum +  ";" + $outbound.Average + ";" + $outbound.Maximum + ";" + $concurrent.Average + ";" + $concurrent.Maximum | Out-File -FilePath $outputPath -Encoding default -Append

            # Let's not forget to reset our arrays now that the hour has changed
            [int[]]$inboundCalls = $null
            [int[]]$outboundCalls = $null
            [int[]]$concurrentCalls = $null
            [int]$counter = 0
                        
        }

        # Let's hear if the user wants to end the script
	    Write-Host "`n `nPress ESC to end the script on next 10s loop." -ForegroundColor Red

                
        if ($Host.UI.RawUI.KeyAvailable -and ($Host.UI.RawUI.ReadKey("IncludeKeyUp,NoEcho").VirtualKeyCode -eq 27)) 
         {
           Write-Host "`nExiting shortly...`nCheck output file for any historical data:`n" $outputPath "`n" -Background DarkRed
              # Clean up and exit while pointing to the results file
           $Stopped = $True
        }

    Start-Sleep -Seconds 10

   }
   Until ($Stopped)

Lync Monitoring Time Zone

 

When attempting to utilize Lync 2013 Monitoring Reports, we were presented with an error which began with:
An error has occurred because the time zone of SQL Server Reporting Services (SSRS) is inconsistent with the time zone of the…

image

We run our SQL servers in GMT because of our International presence, but our Lync servers are running in EST. We recently rebuild our Lync environment and hadn’t verified the monitoring reports on the new installation because they were so rarely used.

The error message provides the proper steps for resolution. These steps needed to be run on both the LCSCDR and QoEMetrics databases, and the above error message will reference either database as they both need to be fixed.

1. Run the dbo.RTCClearTimeZoneInfo stored procedure on the [LcsCDR/QoeMetrics] database
2. Open the reports home page in your web browser using the SSRS that has the time zone that you want to use
3. Run the dbo.RtcGenerateSummaryTables stored procdure on [LcsCDR/QoeMetrics] database to improve your reporting experience.

However, after running the procedure in step 1, we were presented with the following error:

Msg 547, Level 16, State 0, Procedure RtcClearTimeZoneInfo, Line 6

The DELETE statement conflicted with the REFERENCE constraint “FK_DaylightSavingYears_1”. The conflict occurred in database “QoEMetrics”, table “dbo.DaylightSavingYears”, column ‘TimeZone’.

The statement has been terminated.

(1 row(s) affected)

I found this thread on technet which properly identified the problem with Lync 2013 that the end user had resolved with a Microsoft support call.

There are two issues which prevent the successful execution of the stored procedure in Lync 2013.

  1. There is now a foreign key constraint between the DaylightSavingYears table and the TimeZones table which prevents the TimeZones table from being cleared first.
  2. There is now a trigger associated with the TimeZoneConfiguration table which requires there always be exactly one row in the table.

Resolution

To work around this issue the following SQL query can be run to perform the same action as the stored procedure.  It will execute the operation on both the LcsCDR and QoEMetrics tables.

USE [LcsCDR]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
declare @Status int
set @Status = 0
BEGIN TRANSACTION
    DELETE from DaylightSavingYears WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 DELETE from TimeZones WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 ALTER TABLE TimeZoneConfiguration DISABLE TRIGGER "TimeZoneConfigurationTrigger";
    DELETE from TimeZoneConfiguration WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 ALTER TABLE TimeZoneConfiguration ENABLE TRIGGER "TimeZoneConfigurationTrigger";
COMMIT TRANSACTION
exec @Status = RtcTruncateSummaryTables
SELECT @Status

USE [QoEMetrics]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
declare @Status int
set @Status = 0
BEGIN TRANSACTION
    DELETE from DaylightSavingYears WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 DELETE from TimeZones WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 ALTER TABLE TimeZoneConfiguration DISABLE TRIGGER "TimeZoneConfigurationTrigger";
    DELETE from TimeZoneConfiguration WITH (TABLOCKX)
    if (@@error <> 0) begin
        ROLLBACK TRANSACTION
    end
 ALTER TABLE TimeZoneConfiguration ENABLE TRIGGER "TimeZoneConfigurationTrigger";
COMMIT TRANSACTION
exec @Status = RtcTruncateSummaryTables
SELECT @Status

The nice thing about the script is that it combines the procs from both the LcsCDR and QoEMetrics in a single procedure. After running the script, I still needed to run generate the summary tables with dbo.RtcGenerateSummaryTables in each of the databases.