The Surly Admin

Father, husband, IT Pro, cancer survivor

DFS Replication Monitor With History Upgrade 2.6

By far my most popular post is the DFS Replication Monitor With History script, so I like to revisit it every now and then and make sure I’m doing things the best way I know how.  Continue on to see about the new upgrade.

Version 2.6

This isn’t a big upgrade, but while looking at someone else’s script it suddenly occurred to me that I could be saving my data in XML format instead of CSV.  Why is this important?  With a CSV every property is stored as a string, but one of the properties of the script very definitely has to be a functioning DateTime variable.  This turned into a problem with international users (read more detail here) because Windows can be set to store the date/time in different formats when converted to a string.  Not to mention I had to run a little loop after loading the CSV data and convert every date property into a DateTime variable, and then back again to a string value to save it.

XML can solve this problem because it fully supports .NET data types, and would save me from having to specifically code for international date formats.

Turns out, converting the script to use XML was pretty easy too.  First I had to take some existing data and convert it to XML in the first place, this code became Convert-DFSDataToXML.ps1 and if you want to start using the new version of DFS Monitor with History and keep your historical data you’ll need to download it and run it once.  But the “gist” of the script is you take your array and pipe it into ConvertTo-CLIXML and voila, you have your data perfectly preserved in an XML file.  The only downside is that the data file, for me, jumped from 202Kb to 1.4MB.

After that I just had to remove the conversion loops–one at the loading of the CSV where I convert from string to date/time, and one just before saving the CSV where I had to convert back to string in a very specific format.  One minor problem was when I ran the script it died trying to add the new data, and this turned out to be a variable type mismatch.  When you load XML into an object, it creates a PSCustomObject system array, but when I saved data I was adding new objects to the array as PSObject’s.  Simply changing the save portion to PSCustomObject solved that problem.

How do you upgrade?

First download the conversion script here, then download the new DFS Monitor with History script here.  I’m going to assume you are running this on the server/workstation where you save  your historical data in a path called:  c:\scripts\dfsmonitor.  Save both scripts to that location.

  1. Open Powershell and navigate to the dfsmonitor folder.
  2. Run the conversion script:  .\Convert-DFSDataToXML.ps1 -DataLocation C:\Scripts\DFSMonitor
  3. When completed modify the PARAM portion of DFSMonitorWithHistory.ps1 to match your environment

That’s it.  If you already have a scheduled task just make sure it’s pointing to the new script, and if you saved it with the same name it should be.

If you’d like more information about setting up a scheduled task to run Poweshell scripts, you can read the How-To I wrote for Spiceworks.

Feedback

As always, I want to hear from you, especially if you’re an international user with your funky date/time formats (kidding!).  I might not answer you in comments here, but I will always try to get back to you in email.

Advertisements

December 20, 2012 - Posted by | PowerShell | , ,

8 Comments »

  1. […] Version 2.6 release: An update to this post, I’ve recently release verison 2.6 of DFS Monitor with History that now saves the data in XML format.  Read more about it here. […]

    Pingback by DFS Replication Monitoring « The Surly Admin | December 20, 2012 | Reply

  2. This is a great script and we are currently using it to monitor our many DFS replication groups. Question for you though. Is there a way that the Y axis could auto scale so that it is better scaled based on the time slot you are looking at. We have spike of times when there are 17000 files being replicated but other times where is only 30 files. It would just be a bit easier to read the graph if this were an option. Thanks again for all your work on this. It has been a great reporting tool for us.

    Comment by Brian | February 27, 2014 | Reply

    • I wish there was but Google doesn’t give any controls for that, so unfortunately no.

      Comment by Martin9700 | February 27, 2014 | Reply

  3. Trying to run this script on a 2012 R2 replication group is hitting a weird snag – the backlog is being reported twice for some odd ball reason (the stock windows 2012 powershell cmdlet does the same thing, so the problem isn’t necessarily with this script but with the way the WMI is feeding back the result set.) the error when I run the script indicates that “Measure-object: input object is not numeric” so i dug in a little bit and found the $output array was getting populated with a sub-array, meaning my backlog was being expressed twice i.e. {4, 4} – hence the “That’s not a number dude” error.

    The below work around is VERY far from elegant since I’m not a very experience powershell scripter but it eliminated the error, and I am hoping that a more experienced script-writer can contribute a better fix to replace the work-around long term.

    Looking at the foreach loop starting at line 472 I made the following changes, ultimately with the goal to select only the FIRST of the two returned values in the $output array. (The decision to implement a new “Serverpair” array instead of working with the $output array was deliberate in an attempt to avoid messing with the original code and to keep the change local to that code section.)

    #Now add the new data
    ForEach ($GroupName in $AllGroupNames.Values)
    { #$GroupName = ($Group.Split(“:”))[1]
    $UniqueReplFolders = $Output | Where {$_.GroupName -eq $GroupName} | Select Folder -Unique
    ForEach ($Folder in $UniqueReplFolders)
    {
    $serverpair = @()
    foreach ($pair in $output)
    {
    $Serverpair += new-object PSCustomObject -Property @{
    name = $pair.inserver + ‘ replicating to ‘ + $pair.outserver
    groupname = $pair.groupname
    folder = $pair.folder
    backlog = $pair.backlog | select-object -first 1
    }
    }

    $BacklogCount = ($serverpair | Where {$_.GroupName -eq $GroupName -and $_.Folder -eq $Folder.Folder} | Measure-Object Backlog -sum).Sum
    $NewRGName = $Folder.Folder + “:” + $GroupName
    $Data += New-Object PSCustomObject -Property @{
    RFName = $Folder.Folder
    RGGUID = $NewRGName
    BacklogCount = $BacklogCount
    RunDate = $ScriptRunDate
    }
    }
    }

    Comment by Christopher | October 29, 2014 | Reply

  4. […] other problem is I don’t have the infrastructure around to support some of the scripts, the DFS Monitor comes to mind!  No DFS here at work so hard to […]

    Pingback by Opening up my Scripts « The Surly Admin | April 14, 2015 | Reply

  5. This is a great script, thank you for writing it. I am happy our Microsoft support guy showed us this one. A couple of questions from a non-scripting pleb.

    1. Is there a way to remove the last column with the file names? There are times when knowing the backlogged files are key, but for the general view, I would love to be able to just show the numbers.

    2. Is there a way to target specific replication groups? When I run this in our environment, I really only care about 6 specific RGs. But the servers are members of many more RGs, ones I have no permission to or care about, and I would like to exclude them if possible. A lot of these extra RGs end up throwing errors when running the script as a bunch of servers do not have DFS running. If there were a way to focus on specific RGs, that would make this even better.

    I realize from your previous posts that you do not have an environment to test in. I would be happy to test any changes/improvements, I just would have no idea how to code them.

    Comment by David | June 5, 2015 | Reply

  6. This script has saved many days of trouble. Recommended by Microsoft tech as well.

    The script worked great for a while. then recently a Microsoft patch or something changed and im getting WMI errors on lots of servers and the report fails.

    PS>TerminatingError(Write-Debug): “Cannot bind argument to parameter ‘Message’ because it is null.”
    Write-Debug : Cannot bind argument to parameter ‘Message’ because it is null.
    At \\eu-solarwinds\c$\inetpub\wwwroot\DFSRMon\DFSMonitorWithHistory.ps1:465
    char:25
    + Write-Debug $Result.ErrorDetail
    + ~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidData: (:) [Write-Debug], ParameterBinding
    ValidationException
    + FullyQualifiedErrorId : ParameterArgumentValidationErrorNullNotAllowed,M
    icrosoft.PowerShell.Commands.WriteDebugCommand

    DEBUG: Receiving job number: 55

    Comment by Wayne | July 3, 2015 | Reply

    • I’m having the same WMI error issues as Wayne, was working perfectly before

      Comment by David | September 7, 2016 | Reply


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: