As I’ve mentioned, I’m doing a ton of SQL work lately (more on that later!) and I just had a co-worker ask for something so I threw together this quite query. One of our backups had failed and he just wanted to know where the backup file was being placed (in most cases we backup locally but we do have some exceptions). I threw together this query to locate the file name:
I also have a new PowerShell script that reports on SQL backups that I want to get published, as well as a more intelligent index rebuilder that is Availability Group and log shipping aware.
I have a confession to make. I’m really busy at work. So busy, and challenged, that I haven’t had much time to keep the blog going and certainly not enough time to help support all the scripts I’ve written. In fact, if you’re subscribed to the Employee Directory post, you’ve seen the back and forth going on there! I feel really bad that I haven’t been able to get back to the few of you who have reached out to me because you’re running into problems, but the work/life balance is a little skewed right now and the blog and supporting you is what’s had to drop off.
The other problem is I don’t have the infrastructure around to support some of the scripts, the DFS Monitor comes to mind! No DFS here at work so hard to test!
So what to do? I want people to enjoy my scripts and I’d like to continue seeing development happen on them. So time to get these things out on GitHub. This way if you want to make changes you can fork the code off and do your thing and even put in a pull request–I swear that should be a push request, but that’s just me–and if I like what I see I’ll merge it in with the main code.
You can find me here, and the scripts I’ve published so far:
That’s all I have right now, but if there’s a script you’d really like to see up there just let me know in comments and I’ll get it posted ASAP. I hope this helps!
Continuing my HTML reporting series, I have a new twist on coloring columns. There are times when you want to group a number of rows together, so I created this function to do just that. It’s essentially a riff on the Set-AlternatingRows with a little bit of Set-CellColor thrown into the mix.
So last post I talked about dynamic properties in objects, and it was pretty cool. Then a crazy thing happened, that same colleague who was playing around with a faster Get-ChildItem project? He decided to get some real work done and began working on a Office365 script but the Get-MailboxStatistics cmdlet is a little bit different with O365 in the TotalItemSize property is deserialized and pretty much only has a lousy string output. So how to get the raw number without the ToMB() method? A little Googling and we found an Exchange blog post using RegEx to strip out all the extra crud, and a dynamic property! In the years I’ve been using PowerShell I’ve never once seen a dynamic property and the very day I decided to learn it we actually found a real life usage! What are the odds? Anyway, the twist was the Exchange team was applying the Add-Member to an array of objects. I didn’t realize you could do that. And it worked!
But what about performance?
Typically we’d use a calculated field to strip out the crud and calculate the number, right? It’s the PowerShell way. This method seemed really cool and we wanted to use it, but the fear is it’d be slower than a Select-Object with calculated field. So I tried it with some sample data and ran them both through Measure-Command. And my jaw dropped. The dynamic property and Add-Member technique was about 15x faster than the Select-Object with calculated field. What?! A few more test proved it out.
Remember this last “wish list’ item I posted yesterday?
Let’s actually do it in real life, this time using both the Select-Object and Add-Member technique’s and see who wins. Here’s the test code:
This is against my test folder where I keep a bunch of files (mostly scraps from other scripts!). There are about 234 files in these folders. The results?
Not the 15x speed difference we saw when doing string manipulation, but still 3x faster. So next I decided to run it against my entire C: drive. There are 216,177 files on my PC!
This time we got a little over FOUR time faster.
If you find yourself working with really big data sets and you need a calculated field, there’s no question this is a great technique and much faster than Select-Object with calculated fields. But there are some limitations. With Select you can actually draw information from other sources. On Friday I needed to produce a report of all of our VM’s that were not on hardware version 10 and had a 2TB virtual disk (we actually have a few). Turns out VMware does not support VMDK’s over 2TB on hardware version 9. So I setup a loop on our vCenter servers, then a loop on our VM’s and used Get-HardDisk on each VM. But the problem is the pipeline from Get-HardDisk doesn’t include the vCenter information or the VM name, so I used calculated fields in a Select statement to refer to that information from my loop. Since the object that can be created from Get-HardDisk does not contain that information I wouldn’t be able to use the Add-Member method to get information from the loop.
OK, technically that’s not true. There is some reference information in Get-HardDisk, so in the value parameter I could use Get-VM and from that data use Get-Host to get that information, but it would be evaluated every time you display the object which would be painfully slow!
I love this. I love its speed, and I love its simplicity. It’s a narrow use case, admittedly, but another powerful weapon in my arsenal. But Holy Crap! I’ve been trying to get people off of Add-Member (at least for normal object creation) for years and now suddenly I’m going to embrace like a champ! Fun stuff!
Short post today, but I wanted to talk about dynamic properties in objects. I haven’t had too much need for this kind of thing, but the fact that PowerShell can do it so easily is pretty cool. Imagine having an object where you plug a value into one field and dynamically another field in the object changes based on that first value. Kind of sounds like a function, but it’s all done right in the object itself.
A little background. Colleague of mine was trying to use CMD.exe DIR command to make a fast Get-ChildItem, one that doesn’t get every single object property for a file, which in theory would retrieve that information much faster then Get-ChildItem. For now he’s abandoned the project because the speed wasn’t there but it brought up some interesting points, and one of them was directory name. When using DIR you often only get the BaseName property, not the full path but as a scripter I almost always want the full path. Certainly ways of getting that information but we would really want to display both. What if we could just have the object do it and not worry about?
What if you were dealing with a lot of disk size values and wanted the flexibility of seeing MB and GB conversions? Sure you could send it off to a function but lets consider another way.
It all begins with Add-Member. Let’s keep it simple and go with the BaseName calculation I mentioned before. First let’s define some things:
No real drama, get some data and create an object with that data as a property (FullName). Now I want to add my dynamic property:
This is where it gets cool. We take the object we created and pipe it into Add-Member which will add a property, in this case a ScriptProperty which will evaluate it’s value based on the scriptblock in the Value field. This is a pretty simple script function where I take the current object ($this) take the FullName property and split that on the back slash. Then take the last element in the returned array for display (which will be the last folder in the path–in this case).
Cool. But we could have just assigned the BaseName value when we created the object right? Check this out:
Notice all we did was change the FullName property, but the BaseName property dynamically changed. You’ve essentially created a class that dynamically changes itself based on the values you assign. You could still mimic all of this behavior by using Functions but it’s pretty interesting none the less. With PowerShell 5.0, Microsoft is promising us the ability to create classes without having to drop into C# which would really open up these kinds of techniques.
Imagine something like this:
Where you’ve defined a class that automatically takes the Length property and converts it to MB and GB for you.
Follow-up post that coming soon that explains this tweet:
Love finding new technique’s that increase performance…. do you?
If you’re looking to read about Modules and what they are and how to make them, this isn’t the blog post for you. No, the problem I have is living with Modules. I’ve only recently begun using them with the new job, but they have quickly made a place in my heart because being able to just call a module over and reuse code saves so much time! But, as always happens, I want to make a change or fix a bug and that’s where modules fall on their face (a little). This is my attempt to address that fact.
If you’re a Windows Administrator and have Microsoft SQL Servers in your environment–and you’d be a rare bird if you didn’t–then you’ve encountered this problem. Backups are missed for awhile, massive restore is done, something happens and your transaction log file grows way too big. It’s not uncommon to find it being much larger then the actual database! If you’re like me you would RDP to the server, check out the data drive and see that the log file itself was out of size. You’d then hit Google to find out how to do a shrink file, then have open SQL Management Studio and try to figure out the logical name of the log file, then run the query. Usually about 20 minutes of lookups, all so you can run a SQL query that typically takes a few seconds to complete. Time to address this using PowerShell.
There are a whole bunch of blog posts and scripts out there to do SQL queries in PowerShell, so I’ve really hesitated about posting my own version of it. But hey, it’s my blog and I can do what I want to. Do what I want to.
Another interesting day at SpiceWorld. Sat through the morning session about IT Policies and I’m not going to lie, got a good nap in during that one. As necessary as they are it would take an amazing speaker to make it interesting!! The guys here were good, but not up to that monumental task.
Next session I attended was the JEA from Jeffrey Snover. Great session talking about “Just Enough Administration” and how this is now possible with the tool sets provided by PowerShell. As with most security measures it’s only as good as the effort you put into it–which in my experience often means no one does much with it at all. As it stands it’s a great idea and the functionality is there, but I doubt it will be widely adopted. The reason is everything is custom, you create your custom tool sets with custom permissions and then override, or “proxy” known PowerShell cmdlet’s with your requirements. What I mean by that is you can actually modify Restart-Computer with a custom validation set so someone using that toolset could only restart Server1 and Server3, but would be unable to touch Server2. Cool stuff, but honestly seems impractical at this juncture. Once some easier interfaces, templates, etc start being available I believe you’ll see this take off.
After that I went to the Windows 2012 Deduplication session. This was put on by a Veeam employee, but he took great pains to make sure we understood he was doing this as an enthusiast and not a speaker from Veeam. He was very successful. Great session and really highlighted the insane compression ratios possible with Windows 2012 in the proper circumstances. In this case the massive backup files created by Veeam backup software. Ninety-four percent? Are you insane?! Couldn’t wait to take that back to the office.
After that just visited a few vendors, then sat through the raffle. This was a lot of fun as you had to be in the room when the raffle called your name. If you weren’t your ticket was put into the shredder to chants of “Shred it! Shred it!” and the crowd was out for blood.
Kris Bushover also put on his yearly presentation of the best tickets his department received from Spiceworkers. Most of them were pretty terrified of this particular presentation, but it’s all in good fun and great for a laugh.
After that was a after-party there in the convention center which was essentially a last chance to catch up and visit with everyone, new and old friends alike. Then a few of us met up with the community managers over at Champions, but Vegas rules were in effect so can’t say anything more. Although I did find out what camel case meant, thanks Justin.
Great time, and the PowerShell session was so much fun to put on I just love doing it. Not sure if I’ll be able to go back next year as it’s a very expensive trip and I had to foot the entire bill. I will try.