So last post I talked about dynamic properties in objects, and it was pretty cool. Then a crazy thing happened, that same colleague who was playing around with a faster Get-ChildItem project? He decided to get some real work done and began working on a Office365 script but the Get-MailboxStatistics cmdlet is a little bit different with O365 in the TotalItemSize property is deserialized and pretty much only has a lousy string output. So how to get the raw number without the ToMB() method? A little Googling and we found an Exchange blog post using RegEx to strip out all the extra crud, and a dynamic property! In the years I’ve been using PowerShell I’ve never once seen a dynamic property and the very day I decided to learn it we actually found a real life usage! What are the odds? Anyway, the twist was the Exchange team was applying the Add-Member to an array of objects. I didn’t realize you could do that. And it worked!
But what about performance?
Typically we’d use a calculated field to strip out the crud and calculate the number, right? It’s the PowerShell way. This method seemed really cool and we wanted to use it, but the fear is it’d be slower than a Select-Object with calculated field. So I tried it with some sample data and ran them both through Measure-Command. And my jaw dropped. The dynamic property and Add-Member technique was about 15x faster than the Select-Object with calculated field. What?! A few more test proved it out.
Remember this last “wish list’ item I posted yesterday?
Let’s actually do it in real life, this time using both the Select-Object and Add-Member technique’s and see who wins. Here’s the test code:
This is against my test folder where I keep a bunch of files (mostly scraps from other scripts!). There are about 234 files in these folders. The results?
Not the 15x speed difference we saw when doing string manipulation, but still 3x faster. So next I decided to run it against my entire C: drive. There are 216,177 files on my PC!
This time we got a little over FOUR time faster.
If you find yourself working with really big data sets and you need a calculated field, there’s no question this is a great technique and much faster than Select-Object with calculated fields. But there are some limitations. With Select you can actually draw information from other sources. On Friday I needed to produce a report of all of our VM’s that were not on hardware version 10 and had a 2TB virtual disk (we actually have a few). Turns out VMware does not support VMDK’s over 2TB on hardware version 9. So I setup a loop on our vCenter servers, then a loop on our VM’s and used Get-HardDisk on each VM. But the problem is the pipeline from Get-HardDisk doesn’t include the vCenter information or the VM name, so I used calculated fields in a Select statement to refer to that information from my loop. Since the object that can be created from Get-HardDisk does not contain that information I wouldn’t be able to use the Add-Member method to get information from the loop.
OK, technically that’s not true. There is some reference information in Get-HardDisk, so in the value parameter I could use Get-VM and from that data use Get-Host to get that information, but it would be evaluated every time you display the object which would be painfully slow!
I love this. I love its speed, and I love its simplicity. It’s a narrow use case, admittedly, but another powerful weapon in my arsenal. But Holy Crap! I’ve been trying to get people off of Add-Member (at least for normal object creation) for years and now suddenly I’m going to embrace like a champ! Fun stuff!
Short post today, but I wanted to talk about dynamic properties in objects. I haven’t had too much need for this kind of thing, but the fact that PowerShell can do it so easily is pretty cool. Imagine having an object where you plug a value into one field and dynamically another field in the object changes based on that first value. Kind of sounds like a function, but it’s all done right in the object itself.
A little background. Colleague of mine was trying to use CMD.exe DIR command to make a fast Get-ChildItem, one that doesn’t get every single object property for a file, which in theory would retrieve that information much faster then Get-ChildItem. For now he’s abandoned the project because the speed wasn’t there but it brought up some interesting points, and one of them was directory name. When using DIR you often only get the BaseName property, not the full path but as a scripter I almost always want the full path. Certainly ways of getting that information but we would really want to display both. What if we could just have the object do it and not worry about?
What if you were dealing with a lot of disk size values and wanted the flexibility of seeing MB and GB conversions? Sure you could send it off to a function but lets consider another way.
It all begins with Add-Member. Let’s keep it simple and go with the BaseName calculation I mentioned before. First let’s define some things:
No real drama, get some data and create an object with that data as a property (FullName). Now I want to add my dynamic property:
This is where it gets cool. We take the object we created and pipe it into Add-Member which will add a property, in this case a ScriptProperty which will evaluate it’s value based on the scriptblock in the Value field. This is a pretty simple script function where I take the current object ($this) take the FullName property and split that on the back slash. Then take the last element in the returned array for display (which will be the last folder in the path–in this case).
Cool. But we could have just assigned the BaseName value when we created the object right? Check this out:
Notice all we did was change the FullName property, but the BaseName property dynamically changed. You’ve essentially created a class that dynamically changes itself based on the values you assign. You could still mimic all of this behavior by using Functions but it’s pretty interesting none the less. With PowerShell 5.0, Microsoft is promising us the ability to create classes without having to drop into C# which would really open up these kinds of techniques.
Imagine something like this:
Where you’ve defined a class that automatically takes the Length property and converts it to MB and GB for you.
Follow-up post that coming soon that explains this tweet:
Love finding new technique’s that increase performance…. do you?
If you’re looking to read about Modules and what they are and how to make them, this isn’t the blog post for you. No, the problem I have is living with Modules. I’ve only recently begun using them with the new job, but they have quickly made a place in my heart because being able to just call a module over and reuse code saves so much time! But, as always happens, I want to make a change or fix a bug and that’s where modules fall on their face (a little). This is my attempt to address that fact.
If you’re a Windows Administrator and have Microsoft SQL Servers in your environment–and you’d be a rare bird if you didn’t–then you’ve encountered this problem. Backups are missed for awhile, massive restore is done, something happens and your transaction log file grows way too big. It’s not uncommon to find it being much larger then the actual database! If you’re like me you would RDP to the server, check out the data drive and see that the log file itself was out of size. You’d then hit Google to find out how to do a shrink file, then have open SQL Management Studio and try to figure out the logical name of the log file, then run the query. Usually about 20 minutes of lookups, all so you can run a SQL query that typically takes a few seconds to complete. Time to address this using PowerShell.
There are a whole bunch of blog posts and scripts out there to do SQL queries in PowerShell, so I’ve really hesitated about posting my own version of it. But hey, it’s my blog and I can do what I want to. Do what I want to.
Another interesting day at SpiceWorld. Sat through the morning session about IT Policies and I’m not going to lie, got a good nap in during that one. As necessary as they are it would take an amazing speaker to make it interesting!! The guys here were good, but not up to that monumental task.
Next session I attended was the JEA from Jeffrey Snover. Great session talking about “Just Enough Administration” and how this is now possible with the tool sets provided by PowerShell. As with most security measures it’s only as good as the effort you put into it–which in my experience often means no one does much with it at all. As it stands it’s a great idea and the functionality is there, but I doubt it will be widely adopted. The reason is everything is custom, you create your custom tool sets with custom permissions and then override, or “proxy” known PowerShell cmdlet’s with your requirements. What I mean by that is you can actually modify Restart-Computer with a custom validation set so someone using that toolset could only restart Server1 and Server3, but would be unable to touch Server2. Cool stuff, but honestly seems impractical at this juncture. Once some easier interfaces, templates, etc start being available I believe you’ll see this take off.
After that I went to the Windows 2012 Deduplication session. This was put on by a Veeam employee, but he took great pains to make sure we understood he was doing this as an enthusiast and not a speaker from Veeam. He was very successful. Great session and really highlighted the insane compression ratios possible with Windows 2012 in the proper circumstances. In this case the massive backup files created by Veeam backup software. Ninety-four percent? Are you insane?! Couldn’t wait to take that back to the office.
After that just visited a few vendors, then sat through the raffle. This was a lot of fun as you had to be in the room when the raffle called your name. If you weren’t your ticket was put into the shredder to chants of “Shred it! Shred it!” and the crowd was out for blood.
Kris Bushover also put on his yearly presentation of the best tickets his department received from Spiceworkers. Most of them were pretty terrified of this particular presentation, but it’s all in good fun and great for a laugh.
After that was a after-party there in the convention center which was essentially a last chance to catch up and visit with everyone, new and old friends alike. Then a few of us met up with the community managers over at Champions, but Vegas rules were in effect so can’t say anything more. Although I did find out what camel case meant, thanks Justin.
Great time, and the PowerShell session was so much fun to put on I just love doing it. Not sure if I’ll be able to go back next year as it’s a very expensive trip and I had to foot the entire bill. I will try.
SpiceWorld 2014 hasn’t officially started, but this year was a little different from last year in that they have set aside the main “gallery”–this is mostly where we ate last year–for some of the vendors. They now have full convention style booths in there, which was interesting! CDW actually had a bus which I’m looking forward to walking through. But as always with SpiceWorld a lot more time was spent catching up with online friends and talking then looking at the goods! Already have a bag full of swag, which is typical for SpiceWorld. Did I mention I brought a much bigger piece of luggage than I actually needed because I learned my lesson from last year!
After that Rob and I broke off and did one last practice run of our Introduction to PowerShell presentation. We ran it on Rob’s Surface, which was mostly OK except we did run into a few problems. First the Help files didn’t contain all the help information?! And then when we tried to run Update-Help discovered he was of the WiFi! We were running late so we just winged it–and we’ve done this enough times that we can pretty much do that in our sleep–and I think it’s going to be pretty good this year. I added some slides for the last SpiceCorp and the concept was good but the slides needed tweaking and Rob was able to do exactly that and I think they’re really solid. It’s all around objects and really hammering home what they are.
After that we walked about a mile–ok, it was probably less but to my feet it felt like a mile!–down to Banger’s for a ton more socializing! Too hot to try out the sausages, but they looked amazing. Ran into a lot of friends there and was able to catch up, and also ran into several people from Boston SpiceCorp, so great to see the Northeast so well represented! Also got to meet Jeffrey Snover and that was really fantastic. Didn’t get to talk to him long but seemed like such a great guy–I’m sure he’s not an ax murderer. No more than I am! <Insert evil villain laugh here>
I ran back to the hotel after that and tried to catch the premier of Gotham but at 9pm they showed an hour and a half of news! And it wasn’t on at 8pm. The weird thing was they covered Gotham in the news cast and said it would premier today! Oh well, did tape it at home so it’ll have to wait until I return!
That’s it for today, should be more to write about tomorrow!
And I’m in Austin! Looking forward to getting SpiceWorld started. Our Introduction to PowerShell session is one of the first ones out of the gate, not sure if that’s good or bad! I’ll take it as good as I can enjoy the rest of the convention after that. Looking forward to meeting Jeffrey Snover who arrived yesterday (according to his twitter feed).
Excited! And very hungry! Have to find something to eat!
Getting very excited that SpiceWorld 2014 is just around the corner. Heading out to Austin in a couple of weeks to enjoy the spicy goodness. Rob Dunn and I are again putting on our Introduction To PowerShell presentation, and I think we’ve made some good tweaks to the presentation to improve on it from last year. If you’re new to PowerShell and coming to SpiceWorld this year, you HAVE to come check us out. Unless anything changes we’ll be up in Room 9 on the second floor.
Oh, did I mention that Jeffrey Snover will be there? He’ll be talking about “Just In Time, Just Enough Admin” so should be a fantastic 1, 2 punch of PowerShell–and general best practices–goodness.
It’s been so busy at work, and the PowerShell scripts have been flowing fast and furious. Unfortunately they’re so specific to the environment here that I haven’t been able to publish anything. I did finally come across another way to produced WhoIs data, so my WhoIs script is back in business! One nice thing about the report is I’m using a technique that converts HTML tables into XML and allows you to manipulate them directly. This is good because there’s a current bug in Set-CellColor script that if two cells in a row have the same data, and you’re trying to color the cell with that data BOTH cells will get the color treatment. Using XML looks very promising in stamping that out but I just haven’t had time to dig into it yet.
Wish I could show you all the cool things I’ve been able to work on! I’ll have to write about some of them as I can generalize a lot of it–I think!
By now, you may have noticed I’m always on the lookout for better performing code. This has turned out to be a good habit now that I’m working at athena health, as the pure scale of things is so much larger than places I’ve been at in the past. One piece I’ve never been able to speed up, though, is iterating through folders and files. Nicolas1847, a PowerShell scripter on Spiceworks, has come up with an ingenious method to get simple directory information using Robocopy (of all things), and a colleague at athena health likes to shell out to CMD.exe and use the old DIR command. But are they faster? And if so, which one?