Transcript Logging – Why you should do it
I’d been writing PowerShell scripts for a couple of years before I came to athena health, and had a pretty good feel for how I liked to do things. I knew about Transcript logging, but it was so inelegant. Most of the time it didn’t capture much anyway because I really do believe in running clean scripts. A good script shouldn’t have any problems, just run and not really return any feedback. Especially if it’s running as a scheduled task–no one’s there to read it anyway! And that’s the rub, if you’re not there to see what went wrong, how do you troubleshoot it? Logging, of course!
SQL Backups Report
This is a simple report to tell you the status of your SQL Server backups.
Read Text Files Faster than Get-Content
This was a fun little script I threw together after a particular conversation came up at Spiceworks. If you’ve worked with PowerShell long you’ve used Get-Content to read a file. 99% of the time, it’s fine and you just continue on with life. This blog post is about that 1% when Get-Content is SLOW. The .NET IO.Streamreader is where people turn to speed things up so I decided to create a function around it that worked much like Get-Content does. This is it’s story.
SQL Backup File Name
As I’ve mentioned, I’m doing a ton of SQL work lately (more on that later!) and I just had a co-worker ask for something so I threw together this quick query. One of our backups had failed and he just wanted to know where the backup file was being placed (in most cases we backup locally but we do have some exceptions). I threw together this query to locate the file name:
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
SELECT bs.database_name AS Name, | |
bs.backup_finish_date AS LastBackup, | |
bmf.physical_device_name AS BackupFile | |
FROM msdb.dbo.backupmediafamily AS bmf | |
JOIN msdb.dbo.backupset AS bs | |
ON bmf.media_set_id = bs.media_set_id | |
WHERE bs.type = 'D' | |
ORDER BY bs.backup_finish_date DESC |
I also have a new PowerShell script that reports on SQL backups that I want to get published, as well as a more intelligent index rebuilder that is Availability Group and log shipping aware.
Stay tuned!
Opening up my Scripts
I have a confession to make. I’m really busy at work. So busy, and challenged, that I haven’t had much time to keep the blog going and certainly not enough time to help support all the scripts I’ve written. In fact, if you’re subscribed to the Employee Directory post, you’ve seen the back and forth going on there! I feel really bad that I haven’t been able to get back to the few of you who have reached out to me because you’re running into problems, but the work/life balance is a little skewed right now and the blog and supporting you is what’s had to drop off.
The other problem is I don’t have the infrastructure around to support some of the scripts, the DFS Monitor comes to mind! No DFS here at work so hard to test!
So what to do? I want people to enjoy my scripts and I’d like to continue seeing development happen on them. So time to get these things out on GitHub. This way if you want to make changes you can fork the code off and do your thing and even put in a pull request–I swear that should be a push request, but that’s just me–and if I like what I see I’ll merge it in with the main code.
You can find me here, and the scripts I’ve published so far:
That’s all I have right now, but if there’s a script you’d really like to see up there just let me know in comments and I’ll get it posted ASAP. I hope this helps!
HTML Reporting – Grouping Rows
Continuing my HTML reporting series, I have a new twist on coloring columns. There are times when you want to group a number of rows together, so I created this function to do just that. It’s essentially a riff on the Set-AlternatingRows with a little bit of Set-CellColor thrown into the mix.
Dynamic Properties in Objects…. and performance!
So last post I talked about dynamic properties in objects, and it was pretty cool. Then a crazy thing happened, that same colleague who was playing around with a faster Get-ChildItem project? He decided to get some real work done and began working on a Office365 script but the Get-MailboxStatistics cmdlet is a little bit different with O365 in the TotalItemSize property is deserialized and pretty much only has a lousy string output. So how to get the raw number without the ToMB() method? A little Googling and we found an Exchange blog post using RegEx to strip out all the extra crud, and a dynamic property! In the years I’ve been using PowerShell I’ve never once seen a dynamic property and the very day I decided to learn it we actually found a real life usage! What are the odds? Anyway, the twist was the Exchange team was applying the Add-Member to an array of objects. I didn’t realize you could do that. And it worked!
Dynamic Fields in Objects – what?
Short post today, but I wanted to talk about dynamic properties in objects. I haven’t had too much need for this kind of thing, but the fact that PowerShell can do it so easily is pretty cool. Imagine having an object where you plug a value into one field and dynamically another field in the object changes based on that first value. Kind of sounds like a function, but it’s all done right in the object itself.
Building Modules
If you’re looking to read about Modules and what they are and how to make them, this isn’t the blog post for you. No, the problem I have is living with Modules. I’ve only recently begun using them with the new job, but they have quickly made a place in my heart because being able to just call a module over and reuse code saves so much time! But, as always happens, I want to make a change or fix a bug and that’s where modules fall on their face (a little). This is my attempt to address that fact.
Shrink SQL Log Files
If you’re a Windows Administrator and have Microsoft SQL Servers in your environment–and you’d be a rare bird if you didn’t–then you’ve encountered this problem. Backups are missed for awhile, massive restore is done, something happens and your transaction log file grows way too big. It’s not uncommon to find it being much larger then the actual database! If you’re like me you would RDP to the server, check out the data drive and see that the log file itself was out of size. You’d then hit Google to find out how to do a shrink file, then have open SQL Management Studio and try to figure out the logical name of the log file, then run the query. Usually about 20 minutes of lookups, all so you can run a SQL query that typically takes a few seconds to complete. Time to address this using PowerShell.