BiOps
Monitoring SQL Server with sp_Blitz

Monitoring SQL Server with sp_Blitz

My team and I currently manage six SQL Server instances, each on its own server. Our main need is to monitor how the databases on our instances are evolving and keep an eye on disk space.
Our sysadmin team manages the OS and monitors the VMs vitals, but we don’t have a hand on that monitoring and we want to see the evolution of things before its too late. So we started looking for a solution that could help us.

Professional monitoring tools are attractive, but they were way too expensive to justify buying for this kind of need.
I first stumbled upon SQLWatch, developed by Marcin Gminski. Great tool! We used it for a couple of weeks, but it had two issues for us:
– it requires PowerBI online, and we’re an “exclusively on-premise” shop. So that meant we always had issues with the dashboard because of PowerBI versions and limitations of PBiRS. The log databases were also getting pretty big. I don’t know if I screwed up the purge mechanism or we just had a lot of stuff going on but it was a small rock in my shoe as well.

And one day, reading the full documentation of sp_Blitz, a stored procedure we use to get feedback on our SQL Server instances, I noticed the results could be output to a table. And even better, that it was done using linked servers and a that you could use a central repository to store the results of multiple servers. Isn’t that EXACTLY what we need?

sp_Blitz

sp_Blitz is a great tool developed by Brent Ozar as part of a First Responder Kit. For those who don’t know Brent, he’s kind of the Tyrion Lannister of SQL Server. It’s what he does, he drinks and he knows things.
sp_Blitz and more generally the First Responder Kit it is part of give extremely valuable information about SQL Server instances and databases. It is provided under the MIT License.

Running this stored procedure was something I used to do on a weekly basis on all instances to check the server health (storage, backup, reliability, etc.). I also had developers check that the databases they were each the owner off were not going crazy. But to really get what I wanted I had to get two things working: centralizing the data and creating some kind of reporting.

The BlitzResults table

I was all happy to see the stored procedure had a @OutputServerName parameter. I was going to be able to setup that central repo. AWESOME!
So I tried it and it worked on one serer.
And then I RTFMed. 😭
On the very, very last line of the documentation it read:

@OutputServerName - not functional yet. To track (or help!) implementation status: https://github.com/BrentOzarULTD/SQL-Server-First-Responder-Kit/issues/293

Ok, let’s see what that issue says. It was quite the read but apparently, somebody had coded something in. Isn’t that awesome? So I tested it and it worked! I had my central monitoring solution almost up and ready. For free!

All I had to do were the two following steps:

  • Setup a linked server on all my servers to the one I had chosen to hold the central repo
  • Schedule a job to run sp_Blitz every morning and export the results to that server.
  • (and add a cleanup job to purge the old data)

And I was done! I just had to have a look at the contents of that table every time I wanted to check on my kingdom and life was instantly easier.

Making sp_Blitz better

What I had was great, but it wasn’t perfect. First of all, I wanted to be able to exclude some checks.

Some warnings weren’t relevant to me. For example: backing up on the same drive. We have Veeam backing up the databases from the disk using snapshots as soon as they’re done and shipping them of.

But I couldn’t do this easily. @SkipChecksServer exists on sp_Blitz, but it wasn’t implemented. But all I had to do was code that feature myself. And I did. My first ever serious open source contribution. And there I had my central repo and my central skipchecks table.

I also had an issue with the reporting of the disk space. sp_Blitz only reported free space on the disk. Great. But out of how much? I wanted total size and ration. So again, I tried to implement it and submitted it to the project. It got approved and finally I had my ideal solution in place.

Creating my own DBATools DB

Customizing the view

After a couple of days looking at the results, I had all I wanted by a few things weren’t enough.
I needed to alias the servers. So I created a mapping table.
I also needed to put a view on the results table to focus on what I needed: issues with priority under 200 and disk space only. So I created a view.

Now I had three tables and a view. Well, let’s put that in source control. And so the DBATools was born.

Mailing the results

Now, my view returns exactly what I need. But I still need to open SSMS or Azure Data Studio and query it. And I’m lazy. And my team is busy so they’re not doing it either. Let’s create a dashboard!

That’s still work in progress, but for now, I mail the results to the team. And so I made sp_mailResults.
What this procedure does is take the view, generate two colorful HTML tables from it and send them by email using the SQL Server mail function. One mail for the first 50 lines of DB “issues”, sorted by priority of course. One mail for the disk size information on all servers. Scheduled once a week.

And with that, I have my monitoring solution. I get a weekly view of how my ecosystem is evolving. Developers get an insight of which tables have issues that need addressing (missing indexes, partitioning alignment, etc.)

♥ Open Source ♥

Working on this topic has made me understand the power of Open Source. It’s not just getting free software on the internet. Actually, it’s not that at all! It’s being able to make it better. It’s being able to make it do what YOU need it to do. I feel solving this monitoring issue wasn’t just about the result. It was mostly about the journey. And for sure, I’ll be writing about what it brought me on a personal level, and what I think it can bring any developer out there.

For now, go check out the First Responder Kit if you’re working with SQL Server. Also, feel free to use my DBATools stuff. Just be aware that it isn’t very “community friendly” yet and some things like the mail profile are hardcoded in.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: