Every so often, Microsoft puts some extra into their SQL Server product, which doesn’t get as much time in the spotlight as it should. Nope, not talking about any of the great, new things that we got coming in with SQL Server 2022. Today’s post will be used to look at something called. SQL Assessment API. It’s not a very exciting name for a feature, but it definitely provides a new, exciting capability!
So, what exactly is the SQL Assessment API, and where, and how would you use it?
What is the SQL Assessment API?
Let’s hear it from the Microsoft directly.
To summarize. It’s a programmatic way to perform assessment of SQL Servers in scale, repeatably, using baselines defined by the SQL Server Team. If you have large SQL Server estate you need to manage, now would be a perfect time to feel a little excited. Likewise, It is fine if part of that estate is running on Azure PaaS because the list of supported versions and deployment models include Managed Instance.
The SQL Assessment API comes packed with 200+ rules that are evaluated across instances or databases. While this is a rather decent list by itself, it’s also possible to add your own rules or change any of the existing ones. Since the default ones are coming from SQL Team and are based on widely recognized best practices, I would just start with these first.
How do you run it?
There are two ways to use it, by running Powershell CmdLets or using Azure Data Studio with proper extension. There are couple prerequisites you need to take manage first, namely installing the following items.
With Powershell CmdLets
I’ve spent quite a lot of time working with DBATools, so these days when I try to do anything, my first thought is typically: “How can I do this with Powershell?” Most of the functionality is based around these two CmdLets:
The first one will provide a list of assessments that can be run against the target, which in this case is either the Server or the Database. This is done by simply piping the instance/database into the CmdLet.
Get-SqlInstance -ServerInstance localhost | Get-SqlAssessmentItem
Get-SqlDatabase -ServerInstance localhost | Get-SqlAssessmentItem
The simplest way to execute the assessment is to simply pipe an instance or database into the Invoke-SqlAssessment CmdLet. This can be useful, if you only want to take a quick look and not persist the results anywhere.
Get-SqlInstance -ServerInstance localhost | Invoke-SqlAssessment
However, to make most out of the process of continuously checking for the issues, you should consider persisting the results somewhere. A typical solution would be to use the toolbox or utility database most of us tend to create when onboarding instances. Once you have that place, you can take the output of the Invoke-SqlAssessment and store that into its own table (or even schema, if you prefer) using Write-SqlTableData.
Get-SqlInstance -ServerInstance localhost | Invoke-SqlAssessment -FlattenOutput | Write-SqlTableData -ServerInstance localhost -DatabaseName tempdb -SchemaName sqlassessment -TableName AssessmentResults -Force
In the example above, we’re running the SQL Assessment against the instance. We need to make the output usable for Write-SqlTableData with FlattenOutput, and since we don’t have existing sqlassessment schema available, we need to invoke the Write-SqlTableData with Force parameter to have our execution create it. If we take a look at the tempdb (note that if you really want to keep the data, don’t put it here, as tempdb is recreated every time the instance starts), we can see the schema and the table.
We can also query the table to see the results of the execution.
This was quick and easy, now let’s move on to test the Azure Data Studio extension!
With Azure Data Studio
I admit that I don’t use Azure Data Studio too frequently, often I have easier time to find what I am looking for with good old Maintenance Studio. However, since SQL Assessment API has the extension for this exact purpose, let’s give it a go.
After installing the extension and selecting Manage on the server option, there’s a new option available for me under General section called SQL Assessment. I am already liking it at this point.
After I select the assessment, I can immediately execute it by clicking the button called Invoke-Assessment. My testbox is relatively small, running on default settings, and I’ve restored a couple AdventureWorks databases there. I have to admit that I really like what I am seeing. We don’t get just listing of the identified issues, but more detailed description in the bottom window.
If you click on the View applicable rules selection, it shows you all the rules that the instance and the databases were evaluated against. As a bonus, there is historic information available that you can see by selecting the History option, available next to Assessment. This is rather handy, as you can track the number of changes over time.
You can also get to the details if you select one of the historic assessments.
After executing the assessment, it’s also possible to save it as an HTML page. It has all the same information, with links leading to external documentation for more details on the issue and suggested fixes.
And that’s it for now. Now we have learned at least two, easy, ways to check the configuration of your SQL Server instance and databases.
I feel like SQL Assessment API is a considerable step into the right direction, when it comes to management of large SQL Server estates. If you need to do a quick lookup on a server, there’s the Azure Data Studio extension for that. It also provides a nice way to provide a decent looking report that you can send forward. If you want to do it in a scheduled manner and against larger estates, go with Powershell and store the results into a central location. Writing your own report to pull out the results isn’t too difficult, if you’re feeling fancy you might even point a Power BI in it.
It will be very fascinating to see how Microsoft will eventually integrate this feature with Azure. As Microsoft already has a way to pull together information from large Azure-based estates with their SQL Server extension, adding this data into the same view would seem a logical next step.