You are now reading the 3rd part of the 4 part series on backing up databases to Azure. In Part 1 we looked at some benefits of using Azure for your backups and in Part 2 we set up the Storage Account with the Block Blob storage container.
In this post, we’ll take a look at how to use the freshly created Blob storage with our customized backup routines.
I, personally, love Ola Hallengrens backup solution, so that is what I’ll be using for this post. If you’re not familiar with it, I highly recommend that you check it out! However, it doesn’t really matter what customized solution you are using, it’ll work pretty much the same when it comes to backing up to Azure.
So let’s get started by first…
Creating shared access signature and SQL Server credentials
As mentioned in the previous post, to make the posts easier to follow, I’ll be using Azure Portal for everything (when it’s possible). If you rather use Powershell or Azure CLI, you can do that as well to accomplish the same results.
Shared Access Signature
First thing we need to do is to generate Shared Access Signature (SAS). We will then use this signature for the creation of SQL Server credentials. Creating SAS is done by going to Azure Portal, finding our Storage Account and then going to Shared Access Signature in the Settings.
I’ve marked with red the couple of things to pay special attention to. These are what I’d recommend to setup as a minimum:
- Allowed services. As we’re only using Blobs it makes sense to limit other services out.
- Start and Expiry date/time. I am a bit old-fashioned in the sense, that I want things to eventually expire so I am forced to renew and review what I’ve setup earlier. For demo purposes, I’ve set it for a day, for production I’d probably go with 6 or 12 months unless something else is specified by the company security policy.
- Allowed IP addresses. This is quite obvious I think, this you can use to limit the IP or range IP addresses that can access the container using this key. For backups, you really want to limit access to your SQL Servers and whatever box you use for administering them.
- Allowed Protocols. I can’t think of a reason to use anything but HTTPS, so leave the default on!
After you’re satisfied, hit the button that says Generate SAS and connection string. What you’re going to be needing for SQL Server credential is the middle field, called SAS token.
You want to copy that, but not all of it. There’s a question mark prefix at the beginning of the line, do not copy that, but everything after it. This really had me scratching my head the first time I did this, I guess the SAS token is mostly meant to be used as a part of the URL by services apart from SQL Server backup.
Next we’ll need the URL to our container, this we can easily find from going to Blob service section in the Azure Portal and then selecting the container we created (I created one called db-backups).
Once the container is selected, we’ll go to Properties page and get the URL from there.
There’s a button at the end of the field to copy the URL. Now that we have these both, it’s just the matter of creating the credentials using the following script.
SQL Server Credential
USE master GO -- Creating credentials for use with Shared Access Signature CREATE CREDENTIAL [https://testingstoacc.blob.core.windows.net/db-backups] -- Container name and path WITH IDENTITY='SHARED ACCESS SIGNATURE' -- this is a mandatory string, do not change it. , SECRET = 'sv=2018-03-28&ss=b&srt=sco&sp=rwdlac&se=2019-08-06T21:00:00Z&st=2019-08-05T21:00:00Z&spr=https&sig=etc...' -- Token generated from the Azure portal GO
And that’s it. If you check from Management Studio under Security and Credentials (or by querying sys.credentials), you should see your freshly created credential there.
After that, it’s just the matter of setting up the backup.
Backup to URL
As I already mentioned, earlier, I will be using Ola Hallengrens excellent backup solution for this, you can use BACKUP DATABASE command as well, but this is simply a bit nicer way to do it. To save some time, I’ll also backup just the system databases, as they are a bit smaller.
If you’re familiar with Ola’s solution already, you’ll notice that the syntax is basically the same for backing up to disk or to URL. Basically, the only thing you’ll need to do is to change the @Directory parameter to @URL, and you’re set.
EXECUTE dbo.DatabaseBackup @Databases = 'SYSTEM_DATABASES', @URL = 'https://testingstoacc.blob.core.windows.net/db-backups', @BackupType = 'FULL', @Compress = 'Y', @Verify = 'Y'
After running this couple times, I can go back to Azure Portal and to my Storage Account. From there, we can open up the Storage Explorer. Storage Explorer is a really nice way to browse the contents of your storage account, and there’s also a separate download available that works outside Azure Portal. If you’re dealing with plenty of files, such as the case often is with backups, it’s really a good idea to use the application for this.
As you can see, running the backup Stored Procedure, it also created the folders commonly used with Ola’s solution.
One more thing
We’re almost finished, but there’s one more thing to consider. When you do use a customized script, such as the one I now used, you need to figure out a way to clean up old backups that have passed their required retention time.
While Azure has enough storage to hold your backups until the end of days, you will incur costs for the space you use. That and the security policies you have in place usually require that you have to remove old backups. This is where Ola’s script unfortunately cannot help you, if you try to use @CleanUpTime parameter it’ll straight out tell you that.
This isn’t the limitation only with custom scripts, though, it’s also there when using built-in SQL Servers Maintenance Plans for backups. Don’t worry, though, there’s a whole slew of different kind of solutions for cleaning up, ranging from scheduled Powershell scripts to Azure Functions. I also created a Logic App some time ago to do it, which was quick and easy with a ready-made template from Microsoft.
And with that, we’re done. Hopefully, you found this post useful and enjoyable. If you did, you might also want to read the other posts in this series: