Testing Your Database Backups Using SQL Server on Linux Containers
We recently published a blog post by Shand Glenn titled, "Your Disaster Recovery Strategy Is Useless If Your SQL Server Backups Haven’t Been Tested" that discusses the need for testing your backups and outlines a process to assist in automating that testing. It got me thinking about how SQL Server on Linux containers can be used along with Docker and Azure DevOps to create an automated process for testing your backups.
This idea is still relatively rough and isn't a replacement for testing on an equivalent production system. However, this process would help you detect any corruption occurring in your backups and provide you with additional confidence that your backups are in a reliable state.
We will leverage my previous blog post, "Restoring an AdventureWorks Database on a SQL Server on Linux Container," to bootstrap this idea. At the end of that blog post, there is a section called "An Alternative Approach to Building the Container" that shows how to download the backup into the container to restore it. That example will be the base Dockerfile for demonstrating how this could work. If your backups aren’t available over HTTP, I will discuss an alternative approach for that.
Organizing Your Dockerfiles
You don’t have to leverage Azure DevOps to create a similar setup; that is just what I am going to demonstrate. This approach should work for any other system that can run tasks and store your Dockerfiles.
Let’s create a directory called backup-tests, and inside of that folder create two additional directories, adventureworks2019 and adventureworks2017. Inside each of those directories, let’s make a Dockerfile. You should now have a directory structure that looks like the following:
$ tree .\backup-tests\ /f ~\BACKUP-TESTS │ README.md │ ├───adventureworks2017 │ Dockerfile │ └───adventureworks2019 Dockerfile
We are placing the Dockerfiles in their own directory because of how Docker loads the context. You can read more about that here. This is a fairly standard approach that you will see often.
Creating Our Dockerfiles
Now that we have our layout in place, let’s open the Dockerfile in the adventureworks2019 directory and paste in the following code:
# Dockerfile # # Adventure Works Database on SQL Server 2019 FROM mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04 # Note: This isn't a secure password, and please don't use this for production. ENV SA_PASSWORD=ThisIsAReallyCoolPassword123 ENV ACCEPT_EULA=Y # Change to root user to run wget and move the file USER root RUN wget -progress=bar:force -q -O AdventureWorks2019.bak https://github.com/Microsoft/sql-server-samples/releases/download/adventureworks/AdventureWorks2019.bak \ && chmod 777 AdventureWorks2019.bak \ && mkdir /var/opt/mssql/backup \ && cp AdventureWorks2019.bak /var/opt/mssql/backup/ # Change back to the mssql user to restore the database USER mssql # Launch SQL Server, confirm startup is complete, restore the database, then terminate SQL Server. RUN ( /opt/mssql/bin/sqlservr & ) | grep -q "Service Broker manager has started" \ && /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P ${SA_PASSWORD} -Q 'RESTORE DATABASE AdventureWorks2019 FROM DISK = "/var/opt/mssql/backup/AdventureWorks2019.bak" WITH MOVE "AdventureWorks2017" to "/var/opt/mssql/data/AdventureWorks2019.mdf", MOVE "AdventureWorks2017_Log" to "/var/opt/mssql/data/AdventureWorks2019_log.ldf", NOUNLOAD, STATS = 5' \ && pkill sqlservr CMD ["/opt/mssql/bin/sqlservr"]
This example shows how to pull a backup file over HTTP into your container to restore it. If the restore fails, then you know that your backup might be corrupt.
Let’s turn to pulling a backup that isn’t available over HTTP. This is a more difficult one to discuss due to potentially needing extra plugins and libraries installed to enable Docker to mount shared drives. Here are some resources for mounting a share if that is the direction you want to take.
We are going to take an approach that isn’t as elegant. We will provide a mechanism outside of Docker to download the backup and place it inside the same directory as the Dockerfile. Then, we can use the approach of just copying the backup into the container as part of the container build. Let’s open the Dockerfile in the adventureoworks2017 directory and add the following.
# Dockerfile # # Adventure Works Database on SQL Server 2019 FROM mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04 # Note: This isn't a secure password, and please don't use this for production. ENV SA_PASSWORD=ThisIsAReallyCoolPassword123 ENV ACCEPT_EULA=Y # Setting the user USER mssql COPY AdventureWorks2017.bak /var/opt/mssql/backup/ # Launch SQL Server, confirm startup is complete, restore the database, then terminate SQL Server. RUN ( /opt/mssql/bin/sqlservr & ) | grep -q "Service Broker manager has started" \ && /opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P ${SA_PASSWORD} -Q 'RESTORE DATABASE AdventureWorks2017 FROM DISK = "/var/opt/mssql/backup/AdventureWorks2017.bak" WITH MOVE "AdventureWorks2017" to "/var/opt/mssql/data/AdventureWorks2017.mdf", MOVE "AdventureWorks2017_Log" to "/var/opt/mssql/data/AdventureWorks2017_log.ldf", NOUNLOAD, STATS = 5' \ && pkill sqlservr CMD ["/opt/mssql/bin/sqlservr"]
Orchestrating All of These Steps
Okay, we have all parts in place, so we can now set up some automation. I prefer to create a script to perform my tasks locally to use with Azure DevOps. This gives me the ability to test locally and keeps my build YAML small. Let’s create our build.ps1 to start.
if ((Get-ChildItem -Recurse -Path **\\Dockerfile).Count -gt 0) { $dockerFiles = Get-ChildItem -Recurse -Path **\\Dockerfile | Select-Object -ExpandProperty DirectoryName -Unique $dockerFiles | ForEach-Object { Push-Location $_; $directory = Get-Item Dockerfile | Select-Object -ExpandProperty Directory docker build -t $directory.Name . Pop-Location; } } else { Write-Host "No Dockerfiles found!" exit 1 }
Now that we have our build.ps1, let's run it locally to make sure that it works.
$ .\build.ps1 [+] Building 0.2s (8/8) FINISHED => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => [internal] load build definition from Dockerfile 0.1s => => transferring dockerfile: 981B 0.0s => [internal] load metadata for mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04 0.1s => [internal] load build context 0.0s => => transferring context: 46B 0.0s => [1/3] FROM mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04@sha256:a4c896f11c73fd6eecaab1b96eb256c6bc0bdc 0.0s => CACHED [2/3] COPY AdventureWorks2017.bak /var/opt/mssql/backup/ 0.0s => CACHED [3/3] RUN ( /opt/mssql/bin/sqlservr & ) | grep -q "Service Broker manager has started" && /opt/mss 0.0s => exporting to image 0.1s => => exporting layers 0.0s => => writing image sha256:8a5e9367c12e78943f6f1e0dc5423290f8bbaf8bc9a38e44895fedc5dabacae5 0.0s => => naming to docker.io/library/adventureworks2017 0.0s [+] Building 0.2s (7/7) FINISHED => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 1.33kB 0.0s => [internal] load .dockerignore 0.1s => => transferring context: 2B 0.0s => [internal] load metadata for mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04 0.1s => [1/3] FROM mcr.microsoft.com/mssql/server:2019-CU5-ubuntu-18.04@sha256:a4c896f11c73fd6eecaab1b96eb256c6bc0bdc 0.0s => CACHED [2/3] RUN wget -progress=bar:force -q -O AdventureWorks2019.bak https://github.com/Microsoft/sql-serve 0.0s => CACHED [3/3] RUN ( /opt/mssql/bin/sqlservr & ) | grep -q "Service Broker manager has started" && /opt/mss 0.0s => exporting to image 0.1s => => exporting layers 0.0s => => writing image sha256:844dfb9edec2e93e247cd7e3731d5163d6fa869a57bcc46dbd34d57aab22a6b3 0.0s => => naming to docker.io/library/adventureworks201
The final step is to make this up as a pipeline in Azure DevOps, which will make it easy to exercise and execute the building of your containers. Create an azure-pipelines.yml next to your build.ps1 with the following content.
trigger: - master stages: - stage: BackupTests displayName: Test Database Backups pool: vmImage: 'Ubuntu-18.04' jobs: - job: displayName: Test Backups steps: - powershell: ./build.ps1 displayName: 'Build containers'
Next Steps
We now have a good foundation to start extending. Additional Dockerfiles could be added for more backups and other types of checks that will build confidence that your backups are healthy.
Thank you for reading! I hope this helps you generate ideas for how to make it easier to test your backups.