Hey THWACK Community,
I wanted to share a handy script I've been using to streamline my SQL Server backup process. Like many of us, I ensure my SQL Server database is backed up daily. To enhance our disaster recovery capabilities and facilitate easier management, I've set up a workflow to transfer these backups directly to Amazon S3.
Here’s a quick guide on what I’ve done and the PowerShell script I’m using to automate this process.
Steps I Followed:
- Daily SQL Server Backup:
- Scheduled a daily backup job in SQL Server Management Studio (SSMS) to save the
.bak files locally.
- Create an S3 Bucket:
- Provisioned an S3 bucket in AWS named
solarwinds-sql-backups to store these backup files.
- Automate Backup Transfer to S3:
- Created a PowerShell script to upload the latest backup files from the local directory to the designated S3 bucket.
Here’s the PowerShell script:
# Define local backup path and S3 bucket details
$localpath = "C:\MySQLBackups\"
$bucketName = "solarwinds-sql-backups"
$awsRegion = "us-east-1"
# Get the latest .bak file in each directory and subdirectory
$latestFiles = Get-ChildItem $localpath -Filter *.bak -Recurse |
Group-Object { $_.DirectoryName } | ForEach-Object {
$_.Group | Sort-Object LastWriteTime -Descending | Select-Object -First 1
}
# Upload each latest backup file to S3
foreach ($file in $latestFiles) {
$backupFile = $file.FullName
$s3Key = $file.Name
Write-S3Object -BucketName $bucketName -Key $s3Key -File $backupFile -Region $awsRegion
Write-Host "Uploaded $backupFile to s3://$bucketName/$s3Key"
}
Then added the above PS script to the Windows Task Scheduler to run daily at 3AM.
Script Breakdown:
- Variables:
$localpath: Path where the .bak files are stored.$bucketName: Name of the S3 bucket.$awsRegion: AWS region where the S3 bucket is located.
- Steps to Execute:
- The script retrieves the latest
.bak files from the directory and its subdirectories. - It groups these files by directory and sorts them by the last write time.
- Only the latest backup file is selected.
- Each selected file is uploaded to the specified S3 bucket, and the completion is logged to the console.
Benefits:
- Automated Backup Transfer: Easily automate the backup and upload process.
- Disaster Recovery: Ensures that backups are stored offsite in AWS S3, adding an extra layer of resilience.
- Convenience: Saves time and reduces manual intervention.