Automating a SQL smoke test

by Jon Shah | Jun 19, 2025 | Az DevOps, PowerShell

In modern database development workflows, updates to the schema and data should be tested, so they don’t introduce bugs or cause data loss. This testing can sometimes be referred to as a smoke test, which is designed to catch immediate errors before latter stage tests like integration or performance testing.

In this post, we’ll explore what a SQL smoke test is, why it matters, and how to automate it in your pipeline.

What is a SQL Smoke Test?
A smoke test in the context of SQL is a basic check to confirm that your database scripts:

  • Execute without syntax errors
  • Don’t break database objects like tables, views, functions and stored procedures
  • Produce expected changes

It’s not meant to catch every bug, but rather to act as a “sanity check” before code moves into broader testing or production environments.

Why Automate SQL Smoke Tests?
Manual database testing is time-consuming and error-prone. Automating smoke tests offers:

  • Early feedback: Catch broken scripts before they’re merged or deployed.
  • Consistency: Enforce a standard validation step across teams.
  • Speed: Run checks as part of every pull request or CI/CD event.

This example uses Azure CLI commands to complete the test. This would usually be set to run as a build pipeline, when a pull request is created.
The process deploys the Git code to Azure Data Lake and triggers a synapse pipeline to ingest the code in to the database.
The DevOps agent is held open for the duration of the test, with a one minute poll interval to the synapse pipeline requesting a pass or fail status.


If the test passes, the pull request is released, allowing the branch to be merged to main.

A fail holds the pull request open, for a code fix and re-queue of the smoke test.

Initialisation

  • Set initial variables and retrieve pipeline parameters:
  • parameters.project: The name of the project.
  • parameters.pullRequestId: The pull request ID.
write-host @"           
   __                    ___           
  (_  _  _   _ ( _  _     )  _   _ _)_ 
 .__) ) ) ) (_) )\ )_)   (  )_) (  (_  
                  (_       (_   _)     
"@

$organisation = "movie-ratings"
$uriPrefix = "https://dev.azure.com/$(organisation)"
$environment = "dev"
$workspace = "movie01"
$adlsAccount = "$environment$workspace"
$container = "dbScripts"
$project = "$(System.TeamProject)"
$pullRequestId = "$(System.PullRequest.PullRequestId)"

Query API for Pull Request and Commit

  • Fetch pull request details via Azure DevOps REST API.
  • createdBy.displayName: The person who created the pull request.
  • repository.name: The branch repository.
  • $sourceRefName: The branch.
  • lastMergeSourceCommit.commitId: The branch commit id.
token = "bearer $(System.AccessToken)"
$headers = @{Authorization=$token}

$uri = "$uriPrefix/$project/_apis/git/pullrequests/$($pullRequestId)?api-version=7.2-preview.2"
$prInfo = Invoke-RestMethod -Uri $uri -Method get -headers $headers -ContentType "application/json"

$pullRequestUser = $prInfo.createdBy.displayName
$repository = $prInfo.repository.name
$sourceRefName = $prInfo.sourceRefName
$commitId = $prInfo.lastMergeSourceCommit.commitId
$sourceBranch = $sourceRefName.Replace('refs/heads/','')
$runDateTime = Get-Date -UFormat "%d/%m/%Y %R"
  • Fetch commit details via Azure DevOps REST API.
  • commiter.name: The person who created the commit.
  • commiter.date: The commit date.
$uri = "$uriPrefix/$project/_apis/git/repositories/$repository/commits/$($commitId)?api-version=5.0"
$commitInfo = Invoke-RestMethod -Uri $uri -Method get -headers $headers -ContentType "application/json"

$lastCommitUser = $commitInfo.committer.name
$commitDate = $commitInfo.committer.date
$lastCommitDate = [datetime]::Parse($commitDate).ToString('dd\/MM\/yyyy HH:mm')

Deploy Code

  • Fetch target Azure Data Lake directory from csv lookup.

code-groups.csv

$csv = Import-Csv $(System.DefaultWorkingDirectory)\csvlookups\code-groups.csv `
            | Where-Object {$_.project -eq $project} `
            | Select-Object adls_directory
$adlsDirectory = $csv.adls_directory
  • Create working directory for code pull.
Set-Location -Path $env:SYSTEM_DEFAULTWORKINGDIRECTORY
New-Item $repository -Type Directory | Out-Null
Set-Location -Path $repository
$sourceDir = (Get-Location).Path
  • Checkout code from Git.
uri = "https://$env:SYSTEM_ACCESSTOKEN@dev.azure.com/$organisation/$project/_git/$repository"
git init --quiet
If ((git remote).count -gt 0) { git remote remove origin }
git remote add origin $uri
git config --global user.email "movie-ratings.co.uk"
git config --global user.name "movie-dev"
git fetch --quiet
git checkout $commitId --quiet
  • Start writing to the console.
write-host "Code:"
write-host " > $project > $repository > $sourceBranch"
write-host " > Last branch commit $commitId will be tested"
write-host " > Committed by $lastCommitUser on $lastCommitDate"
write-host "Test:"
write-host " > Triggered by $pullRequestUser with pull request $pullRequestId"
write-host " > Started $runDateTime in $environment-$workspace"
write-host " > Checking out and deploying code"
  • Deploy code.
$dlCheck = (az storage fs directory exists `
                --account-name $adlsAccount `
                --auth-mode login `
                -n $repository `
                -f $adlsContainer/$adlsDirectory | ConvertFrom-Json).exists
if($dlCheck) {
  az storage fs directory delete `
      --account-name $adlsAccount `
      --auth-mode login --file-system $adlsContainer `
      --name $adlsDirectory/$repository `
      --yes | Out-Null
}

az storage blob upload-batch `
    --account-name $adlsAccount `
    --auth-mode login -d $adlsContainer `
    --destination-path $adlsDirectory/$repository -s $sourceDir `
    --no-progress | Out-Null

Trigger Synapse pipeline

  • Start pipeline plRunSmokeTest.
$pipeline = "plRunSmokeTest"
$parameters = '{\"pGroupName\":\"'+$repository+'\"}'

$plTestStart = Get-Date -Format "yyyy-MM-ddTHH:mm:sszzz"
$plTest = az synapse pipeline create-run `
                --workspace-name ${{ variables.workspace_name }} `
                --name $pipeline `
                --parameters $parameters `
                --only-show-errors | ConvertFrom-Json
$plTestRunid = $plTest.runId
  • Write to the console.
write-host " > Running pipeline $plTestRunid ($pipeline)"
write-host " > The code test has started.  Please wait..."

Declare function

  • Function used later on.
function BuildRunIdArray {
  param( [string[]]$plParentRunId, [string]$plTestStart, [string]$plTestEnd )

  $plChild = (az synapse pipeline-run query-by-workspace `
                  --workspace-name ${{ variables.workspace_name }} `
                  --last-updated-after $plTestStart `
                  --last-updated-before $plTestEnd `
                  --only-show-errors) | ConvertFrom-Json

  for($i=0 ; $i -lt $plParentRunId.Length; $i++) {
    for ($k = 0; $k -lt $plChild.value.count; $k++) {
      $plChildParentRunId = $plChild.value[$k].invokedBy.pipelineRunId
      $plChildId = $plChild.value[$k].runId

      if ($plChildParentRunId -eq $plParentRunId[$i]) {
          if ($plParentRunId.contains($plChildId) -eq $false) {
              $plParentRunId += $plChildId
          }
      }
    }
  }
  return $plParentRunId
}

Return Test Result

  • Start polling the Synapse pipeline status.
  • Create array to store all pipeline run Ids involved in the test.
  • Start 60 minute timer with a 60 second polling interval.
  • Poll status of InProgress pipeline plRunSmokeTest.
  • When test completes, build array of all child pipeline run ids involved.
$plArray = @()
$testFailed = $false
$timeout = new-timespan -Minutes 60
$sw = [diagnostics.stopwatch]::StartNew()

while ($sw.elapsed -lt $timeout){
  $plParent = az synapse pipeline-run show `
                --workspace-name ${{ variables.workspace_name }} `
                --run-id $plTestRunid `
                --only-show-errors | ConvertFrom-Json

  if ($plParent.status -eq "Succeeded" -Or $plParent.status -eq "Failed"){
    $plTestEnd = Get-Date -Format "yyyy-MM-ddTHH:mm:sszzz"
    
    #build runId array and pass down 7 levels through child pipeline tree
    $plArray += $plParent.runId
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
    $plArray = BuildRunIdArray $plArray $plTestStart $plTestEnd
  • Write test result to console.
  • If test fails, get status and error of each run id involved in test.
    write-host "Result:"

    if($plParent.status -eq "Succeeded") {
      write-host " > The code test has passed"
    }
    else {
      $testFailed = $true
      $scriptErrors = [PSCustomObject]@()
      $reportedErrors = [PSCustomObject]@()
      
      foreach($plrid in $plArray) {
        $plQuery = az synapse pipeline-run show `
                        --workspace-name ${{ variables.workspace_name }} `
                        --run-id $plrid `
                        --only-show-errors | ConvertFrom-Json

        $plRunid = $plQuery.runId
        $plName = $plQuery.pipelineName
        $pipelineError = $plQuery.message
  • Remove unwanted strings from error message..
  • Remove from the pipeline error message any found strings from error-string.csv.

error-strings.csv

        $csv = Import-Csv $(System.DefaultWorkingDirectory)\csvlookups\error-strings.csv -Delimiter "|" `
                            | Select-Object error_string
        foreach ($row in $csv) {
          $noisyString = $($row.error_string)
          $pipelineError = $pipelineError -replace($noisyString,'')
        }
  • Write test result to console.
  • Store errors from plRunScript pipeline runs.
        if($plQuery.status -eq "Failed") {
          if($plname -eq "plRunSmokeTest") {
            write-host " > The code test has failed!"
            write-host "Errors:"
          }

          if($plname -eq "plRunScript") {
            $sqlScript = $plQuery.parameters | Select-Object -ExpandProperty pFileName

            $scriptErrors += [PSCustomObject]@{
              script = $sqlScript
              errorMessage = $pipelineError
              scriptErrorMessage = $sqlScript+$pipelineError
            }
          }
        }
      }
  • Write script errors to console.
  • Iterate the script errors.
  • Check if script error has been reported.
  • Store reported script error.
  • write script error to console.
      for ($i = 0; $i -lt $scriptErrors.count; $i++)  {
        $script = $scriptErrors[$i].script
        $errorMessage = $scriptErrors[$i].errorMessage
        $scriptError = $scriptErrors[$i].scriptErrorMessage

        $isReportedError = ($reportedErrors | Where-Object { $_.reportedError -eq $scriptError }).count
        
        $reportedErrors += [PSCustomObject]@{
          reportedError = $scriptError
        }

        if ($isReportedError -eq 0) {
          write-host "`nScript: [$script]"
          write-host "Error: $errorMessage"
        }
      }
    }
    if ($testFailed) {   
      exit 1
    }
    return
  }
  start-sleep -Seconds 60
}