Whilst continuous integration (CI) is a well established practice in the context of application development, its use within database development has typically lagged.
In this post I’m walking through a CI framework for SQL Server, that’s built upon Visual Studio 2015, SSDT, GitHub, DACPACs, TeamCity and PowerShell.
My implementation example for this framework is broken down into these 9 sections:
(1) Create a Visual Studio SSDT SQL Server project and import a development database into it
(2) Add a DACPAC-based PowerShell deployment script to the project
(3) Create an empty repository for the database project in GitHub
(4) Create a local Git repository and publish to the remote GitHub repository
(5) Create a TeamCity project and connect it to the GitHub repository
(6) Add a build configuration, 2 build steps and 1 trigger to the TeamCity project
(7) Create a target QA database and populate it with a copy of the development database
(8) Make a change to the development database project, build and test in the dev environment and then check in
(9) Check that the CI process has automatically updated the QA database with this change
Note that my scope here spans only the initial stages of CI, whereby a developer check-in to source control triggers a build of the project on a CI server, after which the DACPAC output from that build is applied to a QA database. The next stage – running automated tests on the QA database – is the subject of my next post.
Also because this example relies solely on DACPACs to deploy changes to the QA server, only database meta-data changes are considered (see here for a good explanation of DACPACs). If table data (e.g. within system lookup tables) is updated in the development project and needs transferring to QA, then another mechanism is required to handle this, such as defining a post-deployment script within the SSDT project.
For this post I’m using Visual Studio 2015 Community Edition, SQL Server Data Tools (SSDT) v16.5, SQL Server 2014 SP1 and JetBrains TeamCity v10.0.5.
Our first step is to create a new SQL Server Database project in SSDT:
I’ve called the project ContosoRetail, and I’ll be using it to hold the database definition for Microsoft’s ContosoRetailDW sample database. If I right-click the project in the SSDT solution explorer I have the option to import an existing database into it:
Once I select ContosoRetailDW from the subsequent dialog and import it, the project populates as follows:
Note that this import includes database meta-data only – no table rows are imported. Note also the significance of this step – I now have a meta-data description of the database that I can check into source control.
Because my build target is SQL Server 2014, I right-click the project node, select Properties and change the Target Platform from the default of SQL Server 2016 to SQL Server 2014.
The CI process that we’re implementing uses a CI server to build the SSDT database project and then apply the resulting DACPAC to the QA database. The following PowerShell script (which originates from here) is used to apply the DACPAC:
This needs to be customised for the target environment – above you can see that I reference the local SQL Server installation, the ContosoRetail.dacpac DACPAC file and a QA database called ContosoRetailQA.
Also note that deployment of a DACPAC can fail, e.g. if it would result in data loss. So some error handling should be included here to handle this situation gracefully.
I save this to a file called Deploy.ps1 and right-click the SSDT project and add a new folder called Scripts and then add the script into it by right-clicking the project and adding it as an Existing Item:
I use the GitHub GUI to create a repository called ContosoRetail:
I use the SSDT Team Explorer tab to create a new local Git repository in the same folder as the SSDT project:
and I then right-click the SSDT ContosoRetail solution node and select “Add Solution to Source Control”, which commits the ContosoRetail project to this repository. Now via the Team Explorer tab we can link and publish this to the remote repository that we created in the previous step:
From the Administration page of TeamCity I create a new project:
and then on the VCS Roots page I add a reference to my GitHub repository:
Now from the General Settings page we can create a manual Build Configuration:
and where from the following page I select the VCS Root that I previously created, to be the source control for this build configuration.
We can now create the build steps. For the first step we select the “Visual Studio (sln)” runner, since this step will be doing our database project build:
Then we create a second build step, this time using the “PowerShell” runner since here we will be running the Deploy.ps1 script shown previously, to deploy the previous step’s DACPAC output to the QA database:
Note the use of the -tcDir %teamcity.build.checkoutDir% parameter to pass the TeamCity checkout directory to the deploy script, since this is where the DACPAC file will be.
For the purposes of this demo example, these are the only steps I’m including. A build step that runs tests on the QA build is obviously required in practice, and in my next post I will discuss automated testing and show how it can be added to this TeamCity project. For now though, TeamCity shows this summary of our build configuration:
Now we can add a trigger to the build configuration, that determines when the build will run. For this example we’re running the build upon any check in (but where an alternative could be to have the build run on a scheduled basis, for example):
Then under Failure Conditions for the build configuration, we check all options:
And then from the top toolbar in TeamCity, we click on “Agents” to verify that we have an agent assigned to the TeamCity project.
Using SSMS I restored a backup of the ContosoRetailDW database to a QA database called ContosoRetailQA.
As a simple test, using SSDT I’m changing the data type of the AccountLabel column of the DimAccount table from nvarchar(100) to nvarchar(200):
I build this locally to verify the change and then commit the change and push it to GitHub.
The TeamCity project is configured to check the GitHub repository every minute, and within that time we see the build start to run:
It succeeds, and if we go to the ContosoRetailQA database in SSMS, we see that the AccountLabel column has had its data type amended from nvarchar(100) to nvarchar(200):
The above was a whistle-stop tour through the implementation of CI for a SQL Server environment. For any organisation the devil is in the detail with regards what a final solution would look like. Tools used within this implementation – e.g. SSDT, PowerShell and TeamCity – are incredibly flexible and so offer many alternatives and tweaks for how any single CI solution can be designed and developed.
I’ve specifically focused on the technical details of database CI implementation in this post, but for a broader look at database CI in general, Steve Jones’ SQLBits presentation on Continuous Integration for Databases is an informative watch.