Agile BI Tools - Database versioning with SSDT #1

Technical Value

Microsofts SQL Server Data Tools (especially for databases) got stronger and stronger. It can compare whole database catalogs, generate deployment scripts, deploy initial table data any many more. In the end you can fully maintain (develop, build, deploy, document and reverse engineer) a database.

I'am using SSDT now for a while and sometimes I'am asking myself how to maintain the migration of a database developed in SSDT. Up to now we could avoid complex migrations. But it would be nice to have a process which handles such migrations more or less automatically. But before we dive into the different migrations scenarios I have to expose that a software version for a database is mandatory to maintain complex migration scripts. Typically you write migrations scripts to migrate a database from version x.y to x.y+1. This let you handle complex scenarios for example migrating a CDC enabled database or big fact tables.

The Software Version in SSDT projects

SSDT (to be precise DacFx; the backend of SSDT) has already a implemented a versioning mechanism. Lets take a look at the database properties: SSDT_Example 05SSDT_Example 06As you can see the current database logical name and version is stored in the project file of the database. Because the project file is a simple MSBuild file we have many opportunities to extend the build behavior: for example auto incrementing a build number at the end of the version string or adding a debug or release suffix. One drawback is that this version number cannot be accessed by the script itself. It would be desirable to have a SQLCmd variable like $(DacVersion). A marginal note - it seems to be that Microsofts preferred way maintaining migration scripts is to implement a build or deployment contributor (see here for reference http://msdn.microsoft.com/en-us/library/dn268597(v=vs.103).aspx). That implies you have to implement some C#-code which handles the migration for you. If you go this way you have to adress the normal bug fixing and deployment challenges to got it work in your development environment. Lets get back to our "scripting" way to handle database migrations. We now examine a SSDT project file. As you can see the DacVersion property hold our version information. This property is a normal MSBuild property. SSDT_Example 01   To access this information in our scripts we need to add a SQLCmd variable. Lets call it CurrentDacVersion. SSDT_Example 02   In the next step we have to edit the project file. Search for the CurrentDacVersion SQLCmd variable. You should find something like this: SSDT_Example 03   This is the definition of a SQLCMD variable. This variable is connected to the MSBuild variable $(SqlCmdVar__1). Changing this to $(DacVersion) solves the problem. Save the project file and reload is in Visual Studio. Examining the project properties SQLCMD tab will show now this: SSDT_Example 04 Now you can reference your current database version with the $(CurrentDacVersion) SQLCMD variable in your post- or predeployment scripts with some limitations:

  • The local value overrides the default value. So the default value cannot be used anymore.
  • Changing the local value will not affect the version number in the project file. Changing the version has to be done in the properties window. So this SQLCMD variable in read only for the developers.
  • You need to synchronize your publishing profiles so that the variable got updated. However it is best practise to review the publishing profiles occasionally.

The next step is writing some SQL scripts which compares the target database version with the current database version and migrate some database objects (for example CDC enabled tables). I will cover this topic in my next post.      


D Esser
Fr, 30.10.2015 - 09:47

Hi Manuel,

SSDT cares about database consistency and deployment very well. Some topics are too complex to be handled in a standard way. For example migrating huge fact tables with many indexes: SSDT would drop the indexes, create a temporary table with the updated schema as CTAS, drop the old table, rename the new table. There are many reasons why this is a bad idea: table is locked and not available, long run-time and so on.

In my opinion you have three opportunities. The first one is to do a manual operation and import the changes back into the SSDT project (this sounds weird i know). The second one is to automate the manual operation. Here at ORAYLIS we've build a template for the most common operations. The solution for automating common operations is to use the Post- and Pre-Deployment Scripts together with T4 Text Templating. Have a look at Dave Ballantyne's Blog <a href="http://dataidol.com/davebally/2014/03/29/t4-support-in-ssdt/&quot; rel="nofollow">T4 Support in SSDT</a>.

The third opportunity is a combination of the first and second one: first develop the manual operation and then integrate the custom migration script in your pre- and post-deployment scripts. For example you can find a simple solution for CDC enabled tables right here <a href="http://blog.oraylis.de/2014/06/agile-bi-tools-database-versioning-with-…; rel="nofollow">Agile BI Tools – Database versioning with SSDT #2</a>.

Kind regards

Do, 22.10.2015 - 11:39

Hi Daniel,

how do you handle custom Migration Scripts with SSDT in your Projects?


Neuen Kommentar schreiben

Der Inhalt dieses Feldes wird nicht öffentlich zugänglich angezeigt.


  • Keine HTML-Tags erlaubt.
  • HTML - Zeilenumbrüche und Absätze werden automatisch erzeugt.
  • Web page addresses and email addresses turn into links automatically.
Teilen auf

Newsletter Anmeldung

Abonnieren Sie unseren Newsletter!
Lassen Sie sich regelmäßig über alle Neuigkeiten rundum ORAYLIS und die BI- & Big-Data-Branche informieren.

Jetzt anmelden