SQL Server Data Files on Azure

Technical Value

Azure | SQL Server 2014

With SQL Server 2014 it’s easy to move database files to the Azure Blog storage even if the SQL Server runs on premise. Azure Blob storage offers reliable, cheap and high available storage, which could be useful for “cold” data for example.

However, configuration is a little bit tricky, so I’m going to walk through this process step by step.

1. Create an Azure Blob store account and a container

Log into Azure and create a new storage account. For my example, I’m using “db4” as the name as shown below:


Next, I’m going to create a blob store container, which I name “data” here:


In order to access the container, we need the URL to the container (db4.core.windows.net/data in my example) and the storage key. The key can be obtained by clicking on “Manage Access Keys” on the bottom of the screen:


You can copy the key to the clipboard by clicking on the icon right besides the Primary Access Key box.


For the next task I’m using Windows Azure Storage Explorer (download here ). Here you can add your storage account by pasting the access key into the storage account key input box:



2. Create a Shared Access Signature for the container

In Azure Storage explorer, select the container (data) and click on ‘Security’:


This brings up the following dialog. Make sure to select the permissions list, delete, read and write. After clicking on ‘Generate Signature’ a shared access signature is created. Copy this signature to the clipboard.



3. In SQL Server: Create a credential for the blob container

In SQL Server we’re using the create credential statement to create a credential for the blob store. Make sure to replace the secret key with the generated shared access signature from the last step (I just obfuscated the key by overwriting part of the key with ‘x’):


  1. CREATE CREDENTIAL [https://db4.blob.core.windows.net/data]      
  1. SECRET = 'sv=2014-02-14&sr=c&sig=c%2Fxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx3%3A00%3A00Z&se=2014-12-31T23%3A00%3A00Z&sp=rwdl'

If you like, you can check the credentials with “select * from sys.credentials”:



4. In SQL Server: Create a database that uses the blob container

The next step is simple. We just need to create a database using the container as its storage:


  1. CREATE DATABASE testdb      
  2. ON      
  3. ( NAME = testdb_dat,      
  4. FILENAME = 'https://db4.blob.core.windows.net/data/TestData.mdf' )      
  5. LOG ON      
  6. ( NAME = testdb_log,      
  7. FILENAME = 'https://db4.blob.core.windows.net/data/TestLog.ldf')

You can create tables and load data in just the same way as you would do with a local database file. Azure Storage Explorer lists the database files that are created:



5. Optional: Register the blob store in SQL Server Management Studio

You can register the blob store in SQL Server Management Studio by creating a connection to Azure:


The “Access Key” is the key we created in the first step and can simply be copied into the account key field:


After connecting to the Azure blob store, Management Studio shows our container together with the database files:


Of course, when placing database files on Azure, a connection is needed to the blob store. If you don’t have this connection, you will not be able to access the database:




With SQL Server 2014 it is easy to put data files on an Azure storage account even for an on premise SQL Server. Use cases include

  • store data that is not heavily queried
  • store data that you want to secure in a geo-redundant way
  • enhance the local storage of a SQL Server
  • perform a single table backup to the cloud
  • … and many more

Neuen Kommentar schreiben

Der Inhalt dieses Feldes wird nicht öffentlich zugänglich angezeigt.


  • Keine HTML-Tags erlaubt.
  • HTML - Zeilenumbrüche und Absätze werden automatisch erzeugt.
  • Web page addresses and email addresses turn into links automatically.
Teilen auf

Newsletter Anmeldung

Abonnieren Sie unseren Newsletter!
Lassen Sie sich regelmäßig über alle Neuigkeiten rundum ORAYLIS und die BI- & Big-Data-Branche informieren.

Jetzt anmelden