Best Practice for running hourly backups on an SQL Azure Database?

1.7k views Asked by At

I need to backup my production database each hour to the least as it contains a lot of financial information. I've looked into SQL Data Sync but considering the fast pace of schema changes we do all the time, it won't do the trick.

This article: http://msdn.microsoft.com/en-us/library/windowsazure/jj156170.aspx presents a few methods of achieving this but i couldn't figure out which one is better: Database Copy or Import/Export Service (.bacpac).

Another thing i came across is a technique for Database Mirroring across different datacenters: http://msdn.microsoft.com/en-us/library/windowsazure/jj870961.aspx isn't there an easier way to do it? Why do i even need it if Azure has automated backups? (Assuming they do have it).

My question is what's the best way to do it?

Features i'm looking for:

  1. The process has to be entirely automated.
  2. I don't want to write new code / scripts to achieve the automation.
  3. This has to be reliable and the backup has to be simple to recover.

Thank you!

1

There are 1 answers

2
Eoin Campbell On

"Best Way" is going to get this closed as opinionative but I can tell you that we use the I/E Services and dump a bacpac file and it works well.

Backups are stored in Blob Storage and are easily accessed.

It's also easy to use the Import/Export data-tier functionality in SQL Server to pull a bacpac down and import it directly into your local/dev sql server instance.

Features wise...

  1. Process is completely automated - we've a nightly job that does it and the bacpac is just there each morning.
  2. You won't have to write code - have a look here - http://msdn.microsoft.com/en-us/library/jj650016.aspx
  3. It's very easy to recover - bacpacs are stored in Blob Storage and can be restored to local via Import Export Data-Tier or restored to Azure via the management portal.