HiveBrain v1.2.0
Get Started
← Back to all entries
patternsqlModerate

Syncing two databases in SQL Server

Submitted by: @import:stackexchange-dba··
0
Viewed 0 times
databasessqltwoserversyncing

Problem

I have two SQL Server databases. One is client (Windows application) and the second is on the server. I want to sync these two databases every so often (e.g. every 2 minutes!).

I have read about different ways of syncing like replication, time stamp, log tables using triggers, Microsoft Sync Framework and so on.

Actually I don't like to use a syncing method which might be a black box (like replication) because I don't want the SQL Server specific tables to be blocked while I'm updating them and syncing them with the server.

-
which method do you think that I should use in such circumstance? Remember that every several minutes I must send several table changes from client to the server and fetch also two table changes from server.

-
I have found a method which is strange but new. Is that possible that I log all executed (for specific preferred) stored procedures in client and send them with their parameters in a .sql file to the server and execute them there? The same will happen on the server and sent to the client. Do you think that this is a simple yet useful method or not?

-
please suggest me any useful approach if you can. Thank you so much.

EDIT: Remember that this is a real-time synchronization and this makes it special. It means when the client user is using the table, the synchronization process with server must happen every several minutes so none of the tables must be locked.

Solution

Well I might not get it, but I try to answer it.

You said you need a high performance solution which runs often (minimum all 2 minutes) and you need a good approach which should be fast without locking. But you don't want a blackbox system.

Instead of a blackbox system, which is used on millions of installations with good results, you try to invent the wheel again and build your own solution? Hm, sounds a bit weird.

In fact these are my suggestions.

  • Replication even if you said you won't use it. It's quite the easiest and best solution you can use for this. The replication is easy to setup, replicate fast and you don't have to invent the wheel again. If you just weird about locking, you may try to set the ISOLATION LEVEL to READ_COMMITTED_SNAPSHOT. You can read more about it here. This will use up a part of your tempdb, but your table is always read- and writeable and the replication can work in the background.



See the example below:

ALTER DATABASE yourDatabase SET ALLOW_SNAPSHOT_ISOLATION ON
ALTER DATABASE yourDatabase SET READ_COMMITTED_SNAPSHOT ON


  • CDC (Change Data Capture) can also be a solution. But this way you need to build nearly everything on your own. And I've made the experience that CDC can be a fragile thing in some circumstances. CDC will capture all data on a watched table (you need to specify each watched table manually). Afterwards you'll get the value before and the value after an INSERT, UPDATE or DELETE. CDC will hold back those information for a period of time (you can specify it on your own). The approach could be to use CDC on certain tables you need to watch and manually replicate those changes to the other database. By the way, CDC uses the SQL Server Replication under the hood too. ;-) You can read more about it here.




Warning: CDC will not be aware of DDL-changes. This means, if you change a table and add a new column, CDC will watch the table but ignore all changes to the new column. In fact it only records NULL as value before and value after. You need to reinitialize it after DDL-Changes to a watched table.

  • The way you described above is something like capturing a workload using SQL Server Profiler and run it again on another database for some benchmarks. Well it could work. But the fact that there are too many side effects is a bit too heavy for me. What do you do if you capture a procedure call on your client. Afterwards running the same command at your principle database as it is out of sync? The procedure may run through, but it may delete/update/insert rows which were not present in your client. Or how do you handle multiple clients with one principle. I think this is too tricky. In the worst case, you probably destroy your integrity.



  • Another idea could be application based or using a trigger. Depending on how many tables you want to be synced. You can write all changes to a separate staging table and run an SQL Server Agent Job all x Minutes to sync those rows in the staging table with your master. But this may be a bit to heavy if you try to sync (e.g.) 150 tables. You would have a big overhead.



Well these are my 2 cents. Hopefully you have a good overview and maybe you found one solution which works for you.

Code Snippets

ALTER DATABASE yourDatabase SET ALLOW_SNAPSHOT_ISOLATION ON
ALTER DATABASE yourDatabase SET READ_COMMITTED_SNAPSHOT ON

Context

StackExchange Database Administrators Q#105965, answer score: 14

Revisions (0)

No revisions yet.