Microsoft SQL Transaction Log Shipping to a Remote Server without SMB/CIFS -
Not Actually That Tricky
Applications, Cool Stuff, Replication / Data Shipping, Scripting, Servers -
August 8, 2012

from http://blog.glo-networks.com/2012/08/08/microsoft-sql-transaction-log-shipping-to-a-remote-server-without-smbcifs-not-actually-that-tricky/


Every now and then we get non-typical requests. Normally it'll be for something
mundane like helping out with a Managing Director's kids computer, or providing
broadband for a board member.

However the most recent one was to get a Microsoft SQL database transaction log
shipped to a remote server, thats on a different domain, and could not be
contacted over a normal SMB/CIFS share.

Immediately the first thought was of VPNs, but that wasn't an option for
politicial and security reasons. The next option is to avoid transaction log
shipping and move to replication over HTTPS -- unfortunately not an option due
to political reasons.

So we started breaking down transaction log shipping. What is it? In essence a
transaction log backup, which is then copied and restored. Figuring that we
could set this up in parts we queued up the A Team theme and went to work
setting up a lab.

It turns out that it's surprisingly easy to do; On the primary server, cheat
and use the Microsoft SQL Studio Management GUI to setup the primary as you
would normally, but don't add a secondary server. This gives you the backups
into a directory of your choice.

Next job is to transport those backups to the secondary server(s). For this we
chose to use winscp. It's something we often use and it's easily scriptable
(we're aware of other options -- rsync and so on). Using winscp in console mode
(/ini=path/to.ini) to create and save a session. We then created a winscp
script to open a connection and call the synchronize command, to sync the local
and remote folders. This is then called on a scheduled task every X minutes.

Now the backups are being copied to the secondary server the next and final
step is to get those backups restoring into the secondary SQL server. First
step is to restore a full copy of the database, in standby mode (as usual -- if
you manually seed). Next we need to setup the tasks to restore the transaction
log backups.

Now we can't use the GUI, so we'll need to manually do it using the log
shipping stored procedures. If you're not familiar with these I highly suggest
that you look them up rather than blindly running them. The timings for very
much tweaked for our settings -- you'll certainly need to alter the
"primary-server-name", "secondary-server-name", and "C:\Path\To\SQL\Transaction
Log Shipping" (this needs to match the destination that your files are copied
to on your secondary server) -- you'll also want to alter the retention and
deletion period to match the settings on your primary.




declare @LSCopy_Job uniqueidentifier,
    @LSRestore_Job uniqueidentifier,
    @LSSecondary_Id uniqueidentifier

exec sp_add_log_shipping_secondary_primary
    @primary_server = 'primary-server-name',
    @primary_database = 'primary-database-name',
    @backup_source_directory = 'C:\Path\To\SQL\Transaction Log Shipping',
    @backup_destination_directory = 'C:\Path\To\SQL\Transaction Log Shipping',
    @copy_job_name = 'LSCopy_primary-server-name_primary-database-name',
    @restore_job_name = 'LSRestore_primary-server-name_primary-database-name',
    @file_retention_period = 2160,
    @overwrite = 1,
    @copy_job_id = @LSCopy_Job OUTPUT ,
    @restore_job_id = @LSRestore_Job OUTPUT ,
    @secondary_id = @LSSecondary_Id OUTPUT,

exec sp_add_log_shipping_secondary_database
    @secondary_database =  'secondary-database-name',
    @primary_server = 'primary-server-name',
    @primary_database = 'primary-server-name',
    @restore_delay = 0,
    @restore_all = 1,
    @restore_mode = 1,
    @disconnect_users = 1,
    @restore_threshold = 120



Now that we have the jobs setup they'll need schedules set and to be enabled.
This is do-able using the sp_add_schedule, sp_attach_schedule and sp_update_job
stored procedures, however this can also be done via SQL Studio Management --
use which ever you're more comfortable with. The important thing to note is
that the copy job is not required (since it's happening over
FTP(S)/SFTP/SCP/whatever) and can either be deleted or left disabled, and your
schedule must be sane.

The only thing to do now is to setup some form of monitoring. This is the most
important step. You can either use SQL alerts or look for event logs indicating
that the databases are out of sync.

Now you can sit back, relax and utter the words "I love it when a plan comes
together".