SQLUNINTERRUPTED

I am just a medium, SQL Server the Goal

Monthly Archives: July 2015

Introduction to Stretch Databases – Part Three

Continuing with my earlier posts on Stretch Databases – Part1, Part2, I will discuss some key aspects to keep in mind when we stretch a database.

Note- SQL Server 2016 is still in preview and some of this information can change in the future.

Insert, Update and Delete

As I had mentioned in my earlier posts, once the records have been moved to the remote Azure Databases, it cannot be Deleted/Updated from the local server. All updates/deletes would need to be run on the Remote Server explicitly. The following error would be returned if an attempt is made to update/insert the record, which is already on the remote server.

Update and delete of rows eligible for migration in table ‘Test_RemoteArchive’ is not allowed because of the use of REMOTE_DATA_ARCHIVE.
Msg 3609, Level 16, State 1, Line 3
The transaction ended in the trigger. The batch has been aborted.

New data which is inserted into the table, get inserted to the local table first and then get archived to the remote server at a later point in time.

Select Operations

As previously mentioned, Select operations executes the query on both the local and remote server and then concatenates the data before sending to the client. A typical query plan for queries on a Stretched tabled would look something like.

If SQL Server is not able to connect to the remote linked server (either because of network related issues or because of authentication issues) the following error would be returned.

OLE DB provider “SQLNCLI11” for linked server “ServerName” returned message “Login timeout expired”.
OLE DB provider “SQLNCLI11” for linked server “ServerName” returned message “A network-related or instance-specific error has occurred while establishing a connection to SQL Server. Server is not found or not accessible. Check if instance name is correct and if SQL Server is configured to allow remote connections. For more information see SQL Server Books Online.”.

Msg 53, Level 16, State 1, Line 3
Named Pipes Provider: Could not open a connection to SQL Server [53].

This error would also be logged in the SQL Error Log. If the connection to the Remote Server gets terminated in the middle of query execution, you would get the following error..

OLE DB provider “SQLNCLI11” for linked server “ServerName” returned message “Communication link failure”.
Msg 10054, Level 16, State 1, Line 3
TCP Provider: An existing connection was forcibly closed by the remote host.

Backup and Restore

Databases enabled for stretch can be backed up or restored just the ordinary databases. The only difference being that immediately after restore you wont be able to query the remote table data. In order to query the remote data, we first have to reconnect the database to the remote database. This can be done using the stored procedure sys.sp_reauthorize_remote_data_archive.

Exec sp_reauthorize_remote_data_archive N’sql_azure_sysadmin_username’ , N’password’
Advertisement

Zipping and Unzipping Files – The Fast and Furious Way

Recently while working on a customer project, we were required to zip and unzip hundreds of files of varying size. The objective was to import IIS Logs from over 100+ servers to Microsoft Analytics platform system for further analytics. The SLA’s we were working on was 1 hour, i.e. APS should have data available with a maximum latency of 1 hour. The processing required us to copy Logs from these IIS servers to a central NAS and from there to the APS landing zone. The hop to the central NAS was required because of some regulatory requirements at the client.

The IIS logs which were getting created every 1 hour, were anywhere between 1 MB – 50 MB, depending on the activity on the servers. Coping the raw files was out of question for obvious reasons. So we had to zip the files (thankfully IIS logs give tremendous compression) and copy them to the central NAS. From there copy and unzip the files on the Landing Zone

We tired multiple options like, calling exe’s (rar, unrar, 7-zip) from PowerShell, but that did not scale well. Eventually we settled on the .NET Compression routine (.NET 4 and above) to this.  We saw overwhelming results when using .NET routines (~13-15 times better execution speed) during compression and decompression.

For the purpose of this blog, I created a test folder, and filled it with 204 different types of files (images from our last trip, some PowerPoint presentations etc.) of varying size (max size 7 MB, min size 5 KB), for a total size of 320 MB.

Compression Results using rar/7-zip

$files = Get-ChildItem -Recurse "C:\Intel\BlogTests\TestFolder"
$starttime = Get-Date
foreach ($file in $files)
{
    $ZipUtility = 'C:\Program Files\WinRAR\rar.exe'
    $fileToCompress = $file.FullName
    $outputFile =  "C:\Intel\BlogTests\ZipFiles\" +$file.Name.Replace(".","_") + ".rar" 
    $zipCommandparam = " a " + $outputFile + " " + $fileToCompress
    $ps = Start-Process -FilePath $ZipUtility -ArgumentList $zipCommandparam -NoNewWindow -PassThru -Wait
    $ps.WaitForExit() # block till exe finish
}
$endTime = Get-Date
Write-Host "Time to Complete Using RAR -: "$endTime.Subtract($starttime).ToString()

Results With Rar.exe  —> Total Duration – 00:03:28 (3 minutes, 28 Seconds)image
Results With 7-zip.exe —> Total Duration – 00:03:48 (3 minutes 48 Seconds)

Compression Results using .Net Compression Routine


[System.Reflection.Assembly]::LoadWithPartialName('System.IO.Compression.FileSystem')
$files = Get-ChildItem -Recurse "C:\Intel\BlogTests\TestFolder"
$starttime = Get-Date
foreach ($file in $files)
{
$fileToCompress = $file.FullName
$directorypath = "C:\Intel\BlogTests\ZipFiles\" + $file.Name.Substring(0,$file.Name.LastIndexOf("."))
$outputzipfile =  "C:\Intel\BlogTests\ZipFiles\" + $file.Name.Replace(".","_") + ".zip"

#Write-Host $directory
$directory = [IO.Directory]::CreateDirectory($directorypath)
$filetocopy = $directorypath + "\" + $file.Name
[IO.File]::Copy($file.FullName,$filetocopy)
[System.IO.Compression.ZipFile]::CreateFromDirectory($directorypath,$outputzipfile)
[IO.Directory]::Delete($directorypath,$true)
}
$endTime = Get-Date
Write-Host "Time to Complete Using .Net Routine -: "$endTime.Subtract($starttime).ToString()

Results With .Net Routine —> Total Duration – 00:00:17 (17 Seconds)
image_thumb.png

We achieved similar results while decompressing the files.