Hello! And welcome to the final part of this series. This script combines the first 4 parts of this series to create an unabridged process of synchronising a ssas database from one server to another. Even though I’ve used this script fora while, it still gets tweaks and I’d still consider it a work in progress, however it’s as complete now as it ever has been. No doubt when I make some more changes I’ll post an update.

The script so far includes:

How Sync Works

Here’s a very high level overview on how sync works:

As from part 4 of this series, we use the “Invoke-ASCmd” PowerShell cmdlet. We store a template of the sync xmla file on the disk. The template looks like this.


<Synchronize xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
	<Source>
		<ConnectionString>Provider=MSOLAP.6;Data Source=@@Source@@;Integrated Security=SSPI</ConnectionString>
		
			@@DatabaseID@@
		
	</Source>
	<SynchronizeSecurity>IgnoreSecurity</SynchronizeSecurity>
	<ApplyCompression>true</ApplyCompression>
</Synchronize>

Save this xmla as “Synchronize.xmla” at “C:\temp".

The @@databaseID@@ and @@source@@ values are updated using PowerShell and are saved to a new xmla file, so that the template can be reused. The new, saved sync xmla file is executed via the Invoke ASCmd PowerShell cmdlet.

The idea of this script is to try to automate as much as possible synchronising and to anticipate any disk space issues where your cubes are very large or your disk space is limited. As an ideal, there should always be as much free space on your disk that will allow a full process of the largest cube your instance hosts. When a full process is executed, you will have at the end of the process 2 copies of the database, and the new copy will replace the old copy. However, if you plan on using synchronise, the ideal is to have enough space to store 2 copies of the largest database. However if this is not the case, then the drop and re-sync the whole database is a decent enough alternative.

It’s fairly generic, the only parameter you will want to change internally is the name of the data source (~line 32).

A large part of this script is anticipating that the original cube may need to be dropped, so extracting the original data source/roles/members takes up a significant part of this script. Remember this is only really important if the target database and source database really are different in terms of environments. So this script I use when I need to update the data from our production cubes post release to our UAT cubes, hence the need for different data sources etc.

Apart from synchronise, this should also work with Robocopy.You jwould just have to remove the sync part of this script and execute Robocopy instead.

I have added comments to the script, but if not all is clear then feel free to leave comments below.

# database is an optional parameter
# if not included then script will run for all databases on source instance
# run for specific databases by including a comma separated list of names 
# example syncCubes.ps1 -sourceInstance "olap01" -targetInstance "olap02" -syncAccount "ad1\sql_analysis" -Database "ADWORKS,ADWORKSDW" -Drop
# you can include just one database name
# drop is also optional parameter; if disk space not sufficient for 2 copies of db then it will drop the target database
# full explanation of how drop works in comments below
[CmdletBinding()]
param(
[Parameter(Position=0,mandatory=$true)]
[string] $sourceInstance,
[Parameter(Position=1,mandatory=$true)]
[string] $targetInstance,
[Parameter(Position=2,mandatory=$true)]
[string] $syncAccount,
[Parameter(Position=3)]
[string[]] $database,
[Parameter(Position=4)]
[switch] $Drop)

[void][System.reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices")

#setup some variables to use in script
$sourcesvr = new-Object Microsoft.AnalysisServices.Server
$sourcesvr.Connect($sourceInstance)

$targetsvr = new-Object Microsoft.AnalysisServices.Server
$targetsvr.Connect($targetInstance)

$rolename = "Sync01"

$DataSourceName = "AdventureWorks"

#sort out whether we are running sync for on or more or all databases on the instance
[Microsoft.AnalysisServices.Database[]] $db = @()
if([string]::IsNullOrEmpty($database))
{
$db = $sourcesvr.Databases
}
else 
{
$database = $database.Split(",") 
ForEach ($d in $database)
{
$d = $d.Trim()
$db += $sourcesvr.Databases.FindByName($d)
}
}

# check if role exists. if it does not exist we create it and add permissions
foreach ($sourceDB in $db)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
Write-Output "${t}:*** Running Sync Script for $sourceDB ***"
# at some point we may need to skip syncing particular database because of disk space issues
# whenever skip is set to 1 we will not sync
# let's be optimistic and set it to 0 for starters
$skip = 0
$role = $null
$role = $sourceDB.Roles.FindByName($rolename)
if ($role -eq $null) 
{
#Create Role
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:role $rolename created on $sourcesvr $sourceDB"

$roleToCreate = new-Object([Microsoft.AnalysisServices.Role])($rolename)
$sourceDB.Roles.Add($roleToCreate)
$roleToCreate.Update()

#add the permission
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:admin permission added for $roletoCreate on $sourcesvr $sourceDB"

$sourceDBperm = $sourceDB.DatabasePermissions.Add($roleToCreate.ID)

$sourceDBperm.Administer = [Microsoft.AnalysisServices.ReadAccess]::Allowed
$sourceDBperm.Update()
$role = $sourceDB.Roles.FindByName($rolename)
}
 
#check to see if role that exists has administer permission
#if not then administer is added
$sourceDBperm = $sourceDB.DatabasePermissions.FindByRole($role)
if ($sourceDBperm.Administer -eq $false)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:admin permission added for $role on $sourcesvr $sourceDB"
$sourceDBperm.Administer = [Microsoft.AnalysisServices.ReadAccess]::Allowed
$sourceDBperm.Update()
}

# check if service account that runs target instance is a member of the role
# if it is not then we will add it to role
# this is necessary to ensure that the databases can be synced

$foundMember = $null
foreach ($member in $role.Members)
{
if ($member.Name -eq $syncAccount)
{
$foundMember = $member
}
}

If ($foundMember -eq $null)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Adding access to " + $syncAccount

$newMem = New-Object Microsoft.AnalysisServices.RoleMember($syncAccount)
$role.Members.Add($newMem)
$role.Update()
}
else
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Usergroup " + $syncAccount + " already added"
}

# we want the data source to be the same as what is was before sync
# get connection string before update
$targetdb = $targetsvr.Databases.FindByName($sourceDB.name)
if([string]::IsNullOrEmpty($targetdb))
{
#set targetdb to null for further checks
$t = Get-Date -format "d MMM yy HH:mm:ss"
Write-Warning "${t}:$sourceDB not found on $targetsvr. This means that roles and members of the role and data sources may not be correct for the environment."
$targetdb = $null
}

if ($targetdb -ne $null)
{
$targetds = $targetdb.DataSources.FindByName($DataSourceName)
$targetconnectionString = $targetds.ConnectionString
}

# disk checks
# get disk details for target server

$targetDisk = $targetsvr.ServerProperties

$targetDisk = $targetDisk | Select-Object name, value | Where-Object {$_.Name -eq 'DataDir'} | Select-Object -Expand value 
$TargetDisk.ToString() | Out-Null
$TargetDriveLetter = $targetDisk.Substring(0,2)

$targetdisksize = Get-WmiObject Win32_LogicalDisk -ComputerName $targetsvr -Filter "DeviceID='$TargetDriveLetter'" | Select-Object -Expand Size
$targetfreespace = Get-WmiObject Win32_LogicalDisk -ComputerName $targetsvr -Filter "DeviceID='$TargetDriveLetter'" | Select-Object -Expand FreeSpace
$targetdisksize = $targetdisksize / 1gb
$targetfreespace = $targetfreespace / 1gb

# get disk details for source server

$sourceDisk = $sourcesvr.ServerProperties

$sourceDisk = $sourceDisk | Select-Object name, value | Where-Object {$_.Name -eq 'DataDir'} | Select-Object -Expand value 
$sourceDisk.ToString() | Out-Null
$SourceDriveLetter = $SourceDisk.Substring(0,2)

$sourcedisksize = Get-WmiObject Win32_LogicalDisk -ComputerName $sourcesvr -Filter "DeviceID='$sourceDriveLetter'" | Select-Object -Expand Size
$Sourcefreespace = Get-WmiObject Win32_LogicalDisk -ComputerName $Sourcesvr -Filter "DeviceID='$SourceDriveLetter'" | Select-Object -Expand FreeSpace
$sourcedisksize = $sourcedisksize / 1gb
$Sourcefreespace = $Sourcetfreespace / 1gb


 # first check; are the target and source disks the same size?

 if ($targetdisksize -ge $sourcedisksize)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:both target and source disk sizes are the same or target is larger"
}

# if disk are not the same size, then how much smaller is the target? 
# we still might be able to sync with the smaller disk

if ($targetdisksize -le $sourcedisksize) 
{
$total = $sourcedisksize - $targetdisksize 
$total = $total /1gb
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:target disk is smaller than the source disk by $total gb "

}

# option here to drop the target database and sync entire db across
# when a sync runs, it first copies all the data that exists in the source database that does not exist in the target database 
# over the network from the source database into a temporary folder
# the sync job then copies out the remaining data form the target database into the temporary folder
# if any data exists in the target server that does not exist in the source server, this is deleted/not copied over
# once the copy is complete, the temp folder is renamed to the database name and the number on the folder name is incremented by 1
# the original folder is then deleted and replaced by the new folder
# this means at any one time when syncing there will be 2 copies of the databases on disk 
# this is not a problem if the databases are small. however it is a very big problem if the databases are very big.
# this check will determine if there is enough space on the target to accomodate a copy of the source database. If not we can always drop
# the target copy, then the sync will copy the entire db over the network

$sourceDBfolder = $sourceDB.name

$sourceDBSize = 
 invoke-command -computername $sourcesvr -scriptblock{param ($sourceDBfolder, $SourceDisk) $path = Get-ChildItem $SourceDisk | Where-Object {$_.PSIsContainer -eq $true -and $_.Name -match $sourceDBfolder}
 $path = $SourceDisk+"\"+$path
 $fso = New-Object -comobject Scripting.FileSystemObject
 $folder = $fso.GetFolder($path)
 $size = $folder.size
 $size = $size / 1gb
 $size
 } -Args $sourceDBfolder, $sourceDisk

if ($targetfreespace -le $sourceDBSize )
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
Write-warning "${t}:There may not be enough free space to synchronize the database $sourceDB. Consider dropping the target database $sourceDB on $targetsvr."

if ($targetdb -eq $null)
{
write-error "${t}:There is not target database $sourcedb to drop. The disk will have to be increased. Database will not be synced" 
$skip = 1
}

# however if database is to be dropped we have to add recreate them
# prior to dropping the database, we need to store values in arrays so that we can re-create them

if($Drop.IsPresent -and -$targetdb -ne $null)
{
$targetDB = $targetsvr.Databases.FindByName($sourceDB.name)
[Microsoft.AnalysisServices.Role[]] $targetroleArray = @()
foreach ($targetrole in $targetDB.Roles)
{
$targetroleArray += [Microsoft.AnalysisServices.Role[]] $targetrole

foreach($targetmember in $targetrole.members) {}
}
$targetDBpermissionCollection = @()

foreach ($targetDBpermission in $targetDB.DatabasePermissions)
{
$targetDBpermission
$targetDBpermissionCollection += $targetDBpermission
}
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Dropping $targetDB ..."
$targetDB.Drop()
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:$targetDB dropped"
}
}

# check the sync template as there are some settings you can alter in the template, such as copying permissions from source and compression etc
if ($skip -eq 0)
{
$syncTemplate = "C:\temp\Synchronize.xmla"
$syncexec = "C:\temp\Synchronize$sourceDB.xmla"

$syncxmla = Get-Content $syncTemplate
$syncxmla = $syncxmla.Replace("@@DatabaseID@@", $sourcedb)
$syncxmla = $syncxmla.Replace("@@Source@@", $sourcesvr)
$syncxmla | Out-File $syncexec

$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Synchronising $sourcedb from $sourcesvr to $targetsvr"
[xml]$resultXml = Invoke-ASCmd -InputFile $syncexec -Server $targetsvr -TraceFile $tf -TraceLevel Low
Write-Output $resultXml.InnerText
"${t}:Sync Complete for $sourceDBfrom $sourcesvr to $targetsvr"

if ($targetdb -ne $null)
{
# update the data source
# use the connection string we got at the beginning of the process
$newTargetSvr = new-Object Microsoft.AnalysisServices.Server
$newTargetSvr.Connect($targetInstance)
$newTargetDB = $newTargetSvr.Databases.FindByName($targetDB)
$resetds = $newTargetDB.DataSources.FindByName($DataSourceName)
$resetds.connectionstring = $targetconnectionString
$resetds.Update()
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:datasource updated for $newtargetdb to $targetconnectionString"
}


# if databases were not dropped then all roles/members and permissions are as they were
if($Drop.IsPresent -and $targetdb -ne $null)
{
foreach ($tr in $targetroleArray)
{

$newTargetSvr = new-Object Microsoft.AnalysisServices.Server
$newTargetSvr.Connect($targetInstance)
$newTargetDB = $newTargetSvr.Databases.FindByName($targetDB)
$newroleToCreate = new-Object([Microsoft.AnalysisServices.Role])($tr)
$newTargetDB.Roles.Add($newroleToCreate)
$newroleToCreate.Update() | Out-Null
$r = $targetDB.Roles.FindByName($tr)

foreach ($tm in $r.Members)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Adding " + $tm.Name + " into the role "+$newRoleToCreate
$newMem = New-Object Microsoft.AnalysisServices.RoleMember($tm.name)
$newroletocreate.Members.Add($newMem) | Out-Null
$newroleToCreate.Update() | Out-Null
}
}

foreach ($newTargetDBpermission in $targetDBpermissionCollection)
{

$newTargetDBperm = $null

if ($newTargetDBpermission.Read -ne $null)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Adding read permissions to " + $newTargetDBpermission.Role.Name + " on $newTargetSvr $newtargetDB"
if($newTargetDBperm -eq $null)
{
$newTargetDBperm = $NewTargetDB.DatabasePermissions.Add($newTargetDBpermission.Role.Name)
}
$NewTargetDBPerm.Read = [Microsoft.AnalysisServices.ReadAccess]::Allowed
}

if ($newTargetDBpermission.Administer -eq $true )
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Adding admin permissions to " + $newTargetDBpermission.Role.Name + " on $newTargetSvr $newtargetDB"
if ($newTargetDBperm -eq $null)
{
$newTargetDBperm = $NewTargetDB.DatabasePermissions.Add($newTargetDBpermission.Role.Name)
}
$NewTargetDBPerm.Administer = [Microsoft.AnalysisServices.ReadAccess]::Allowed
}

if ($newTargetDBpermission.Process -eq $true)
{
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Adding process permissions to " + $newTargetDBpermission.Role.Name + " on $newTargetSvr $newtargetDB"
if ($newTargetDBperm -eq $null)
{
$newTargetDBperm = $NewTargetDB.DatabasePermissions.Add($newTargetDBpermission.Role.Name)
}
$NewTargetDBPerm.Process = [Microsoft.AnalysisServices.ReadAccess]::Allowed
}

if ($newTargetDBperm -ne $null)
{
$NewTargetDBPerm.Update()
}
}
}
}
$t = Get-Date -format "d MMM yy HH:mm:ss"
"${t}:Sync Script Completed for $sourceDB"
}