SSAS Activity Monitor Available
So it’s been a while since I’ve worked on any SSAS, but today I am happy to announce that there is an up-to-date version of the SSAS Activity Monitor. The solution has had couple of changes to it.
So it’s been a while since I’ve worked on any SSAS, but today I am happy to announce that there is an up-to-date version of the SSAS Activity Monitor. The solution has had couple of changes to it.
Hello!
Recently someone got in touch with me about one of my earliest posts. I realised I had made a mistake on that post and had updated. But I also realised that the script itself is actually part of a PowerShell module. So wanting to make things as easy as possible, and being the helpful guy that I am, I have altered that example into a script which I am posting here.
It’s worth stressing that the errors that output to the console are fairly generic, and that there is a log file that will contain all the information if there are any errors.
Recently I needed to update some cubes in a test environment after the underlying data warehouse had been refreshed. Typically I’d sync the cubes from prod, but as sync causes locking, and I needed to get this done during the day, I needed to reprocess all of the cubes in the test environment, and there were a lot of them, all ranging in sizes from hundreds of gb to a few mb. The simplest way to update the cubes was to run a process default on the dimensions and then run a process default on the measure groups. Some of the measure groups were partitioned, so I could’ve got clever and just processed fully any unprocessed partitions, but I felt a default would do most of the hard work for me.
Hello! And welcome to the final part of this series. This script combines the first 4 parts of this series to create an unabridged process of synchronising a ssas database from one server to another. Even though I’ve used this script fora while, it still gets tweaks and I’d still consider it a work in progress, however it’s as complete now as it ever has been. No doubt when I make some more changes I’ll post an update.
The script so far includes:
Hello, and welcome to part 4 of the series. In this post, I’m going to do a straight up sync with none of the disk checks from yesterdays post. The full script at the end of the series will include the disk checks.
Hello, and welcome to part 3 of the series on syncing databases across servers. The first two parts focused on ensuring that the role exists on the source server with the appropriate permissions to sync. We then check and add the service account of the target instance to ensure that the sync can occur.
Part two of this series expands upon yesterdays post, which was creating a database role and assigning permissions, by adding a member to that role. The script adds just one member to the role, but if you need more than one member you can either create a domain group and add all the users required into that group, then use the group name in place of the user name, or do something clever with arrays.
Tomorrow’s post will look at some disk checks that are included to ensure there is enough space to sync cubes.
Here are the links for the other parts of the series Automate SSAS Syncing Part 1: Create SSAS Database Role Automate SSAS Syncing Part 2: Create SSAS Role And Add Member
Hello!
Every day this week I am going to be posting part of a script that I recently wrote, but broken down into separate components that are useful in their own right. The full script, which I will be sharing at the end of the week, automates the synchronisation of SSAS cubes from one instance to another.
So to make this an automated process that was robust to any deviation I had to make sure that the destination service account has full admin access to the source database, that the destination disk could stand the anticipated growth as well as the copy of the sync’d cube, and that the datasource’s connection string on the destination database was reset to it’s original value. Typically the datasource does not need updating, but the destination databases are used in a different environment with their own datasource. And as the sync also syncs the datasource it is necessary to update the datasource back to its original value
Hello!
Here’s a simple Powershell script here to get the top level info of the cubes on an instance. This is a good script to understand how to use the AMO namespace to get info on the cubes via Powershell. You can add many other properties to this list to expand the info that you want. I’ve often mused about moving a lot of the logic that I constantly put in these scripts into higher tier functions in Powershell, or even create a ssas library, but time is always a premium…
One feature that has always irritated me greatly is how you cannot alter the connection for a xmla query window via the right click content menu. For .sql, .mdx and .dmx it is an option via the content menu, but not xmla!
Earlier this year I posted a Powershell script that will set the set the aggregation designs on your cube partitions. I recently created another script that contains a “Fix” flag to set the aggregation design, so you can just run it without the flag to get the info, or set the flag and fix the aggregation designs. I also added a “process” part that will run a process index on any partition whose aggregations are not fully processed.
The advantages of using this over the older version include the aforementioned process and checks, but also it means you don’t have to specify cube, measuregroup and the aggregation designs name. The disadvantage to this is that it assumes you want the latest aggregation design applied and processed. Somewhere along the way there will probably be a script which combines all of this nicely!
When I first started administering SSAS, the HardMemoryLimit is a property on SSAS instances that I never used to really take much notice of. After all, there’s the LowerMemoryLimit and the UpperMemoryLimit, and once these are set, they are set in stone right?
Right?
Before I start this post, I want to mention a few things:
I’ve been looking into using processor affinity on our SSAS instances to help control the workload on our servers. Sometimes a process can run which will grab as much CPU resource as it can, which means that queries on the cubes cannot run, grinding the front end to a halt. Unlike the db engine, there is no resource governor for Analysis Services. Sure you can set the min/max memory simple enough, but setting limits on IO and CPU is a more complex task.
Today I needed to update our cube aggregation designs using the Usage-Based Optimisation Wizard but I got stuck on an error I;d not seen before “Estimated Count or Partition Count must be specified for cube objects that require counting.” Ok, not too helpful. I highlighted each dimension and clicked the “count” button for each one, but it still did not fix. I then looked a little closer and noticed that one of the dimensions had a red squiggle underneath it.
Recently I deployed updates to our cubes that resulted in the aggregation designs being removed from our partitions. This was because in the Visual Studio solution we had more than one aggregation design for each measure group, so that re-processing and adding new partitions caused the aggregation designs to be dropped. D’oh!
The Powershell script below returns information on all the partitions within the Measure Groups of a given cube. I prefer to use Powershell to get this data as there’s no way to get he info out of the UI this quickly. If, like me, you have lots of partitions within a Measure Group and want to exit the script you can press “X” when the list reaches the bottom of the screen.
Way back when I started this blog I wrote about a few of the SSAS server properties, and today I’m revisiting this subject tot talk about the AllowedBrowsingFolders property.
Apologies for the lack of snappy title! Some weeks back I posted about the Management Pack for SQL Server Analysis Services 2008 and 2012 that was recently released. If you do not know anything about the management pack I recommend a read of my older post before reading this one. We recently installed the Management Pack for SQL Server Analysis Services 2012 and as promised I am posting some notes on the whole experience and install;
I was browsing Twitter last Sunday night and a tweet from Chris Webb caught my eye:
https://twitter.com/Technitrain/status/429582908010274816
Great! Finally, a Management Pack for SSAS! I don’t know how much people know about Systems Center Operations Manager (SCOM), so a brief introduction: SCOM is, according to the Microsoft Marketing Executives:
I’m on holiday now until the beginning of January, getting some well earned rest. That is until I remember I have a 4 month old daughter who is currently unwell and wishes to vocalise her displeasure of this rather than getting some sleep.
One of the useful features of the SSAS Activity Viewer 2012 is the ability to cancel sessions and queries.
This morning I am happy to announce the release of an alpha version of the SSAS Activity Viewer for SQL Server 2014 CTP2. The name has had a slight change and is now called Analysis Services Server Activity Monitor (ASSAM) 2014.
This morning I am happy to announce the release of a stable version of SSAS Activity Viewer 2012. The build number is 1.1.0.0. The new features have been highlighted below in the release notes section.
(Worth noting that there is a more recent version available here!)
Hi all, yesterday morning I created a project based on an old blog post Upgrade Analysis Services Activity Viewer 2008 to SQL Server 2012. Over the course of the day I made a few changes that I wanted to see myself and this afternoon I have released a beta version, SQL Server Analysis Services Activity Viewer 2012 1.0.0.1, or SSASAV 2012 1.0.0.1 for short.
Not long ago we were experiencing growing pains with our cubes: We have over 20 SSAS databases with 3 MOLAP cubes on each database, and we had aggregations applied to one of the 3 cubes. To quote Jen Stirrup, aggregations are a “data structure that contains a pre-calculated summary of data. The aggregation size is smaller than the fact data it is based on, so it is expected that this will be quicker to query.
Late night maintenance on databases also means I get to write a blog post during the downtime. Tonight’s blog post is more PowerShell/SSAS: following on from calculating the size of dimensionsand Finding Unprocessed Objects,here’s a PowerShell script that will process all the databases on a given instance that match the current process state you input: If you’ve deployed all the databases to an instance, entering Unprocessed for $CurrentProcessStateand ProcessFull for $TypeOfProcesswill process all the objects.
Update: I have uploaded the changes to my own Codeplex project. Now you can run the Activity Viewer on a box that only has SSAS 2012 installed. https://asactivityviewer2012.codeplex.com/ Hey folks, it’s been a massively busy week post-release, but I thought I’d share a tip on upgrading the Activity Viewer to 2012. First, a bit of background: Activity Viewer, or Analysis Services Activity Viewer 2008 to give it its full name, is a solution available to download on the Codeplex website.
[CmdletBinding()] param( [Parameter(Position=0,mandatory=$true)] [string] $ssasInstance) [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") $svr = new-Object Microsoft.AnalysisServices.Server $svr.Connect($ssasInstance) foreach ($db in $svr.Databases) { if ($db.State -ne "Processed") { write-host "-------------------------------------------------------------------------------------------------------------------------" -foregroundcolor yellow -backgroundcolor darkcyan write-host write-host "Database" $svr $db "is" $db.State write-host $Cub=New-object Microsoft.AnalysisServices.Cube foreach ($cub in $db.Cubes) { if ($cub.State -ne "Processed") { write-host $db "Cube" $cub "is" $Cub.State write-host foreach ($mg in $cub.MeasureGroups) { if ($mg.State -ne "Processed") { write-host $db $Cub "Measure Group"$mg "is" $mg.
I posted this Powershell script in my previous post, however I feel it’s worth sharing on it’s own. For full details read my previous post. Unlike SQL Server databases, there are no DMV’s or sys tables in SSAS that will provide us with size metrics. I have written about sys tables for SQL before hereand here. Instead, you need to get the size of the dimension from the folder that resides within the database folder on the disk.
Introduction
I’m still using my newly acquired string function skills to aid me in writing better scripts. This post I’m focusing on PowerShell. As an added bonus, I’m also providing a script that will get the total size of the dimensions within an analysis services database, although it can be altered quite easily. Unlike SQL Databases, it’s not possible to determine the size using built in DMV’s. Instead, you need to get the size of the dimension from the folder that resides within the database folder on the disk.
I’ve been on New Orleans time all this working week and have been up late at night, not doing much, just hitting the ‘random’ button on DBA Reactions (and posting them on Twitter) when I came across this one on Anti Virus running on SQL file-types.
“Good one” I thought to myself. And then I wondered if any of the files that make up the cubes on our SSAS boxes were being scanned or whether they were excluded. We have some big cubes in our system, and by big i mean TB’s worth of cubes, some of them over 700GB. So I fired up the System Center 2012 Endpoint Protection on one of the SSAS boxes, and sure enough, all files were being scanned. But should they? This Microsoft KB article confirmed my suspicions that they can be exempt from scanning: How to choose antivirus software to run on computers that are running SQL Server
With Service Packs and Cumulative Updates rapidly released for SQL, it can be hard to keep up with what version you have installed where. It’s easy enough to find out which version of SQL Server you have installed just running the following T-SQL and you can find out all the info you need. But what about SSAS?
Microsoft have released SQL Server Data Tools for Analysis Services, Integration Services and Reporting Services projects with Visual Studio 2012. Finally, it looks like the BIDS suite is going to be usable with the most up-to-date version of Visual Studio; after we upgraded to SQL Server 2012, for about 3 months last year we were able to use Visual Studio 2010 as our sole IDE, before the update to Visual Studio 2012 rolled out at our office.
Here is a neat little script; it gets all the status info of an Analysis Service database instance and the cubes within that database. You should see that status ought to be “Processed” and if not, you know that something has gone wrong in your database. This script can be quite easily expanded into including dimensions, measure groups, partitions, but I personally split that out into different scripts otherwise the screen looks too full of info and you cannot see the forest for the trees.
When releasing a new version of our software there are typically cube changes that have to be deployed. We have only one SSAS Solution but we deploy it many times for reasons I won’t go into here. Suffice to say that cube deployment and processing is by far the biggest task of any release due to the number of databases we deploy and size of the cubes. We have been able to automate much of the deployment thanks to the use of Powershell and MSBuild.
Hello folks, I hope you all had a relaxing holiday break. Welcome to a quick blog on SSAS optimisation. There’s a few quick wins that you can do w/r/t SSAS that can optimise the instance. One of these has to do with a feature called the FlightRecorder log. This log is by default set to ON, but really, unless you are trying to diagnose an issue, it doesn’t really provide any benefit to just having it switched on writing to the log location, particularly if the log location is on the same drive as the data location.