SSAS Activity Monitor Available
So it’s been a while since I’ve worked on any SSAS, but today I am happy to announce that there is an up-to-date version of the SSAS Activity Monitor. The solution has had couple of changes to it.
So it’s been a while since I’ve worked on any SSAS, but today I am happy to announce that there is an up-to-date version of the SSAS Activity Monitor. The solution has had couple of changes to it.
Hello!
Recently someone got in touch with me about one of my earliest posts. I realised I had made a mistake on that post and had updated. But I also realised that the script itself is actually part of a PowerShell module. So wanting to make things as easy as possible, and being the helpful guy that I am, I have altered that example into a script which I am posting here.
It’s worth stressing that the errors that output to the console are fairly generic, and that there is a log file that will contain all the information if there are any errors.
Recently I needed to update some cubes in a test environment after the underlying data warehouse had been refreshed. Typically I’d sync the cubes from prod, but as sync causes locking, and I needed to get this done during the day, I needed to reprocess all of the cubes in the test environment, and there were a lot of them, all ranging in sizes from hundreds of gb to a few mb. The simplest way to update the cubes was to run a process default on the dimensions and then run a process default on the measure groups. Some of the measure groups were partitioned, so I could’ve got clever and just processed fully any unprocessed partitions, but I felt a default would do most of the hard work for me.
Hello! And welcome to the final part of this series. This script combines the first 4 parts of this series to create an unabridged process of synchronising a ssas database from one server to another. Even though I’ve used this script fora while, it still gets tweaks and I’d still consider it a work in progress, however it’s as complete now as it ever has been. No doubt when I make some more changes I’ll post an update.
The script so far includes:
Hello, and welcome to part 4 of the series. In this post, I’m going to do a straight up sync with none of the disk checks from yesterdays post. The full script at the end of the series will include the disk checks.
Hello, and welcome to part 3 of the series on syncing databases across servers. The first two parts focused on ensuring that the role exists on the source server with the appropriate permissions to sync. We then check and add the service account of the target instance to ensure that the sync can occur.
Part two of this series expands upon yesterdays post, which was creating a database role and assigning permissions, by adding a member to that role. The script adds just one member to the role, but if you need more than one member you can either create a domain group and add all the users required into that group, then use the group name in place of the user name, or do something clever with arrays.
Tomorrow’s post will look at some disk checks that are included to ensure there is enough space to sync cubes.
Here are the links for the other parts of the series Automate SSAS Syncing Part 1: Create SSAS Database Role Automate SSAS Syncing Part 2: Create SSAS Role And Add Member
Hello!
Every day this week I am going to be posting part of a script that I recently wrote, but broken down into separate components that are useful in their own right. The full script, which I will be sharing at the end of the week, automates the synchronisation of SSAS cubes from one instance to another.
So to make this an automated process that was robust to any deviation I had to make sure that the destination service account has full admin access to the source database, that the destination disk could stand the anticipated growth as well as the copy of the sync’d cube, and that the datasource’s connection string on the destination database was reset to it’s original value. Typically the datasource does not need updating, but the destination databases are used in a different environment with their own datasource. And as the sync also syncs the datasource it is necessary to update the datasource back to its original value
Hello!
Here’s a simple Powershell script here to get the top level info of the cubes on an instance. This is a good script to understand how to use the AMO namespace to get info on the cubes via Powershell. You can add many other properties to this list to expand the info that you want. I’ve often mused about moving a lot of the logic that I constantly put in these scripts into higher tier functions in Powershell, or even create a ssas library, but time is always a premium…
One feature that has always irritated me greatly is how you cannot alter the connection for a xmla query window via the right click content menu. For .sql, .mdx and .dmx it is an option via the content menu, but not xmla!
Earlier this year I posted a Powershell script that will set the set the aggregation designs on your cube partitions. I recently created another script that contains a “Fix” flag to set the aggregation design, so you can just run it without the flag to get the info, or set the flag and fix the aggregation designs. I also added a “process” part that will run a process index on any partition whose aggregations are not fully processed.
The advantages of using this over the older version include the aforementioned process and checks, but also it means you don’t have to specify cube, measuregroup and the aggregation designs name. The disadvantage to this is that it assumes you want the latest aggregation design applied and processed. Somewhere along the way there will probably be a script which combines all of this nicely!
Very recently I had to delete a 600 partitions from a single measure group in a cube. This is a high number of partitions, but to make matters worse, it was 600 random partitions out of 1,000 or so partitions already there. So, I could’ve spent the best part of a day picking through which partitions to delete, or I could use SQL to script out the xmla for me and then execute the xmla manually. Ideally I would have done this in PowerShell. And I probably will if I have to do something like this again. But seeing as it is a one shot, I decided to write it in T-SQL. At any rate, I’ve posted the T-SQL below.
Before I start this post, I want to mention a few things:
I’ve been looking into using processor affinity on our SSAS instances to help control the workload on our servers. Sometimes a process can run which will grab as much CPU resource as it can, which means that queries on the cubes cannot run, grinding the front end to a halt. Unlike the db engine, there is no resource governor for Analysis Services. Sure you can set the min/max memory simple enough, but setting limits on IO and CPU is a more complex task.
Recently I deployed updates to our cubes that resulted in the aggregation designs being removed from our partitions. This was because in the Visual Studio solution we had more than one aggregation design for each measure group, so that re-processing and adding new partitions caused the aggregation designs to be dropped. D’oh!
The Powershell script below returns information on all the partitions within the Measure Groups of a given cube. I prefer to use Powershell to get this data as there’s no way to get he info out of the UI this quickly. If, like me, you have lots of partitions within a Measure Group and want to exit the script you can press “X” when the list reaches the bottom of the screen.
Way back when I started this blog I wrote about a few of the SSAS server properties, and today I’m revisiting this subject tot talk about the AllowedBrowsingFolders property.
Apologies for the lack of snappy title! Some weeks back I posted about the Management Pack for SQL Server Analysis Services 2008 and 2012 that was recently released. If you do not know anything about the management pack I recommend a read of my older post before reading this one. We recently installed the Management Pack for SQL Server Analysis Services 2012 and as promised I am posting some notes on the whole experience and install;
I was browsing Twitter last Sunday night and a tweet from Chris Webb caught my eye:
https://twitter.com/Technitrain/status/429582908010274816
Great! Finally, a Management Pack for SSAS! I don’t know how much people know about Systems Center Operations Manager (SCOM), so a brief introduction: SCOM is, according to the Microsoft Marketing Executives:
One of the useful features of the SSAS Activity Viewer 2012 is the ability to cancel sessions and queries.
This morning I am happy to announce the release of an alpha version of the SSAS Activity Viewer for SQL Server 2014 CTP2. The name has had a slight change and is now called Analysis Services Server Activity Monitor (ASSAM) 2014.
This morning I am happy to announce the release of a stable version of SSAS Activity Viewer 2012. The build number is 1.1.0.0. The new features have been highlighted below in the release notes section.
(Worth noting that there is a more recent version available here!)
Hi all, yesterday morning I created a project based on an old blog post Upgrade Analysis Services Activity Viewer 2008 to SQL Server 2012. Over the course of the day I made a few changes that I wanted to see myself and this afternoon I have released a beta version, SQL Server Analysis Services Activity Viewer 2012 1.0.0.1, or SSASAV 2012 1.0.0.1 for short.
I’ve been on New Orleans time all this working week and have been up late at night, not doing much, just hitting the ‘random’ button on DBA Reactions (and posting them on Twitter) when I came across this one on Anti Virus running on SQL file-types.
“Good one” I thought to myself. And then I wondered if any of the files that make up the cubes on our SSAS boxes were being scanned or whether they were excluded. We have some big cubes in our system, and by big i mean TB’s worth of cubes, some of them over 700GB. So I fired up the System Center 2012 Endpoint Protection on one of the SSAS boxes, and sure enough, all files were being scanned. But should they? This Microsoft KB article confirmed my suspicions that they can be exempt from scanning: How to choose antivirus software to run on computers that are running SQL Server
With Service Packs and Cumulative Updates rapidly released for SQL, it can be hard to keep up with what version you have installed where. It’s easy enough to find out which version of SQL Server you have installed just running the following T-SQL and you can find out all the info you need. But what about SSAS?
Microsoft have released SQL Server Data Tools for Analysis Services, Integration Services and Reporting Services projects with Visual Studio 2012. Finally, it looks like the BIDS suite is going to be usable with the most up-to-date version of Visual Studio; after we upgraded to SQL Server 2012, for about 3 months last year we were able to use Visual Studio 2010 as our sole IDE, before the update to Visual Studio 2012 rolled out at our office.
Here is a neat little script; it gets all the status info of an Analysis Service database instance and the cubes within that database. You should see that status ought to be “Processed” and if not, you know that something has gone wrong in your database. This script can be quite easily expanded into including dimensions, measure groups, partitions, but I personally split that out into different scripts otherwise the screen looks too full of info and you cannot see the forest for the trees.
Working in an Agile environment means that we have many releases throughout the year. Despite our sprints being of the same length, there generally seems to be releases where there is a lot to of functionality, and some where not so many changes are going in. Releases tend to be a team effort here, but the overall management of the process falls to me. But no matter how regularly we deploy, or how big an impact the release will be, it always makes common sense to be diligent when preparing for a release.