2018-06

2018-05

2016-10

2016-08

log shipping. One of the key points I mentioned was how slow restoring can be for a read only log shipped database. If going and reading the whole thing is too much effort for you now, I’ll save you the effort (you child of the internet you) and tell you it’s because the database needs to be kept transactionally consistent in between restores when making it available for read-only. It creates a .tuf file (transaction undo file) to keep the progress of all the pages that have uncommitted transactions saved to them. The checking, creation, applying and re-applying of this can take some time where there is a significant amount of uncommitted transactions within a log backup.

2016-02

2015-12

greatly deprecated form, and you cannot read from the secondary. So it will be good for DR, but not for reporting (as an aside it still might be easier to set up log shipping for DR than AlwaysOn Basic because you need to setup a failover cluster. Read through the “how to set up Standard Edition Availability Groups” here.) However you do need to be careful though when setting up log shipping across different editions of SQL Server: whilst you can log ship between Enterprise to Standard/Web, if the database uses any Enterprise features then you’ll need to log ship to an Enterprise edition of SQL Server. And because you’ll be using the database for reporting, you’ll need to get it licensed.

2015-04

2014-10

2014-08

SQL Server 2014 Clustered Columnstore Indexes and Partitioning. The script below will create the database and the objects necessary. I’m going to state the obvious, but read through the scripts first before you run them; things like file locations on boxes are always different:

2014-06

Connect article already open.

2014-04

2013-12

moving large tables across filegroups. This was something that I have had to do a few times over the past few years, however the data warehouse in particular was on Enterprise, so it used page compression and partitioning.

2013-11

Analysis Services Server Activity Monitor (ASSAM) 2014.

SQL Relay R2 2013 is taking place in the UK. This is the 2nd series of events this year. SQL Relay is a series of 1 day events that are run throughout the UK. Each event is a single track (or more in the case of London) conference, with between 50 and 200 attendees and 6-8 hour long talks by SQL Server professionals, MVPs, authors, technical experts etc.

2013-10

to download here. It is also available as a VM on Azure. According to a tweet from Glenn Berry, the version number in the demo is 12.0.1524.

Before what you download and install, here’s what you ought to know;

SSAS Activity Viewer 2012. The build number is 1.1.0.0. The new features have been highlighted below in the release notes section.

here!)

Hi all, yesterday morning I created a project based on an old blog post Upgrade Analysis Services Activity Viewer 2008 to SQL Server 2012. Over the course of the day I made a few changes that I wanted to see myself and this afternoon I have released a beta version, SQL Server Analysis Services Activity Viewer 2012 1.0.0.1, or SSASAV 2012 1.0.0.1 for short.

2013-09

2013-07

2013-06

Visual Studio 2013 Preview and now the SQL Server 2014 CTP1. And that’s just the ones I’m focusing on! There’s the Windows 8.1 Preview as well as Windows Server 2012 R2. What with all the keynotes from Build and both TechEd’s this month, it is clear that Microsoft have certainly accelerated the release cycle and embraced a cloud-first development model. For me this is most noticeable in Team Foundation Services, the Azure based Source Control solution. Features were turned on regularly before they were released in the three updates we’ve had since the release of TFS 2012 back in August. This is a big change from the Microsoft’s previous strategy of developing for its products hosted on the premises first and then being pushed to them to the cloud afterwards.

At a later date I’ll dig more into features of SQL 2014, as this week my focus has been on Visual Studio 2013, which is out sometime this year, as well as doing my job in real life. The duties of blogging…

2013-05

Part 2 focuses on clearing out the SSISDB by creating new stored proc based on the one used by the maintenance job. If your Maintenance Job is taking hours to run, you need to check this out.)

No WiX Wednesday this week, owing to commitments in real life. Instead, here is something regarding SSIS 2012 Deployment. Enjoy!

As part of our CI and Test Builds we have automated the deployment of two SSIS Projects. One is fairly large and the other one contains only two dtsx packages. Recently we have been getting timeout issues with the deployment of the solutions.

2013-04

brentozar.com ran one of their Technology Triage Tuesdays on Compression in SQL Server. The video does not appear to be up yet, however it provided good insight, certainly better than the Technet pages on the same subject. As we have a multi-terabyte data warehouse at work, which is on an Enterprise licensed instance of SQL Server 2012, I’m familiar with the subject of data compression in SQL. Until recently however, another one of our other SQL Servers was on a Standard license. Recently this instance was upgraded to Enterprise, and so I was able to compress the database. Although not as large as our data warehouse, it was well worth considering compressing some of the larger tables.

2013-03

2013-01

2012-12

here) Hey folks, and welcome to my first proper blog post. One of the things that I like to monitor through my daily checks in the file size and the free space of each file within the databases. During a busy loading period, some databases can grow massively, so it’s important to keep an eye on the growth. Through the UI, SSMS only gives us the size of the files and what the max size is that they can grow to, which is clearly not very useful. Fortunately, a quick query on the dbo.sysfiles of each databases that we want to monitor gives us some info:

" rel="attachment wp-att-11

but this isn’t entirely too useful. For starters the size is in KB. This makes sense as databases store data in 8kb sized files. Whilst it may be OK for a small database like this one, our data warehouses are far to big for us to find sizes in KB useful. Also, we can infer the remaining space, but again it’s not too helpful by having to figure this out for ourselves.