PoshSSDTBuildDeploy 2.0.271.0 - Now With Better Exception Handling!
Another quick bulletin about the changes I made to PoshSSDTBuildDeploy today -
Another quick bulletin about the changes I made to PoshSSDTBuildDeploy today -
Concluding this weeks work on PoshSSDTBuildDeploy is adding one of the features I spoke about yesterday, which was joining the OperationSummary and Alerts objects to produce a new table that will make it easier to figure out which Operation is creating which Issue.
Let’s revisit the xml file that I running my tests with -
Continuing on with my efforts to make the SSDT Deployment Report more useful, I’m pleased to say that I’ve added reporting on any alerts that appear in the Deployment Report. Let’s take a look at a report that has warnings in it -
This is going to be a short one because I’ve been working hard all evening on this, and frankly I’m a little tired. But I’ve made a change to PoshSSDTBuildDeploy which is going to be the start of more changes. Intrigued? Well, read on…
It is my birthday: another year older, another year not necessarily wiser. I’m planning on spending the evening watching the BBC and eating pizza. There are worse ways to spend a birthday…
One of the things I feel strongly about is that a build/release pipeline needs to be as completeas possible - that is all configuration required to complete the build/deploy process needs to be automated in some way or another. And as we will see from todays post, this tends to create further complexity, and unfortunately sometimes we have to get inventive for a process to work…
Hello! Some weeks ago I got the tests working on the CI build for PoshSSDTBuildDeploy. The tests were a bit of a mish mash, however they sort of did the trick. Thankfully Eugene Niemand came along and very kindly offered to update the tests into Pester tests, and also just generallytidy up the mess. It’s always great to have a second pair of eyes on any code, as it helps it from not being quite so auteured…
Hello! So, I’ve been working on a project recently where we’ve been using an Azure SQL Database that is stored in source control as an SSDT Project. We’ve also been making use of the excellent TSQLT to run unit tests as part of the build. And typically I set up a Unit Test database to reference the main database so that the tests can be deployed separately from the main code.
Hello! I have recently been creating a pipeline in VSTS that will check if certain software/PowerShell Modules are installed on the box, and if it isn’t then install it. It’s a sort of a DSC/Chef/Puppet process done via VSTS so that I don’t have to configure the aforementioned software. But I needed a way to target a specific build agent to run this build on; specifying an agent pool is no good as it will deploy on any in that pool.
Hello! I’m currently working on a project that requires building and deploying SSDT-based projects. So it’s been a great opportunity of making use of PoshSSDTBuildDeploy and work through any issues. And there have been a few minor updates to the module over the past fortnight which I’ll go through now! As I type the latest version is 2.196.0. Return Path of MSBuildDataTools - Just so much easier to have location returned so that when you execute Invoke-MSBuildSSDT and Publish-DatabaseDeployment you can just pass the outputted variable in.
Hello! I have been working an awful lot with ARM Templates and publishing resources using PowerShell. As part ofhte process we’re using a Service Endpoint to publish resoruces to Azure. So the login and subscription context are set by the endpoint itself. However, when running the scripts locally, it is necessary to login first. so you have two choices - login before running the script, which isn’t always possible, or include Login-AzureRmAccount in every script, and then obviously forget to remove it prior to checking in and wondering why releases takes forever…
Hello! I’ve recently started a new project, and we’re going to use VSTS for the build and release pipeline. This includes deploying infrastructure from ARM templates. This is going to provide its own set of challenges, in that the VSTS Team is new and there is no authentication between VSTS and the Subscription the objects are deploying to. So it really is a “green field” project. But this is a good thing, as we get to deploy as much as we can from source control and limit the interaction with the Azure Portal.
…but crucially I now have https on my static website. And all I had to do was migrate to GitLab! Next post will be about how this site is built. Stay tuned!
Hello! I’m due to sit an exam this week, the excitingly titled “Designing and Implementing Cloud Data Platform Solutions”. Revision is hard going, as my ability to concentrate on reading pretty much anything seems to be less an less potent as I get older. And so by way of a distraction/procrastination, I thought I’d write a brief summary of some of the changes I’ve made to projects in GitHub this month.
Hello! I’ve been playing around with a PowerShell module called RunSyncRun that I’ve never quite got around to completing. What it aims to do is to execute the sync function between SSAS Databases. Part of the problem is that it takes a while to get a test environment set up, but I finally got the WideWorldImporters sample project up and running as a good place to start. The crux of the whole module is to run this relatively small piece of xmla found in StartDatabaseSync -
Hello! I’ve been helping out a team with setting up a deployment pipeline that uses Azure Data Lake Store, and so the obvious way to do this is to make use of the AzureRM.DataLakeStore PowerShell module. For those of you not ITK, Microsoft provide a set of PowerShell cmdlets that make use of the Azure Resource Manager model for managing Azure resources. These can be downloaded via the PowerShell Gallery. And because there are a fairly large amount of cmdlets, these are split into subsets, such as the aforementioned AzureRM.
This is the first post in earnest as I begin the arduous task of migrating away from BlueHost/WordPress and their paid services to a free static web site. Write-Host "I have no idea what I am doing."
Hello! As part of SQLDWSchemaMigrate there is a function called Compare-Rows, which finds if there is a difference between the number of columns for a given table that exists on both source and target tables. And I need to return a hash table of the schema/table where there is a mismatch wrt column count. What makes this a challenge is that hash tables cannot store duplicate keys. I was setting the key to the schema name, so theoretically I could just set the value to the schema and the key to the table.
Hello! I’ve been working on a PowerShell module that will migrate the schema of an Azure Data Warehouse from one database to another. The process has through quite a few iterations, and at one point I was heavily using sqlcmd utility to execute each “CREATE [OBJECT]” statement. Whilst this proved effective when playing around with a small database, when I tested the module against a real ADW it became clear that this process was far too slow.
Hello! There are many things I like about PowerShell, one of which is the automatic variables that are created. For example, $PsScriptRoot is set to the location of the running script (and if you run $PsScriptRoot in a .psm1 file, it sets $PSScriptRoot as the path of the file as opposed to the script.) And one of the automatic variables that I recently discovered is “PsBoundParams”. This handy little variable is a hash table that keeps track of what parameters have had arguements passed to them in either a script of a function.
Hello! Not so long ago, I wrote about how dbatools now has functions that can be used to deploy dacpacs. What I didn’t actually include was a script to do same, so without further ado, below is a script that makes use of Publish-DbaDacpac - There is also a -Verbose option. But even without it a successful deployment will spit out quite a bit of information - And of course, if you run Get-Help Publish-DbaDacpac, you get the built in help -
Hello! 3 updates in one week?! It appears that though I took some holiday and didn’t even so much as touch a PC over the Christmas/New Years season, other people were still working hard. SSISMSBuild is a project that has had a long life. In short, you can build ispac projects without needing Visual Studio, however you do need Integration Services installed. Broadly speaking the project is not mine, I’m just hosting it.
Hello! Another day, another bug. This time in a VSTS extension I wrote that would alter the ModuleVersion Number in a psd1 file. Sadly I didn’t test this against psd1 files that have required modules, so the ModuleVersion numbers were updated for those also! This is helpful to precisely no one, so I made a minor change in how the ModuleVersion is found. Link to GitHub Issue. The bug itself is fixed as of 1.
Hello! Very recently a user of salt got in touch with me about a bug when setting the job schedules up - when creating a schedule an owner is created. And only the owner (or sysadmin) can modify the schedule. So if the deployment account is not sysadmin then it needs to be the owner of the schedule. Problem An owner is identified by a SID, which is stored in a varbinary column in SQL Server, so when we set the SID variables in PowerShell they are stored as byte arrays.
Hello! As is always the case, I was about the furthest away from thinking about computers when I realised that I had made some stupid decision WRT the logic in vstsBuildWatcher. So in between kicking a ball around my living room with The Boy (and causing very little damage in the process) I spent the afternoon uodating the vstsBuildWatcher Module. I’ve pushed the changes out to GitHub and PowerShell Gallery.
Hello! Recently I’ve been working on a PowerShell module called PoshSSDTBuildDeploy that makes use of Microsoft.Data.Tools.MSBuild to build and deploy SSDT-based database projects. And then not long after the fact I started working on functions within the dbatools PowerShell module to deploy dacpacs. In fact here’s the tweet of the announcement - https://twitter.com/psdbatools/status/940650316235706369 And here is a link to the source code on GitHub. This is all good stuff because I’ve spoken to a number of people who have written, rewritten and re-rewritten the PowerShell to deploy dacpacs numerous times before, and we’ve all agreed that once is enough.
Hello! With a spare Nexus 5 knocking about, and somehow too much time on my hands, I decided to flash the phone with Ubuntu Touch. having tweeted said decision I got a response from someone about an annoying bug which will drain your data - Seems to have an annoying bug that makes repeated requests to "https://t.co/xT6GOIi3KY". Needs blocking in localhost or you will eat mobile data. It's actually pretty good but needs lots more people to support it.
Hello! Lately I’ve been working on PowerShell Modules that are published to PowerShell Gallery as part of a build process in VSTS. And so I’ve been creating version numbers and passing them as variables. However I wanted to validate them before they were applied. Below is a sample of how to verify that the version number you’re pushing conforms to semver
Hello! Below the “how to” script, there are two functions for you; the first will download MicrosoftDataToolsMSBuild. The second will compile an SSDT project/solution using downloaded package. There are no other prerequisites other than MSBuild Tools being downloaded/installed on the box.
Hello! Wow, day 7 already! And for today’s post, we getting meta. I have a few builds on VSTS that run PSScriptAnalyzer on some PowerShell modules, and also runs some tests. After these steps have run successfully I publish to both NuGet and PowerShell Gallery (yes that’s right, Continuous Development FTW). Packaging via NuGet is simple enough: there’s several out-of-the-box NuGet steps in VSTS, including even a Custom step that allows you to submit version number/prefix.
Hello! Continuing on with our daily PowerShell Advent Calendar Extravaganza, this script here will return the current .NET version installed on the machine. You can either choose to add the target .NET version or not. If you add a .NET version number and at least that version isn’t on the machine, the script will throw an error, but will continue to run and return the PSObject, unless you set the “ErrorAction” to “Stop”.
Hello! As someone who has some PowerShell modules knocking about, it’s important to get some checks running prior to publishing. As these modules are in source control, it makes sense to launch builds that run PSScriptAnalyzer as soon as a pull request is made on the branch. But running PSScriptAnalyzer on the hosted build agent is a little difficult as not only does it not come with it installed, the service account running the builds don’t have permissions to install it!
Hello! Not too much to say here except that here is an improved script on deploying cubes with PowerShell!
Hello! When I was recently putting together a repo I needed to get a few packages from Nuget. I wanted also wanted to automate getting the packages for when other people downloaded the repo and ran the scripts. So it made sense to create a function that would download the package and test that the download was successful. For info, read the synopsis!
Hello! Today’s PowerShell Snippet will check if a module is installed, and if it isn’t will install it. Finally it will import the module.
Hello! When building up urls from different parameters in something like TeamCity, or Octopus, it’s simple enough to get double “//” in urls if the parameters are not consistent. So little helper functions are always useful to have imported to manage such things. Below is an example of such a thing!
Hello! Today I’ve finally got around to altering PoshSSDTBuildDeploy to generate the deployment scripts/report. These can be generated with or without Publishing the changes. Run-Test file has examples of how this works. It is however, a breaking change, hence the update to version 2. This is because I moved from using the Deploy method to the Publish method. You can read about this method on the SSDT Blog Post. This method was introduced way back in October 2016.
One of the big challenges in the whole “build once/deploy many” ideology is the need for environment configuration. And this is true with SSDT-based database deployments. Happily the variables are stored in the publish.xml file. And so today’s release of PoshSSDTBuildDeploy added a switch on the Publish-DatabaseDeployment function. When included, this switch (getsqlcmdvars) will attempt to resolve SqlCmd variables via matching PowerShell variables explicitly defined in the current context. The drawback here is that all SqlCmdVars have to exist in the current context.
Hello! Late on Friday I published a PowerShell Module that gives working examples of using DataToolsMSBuild. All very interesting,go and have a read of my previous post. One of the things I really want the module to do is to set up all software required for building/deploying. And so from version 18.104.22.168 onwards there are three new Functions that will check if LocalDb is installed, and if it isn’t then it will download and install.
Hello! Over the recent past I’ve been working on a variety of build/deploy PowerShell scripts that make use of Microsoft.Data.Tools.MSBuild. And I’ve been wanting to get like a standardized “this is how I would do it” set of scripts together. And so I’m happy to say that I’ve got around to it. The source code is on GitHub, and the releases are on both NuGet and PowerShell Gallery. There is a test file to to show the expected workflow.
You can tell a lot about how much someone has embraced continuous delivery by how they react when either a build or an automated deployment fails. I have worked with quite a few teams who are at different levels of working with continuous delivery, but irrespective of whether they familiar with the practice or not, you can tell how adaptive a team member is when something goes wrong, like a failed build or deployment.
Hello! One of the most interesting features of SQL Server 2012 Service Pack 4 is the availability of the management command DBCC CLONEDATABASE. The idea of it is to create an “empty” copy of the database; all the metadata and statistics of the original and clone are identical, but the clone contains no data. The syntax of the command is very straightforward with only 4 arguments required. So I ran this on a very small (50mb) database, and within a few seconds it was completed.
Hello! So, last month I posted an article about changing XML values in memory. All very useful for when you have to manage configuration in the modern world of “build once, deploy many.” I’m not going to repeat myself on what I said on that blog, it’s all very interesting to for people who like that sort of thing, go and have a read. Yet it only focused on a somewhat simple example, and so as XML can get very complex, I wanted to show a similar method to find elements where they are not unique by making use of attributes that are.
Hello! So AssistDeploy has been out for a little while now and so I decided to release the project I was using for testing during development. At the moment it uses just one scenario I used for testing as opposed to the several that I ran through. This is because initially I just want to provide a straightforward working sample. What became quite a big challenge was the fact that you need more than just an Integration Services project.
One of the useful features of the SSAS Activity Monitor is the ability to cancel sessions and queries. Though there is no obvious way to do this through SSMS you can in fact write an xmla query to kill either the session, connection or command: If you specify the connection ID it will kill all sessions, if you kill a session it will cancel all SPIDs that pertain to that session, and if you kill the SPID it will kill that one command.
Hello! I was helping a team debug a failed build today. Despite the fact that the build compiled on the devs box - something that’s become almost cliché to hear these days - it would fail using devenv.com via cmdline. Annoyingly, there was no errors outputted. So i cloned the repo and took a look myself - This is helpful to approximately no one. But based on experience we figured out that the Azure Feature Pack for SSIS needed to be installed on the build agent.
I’ve spent what feels like a lifetime working on automating SSIS deployments, more specifically SSIS packages that have been created in Visual Studio that use the project deployment model. If your SSIS Project is using this model you’ll know that the output of a build is an .ispac file. But what is in an ispac? Funnily enough, someone forwarded me a link to a page that covers some details about the ispac file format on MSDN.
Hello! Recently someone got in touch with me about one of my earliest posts on my old blog. I realised I had made a mistake on that post and had updated. But I also realised that the script itself is actually part of a PowerShell module. So wanting to make things as easy as possible, and being the helpful guy that I am, I have altered that example into a script which I am posting here.
Hello! I’ve spent a large chunk of the past few months putting together a PowerShell Module that will automate the deployment of SSIS solutions. Recently it went open source, and because I have a poor sense of humour I called it AssistDeploy. Aside from the readme, each function has it’s own documentation in the header. I’ve also written a rather large post on it at my companies blog. All very detailed, go and have a read.
It’s been a long time coming, but now SQL Server Integration Services is in Public Preview on Azure! I’ve written about it elsewhere in greater depth, but here are the headlines: It makes use of SSIS Scale Out, which was released as part of SQL Server 2017. Although it is based on SSIS Scale Out, you can’t actually configure SSIS Scale Out to run on the instance. If this confuses you then read my in-depth post.
[Hello! 2 posts in 1 day? When you’re hot, you’re hot! I’m actually writing this one whilst I’m waiting for a BACPAC to restore to Azure. It’s actually incredibly easy: use SQLPackage! Setthe /SourceFile to your bacpac, and use the connection string published on Azure, and you’re done! Only a short post yes, but the precursor to some much longer and more interesting posts…
Hello! One of the patterns in Continuous Delivery is to “build once, deploy many”. Another is to “deploy the same way to each environment”. This is certainly easier said than done: The 3 biggest challenges in creating a Continuous Delivery pipeline can be summed up as follows: Configuration Configuration Configuration Obviously values of variables are going to change across environments. So how can we take something static that contains meta data of an environment, like say some xml, and update the values so that it is correct at runtime.
Hello! Let’s be brief and not mince words: PowerShell is great. And so recently I needed to get the VersionInfo of a given DLL, and so was able to write up a module that did exactly that.
Hello! As I have mentioned many times before, I have conflicting opinions when it comes to Octopus Deploy. On the one hand I think it does things that are a bit lousy, like variables only being strings for PowerShell (so it is not possible to include HashTables scoped easily). And whilst we’re on the subject of variables, the page to enter/edit variables is a tedious process. Firstly it has an autocomplete feature for the scoping, which is fine for tiny installs but is painfully slow on larger servers.
This month’s T-SQL Tuesday topic is about automating with PowerShell. I’ve been using PowerShell to help automate deployments for all sorts of SQL solutions; SSDT-based database projects, ISC-based SSIS projects, SQL Agent Jobs, multi-dimensional SSAS projects, to name a few. And when I worked as a DBA I wrote many ad-hoc scripts to automate many tedious processes I was required to do. And so when it came to pick a topic for today’s post I decided to go for something that not so much offers a complete, out-of-the-box, one-size-fits-all solution to everyone’s automation woes, but a bit of a left field example of how you can make use of .
I’ve been using TRUNCATE TABLE to clear out some temporary tables in a database. It’s a very simple statement to run, but I never really knew why it was so much quicker than a delete statement. So let’s look at some facts: The TRUNCATE TABLE statement is a DDL operation, whilst DELETE is a DML operation. TRUNCATE Table is useful for emptying temporary tables, but leaving the structure for more data.
Hello! Ive yet to write up in any detail on the No Holds Barred Competition, but frankly seeing as this write-up is going to be a lot simpler and shorter I’m going to start here. The competition for September is called “Mega Melee”. This Single Battle competition allows for you to use only Pokémon that are capable of Mega Evolution. To be clear, despite a team of 3-6 being added, only 1 Pokémon will be used during each match.
ello! One month ago I became a dad for the 3rd time. And so my spare time, which has always been a premium, has shrunk even further. But here’s some thoughts on some stuff: SQL Stuff SSDT Supports Visual Studio 2017. And not only that, SSDT supports SQL Server Database, Analysis Services, Reporting Services, and Integration Services projects in Visual Studio 2017. SSDT has come a long way since those days of requiring two different IDE’s - one to support BIDS and the other to support Database projects.
Hello! Security has always been an important issue, and this is more true today than it ever has been. And one of the cardinal sins of IT Security is storing sensitive data as clear text stored in files. So with the case of sql scripts generated by SSDT, this sets an issue, as the variables are stored as clear text. I know of no way to mask variables, and so there are three options: Do something clever involving KeyVault that may well get convoluted.
Hello! About 2 weeks ago I had just come off the back of a very busy weekend where some way or another I was still able to find time to take part in The Weakness Cup. I didn’t play the full quota of matches, but at any rate something was far better than nothing . I’ll post my team review sometime. But for now I thought I’d share some things I learned whilst taking part in this competition.
Aloha! (If you want to cut to the chase and just get the generic PowerShell that will find all possible combinations, this gist here is what you’re looking for. If however you want a bit of context then read on…) So, I’ve recently been working on a solution for automating SQL Agent Job deployments, and I came across a problem. In my solution, the constituent parts of a SQL Agent Job would be stored in an XML file.
Hello! The titles is only half serious, but recently someone told me about Pi-hole, an ad blocker that runs on Raspberry PI. This meant that I am able to change my DNS Server settings on each of my devices that use my home network, and now all ads are blocked. Neat! So I installed Raspbian Jessie Lite on a Pi I had spare from when it was running XBMC/Kodi/whatever it is called now, set up SSH and plugged it into a switch that runs off my Router.
[Hello! I’ve been writing a PowerShell Module that will handle deploying SQL Server Agent Jobs by using the Agent namespace in SMO. But outside of work I’ve been very much into playing around with Linux: I’m slowly making my way through Linux From Scratch; I’ve got Pi-hole setup on a Raspberry Pi; and my laptop dual boots into Windows and Ubuntu. And the last time I booted into Windows was May.
Hello! So, there are many different product SKUs of SQL Server available, and generally when developing it’s probably best to use SQL Server Development Edition, even when it used to cost you money to get a product key. The disadvantage of using it though is that it is a large install. The team behind SSDT recognised this and so LocalDB was created, with the intention of using it as a very lightweight version of the database engine to develop against.
Lately I’ve been working on a deployment process for Integration Services Projects that makes use of the views and stored procedures in SSISDB to deploy folders, variables, ispacs etc etc. The reason why I’ve gone this route instead of any of the others is so that I can just use Sql Data Client dll which is part of .NET. So no extra moving parts required when deploying. And as part of the deployment I want to run the validate project stored procedure.
I’ll admit this post is a little off-piste, so let’s see how this works out. I’ve ruminated over this topic as a blog post that I have had in the back of my head for a couple of years now. And what with it being T-SQL-Tuesday, and having just off the back of playing Minecraft for the first time in a while with my nephew, now is as good a time as any to write this post.
Aloha! It’s my birthday, and I’m also on holiday, so what better way to spend it than writing a blog post. In any Software Development Lifecycle Methodology, be it Scrum or Waterfall, there is a beginning and an end to the process, no matter how frequent those iterations are. That is to say, there is a process which relies on input from an upstream process before it can proceed. Generally it goes like this: the customer wants, BA’s spec, Devs code, Testers, umm, test, it gets released and in between that whole dev/test thingy is the non-trivial matter of deployment.
Hello! This here is more for the benefit of me than anyone else, but this is a super simple PowerShell script that checks connection to a SQL instance, as well as switching to a database. It references standard ADO.NET classes so nothing else needs to be installed on the box other than .NET. So This will even work on Linux, providing .NET Core is installed of course.
Aloha! Someone emailed me with a problem they were having on compiling SSIS projects. Or rather, it’s a case of the projects not building. This was there output: “Rebuild All: 0 succeeded, 0 failed, 0 skipped Build succeeded. 0 Warning(s) 0 Error(s)” You see, when it comes to Visual Studio Project files, they’re essentially just a MSBuild file. And so when they’re built they’re compiled using MSBuild engine. Even those through Visual Studio.
Hello! Recently I needed to check that a variable exists in the PowerShell session currently running. This is actually far easier than it sounds. So here is a simple demo for how it works. The magic here is the “Test-Path variable:my_variable” on lines 4. It tests that a variable of that name exists. If it does, great, let’s print out the value. If not, let’s alert that it doesn’t. The second example of this on line 11 will do exactly that.
Aloha! Recently I’ve revisited one of my GitHub projects, TurboLogShip. I’ve been wanted to write a presentation around what it does and why it’s actually really important and super useful for people using log shipping in SQL Server. Essentially it speeds up log ship restoring to a readable secondary to the fastest it can be without upgrading hardware or software. And the idea of trying to do something for free when your hardware/software may not even be the problem is a pretty compelling reason to use it.
Hello! In case you missed it, a fortnight ago SQLBits was held at the Telford International Centre. Believe it or not this was in Telford. For those of you not in the know SQLBits is Europe’s largest community SQL conference. I suppose the name would suggest that the conference focuses on SQL Server. In fact it’s far more than SQL, it’s really more of the Microsoft Data Platform. Of course whilst there’s plenty of talks on SQL Server subjects such as Clustered Columnstore, SQL Azure and SQL Server on Linux, there’s plenty of talks on tools such as Power BI, Document DB, R Server….
Hello! I’ve returned from the dead: not literally, but I was so ill over the past week that I thought that as opposed to calling for a doctor I was going to have to call for an undertaker. But now I am better and ready to prod buttock. Anyway, some random musings over the Easter time. Zesty Zapu Is Glorious Ubuntu is still installed on my Surface Pro, and I’m enjoying it very much.
Hello! So, on another blog a long time ago, I wrote a couple of posts about foreign keys and what step can be taken to suspend them when we are trying to load/delete data into them. There’s the elegant way and the brute force way. Hang On… “But aren’t they there to stop this sort of thing?!” I hear the sensible people cry. And yes, this is true. BUT! Say we are inserting static data into a database at the point of creation using a tool like SSDT, we may not care about what order the data is loaded in, just that it is in.
Hello! Non Fools Disclaimer I will preface this post with the following: despite the fact that this is a post on April Fools, and that the name of the post does say April Fools, there is no fooling in this post. There’s enough ways to waste time on the internet without reading made up stuff on my patch of it. My Number One SQL Server Feature Request If you were to ask me what new feature I would like to see most in SQL Server, I would immediately say “logless tables”, and maybe after a slight hesitation I’d say “logless databases”.
Aloha! If you’re here because you desperately need to go back a step template version and don’t have a copy saved anywhere except within Octopus, and you haven’t got time to read up on my two cents, scroll down to the “I Didn’t Come Here For A Lecture On Communism” section. But be warned; this is not a tried and trusted method, but a hack. I accept no responsibility for this not working or breaking your Octopus Deployment application.
This is hardly breaking news, but it’s important to know: when using NOLOCK table hints in SQL Server, it can cause blocking. Yep, you read that right. Given the fact that the table hint is called NOLOCK it’s counter intuitive to what you’d instinctively think, but it’s fact and by design and also, there’s nothing wrong with it. Part of what’s wrong with NOLOCK can be summed up below: Reading: https://t.
If you need to configure the amount of memory available to an instance of SSRS you have to get your hands dirty and edit the RsReportServer.config file. The RsReportServer.config file stores settings that are used by Report Manager, the Report Server Web service, and background processing. The location of the rsconfile file is generally “\Program Files\Microsoft SQL Server\MSRS1111.MSSQLSERVER\Reporting Services\ReportServer” Within the RsReportServer.config file, configuration settings that control memory allocation for the report server include WorkingSetMaximum, WorkingSetMinimum, MemorySafetyMargin, and MemoryThreshold.
Aloha! Late last week I decided to set up an Ubuntu box in Azure with the intention of installing SQL Server for Linux and VS Code on the same box. VS Code is necessary because there is no Management Studio shipped with SQL Server for Linux (command line only, see?) I know, I could use the command line, but I’m lazy and there’s an extension for VS Code called mssql, which funnily enough is exactly what you think it is.
I’m hoping to critique Pokémon in terms of their suitability for competitive battling at random blog posts. I’ve made a rod for my own back by having a blog, and I don’t want to feel compelled to post every week if i did a “Pokémon of the Week” series. So I thought that random entries into my misadventures of losing way more than I win in Battle Spot with Pokémon I like would perhaps be interesting, if not informative…
Hello! If you know me you know that I love Pokémon. It is my favourite video game series going today. It’s been going on for 21 years now, an within the game there is a lot of data, that clearly sits within a database. And being a database guy, I’ve always wondered how the database would be structured, and how the changes to the game would be incorporated. Long ago, a guy called Veekun created a copy of this database and posted it on GitHub.
Hello! Sitting here waiting for a Runbook in Azure to run whilst waiting for a deployment to run in Octopus. So might as well go meta and post some thoughts. Brand BZZZT! First, a bit of a re-brand, including even my Twitter handle which was never great. Brand BZZZT! The inspiration of the name is after a soft toy of my kids. See You @ SQLBits Second, I’ve bagged myself the full access to SQLBits.
Well, it only took me on and off 8 months, but I finally completed the Duolingo French course. For those of you not ITK, Duolingo is a very cool and very free (advertising not withstanding) tool that aims to gamify learning a new language.It also aims to get people to translate simple statements in documents. Amazingly, Duolingo is valued at nearly half a billion dollars! I can’t say that I am now 40% fluent, despite what the app tells me otherwise, but I’m certainly better at reading and writing French, and I can listen OK.
Hello! Earlier this week the Training Days and Agendas for Friday and Saturday were released for SQLBits. I will be there on the Friday and Saturday, which means I’ll get to enjoy the party! Last years party was nuts in that there was a lot to do and lots of free food, and even free alcohol! Take a look on the media page of the SQLBits Twitter account to get an idea of the conference.
Hello! So, recently at work I have been working on automating SSIS Builds using DevEnv, but called through MSBuild. Fortunately I wrote a blog post about this on another blog a long time ago, so was able to retrieve the info easily enough. Trouble is, the sample code I had on the script had some formatting issues. See all those $quot; entities? You see sometimes, double quotes are formatted as the html " entity.
Hello! And a belated Happy New Year! Occasionally there is a need to get a dll out of the GAC and onto the file system: sometimes its useful to load directly into PowerShell using Get-Type from a folder rather than loading. And I needed to get a way of getting A LOT of dlls out of the GAC, so here is the command line to extract from the GAC using cmdline.
I love Christmas, and I love Christmas films. One of my favourites is “Elf” with Will Ferrell. I know that the world and his wife have seen it, but for those of you that haven’t: The plot revolves around Will Ferrell’s character Buddy coming to terms with the fact that he is not an Elf, but is a human adopted by his Elfish father, played brilliantly by Bob Newhart, and goes on a journey in New York at Christmas time to find his biological dad.
Prior to him discovering he is human, Buddy’s apparent incompetence in the toy-making department sees him transferred from there to the testing department, where the “special” elves go. And this I always felt was a good analogy to how the relationship between testers and developers is seen in the IT Industry.
Today the pre-cons for SQL Bits 2017 were made public, and what a diverse list there is. Increasing from 24 last year to 32, there’s of sessions to choose from on the Training Days. Naturally I will be there for the full week. If you’ve never been, and you live in the UK, or even Europe, be sure not to miss out. The cost is very reasonably, there is a party on the Friday night, and the Saturday is free to attend.
..This guy! If the party season has you all partied out, and the train strikes have not put you off coming into London, why don’t you join me at the Impact Hub King’s Cross where I will be talking about NuGet packages and their uses, specifically pertaining to the Microsoft.Data.Tools.Msbuild NuGet package released by Microsoft, and the PowerShell.SqlServer.Modules package published my yours truly.
Microsoft’s announcements during the Keynote yesterday was like waiting for buses: you wait for one, then 17 appear all at once! In case you don’t have 2 1⁄2 hours spare to watch yesterdays Keynote, I made some notes: Microsoft’s development ideology is : any developer, any app, any platform. GitHub is growing; last year there were 5,000 new projects per day, now it’s 10,000 Contributors are also growing: last year 1,000 per day, now it is 2,000 per day.
Although in reality the notion that “Microsoft have gone open source” is hardly a news bulletin, today on the Connection(); keynote Microsoft have shown just how far they have come by announcing that they are now an Official Platinum Partner of the Linux Foundation. This is another major milestone for a company that, according to Rob Mensching, never really “understood what the Open Source community was really about”. In other news, Visual Studio Code (Microsoft’s open source code editor) continues to grow apace with more users (1 million active users) and more plugins written, including one for SQL.
PowerShell is ten years old today. If you work with Microsoft products you’ll be familiar with PowerShell. It’s almost impossible not to use PowerShell. You’d have to be pretty belligerent to ignore PowerShell. “PowerShell is so powerful, because I am a deeply flawed human being” - Jeffrey Snover If you know me you know I love PowerShell. I truly believe that anything can be achieved by using PowerShell and a well-written API.
Today I’m going on a mini rant about people abusing PowerShell, specifically to do with naming PowerShell functions. If you’re going to take the trouble to write PowerShell functions and add them to a module for others to have access to, please, don’t be a jerk, use a verb from the approved verbs list that is freely and readily available on the MSDN Developers site, or even by typing “get-verb” in the console.
Earlier this year, I had a wood burning stove installed in my house. I’ve always wanted a real fire, but growing up we only ever had the electric type. So I was really pleased to finally get a proper fire. They’re pretty to look at, but wood burning stoves also generate a lot of heat. You can burn coal, but I’m a bit of a environmentalist and so I’ll stick to burning wood.
OK, so here we are again, writing my 4th ever “Hello World” type blog post. TL;DR - last blogs content was merged with another blog, and I had the opportunity to restart making content again. And so this blog is that aim realised. In the coming days I will post more and get into why I started yet another blog. But for now, hello, good evening, and welcome.