Continuous Delivery Lyrics – DDD Melbourne

At the closing to DDD Melbourne last week I fulfilled a bet that I lost by kickin a rhyme at the closure of the conference. I’ve had a few requests for the lyrics since – so here they are! If you’d like to hear my live rendition click here or if you’re really keen to see me make a fool of myself check youtube.

Warning: I am not a vocalist in any way, shape or form. Subject your ears to it at your own risk!

————————————–

continuous delivery is what I spit
new words on old beats for the instant hit
got j5 here rockin on my microphone
so tweet me up real quick if you like the tone

coming to ya straight outta the S Y D
is the Readify consultant with the funky beat
left right combination fast bringin the heat
deadly like Jackie Chan and Chuck Norris feet
lyrical roundhouse put ya back into ya seat
now check the flavour from my DJ while I chill on this beat

If you only knew, the trials and tribulations we been through
then you’d know, CI’s for everyone and that means you
if you only knew, broken checkin’s they make me spew
then you’d know, the chicken dance is what you gotta do

got delivery pipelines flowing out hot rhymes
yall be actin like you ain’t got no time
for this charasmatic character who is number one
rockin hard live for fun, in the burn city sun
now you think i’m done, but I’ve only just begun
so get ya toes wet and come get some

if you only knew, the trials an tribulations we been through
then you’d know, DI’s for everyone and that means you
if you only knew, resolution’s a hard thing to do
then you’d know, IOC’s what you gotta do

building on the back of hard work
cheeky smile, smart smirk
black hoodie, dark shirt
stay sharp or get burned
when I drop this verse like it hurt
then come back around and spin it all in reverse

– INTERLUDE –

if you only knew, the trials an tribulations we been through
then you’d know, simple is for everyone and that means you
if you only knew, complexity can get you screwed
then you’d know, DDD’s what you gotta do

before I give it up, give it up for your hosts
who spent the whole damn Sat’day keepin yall engrossed
to the dudes with most, I’d like to raise a toast
I wanna make it clear,these fellas right here

better get a few free beers, in the bars tonight
from the crowd, or myself just to set it right
for the conference of the year, run so damn tight
with a setup and speakers that were outta sight
winding up with a hot track and beats that bite
to get us goin for a party way into the night

if you only knew, the trials an tribulations we been through
then you’d know, we’re real people homie just like you
if you only knew, a conference ain’t easy to crew
then you’d know, buyin beer is what you gotta do

Publishing samples with Team Build and TFS 2010

It’s common when writing an API to publish a set of samples. This is a great way to give users an idea of the intended usages of your API. Now samples that are intended for publishing are essentially another production application you need to develop, and therefore should go through the same quality checks and processes that the rest of your code does, including version control.

This can raise a couple of challenges once you start to look at how you publish these samples. The issue is that a server based system will need some way to track the client side changes and with most popular version control systems this will involve the presence of extra files around the samples directories. As a consumer of the samples, the presence of these files in the published artefacts is far from ideal. I really don’t want to have the samples trying to connect to your remote server, or even a local instance of a version control provider I might not have installed.

So how do we break the dependency on the source control provider during the packaging of these samples for publishing? We could manually go through and delete all of the binding mechanisms, but that would expose us to the following types of waste:

  • Defects – manual processes are more prone to defects
  • Waiting – builds need to wait on the samples to be prepared for packaging
  • Extra Processing – extra steps to make our packaging pick up our manually cleansed samples

We can avoid most of this waste by automating the process. So if it’s able to be automated, how do we do it?

Prerequisites

Download the following script https://gist.github.com/967976. This script is written originally by Damian Maclennan, and essentially removes the TFS source binding files, solution and project elements that reference the TFS instance.  My fork includes the ability to decide on if you want the files backed up before removal of the version control sections. I usually choose not to, I don’t want the backup files in the output and have the actual versions in version control if I need them. In addition you may want to do the following:

  • Create a batch file wrapper for the script using the approach defined by Jason Stangroome at http://bit.ly/kmyfY2
  • Commit the script to a path in your repository that is part of your build workspace with the same name as the PowerShell script

First Step: Create a custom build template

  • Setup a build to create your packages, and when you get to choosing the process template clone the default template into a new template
  • Check the new template into source control

Second Step: Set up a folder to publish to

  • Open your cloned template and navigate to the point at which MSBuild is invoked with the ‘Build’ target
  • Drop in a new sequence activity and name it ‘Publish Samples’

  • Create an argument named ‘SampleDropFolder’.
    • This will be the configurable name for the folder placed in the output directory with our samples inside.
    • We’ll talk about how to surface this on the build configuration dialogue later.

  •  Create a variable to hold the full path to the output folder.
    • I’ve named mine ‘SamplesDropFolder’ but that may be a bit close to the argument name for you.
    • I also default this to a combination of the outputDirectory variable specified in the scope of the Compile and Test activity and the SampleDropFolder argument we’ve specified previously.

  • Open up your ‘Publish Samples’ sequence activity and drop in a ‘Create Directory’ activity.
    • Configure it with the ‘SamplesDropFolder’ variable we set up in the last step.
    • This will set up a root directory we can copy all our samples to, and makes it easy to run our binding removal script later on.

Third Step: Copy the samples

Now we’ve got our directory, we need to work out what we want to put in it. In most cases, we’ll have more than a single set of folders to move, so we need to put some smarts around how we identify our targets.

  • First create an Argument called ‘Samples’ and configure it to be of the type String[].

  • Drop a ForEach activity into the ‘Publish Samples’ sequence and configure it to iterate over our Samples argument we just created.
  • Add a variable to contain the local path of the sample directory we’re currently working with as a string with the name ‘sampleLocalPath’
  • Inside the ForEach activity drop a ‘ConvertWorkspaceItem’ activity. This will take our server paths and work out the local path for the directories for us. You’ll need to configure it as follows:

  • Drop in a ‘CopyDirectory’ activity to copy our sample directory from the source to the output directory.  Your ForEach should now look something like this:

Fourth Step: Remove the source control bindings

Now we’ve got our samples into the target directory, we need to strip out the source control bindings so our customers don’t try to connect to our server when they open the solutions.

  • Add an Argument to specify the server path to the batch file we checked in back at the start of the process.

  • Create a variable to house the local path of our binding removal script. I’ve named mine ‘SourceControlRemovalScriptLocalPath’
  • Drop in another ‘ConvertWorkspaceItem’ activity after your ForEach activity. This will be used to convert the argument we just created with the server path to our source binding stripping script to its local path.  It should be configured like this:

Note: While the variables look like they are the same item, they aren't I promise!

  • Drop in an ‘InvokeProcess’ activity after the ‘ConvertWorkspaceItem’ you just added.  A little care is required when configuring this activity to ensure we get a reliable execution, so I’ll list out how I’ve configured each property.

Arguments: Microsoft.VisualBasic.Chr(34) + SamplesDropPath + Microsoft.VisualBasic.Chr(34)

Display Name: Strip Source Control Bindings

File Name: Microsoft.VisualBasic.Chr(34) + SourceControlRemovalScriptLocalPath + Microsoft.VisualBasic.Chr(34)

Working Directory: BinariesDirectory

Any other properties remain unaltered from their default state

Your publish samples sequence activity should now look a little like this

Fifth Step: Surface the arguments

That’s nearly everything we need to do to support the publishing of samples into our output directory. However we’ve set up a few arguments in here to ensure our template is re-usable. We now need to surface them to users via the build configuration dialog. To do this:

  • On the ‘Metadata’ Argument for the build workflow click the ellipsis. You should get a pop up dialog
  • Configure the three arguments we’ve added as follows
    • The samples list
    • The samples drop folder name
    • The source control removal script path
    • These should be configured in the editor as follows:

The only thing that changes across the parameters are the Display Name - the name we surface to the editor, and the Parameter Name – the name we gave our argument that corresponds to this parameter

  • Check in the template

Final Steps: Configure the build!

Now we’ve got our template done, let’s go configure a build! The only real point of interest here are the custom parameters we set up on our way through, so we’ll focus on them – this is a long enough read already!

The points of interest are all on the process tab, so let’s skip there. If you expand your custom section you should see something like this:

All you need to do is fill in the values, so it looks more like this:

Once that’s done, kick off a build and you should be able to locate your samples, without the binding configuration in the drop directory of your build output!

Conclusion

In this article I’ve shown you how to create a reusable template for including useful samples in your build output. I’ve used this particular approach with a few customers and what I particularly like about it is we aren’t moving too far from the out of the box activity set that comes with Team Build. This saves us on overhead, and allows the template to be put together pretty quickly.

Auckland .NET Users Group

Are you an NZ native? From Auckland? Are you interested in .Net development? Then you should really check out the Auckland .Net Users Group! It’s kicking off on the 24th of this month with a great session about Behaviour Driven Development (or BDD for short) and Team Foundation Server. This is a great way to get your toes wet with a set of .Net BDD frameworks and understand how you can integrate them into your development workflow. It’s also a good chance to get along and heckle  hear Rob Maher who I can highly recommend as a speaker.

If you’re interested in the details or to RSVP head over to http://aucklandnetusergroup.groups.live.com/

The short:

WhoAuckland .Net User Group

What: BDD Overview

When: 1730, Tuesday 24th May

Where: IAG – 1 Fanshawe St, Auckland

Speaking at the New Zealand ALM Conference

It’s getting to that time of year again where everyone wakes fully from their Christmas/New Year slumber. Work is busy, and events and conferences start to occur on a reasonably regular basis.

Last year saw the inauguration of a new conference both in Australia and New Zealand focusing on application life cycle management. I was lucky enough to nab a speaking spot, and apparently did enough to convince the organisers that I was worthy of presenting again this year. The difference this year is that I’ll be heading ‘across the pond’ to Wellington to present at the New Zealand conference as well! I’ll be presenting on a topic starting to get some traction within ALM, known as continuous delivery. What’s continuous delivery? You’ll have to attend my session to find out !

The conference runs on Wednesday the 6th and Thursday the 7th of April. You can get all the details on the other speakers, the swag available and how to get your hands on a ticket over at the conference site.

Hopefully I’ll see you there!

MSDeploy and TFS Deployer

Recently I was asked for my opinion on an article by a friend Peter Gfader. In his article Peter talks about automating the creation of a deployment package using MSDeploy. Along a similar line, I am regularly queried as to my opinion on what I see as the contrasts between TFS Deployer and MSDeploy. I’ve finally decided to distil these thoughts somewhat to provide a consistent expression of my opinion.

The approach taken by MSDeploy is very similar to the database deployment model taken by the Visual Studio Database tooling. It uses the project state to create a set of configuration and artefact descriptors which it then is able to compare to a target and generate a change script. This includes things like configuration setting transforms, artefact deployment, IIS setting changes etc. There is no need to have two IIS instances to compare, as Peter points out quite well you can simply have the settings packaged into a zip file with a specific format which can then form the basis of a later deployment comparison.

The first difference between the two tools is that TFS Deployer does not attempt to create a deployment package. It assumes a deployable unit is generated as part of the standard build process. This could be by any means appropriate to the tool set that is used by your organisation. As a personal preference, I err towards the use of WiX and the Votive tooling. This allows me to put together an exact specification of the deployment unit I’d like. I have had many clients where this has not been appropriate, and recognise the need for automation in this space and see Peter’s solution as quite useful for this purpose.

TFS Deployer operates primarily as a way of managing two important items that I feel MSDeploy does not yet deal with elegantly. The first is the automation of the deployment action as an operation separate to the build. As MSDeploy is instigated using an argument supplied to the MS Build engine, or as a separate tool it appears to encourage the usage of one of two possible approaches. Either couple the build and deployment actions so that the deployment is completed immediately as part of each build (which I strongly discourage) or specify a build that does no more than call MSDeploy with the correct arguments to begin the deployment process. While the second approach is agreeable in so far as it separates build and deployment actions, it too has some issues that remain to be addressed.

The primary issue I have with creating a build to initiate a deployment is that to action a successful deployment you first need two pieces of information:

  • What am I deploying
  • Where am I deploying to

There are a couple of ways to deal with this using a separate build template. One is to define a build for each environment which can quickly become annoying, especially without a meaningful grouping mechanism for build definitions. Another is to define a single build in which the parameters specifying the two above items are injected at execution time by the build initiator which I feel does not provide significant value over the manual initiation of MS Build itself due to the human error factors it introduces.

So where does TFS Deployer fit into this picture? The primary benefit of using TFS Deployer is that it takes the above pieces of information and makes them something you only need to think about once. By defining a trigger – in most cases the change in build quality, and a mapping of trigger to a specific target environment TFS Deployer allows you to execute a deployment with a single click, rather than a click and some manual configuration.

The second benefit TFS Deployer provides is that it is built around the use of PowerShell. As the defacto server management technology within Microsoft OSes, the use of PowerShell to script your deployment actions ensures that your deployments are able to be quickly understood by your operational teams as well your development teams. It also – depending on your packaging choices, is able to help limit the amount of change needed on your target servers to support deployment. This is done by using the PowerShell remote execution features to initiate the deployment on the target rather than needing a TFS Deployer instance on the machine itself.

So if these are the features, are the two products mutually exclusive? The answer is no, and nor should they be. Where MSDeploy has strengths you should utilise them, such as described in Peter’s article. However I say the same of TFS Deployer. Why attempt to bend, or wrap the usage of MSDeploy with custom code or build definitions when TFS Deployer provides a simple and effective interface for managing both the trigger and target mappings for your product.

TFS 2010 Unboxing – Melbourne

After successful events in Sydney the TFS 2010 Unboxing comes to Melbourne! Check out the details below…

—-

Since 2005, Team Foundation Server (TFS) has been providing integrated version control, work management and build capabilities. The release of TFS 2010 builds on the foundations formed in the earlier 2005 and 2008 products and focuses on lowering the barrier of entry for teams wanting to get the maximum benefit from TFS with minimal implementation fuss.

If you’ve been thinking about adopting TFS, or are just interested to see what it could bring to your development effort this is a great opportunity to experience what TFS 2010 has to offer. In this session you’ll see both myself and our test partners at KJ Ross walk through everything to do with TFS from installation and getting the basics of version control, build and work item tracking running to what is possible with a fully integrated build, test and lab setup.

By the end of the half-day session you will:

  • be able to install and configure TFS to suit your need
  • understand the benefits that SharePoint and SQL Reporting can bring as optional extras
  • know the role TFS plays in each of the SDLC disciplines and how it can add value

This is definitely an opportunity not to be missed!

The details:

What: TFS Unboxed – A journey through 2010

When: Monday the 2nd of August (morning and afternoon sessions available)

Who: Stephen Godbold (Readify), Dr Mark Pedersen and Dr Tafline Murnane (KJ Ross)

Where: Microsoft, Melbourne – Level 5, 4 Freshwater Place, Southbank

Cost: $75 (includes light refreshments)

Register for the morning session here or the afternoon session here

TFS Unboxing – Links

Thanks to everyone who attended the TFS 2010 Unboxing at Microsoft North Ryde. For those unable to attend – there is potential for the session to be hosted again in other cities around Australia. If you’d like a session in your city, the best way to have it happen is to either leave a comment here or contact me on stephen.godbold at gmail.com with an expression of interest.

Some of the links I promised to provide:

The powershell scripts I mentioned are not yet ready for release. For those interested I would suggest to keep an eye on Jason Stangroome’s blog. He’s got a couple of really useful scripts in the pipeline.

Thanks again to all of those that made it, hopefully I was able to give you a picture of what Team Foundation Server is capable of and how you can get the most out of the product!