Category Archives: Build

Publishing samples with Team Build and TFS 2010

It’s common when writing an API to publish a set of samples. This is a great way to give users an idea of the intended usages of your API. Now samples that are intended for publishing are essentially another production application you need to develop, and therefore should go through the same quality checks and processes that the rest of your code does, including version control.

This can raise a couple of challenges once you start to look at how you publish these samples. The issue is that a server based system will need some way to track the client side changes and with most popular version control systems this will involve the presence of extra files around the samples directories. As a consumer of the samples, the presence of these files in the published artefacts is far from ideal. I really don’t want to have the samples trying to connect to your remote server, or even a local instance of a version control provider I might not have installed.

So how do we break the dependency on the source control provider during the packaging of these samples for publishing? We could manually go through and delete all of the binding mechanisms, but that would expose us to the following types of waste:

  • Defects – manual processes are more prone to defects
  • Waiting – builds need to wait on the samples to be prepared for packaging
  • Extra Processing – extra steps to make our packaging pick up our manually cleansed samples

We can avoid most of this waste by automating the process. So if it’s able to be automated, how do we do it?


Download the following script This script is written originally by Damian Maclennan, and essentially removes the TFS source binding files, solution and project elements that reference the TFS instance.  My fork includes the ability to decide on if you want the files backed up before removal of the version control sections. I usually choose not to, I don’t want the backup files in the output and have the actual versions in version control if I need them. In addition you may want to do the following:

  • Create a batch file wrapper for the script using the approach defined by Jason Stangroome at
  • Commit the script to a path in your repository that is part of your build workspace with the same name as the PowerShell script

First Step: Create a custom build template

  • Setup a build to create your packages, and when you get to choosing the process template clone the default template into a new template
  • Check the new template into source control

Second Step: Set up a folder to publish to

  • Open your cloned template and navigate to the point at which MSBuild is invoked with the ‘Build’ target
  • Drop in a new sequence activity and name it ‘Publish Samples’

  • Create an argument named ‘SampleDropFolder’.
    • This will be the configurable name for the folder placed in the output directory with our samples inside.
    • We’ll talk about how to surface this on the build configuration dialogue later.

  •  Create a variable to hold the full path to the output folder.
    • I’ve named mine ‘SamplesDropFolder’ but that may be a bit close to the argument name for you.
    • I also default this to a combination of the outputDirectory variable specified in the scope of the Compile and Test activity and the SampleDropFolder argument we’ve specified previously.

  • Open up your ‘Publish Samples’ sequence activity and drop in a ‘Create Directory’ activity.
    • Configure it with the ‘SamplesDropFolder’ variable we set up in the last step.
    • This will set up a root directory we can copy all our samples to, and makes it easy to run our binding removal script later on.

Third Step: Copy the samples

Now we’ve got our directory, we need to work out what we want to put in it. In most cases, we’ll have more than a single set of folders to move, so we need to put some smarts around how we identify our targets.

  • First create an Argument called ‘Samples’ and configure it to be of the type String[].

  • Drop a ForEach activity into the ‘Publish Samples’ sequence and configure it to iterate over our Samples argument we just created.
  • Add a variable to contain the local path of the sample directory we’re currently working with as a string with the name ‘sampleLocalPath’
  • Inside the ForEach activity drop a ‘ConvertWorkspaceItem’ activity. This will take our server paths and work out the local path for the directories for us. You’ll need to configure it as follows:

  • Drop in a ‘CopyDirectory’ activity to copy our sample directory from the source to the output directory.  Your ForEach should now look something like this:

Fourth Step: Remove the source control bindings

Now we’ve got our samples into the target directory, we need to strip out the source control bindings so our customers don’t try to connect to our server when they open the solutions.

  • Add an Argument to specify the server path to the batch file we checked in back at the start of the process.

  • Create a variable to house the local path of our binding removal script. I’ve named mine ‘SourceControlRemovalScriptLocalPath’
  • Drop in another ‘ConvertWorkspaceItem’ activity after your ForEach activity. This will be used to convert the argument we just created with the server path to our source binding stripping script to its local path.  It should be configured like this:

Note: While the variables look like they are the same item, they aren't I promise!

  • Drop in an ‘InvokeProcess’ activity after the ‘ConvertWorkspaceItem’ you just added.  A little care is required when configuring this activity to ensure we get a reliable execution, so I’ll list out how I’ve configured each property.

Arguments: Microsoft.VisualBasic.Chr(34) + SamplesDropPath + Microsoft.VisualBasic.Chr(34)

Display Name: Strip Source Control Bindings

File Name: Microsoft.VisualBasic.Chr(34) + SourceControlRemovalScriptLocalPath + Microsoft.VisualBasic.Chr(34)

Working Directory: BinariesDirectory

Any other properties remain unaltered from their default state

Your publish samples sequence activity should now look a little like this

Fifth Step: Surface the arguments

That’s nearly everything we need to do to support the publishing of samples into our output directory. However we’ve set up a few arguments in here to ensure our template is re-usable. We now need to surface them to users via the build configuration dialog. To do this:

  • On the ‘Metadata’ Argument for the build workflow click the ellipsis. You should get a pop up dialog
  • Configure the three arguments we’ve added as follows
    • The samples list
    • The samples drop folder name
    • The source control removal script path
    • These should be configured in the editor as follows:

The only thing that changes across the parameters are the Display Name - the name we surface to the editor, and the Parameter Name – the name we gave our argument that corresponds to this parameter

  • Check in the template

Final Steps: Configure the build!

Now we’ve got our template done, let’s go configure a build! The only real point of interest here are the custom parameters we set up on our way through, so we’ll focus on them – this is a long enough read already!

The points of interest are all on the process tab, so let’s skip there. If you expand your custom section you should see something like this:

All you need to do is fill in the values, so it looks more like this:

Once that’s done, kick off a build and you should be able to locate your samples, without the binding configuration in the drop directory of your build output!


In this article I’ve shown you how to create a reusable template for including useful samples in your build output. I’ve used this particular approach with a few customers and what I particularly like about it is we aren’t moving too far from the out of the box activity set that comes with Team Build. This saves us on overhead, and allows the template to be put together pretty quickly.


MSDeploy and TFS Deployer

Recently I was asked for my opinion on an article by a friend Peter Gfader. In his article Peter talks about automating the creation of a deployment package using MSDeploy. Along a similar line, I am regularly queried as to my opinion on what I see as the contrasts between TFS Deployer and MSDeploy. I’ve finally decided to distil these thoughts somewhat to provide a consistent expression of my opinion.

The approach taken by MSDeploy is very similar to the database deployment model taken by the Visual Studio Database tooling. It uses the project state to create a set of configuration and artefact descriptors which it then is able to compare to a target and generate a change script. This includes things like configuration setting transforms, artefact deployment, IIS setting changes etc. There is no need to have two IIS instances to compare, as Peter points out quite well you can simply have the settings packaged into a zip file with a specific format which can then form the basis of a later deployment comparison.

The first difference between the two tools is that TFS Deployer does not attempt to create a deployment package. It assumes a deployable unit is generated as part of the standard build process. This could be by any means appropriate to the tool set that is used by your organisation. As a personal preference, I err towards the use of WiX and the Votive tooling. This allows me to put together an exact specification of the deployment unit I’d like. I have had many clients where this has not been appropriate, and recognise the need for automation in this space and see Peter’s solution as quite useful for this purpose.

TFS Deployer operates primarily as a way of managing two important items that I feel MSDeploy does not yet deal with elegantly. The first is the automation of the deployment action as an operation separate to the build. As MSDeploy is instigated using an argument supplied to the MS Build engine, or as a separate tool it appears to encourage the usage of one of two possible approaches. Either couple the build and deployment actions so that the deployment is completed immediately as part of each build (which I strongly discourage) or specify a build that does no more than call MSDeploy with the correct arguments to begin the deployment process. While the second approach is agreeable in so far as it separates build and deployment actions, it too has some issues that remain to be addressed.

The primary issue I have with creating a build to initiate a deployment is that to action a successful deployment you first need two pieces of information:

  • What am I deploying
  • Where am I deploying to

There are a couple of ways to deal with this using a separate build template. One is to define a build for each environment which can quickly become annoying, especially without a meaningful grouping mechanism for build definitions. Another is to define a single build in which the parameters specifying the two above items are injected at execution time by the build initiator which I feel does not provide significant value over the manual initiation of MS Build itself due to the human error factors it introduces.

So where does TFS Deployer fit into this picture? The primary benefit of using TFS Deployer is that it takes the above pieces of information and makes them something you only need to think about once. By defining a trigger – in most cases the change in build quality, and a mapping of trigger to a specific target environment TFS Deployer allows you to execute a deployment with a single click, rather than a click and some manual configuration.

The second benefit TFS Deployer provides is that it is built around the use of PowerShell. As the defacto server management technology within Microsoft OSes, the use of PowerShell to script your deployment actions ensures that your deployments are able to be quickly understood by your operational teams as well your development teams. It also – depending on your packaging choices, is able to help limit the amount of change needed on your target servers to support deployment. This is done by using the PowerShell remote execution features to initiate the deployment on the target rather than needing a TFS Deployer instance on the machine itself.

So if these are the features, are the two products mutually exclusive? The answer is no, and nor should they be. Where MSDeploy has strengths you should utilise them, such as described in Peter’s article. However I say the same of TFS Deployer. Why attempt to bend, or wrap the usage of MSDeploy with custom code or build definitions when TFS Deployer provides a simple and effective interface for managing both the trigger and target mappings for your product.

Australian ALM Conference Wrap Up

Last week I had the pleasure of speaking at the first annual Australian ALM Conference. My talk covered a “What’s new” of Team Build as well as the customisation experience for Team Build 2010. It was really great to see a large group of people interested in Team Build and I was really excited by some of the post-session questions.

If you were interested in obtaining the slide deck, or the sample code I wrote during the session the event organisers have posted them in a zip on the Agenda page of the Australian ALM site.

If you have any questions I didn’t get to answer during/after the session feel free to contact me on stephen.godbold at or tweet me on @SteveGodbold

Code Analysis in Team Build 2010

Code Analysis provides an executable set of rules that can be checked during a build to ensure standards and practices are being adhered to during development. This functionality is a great addition to a team build, and the tools required come as part of the Team Build Agent 2010 installation. Recently I’ve been doing some digging into how the Code Analysis is setup and triggered during a build.

Configuring Code Analysis

The first step to having code analysis run as part of your team build is to configure the rule sets and execution of analysis during the build of your project files. To do this, right click your project file and bring up the properties interface. Go to the Code Analysis tab and check the Enable Code Analysis on Build check box. This defines a constant that MSBuild will use to determine if the analysis should be run or not. You then need to pick your rule set, which can be set in two places. Firstly you can set it on the projects you want to run analysis against individually.

Project Properties

Configuring code analysis in project properties

This will get you started with Code Analysis, but once you turned on the analysis there must be a better way to get a view of the rule sets across the board for your solution right? If you right click your solution file and open up the properties interface, then click the Code Analysis Settings item you’ll see something like this:

Configuring rule sets in solution properties

Configuring rule sets in solution properties

This will give you a nice view of the code analysis rule sets for all of the projects in your solution. Something to note here is that the rule settings can be chosen on a per configuration basis. You can also use the ‘All Configurations’ option to set them across the board.

Once you’ve enabled Code Analysis and picked a rule set you should be able to run a local build and see a set of warnings shown for those rule violations that exist in your code. We’ve now got code analysis in our local build!

Code analysis warnings

Code analysis warnings in local build

Configuring Team Build to execute code analysis

Having local code analysis configured is a great place to start. Adding support for executing the code analysis in Team Build is the next step on the road to ensuring consistency in code. Setting up code analysis is really quite simple in the default build template.

First open or create the build definition you’d like to run code analysis in. Once you’ve got the Build Definition screen open, head to the Process tab. On this tab, you should see a parameter in the Basic group titled ‘Perform Code Analysis’. The default value here is to use ‘AsConfigured’ which will ensure execution for those project files that specify the code analysis constant with the rule sets defined. You can also turn it on with an ‘Always’ setting, or off with a ‘Never’ setting.

Build Definition

Configuring code analysis in your build defintion

How does code analysis get run?

There are a lot of good posts on how FxCop itself works, so I won’t cover it here. I also won’t cover writing custom rules or custom rule sets.. What I will cover is how it gets called during part of your Team Build.  My images here are based on the default template, so the activities used are all included in the standard install.

The settings for Code Analysis are passed into the executing build workflow as arguments. You can check these by opening the XAML, and clicking the ‘Arguments’ button at the bottom of the workflow designer.

Build Arguments

Build arguments on workflow designer

The list you see are the arguments defined for the workflow as a whole. If you’d like to see the arguments scoped to a specific activity, use the ‘Variables’ button next door.

If you start with a collapsed view (recommended) you’ll need to navigate down the activity tree to locate the Run MSBuild for Project activity which is an instance of the MSBuild activity. If you click this activity and open your properties window you’ll see that the activity accepts a  ‘RunCodeAnalysis’ parameter which is bound to the ‘RunCodeAnalysis’ argument specified for the build workflow.

Activity Configuration

Properties as configured for MSBuild activity

Pretty simple so far right? Time for some reflectoring to see what actually occurs here. Once we’ve loaded the Microsoft.TeamFoundation.Build.Workflow assembly into reflector we can navigate through the Activities namespace and locate the MSBuild activity. Once there we’ll a look at what exists inside. What you’ll see is pretty much what you’ve seen in the properties dialogue in Visual Studio. Lots of properties to allow the configuration of the build call. What you didn’t see in Visual Studio was the list of helper methods used inside the activity. There’s one particular method we’re interested in here which is titled ‘GetFxCopPath’.

Reflected MSBuild Activity

Content of the MsBuild activity viewed in Reflector

If we disassemble this method we’ll see that the method checks a registry key to find the FxCop executable path and returns that to the caller.

GetFxCopPath Dissasembled

GetFxCopPath Method Content

Digging a little deeper into the activity we find it utilises an internal code activity known as GetCommandLineArguments. This activity contains the following code snippet in its execute override:

GetCommandLineArgs Snippet

GetCommandLineArgs method content

This code sets up the execution of Code Analysis based on our build and project settings, and the path to the FxCop executable that was located via the registry key earlier. From here on, code analysis is run as per the normal MSBuild process based on these settings we’ve provided.

Known Issues

Unfortunately due to a bug discovered very late in the release cycle the install for Code Analysis does not execute on x64 build servers. This means you won’t see code analysis running despite correct configuration. The work around for now is to install a Visual Studio SKU that includes Code Analysis on the build machine to get the FxCop installation.


Making use of Code Analysis during a Team Build is a great way to help ensure a good level of consistency across your code base. Armed with this article, and some knowledge of how FxCop inspects code and applies rules you should now be ready to incorporate this useful tool into your automated build process.

Publishing Information to the Build Log in Team Build 2010

I’m currently working on a build activity for the Team Build 2010 contrib project. While debugging an issue I was having I thought it would be great to push some information into the build log. There’s already some great posts on customising build information on Patrick Carnahan’s blog (here and here), but I was looking for a simple way to write what was essentially a chunk of debug information into my log. I knew I had come across some extension methods for CodeActivityContext previously that made build information tracking much simpler, but couldn’t remember where they lived.

To access the CodeActivityContext extension methods, you need to first add a reference to the Microsoft.TeamFoundation.Build.Workflow assembly, then a using or import statement for the Microsoft.TeamFoundation.Build.Workflow.Activities namespace. By adding the reference and using/import statement you get access to TrackBuildMessage, TrackBuildError and TrackBuildWarning. These extensions give a quick entry point for basic build information logging and error reporting which I’ve found quite handy.

Team Build 2010 – Community Content

Something I regularly monitor in the “VSTS” space is the VSTS Feed Aggregator hosted by Accentient. It’s a great way to keep in touch with a group of MVP and team blogs without having to track down the feeds individually. One of the blogs I do subscribe to though is the Ed Squared blog. The product of Ed Blankenship and Ed Kisinger, the blog has recently seen an influx of content on Team Build 2010 from Ed B, including this mornings post which aggregates a bunch of great Team Build posts.

Also hidden in there is a link to the Team Build 2010 Contrib project on codeplex. The intent of this project is to be:

“a place for build engineers to share workflow activities, build processes, and tools for Team Build 2010.”

The project is currently empty, but the call is out for contributors, and content. I’ve signed up as a contributor, and have a couple of tasks in mind. If you’ve played with Team Build 2010 and have written some custom workflow activities, have a request for an activity or would like to add your expertise to the project why not head over to the site and either put in a request or get in touch with one of the coordinators and offer your assistance!

Gated Check In Build Fails – File Locked


The process for initiating a gated check in containing a locking check out has been improved for RTM!

The default is now to display the options expander, with the ‘Preserve my Pending Changes’ check box unchecked and greyed. A tooltip is supplied on hover for the check box which states “The pending changes being checked in contain one or more locks. The build service cannot unshelve or check in your changes if your local changes are preserved.”.

The following therefore only applies to Beta 2 of TFS 2010.


Working with gated check-ins can save you time by verifying your changes before committing them to the repository. This is great in terms of keeping your code base healthy and enabling team development. Recently I’ve seen teams start to use this feature and struggle with a consistently breaking gated check-in build. What I noticed was common across these builds was the cause – a locked file.

Check In Build - Locked File Failure

Gated Check In Build - Locked File Failure

This occurs when your check-in involves a binary file, which TFS automatically forces into a locking check out. This means that when your gated build goes to ‘get’ the file into it’s build workspace it clashes with the lock you have locally.

The fix here is to un-check the “Preserve my pending changes locally” option on the gated build initiation screen. This will undo your local changes and allow the build to get the binary into it’s workspace. The thing to remember here is if the build breaks, you’ll need to know the name of the shelve set to get it back down locally and fix the issue. Luckily it’s right there on the build initiation screen for you, and you can either screen shot or copy it out for later.

Gated Check In Build

Your other option is to use the dialog that pops up with the build results to un shelve the changes…

Gated Check In Results Dialog

Results Dialog