Simplifying Web Package Deployment with TFS Build


In my last post several months ago I described some of the work we’ve been doing to automate web package deployment using MSDeploy and Team Foundation Server 2010 Build Server. In that post I introduced a number of “hard coded” customisations to the Default Template workflow script that is used to drive the TFS Build.

Recently I’ve been refining this script further to allow the reuse of the unmodified workflow scripts between projects through the use of XAML arguments, and thought I might share the output.

Parameterising the MS Deploy activity

This post will focus on modifications to the standard workflow template to allow for an MS Deploy activity to be invoked based purely on the build definition, rather than customising the workflow for each team project.

Step 1: Add parameters to the Default Template xaml file

I did take one shortcut at this stage – the ‘ideal’ approach to collecting the deployment parameter information would be to create a strong typed object structure, then import that into the build process. I’m trying to avoid this at the moment, as we don’t currently need any custom assemblies for the build process and I’m trying to avoid adding extra dependencies.

Therefore I have used the very limited approach of defining a single set of deployment arguments, as opposed to a nice and tidy collection of strongly typed objects we could loop though.

The parameters being added are as follows:

  • Deploy Package 1 (default false)
  • Deploy Package 1 Package Path
  • Deploy Package 1 Script
  • Deploy Package 1 Script Args

Step 2: Setup the metadata to group the deployment arguments

By default, all the added arguments will be added to the “Misc” category in the Build Definition Editor. Once you’ve added more than a couple of customizations this becomes fairly confusing so it is important to setup the metadata as you go.

  1. Find the Metadata argument within the Arguments list
  2. Click the ‘…’ button top open the dialog editor
  3. Add a new entry for each new argument added
  4. Ensure that the Category is the same for each parameter to ensure they are grouped together

Step 3: Modify the Invoke Process activity to use the arguments

For this demonstration I’ve chosen to trigger the deployment immediately after the “Try Compile, Test and Associate Changesets and Work Items block”. This is convenient, since it is directly after the output directories are populated – though I am not yet sure that this step will stay here once the integration tests are plugged in.

The parameterized activity results in the following xaml being produced.

<If Condition=[DeployPackage1]DisplayName=If Deploy Package 1
<Sequence DisplayName=Deploy Package 1tbwt:BuildTrackingParticipant.Importance=Low>
<mtbwa:InvokeProcess Arguments=[DeployPackage1ScriptArgs]DisplayName=Deploy Package 1
FileName=[String.Format(&quot;{0}\{1}&quot;, BuildDetail.DropLocation, DeployPackage1Script)]

You’ll note that I’ve wrapped the old Invoke Process activity in a new If block – the primary purpose of this is to allow us to use this workflow even for projects that do not use the build features.

Note: Don’t forget to check-in your xaml changes at this point – otherwise you will not be able to set any of your new arguments in the build definition editor!

Step 4: Setup the Build Definition

Note that this assumes you have already prepared your destination server to allow for remote deployments. See my previous post on Using TFS Build Server for Continuous Integration for help with this configuration if necessary.

The final step is to setup the build definition.

  1. Create a new Build Definition
  2. On the Process tab, note that a new category for “Deploy Package 1” is now displayed
  3. Enter the path to your deployment package, relative to the output directory. Typically this is _PublishedWebsites\MyProject_Package.
  4. Enter the name of the generated command script file.
  5. Enter the arguments required for the command script file. Typically this is /y /M:ServerName /u:UserName /p:Password.
  6. Change the Deploy Package 1 flag to true.

Step 5: Trigger a build, and watch it work!

Whats Next?

There are a number of changes that we have introduced to the standard workflow, that I will endeavour to describe in my next few posts;

  • Cleaning up the Output Directory to include only packages
  • Running Integration tests post-deployment
  • Using transformations on the integration test app.config file to modify the target endpoints for testing

TFS Automated Deploy – Avoid MSDEPLOY Deleting Log Files

Last weekend I posted the details of how we are using TFS Build Server in conjunction with the Web Deployment Toolkit to enable automated build and deploy of some of our internal development environments.

One piece of feedback that I received from the previous post was that the configuration by default will delete all files in the deployment destination location. Normally this is perfectly acceptable behaviour – but in other cases less than ideal. Specifically, one of the solutions that we are building logs output to a logging folder within the application directory, and we do not want to lose the logs each time a build is deployed.

This is easily solved when deploying using the one-click publishing functionality in Visual Studio 2010 by selecting the “Leave Extra Files on Destination” option on the Publish dialog for a web application (the Publish dialog can be accessed by right clicking on the web application project in Solution Explorer, then selecting Publish) – but the solution is less obvious when using MSDEPLOY or the TFS 2010 build packages directly.

After some time searching through Google I came across this thread that describes the MSDEPLOY parameter –enablerule:DoNotDeleteRule. This parameter can be added to the Windows Workflow script used by TFS2010 for a build definition by modifying the InvoiceProcess activity used in my previous post as follows (change marked in red):

Arguments=”/y /M:YourServer /u:UserName /p:Password

&quot;-setParam:’IIS Web Application Name’=’IISSite/VirtualDirectory‘&quot; -enablerule:DoNotDeleteRule
DisplayName=”Deploy Web Service”
FileName=”[String.Format(&quot;{0}\YourProjectName\YourProjectName.deploy.cmd&quot;, BuildDetail.DropLocation)]”>

First Look: Using TFS2010 for Continuous Integration


Any programmers reading this might remember what their first attempt at coding resulted in. Undoubtedly it lacked a lot of the finesse of something that you would write today. The output of my first real attempt at using TFS as a build server feels much the same as that first page of code many years ago. I’m sure there are many things that could be done more ‘correctly’, and for this reason I don’t really recommend this post as learning material for getting started with TFS Build.

The scenario I’m working towards is to get an existing solution being compiled and deployed in a pseudo continuous-integration setup. I say pseudo continuous-integration in this case, because I’m ignoring a number of key steps at this stage. Specifically, I’m not customising the configuration files for different environments, deploying the database projects, or looking at how to retarget the automated integration tests at the deployed server.

The VS2010 solution for this scenario contains three WCF services based web service projects (as well as a number of supporting assemblies), all targeting .NET 3.5. I’m making use of the Web Deployment packaging and deployment tools provided as part of Visual Studio.


  • Download and install the Microsoft Web Deployment Tool (or the 32bit equivalent) on the deployment machine (where the web services will ultimately be deployed).
  • Create the IIS Web Site on the deployment server. The Virtual Directories will be created automatically, but we do need the IIS Site in place to start with.
  • Create a TFS Build Agent on the TFS server (See MSDN article Create and Work with Build Agents for details)

Lesson Learned 1: The Web Deployment Tool does not install the Remote Agent Service by default. Be sure to select this option as part of the installation Wizard. If the Remote Agent Service is not configured correctly, the deployment activities outlined later in this past will result in an error message “The response header ‘MSDeploy.Response’ was ” but ‘v1’ was expected”.

Lesson Learned 2: The Web Deployment Agent Service is configured to start manually by default. It is necessary to reconfigure this to start automatically.

Lesson Learned 3: If you use the default Working Directory for your build agent then the path will include the full name of your TFS project. This can cause issues building as I have seen instances where the maximum length of the filename and path combined has exceeded the maximum (which surprisingly is only about 260 characters). To avoid this I’ve used the Build Definition ID rather than the Build Definition Path parameter as part of my Working Directory setup.

Basics First – Building the Solution

Setup a new definition by right clicking on the Builds node within Team Explorer for your Team Project and selecting “New Build Definition”.

You will need to select a UNC path for the output location on the Build Defaults tab, but aside from that the only configuration we need to worry about for now is on the Process tab.

For the moment, the only setting we need to change is to select the solution file to build, and select the build configuration. Since this is for an internal test deployment environment, I’ve chosen to use a Debug build for the moment.

Lesson Learned 4: Avoid spaces in the Build Definition Name. I’m sure this problem can be avoided by quoting the execution command – but I some problems executing the web deployment command file due to spaces in the path.

With the build configuration saved, simply right click on the newly created build in Team Explorer and select “Queue New Build”. I’ve found that even if the solution is already building successfully on a local machine, there is still some effort required at this stage to work through any build errors that are produced on the server.

Lesson Learned 5: The TFS Build Server is missing some resources required to build advanced projects. One such issue I ran into was an error stating that “resgen.exe” could not be found on the server. This can be resolved by either installing Visual Studio 2010 on the Team Foundation Server (an option I wasn’t really all that keen on), or install the Windows SDK.

Next Step – Setup Web Deployment Packages

Configure the Package setup options for each WCF service being deployed by right clicking on the WCF project in Solution Explorer, and selecting “Package/Publish Settings”.

In general, the default settings are enough to get the package building; however for this scenario I’ve tweaked the following:

  • Uncheck the option to “Include all databases …”. For the moment I’m excluding this because I am not trying to automate the deployment of the database – but even when we do get to this, I believe that the Database project would be better handled with its own deployment package.
  • Change the IIS Site and Application Name for the destination server to something more appropriate. The syntax used here is {ISS Web Site Name}/{Virtual Directory Name}.

Lesson Learned 6: TFS doesn’t build web deployment packages by default. Having the WCF projects configured in Visual Studio 2010 isn’t quite enough – we also need to instruct TFS to build the packages as part of a build. The easiest way to do this is to modify build definition (right click on the build configuration in Team Explorer, select Edit Build Definition) to include the following MSBuild Arguments on in the Advanced section of the Process tab:

/P:CreatePackageOnPublish=true /P:DeployOnBuild=true

Final Step – Build Template Tweaks to Deploy Web Packages

The build templates are stored as Windows Workflow files within the TFS project folder $/{Project}/BuildProcessTemplates. You can either choose to edit the Default Template (it is in source control after all, so there is no risk in messing this up), or create a new template to work from.

Note that if you create a new build template, you must be sure to set the Build Definition to use the appropriate build template file using the drop down list at the top of the Process tab of the Edit Build Definition dialog.

Within the build template, I’ve made all my modifications within the “Revert Workspace and Copy Files to Drop Location” step, which in turn is within the finally block of the “Try Compile, Test, and Associate Changesets and Work Items” workflow step. Note that each of these tasks can be added via the Windows Workflow user interface in Visual Studio – though I’ve found that editing the XML file directly is quicker on a machine with limited resources.

Remove the CopyDirectory task for the Binaries Directory:

By default, the TFS build template will copy all the project output into the Build output location. This includes all project files (such as ASPX, SVC and ASMX files), as well as all the compiled binaries. Since we are deploying our application using Web Deployment packages my preference is not to have anything in the output folder except for the required packages.

<!— <mtbwa:CopyDirectory Destination=”[BuildDetail.DropLocation]” DisplayName=”Copy Files to Drop Location” Source=”[BinariesDirectory]” /> –>

Add a CopyDirectory task to copy the web deployment package output:

DisplayName=”Copy Deployment Package to Drop Location”

Destination=”[String.Format(&quot;{0}\YourProjectName&quot;, BuildDetail.DropLocation)]”
Source=”[String.Format(&quot;{0}\_PublishedWebsites\YourProjectName_Package&quot;, BinariesDirectory)]” />

Add an InvokeProcess task to run the generated deployment command file:

Arguments=”/y /M:YourServer
/u:UserName /p:Password &quot;-setParam:’IIS Web Application Name’=’IISSite/VirtualDirectory‘&quot;”

DisplayName=”Deploy Web Service”
FileName=”[String.Format(&quot;{0}\YourProjectName\YourProjectName.deploy.cmd&quot;, BuildDetail.DropLocation)]”>


Custom Date Format for the Work Item DateTimeControl (TFS2010)

While working on a new work item template (WIT) for Team Foundation Server 2010 I encountered a requirement for a field where the user could select the time a call was received. The DateTimeControl for TFS work items displays only a date picker by default, which for this particular use case was not quite accurate enough.

My first reaction was to start building a custom work item control – but with a bit of digging into the existing one using Reflector I discovered that the ability to change the control format was indeed supported, just not immediately obvious.

It turns out that for any implementation of IWorkItemControl, the Properties property is set to contain all the attributes on the controls declaration line within the WIT. For example, the following control declaration will result in four entries within the Properties collection (FieldName, Type, Label and LabelPosition).

Label=Time Received:
LabelPosition=Left />

Additional attributes can be added to the control without invalidating the schema, so work item controls can make use of this feature to support additional configuration parameters. The DateTimeControl includes the following properties that can be set using this mechanism:

  • Format: A valid selection from the DateTimePickerFormat enumeration (Long, Short, Time, Custom)
  • CustomFormat: If the Format is set to Custom then the CustomFormat property is used to define the display format. This uses the standard formatting strings for .NET date formats.

Thus the following control definition can be used to add a Time-aware picker to a work item.

Label=Time Received:
CustomFormat=dd/MM/yyyy hh:mm:ss

For anyone that prefers to use the WIT editor within the TFS Power Tools (found here), these properties can be added using the String Collection Editor associated with the Attributes property for the DateTimeControl.

As an added bonus, the control is rendered in correctly in both Team Explorer and Team Web Access views of the work item. Unfortunately it is not possible to change the DateTime selection to use the spin control (aka the up-down control) rather than the calendar drop down, but that is a relatively minor problem.

Debugging .NET IL at Runtime using Reflector / Deblector

The other day we encountered a problem with some inconsistent processing within the POP adapter for BizTalk. If the adapter was processing an email with a very specific set of attachments, the adapter failed to hand the message out to the BizTalk engine, and all following messages on the receive adapter became blocked. Anecdotally we believed this was related to the S/MIME decoder being used by the adapter, but we were struggling with what specific characteristics of the attachments required to reproduce the issue.

Let me make this clear from the start … we didn’t actually manage to resolve this issue using this technique. Having said that I did find the whole process of debugging IL for runtime code very interesting and wanted to share the basics for getting started.

Step 1: Install the components

The tools we used for this are both available for free download. These will need to be copied onto the machine where the .NET process you wish to debug is located.

Installing reflector is just a matter of extracting the runtime files onto your machine. The Add-in can be installed by extracting the files to a subdirectory under the Reflector application path, then adding the Add-in into Reflector (Menu -> View -> Add-in’s).

Deblector can then be activated within Reflector from the Tools menu.

Step 2: Attach to the BizTalk process

The next step is to load the required assemblies into Reflector. In my case this was the Microsoft.BizTalk.Pipeline.Components.dll file that comes with BizTalk Server 2006.

Note that at this stage it pays to navigate through the methods you are likely to be debugging, to ensure that all the dependant assemblies are also included. Reflector will prompt you to load any decencies that are identified for the methods you are reading. This is also a good chance to learn a bit about how the application hangs together, and where logical debugging breakpoints should be applied.

Once the assemblies have been loaded we need to attach Deblector to the process to be debugged. In my case this was the BTSNTSvc.exe process.

Note: For BizTalk this got a bit complicated as there are a number of instances of this process running, so some trial and error was required to find the right one. I also found that stopping all the host instances in BizTalk Administration Console except for the one containing the pipeline you wish to debug was helpful to help filter the list.

Note that the debugger is automatically in pause mode when you attach to the process.

Step 3: Set your breakpoints, and Debug!

Using the explorer included in Reflector, navigate to the method to be debugged (in my case, MIME_SMIME_Decoder.Execute2).

Examine the IL view, and identify a line that you recognise. Reading IL can be a bit daunting, and I’m definitely not proficient with this as a language. I’ve found that searching for strings used for writing log files

Setting up the breakpoint had me a bit confused to begin with, as I could see the breakpoint button but it was not obvious as to how to select the line to apply the breakpoint to. In the end it’s as simple as selecting the line in the IL View tab and clicking the breakpoint icon. Alternatively you can use the Command textbox and the break command (b [location]). I found this useful when wanting to re-setup a number of breakpoints after restarting the BizTalk process and reattaching the debugger.

At this stage you also want to consider whether you want the debugger to break if an exception is thrown. This can be toggled using one of the icons on the toolbox.

Once you have all the breakpoints setup and ready to go, click the Play icon to let the attached process continue until the next breakpoint.

Now it’s debugging as normal – except the commands being stepped through take a bit more effort to comprehend. As a final note, I did notice that the highlighted line is often indicating the instruction prior to the one being executed so if you want to step into a specific function call then you need to do so a couple of instructions before the function is called.

TFS Timesheets is now on CodePlex

One of the key weaknesses (in my opinion) with Team Foundation Server work items at present is the difficulty to record time spent against a project. Sure you can track time in the ‘Work Completed’ field and can even report on what changes were made to this field over time. The difficulty is that this still assumes that developers will always update the work items on the correct day and fails to give simple visibility of how much time different team members have spent on a given work item.

Some time back I created a small timesheet utility for Team Foundation Server 2010. This was partly driven out of a requirement to be able to track this level of detail for a project I was working on, but also to serve as a sample project for my previous blog posts about creating custom work item controls;

At the moment the code posted is far from perfect and I haven’t had a lot of time for updating the sample since the posts, but as a number of people have requested access to either view or contribute to the source code I’ve decided to post it to CodePlex ‘warts and all’ so to speak. If you are interested in the code for this component it can now be found at the following location.

We’ve used this control on a couple of projects now and have found that it has helped to keep the work remaining / work completed values up to date by creating visibility of how much time team members are logging against work items on any given day. There are definitely some improvements that can be made (especially in terms of reporting) – so if you do happen to try out this component please drop me a note to let me know what you think, or if you have any ideas for how it could be improved!

Successfully upgraded TFS 2010 from Beta 2 to RC on EC2

Once again I have my TFS instance on EC2 up to date, and I’m able to connect to it successfully from my Visual Studio 2010 RC instance here at home. One word of advice though; once you install VS2010 RC, VS seems to crash if you then try to install Team Explorer 2010 Beta 2 so make sure your code is all checked in before the upgrade, otherwise won’t be able to make this available to anyone else on your team until the process is complete!

I was surprised that I didn’t run into a few issues as part of this process, but the upgrade was fairly painless – thanks mostly to the TFS 2010 Beta 2 to RC Upgrade Guide posted by Brian Krieger. I ran into a bit of trouble setting up SSRS and WSS again … but that was my own fault, not an issue with the upgrade process (I’d forgotten the credentials I’d used last time … L).

The good news though is that now seems to be working, including my old VS2010 Beta 2 solution files which opened without any issues or the need for an upgrade. 

I did come across a breaking change in the TFS APIs. It would seem that the TeamFoundationServer class no longer supports the AuthorisedIdentity property (in fact the TeamFoundationServer class has now been depreciated in favor of TeamProjectCollection). Changing the reference to use TeamProjectCollection.AuthorizedIdentity seems to have resolved the problem, and everything is building again.