Programmatically finding a Stored Query Definition within TFS2010

Today I’m building a small utility and have a requirement to execute a stored work item query from within Team Foundation Server 2010. Previously I’d used the StoredQueries collection on the Project instance to find a query by name … however it would appear that this was depreciated as part of the TFS2010 release.

I was surprised to find that there doesn’t seem to be a simple replacement for finding a specified stored query by name – part of me thinks I must be missing a simpler solution, however if anyone else is encountering the same issue I hope that the following code snippet will be of use.

Required namespaces:

  • Microsoft.TeamFoundation.Client
  • Microsoft.TeamFoundation.WorkItemTracking.Client

/// <summary>
/// <para>Find the TFS QueryDefinition for a specified Team Project</para>
/// <para>Note that if multiple queries match the requested queryName
/// only the first will be used</para>
/// </summary>
/// <param name=”tfsUrl”>URL to the TFS project, including the
/// collection name (Eg, http://tfsserver:8080/tfs/DefaultCollection)</param>

/// <param name=”projectName”>Name of TFS Team Project</param>
/// <param name=”queryName”>Name of Stored Query. Note if multiple
/// exist the first found will be used</param>

/// <returns></returns>
public static QueryDefinition FindQueryItem(string tfsUrl, string projectName, string queryName)

// Setup the connection to TFS
TfsTeamProjectCollectionprojectCollection = TfsTeamProjectCollectionFactory.GetTeamProjectCollection(new Uri(tfsUrl));

WorkItemStore workItemStore = projectCollection.GetService<WorkItemStore>();

Project project = workItemStore.Projects[projectName];

return FindQueryItem(queryName, project.QueryHierarchy);


/// Recursively find the QueryDefinition based on the requested queryName.
///<para>Note that if multiple queries match the requested queryName
/// only the first will be used</para>
///<param name=”queryName”>Name of Stored Query. Note if multiple exist
/// the first found will be used</param>

///<param name=”currentNode”>Pointer to the current node in the recursive search</param>
private static QueryDefinition FindQueryItem(string queryName, QueryItem currentNode)


// Attempt to cast to a QueryDefinition
QueryDefinitionqueryDefinition = currentNode as QueryDefinition;

// Check if we’ve found a match
(queryDefinition != null && queryDefinition.Name == queryName)

return queryDefinition;

// Attempt to cast the current node to a QueryFolder
queryFolder = currentNode as QueryFolder;

// All further checks are for child nodes so if this is not an
// instance of QueryFolder then no further processing is required.

(queryFolder == null)

return null;

// Loop through all the child query item
foreach (QueryItem qi in queryFolder)

// Recursively call FindQueryItem
ret = FindQueryItem(queryName, qi);

// If a match is found no further checks are required
(ret != null)

return ret;


return null;



Adding Integration Tests to TFS Build Workflow


In my last post I described how to deploy web applications to a build integration server using Team Foundation Server 2010. The next logical step once the build is successfully deploying to the integration server is to trigger a set of integration tests to verify the deployment. In this post I will describe the changes to the Default Template build workflow to execute Integration Tests separately from the existing Unit Tests.

Unit Tests

It is important to consider at this stage why we would run integration unit tests, as opposed to the unit tests executed as part of the assembly build process.

Unit tests executed as part of the build are intended to verify the individual components are functioning correctly, and often would use mocked interfaces to ensure that only the specific functions being tested are executed. Unit tests are typically not reliant on deployed components and therefore can be run as soon as the assemblies have been built.

Integration tests on the other hand are intended to run against the fully deployed environment to ensure that the individual components successfully execute together. Integration tests therefore need to be executed after the application components have been deployed to an integration server. Failures in integration testing might indicate breaking changes such as database changes, missing data, or changed interfaces into other components of the system.

Note that running the deployment and integration tests adds to the duration required to execute a built. Rather than performing this action every time something in the solution changes it might be more pragmatic to have one Build Definition to build and run unit tests on a per-check-in basis, while another is configured for the full integration tests on a nightly basis.

Modify the Build Workflow

Workflow Sequence Overview

The integration tests have to run within the context of a build agent, to the activity needs to take place at the end of the Run On Agent activity, directly after the packages have been deployed to the build integration server within the Deploy Packages activity.

Changing variable scopes

Because we are going to borrow heavily from the existing “Run Tests” activity, but the execution will be outside the “Try Compile, Test, and Associate Changesets and Work Items” activity, we need to modify the scoping of the following variables. This is easiest done by editing the xaml directly in your favourite xml editor.

  • outputDirectory – copy from the “Compile and Test for Configuration” activity up a level to the “Run On Agent” activity.
  • treatTestFailureAsBuildFailure – copy from the try block of “Try Compile, Test, and Associate Changesets and Work Items” to the “Run On Agent” activity.

Add new Integration Tests workflow arguments

The parameters being added are as follows:

  • Integration Tests Disabled (Boolean). I’m not a fan of negative argument types (eg, Disabled, rather than Enabled), however have decided to keep this consistent with the existing Tests Disabled argument.
  • Integration Test Specs (TestSpecList).

The default value for the Integration Test Specs argument provides the defaults for filtering the unit tests to only the integration tests. Ideally I would have liked to be able to filter this to *test*.dll with a test category of Integration, however based on some rudimentary experimentation it appears that the Test Assembly Spec constructor can only set the assembly name filter. In the end I’ve used the following TestSpecList definition as the default value:

New Microsoft.TeamFoundation.Build.Workflow.Activities.TestSpecList(

New Microsoft.TeamFoundation.Build.Workflow.Activities.TestAssemblySpec


Note: Don’t forget to change the Metadata property to ensure the new arguments are displayed in a suitable category in the Build Definition editor.

Add the Run Integration Tests Activity

Follow the following steps to add the new Run Integration Tests activity to the workflow

  1. Add a new foreach activity after the Deploy Packages activity, but still within the Run on Agent activity. This activity will be used to iterate through the project configurations defined in the build definition.
    <ForEach x:TypeArguments=mtbwa:PlatformConfiguration DisplayName=Run Integration Tests Values=[BuildSettings.PlatformConfigurations]>
    <ActivityAction x:TypeArguments=mtbwa:PlatformConfiguration>
    <DelegateInArgument x:TypeArguments=mtbwa:PlatformConfiguration Name=platformConfiguration />

  2. Create a copy of the existing activity titled “If Not Disable Tests” into the foreach statement created above
  3. Modify the copied workflow to use the added workflow arguments
    • Use Integration Tests Disabled instead of Disable Tests
    • Use Integration Test Specs instead of Test Specs

Configure the Build Definition

Configuring the filters for your integration tests is a matter for personal preference, though I’ve found the following approaches fairly simple;

  • Define all integration tests in a separate project and utilise the Test Assembly Filespec filter
  • Add a Test Category of Integration to each of the tests and use the Category Filter.
  • Configure a custom testsettings file to allow for accurately specifying the order tests should be executed

What’s Next?

Having the integration tests successfully executed is all fine and good however you will find that it is necessary to configure the endpoints in the app.config file of your unit tests project to always point to the integration server, which causes some inconvenience if you wish to run the same tests locally on a development environment.

In a future post, I will have a look at how to perform transformations on the app.config file as part of the deployment, similar to the way web.config is transformed as part of the deployment package creation.

Simplifying Web Package Deployment with TFS Build


In my last post several months ago I described some of the work we’ve been doing to automate web package deployment using MSDeploy and Team Foundation Server 2010 Build Server. In that post I introduced a number of “hard coded” customisations to the Default Template workflow script that is used to drive the TFS Build.

Recently I’ve been refining this script further to allow the reuse of the unmodified workflow scripts between projects through the use of XAML arguments, and thought I might share the output.

Parameterising the MS Deploy activity

This post will focus on modifications to the standard workflow template to allow for an MS Deploy activity to be invoked based purely on the build definition, rather than customising the workflow for each team project.

Step 1: Add parameters to the Default Template xaml file

I did take one shortcut at this stage – the ‘ideal’ approach to collecting the deployment parameter information would be to create a strong typed object structure, then import that into the build process. I’m trying to avoid this at the moment, as we don’t currently need any custom assemblies for the build process and I’m trying to avoid adding extra dependencies.

Therefore I have used the very limited approach of defining a single set of deployment arguments, as opposed to a nice and tidy collection of strongly typed objects we could loop though.

The parameters being added are as follows:

  • Deploy Package 1 (default false)
  • Deploy Package 1 Package Path
  • Deploy Package 1 Script
  • Deploy Package 1 Script Args

Step 2: Setup the metadata to group the deployment arguments

By default, all the added arguments will be added to the “Misc” category in the Build Definition Editor. Once you’ve added more than a couple of customizations this becomes fairly confusing so it is important to setup the metadata as you go.

  1. Find the Metadata argument within the Arguments list
  2. Click the ‘…’ button top open the dialog editor
  3. Add a new entry for each new argument added
  4. Ensure that the Category is the same for each parameter to ensure they are grouped together

Step 3: Modify the Invoke Process activity to use the arguments

For this demonstration I’ve chosen to trigger the deployment immediately after the “Try Compile, Test and Associate Changesets and Work Items block”. This is convenient, since it is directly after the output directories are populated – though I am not yet sure that this step will stay here once the integration tests are plugged in.

The parameterized activity results in the following xaml being produced.

<If Condition=[DeployPackage1]DisplayName=If Deploy Package 1
<Sequence DisplayName=Deploy Package 1tbwt:BuildTrackingParticipant.Importance=Low>
<mtbwa:InvokeProcess Arguments=[DeployPackage1ScriptArgs]DisplayName=Deploy Package 1
FileName=[String.Format(&quot;{0}\{1}&quot;, BuildDetail.DropLocation, DeployPackage1Script)]

You’ll note that I’ve wrapped the old Invoke Process activity in a new If block – the primary purpose of this is to allow us to use this workflow even for projects that do not use the build features.

Note: Don’t forget to check-in your xaml changes at this point – otherwise you will not be able to set any of your new arguments in the build definition editor!

Step 4: Setup the Build Definition

Note that this assumes you have already prepared your destination server to allow for remote deployments. See my previous post on Using TFS Build Server for Continuous Integration for help with this configuration if necessary.

The final step is to setup the build definition.

  1. Create a new Build Definition
  2. On the Process tab, note that a new category for “Deploy Package 1” is now displayed
  3. Enter the path to your deployment package, relative to the output directory. Typically this is _PublishedWebsites\MyProject_Package.
  4. Enter the name of the generated command script file.
  5. Enter the arguments required for the command script file. Typically this is /y /M:ServerName /u:UserName /p:Password.
  6. Change the Deploy Package 1 flag to true.

Step 5: Trigger a build, and watch it work!

Whats Next?

There are a number of changes that we have introduced to the standard workflow, that I will endeavour to describe in my next few posts;

  • Cleaning up the Output Directory to include only packages
  • Running Integration tests post-deployment
  • Using transformations on the integration test app.config file to modify the target endpoints for testing

First Look: Using TFS2010 for Continuous Integration


Any programmers reading this might remember what their first attempt at coding resulted in. Undoubtedly it lacked a lot of the finesse of something that you would write today. The output of my first real attempt at using TFS as a build server feels much the same as that first page of code many years ago. I’m sure there are many things that could be done more ‘correctly’, and for this reason I don’t really recommend this post as learning material for getting started with TFS Build.

The scenario I’m working towards is to get an existing solution being compiled and deployed in a pseudo continuous-integration setup. I say pseudo continuous-integration in this case, because I’m ignoring a number of key steps at this stage. Specifically, I’m not customising the configuration files for different environments, deploying the database projects, or looking at how to retarget the automated integration tests at the deployed server.

The VS2010 solution for this scenario contains three WCF services based web service projects (as well as a number of supporting assemblies), all targeting .NET 3.5. I’m making use of the Web Deployment packaging and deployment tools provided as part of Visual Studio.


  • Download and install the Microsoft Web Deployment Tool (or the 32bit equivalent) on the deployment machine (where the web services will ultimately be deployed).
  • Create the IIS Web Site on the deployment server. The Virtual Directories will be created automatically, but we do need the IIS Site in place to start with.
  • Create a TFS Build Agent on the TFS server (See MSDN article Create and Work with Build Agents for details)

Lesson Learned 1: The Web Deployment Tool does not install the Remote Agent Service by default. Be sure to select this option as part of the installation Wizard. If the Remote Agent Service is not configured correctly, the deployment activities outlined later in this past will result in an error message “The response header ‘MSDeploy.Response’ was ” but ‘v1’ was expected”.

Lesson Learned 2: The Web Deployment Agent Service is configured to start manually by default. It is necessary to reconfigure this to start automatically.

Lesson Learned 3: If you use the default Working Directory for your build agent then the path will include the full name of your TFS project. This can cause issues building as I have seen instances where the maximum length of the filename and path combined has exceeded the maximum (which surprisingly is only about 260 characters). To avoid this I’ve used the Build Definition ID rather than the Build Definition Path parameter as part of my Working Directory setup.

Basics First – Building the Solution

Setup a new definition by right clicking on the Builds node within Team Explorer for your Team Project and selecting “New Build Definition”.

You will need to select a UNC path for the output location on the Build Defaults tab, but aside from that the only configuration we need to worry about for now is on the Process tab.

For the moment, the only setting we need to change is to select the solution file to build, and select the build configuration. Since this is for an internal test deployment environment, I’ve chosen to use a Debug build for the moment.

Lesson Learned 4: Avoid spaces in the Build Definition Name. I’m sure this problem can be avoided by quoting the execution command – but I some problems executing the web deployment command file due to spaces in the path.

With the build configuration saved, simply right click on the newly created build in Team Explorer and select “Queue New Build”. I’ve found that even if the solution is already building successfully on a local machine, there is still some effort required at this stage to work through any build errors that are produced on the server.

Lesson Learned 5: The TFS Build Server is missing some resources required to build advanced projects. One such issue I ran into was an error stating that “resgen.exe” could not be found on the server. This can be resolved by either installing Visual Studio 2010 on the Team Foundation Server (an option I wasn’t really all that keen on), or install the Windows SDK.

Next Step – Setup Web Deployment Packages

Configure the Package setup options for each WCF service being deployed by right clicking on the WCF project in Solution Explorer, and selecting “Package/Publish Settings”.

In general, the default settings are enough to get the package building; however for this scenario I’ve tweaked the following:

  • Uncheck the option to “Include all databases …”. For the moment I’m excluding this because I am not trying to automate the deployment of the database – but even when we do get to this, I believe that the Database project would be better handled with its own deployment package.
  • Change the IIS Site and Application Name for the destination server to something more appropriate. The syntax used here is {ISS Web Site Name}/{Virtual Directory Name}.

Lesson Learned 6: TFS doesn’t build web deployment packages by default. Having the WCF projects configured in Visual Studio 2010 isn’t quite enough – we also need to instruct TFS to build the packages as part of a build. The easiest way to do this is to modify build definition (right click on the build configuration in Team Explorer, select Edit Build Definition) to include the following MSBuild Arguments on in the Advanced section of the Process tab:

/P:CreatePackageOnPublish=true /P:DeployOnBuild=true

Final Step – Build Template Tweaks to Deploy Web Packages

The build templates are stored as Windows Workflow files within the TFS project folder $/{Project}/BuildProcessTemplates. You can either choose to edit the Default Template (it is in source control after all, so there is no risk in messing this up), or create a new template to work from.

Note that if you create a new build template, you must be sure to set the Build Definition to use the appropriate build template file using the drop down list at the top of the Process tab of the Edit Build Definition dialog.

Within the build template, I’ve made all my modifications within the “Revert Workspace and Copy Files to Drop Location” step, which in turn is within the finally block of the “Try Compile, Test, and Associate Changesets and Work Items” workflow step. Note that each of these tasks can be added via the Windows Workflow user interface in Visual Studio – though I’ve found that editing the XML file directly is quicker on a machine with limited resources.

Remove the CopyDirectory task for the Binaries Directory:

By default, the TFS build template will copy all the project output into the Build output location. This includes all project files (such as ASPX, SVC and ASMX files), as well as all the compiled binaries. Since we are deploying our application using Web Deployment packages my preference is not to have anything in the output folder except for the required packages.

<!— <mtbwa:CopyDirectory Destination=”[BuildDetail.DropLocation]” DisplayName=”Copy Files to Drop Location” Source=”[BinariesDirectory]” /> –>

Add a CopyDirectory task to copy the web deployment package output:

DisplayName=”Copy Deployment Package to Drop Location”

Destination=”[String.Format(&quot;{0}\YourProjectName&quot;, BuildDetail.DropLocation)]”
Source=”[String.Format(&quot;{0}\_PublishedWebsites\YourProjectName_Package&quot;, BinariesDirectory)]” />

Add an InvokeProcess task to run the generated deployment command file:

Arguments=”/y /M:YourServer
/u:UserName /p:Password &quot;-setParam:’IIS Web Application Name’=’IISSite/VirtualDirectory‘&quot;”

DisplayName=”Deploy Web Service”
FileName=”[String.Format(&quot;{0}\YourProjectName\YourProjectName.deploy.cmd&quot;, BuildDetail.DropLocation)]”>


Custom Date Format for the Work Item DateTimeControl (TFS2010)

While working on a new work item template (WIT) for Team Foundation Server 2010 I encountered a requirement for a field where the user could select the time a call was received. The DateTimeControl for TFS work items displays only a date picker by default, which for this particular use case was not quite accurate enough.

My first reaction was to start building a custom work item control – but with a bit of digging into the existing one using Reflector I discovered that the ability to change the control format was indeed supported, just not immediately obvious.

It turns out that for any implementation of IWorkItemControl, the Properties property is set to contain all the attributes on the controls declaration line within the WIT. For example, the following control declaration will result in four entries within the Properties collection (FieldName, Type, Label and LabelPosition).

Label=Time Received:
LabelPosition=Left />

Additional attributes can be added to the control without invalidating the schema, so work item controls can make use of this feature to support additional configuration parameters. The DateTimeControl includes the following properties that can be set using this mechanism:

  • Format: A valid selection from the DateTimePickerFormat enumeration (Long, Short, Time, Custom)
  • CustomFormat: If the Format is set to Custom then the CustomFormat property is used to define the display format. This uses the standard formatting strings for .NET date formats.

Thus the following control definition can be used to add a Time-aware picker to a work item.

Label=Time Received:
CustomFormat=dd/MM/yyyy hh:mm:ss

For anyone that prefers to use the WIT editor within the TFS Power Tools (found here), these properties can be added using the String Collection Editor associated with the Attributes property for the DateTimeControl.

As an added bonus, the control is rendered in correctly in both Team Explorer and Team Web Access views of the work item. Unfortunately it is not possible to change the DateTime selection to use the spin control (aka the up-down control) rather than the calendar drop down, but that is a relatively minor problem.

TFS Timesheets is now on CodePlex

One of the key weaknesses (in my opinion) with Team Foundation Server work items at present is the difficulty to record time spent against a project. Sure you can track time in the ‘Work Completed’ field and can even report on what changes were made to this field over time. The difficulty is that this still assumes that developers will always update the work items on the correct day and fails to give simple visibility of how much time different team members have spent on a given work item.

Some time back I created a small timesheet utility for Team Foundation Server 2010. This was partly driven out of a requirement to be able to track this level of detail for a project I was working on, but also to serve as a sample project for my previous blog posts about creating custom work item controls;

At the moment the code posted is far from perfect and I haven’t had a lot of time for updating the sample since the posts, but as a number of people have requested access to either view or contribute to the source code I’ve decided to post it to CodePlex ‘warts and all’ so to speak. If you are interested in the code for this component it can now be found at the following location.

We’ve used this control on a couple of projects now and have found that it has helped to keep the work remaining / work completed values up to date by creating visibility of how much time team members are logging against work items on any given day. There are definitely some improvements that can be made (especially in terms of reporting) – so if you do happen to try out this component please drop me a note to let me know what you think, or if you have any ideas for how it could be improved!

Successfully upgraded TFS 2010 from Beta 2 to RC on EC2

Once again I have my TFS instance on EC2 up to date, and I’m able to connect to it successfully from my Visual Studio 2010 RC instance here at home. One word of advice though; once you install VS2010 RC, VS seems to crash if you then try to install Team Explorer 2010 Beta 2 so make sure your code is all checked in before the upgrade, otherwise won’t be able to make this available to anyone else on your team until the process is complete!

I was surprised that I didn’t run into a few issues as part of this process, but the upgrade was fairly painless – thanks mostly to the TFS 2010 Beta 2 to RC Upgrade Guide posted by Brian Krieger. I ran into a bit of trouble setting up SSRS and WSS again … but that was my own fault, not an issue with the upgrade process (I’d forgotten the credentials I’d used last time … L).

The good news though is that now seems to be working, including my old VS2010 Beta 2 solution files which opened without any issues or the need for an upgrade. 

I did come across a breaking change in the TFS APIs. It would seem that the TeamFoundationServer class no longer supports the AuthorisedIdentity property (in fact the TeamFoundationServer class has now been depreciated in favor of TeamProjectCollection). Changing the reference to use TeamProjectCollection.AuthorizedIdentity seems to have resolved the problem, and everything is building again.