Custom States values for Work Items in TFS 2010

In one of my previous blog posts I was asked how to add custom ‘State’ values for a work item in Team Foundation Server 2010. In this post I will look at how this can be achieved.

In one of my previous blog posts I was asked how to add custom ‘State’ values for a work item in Team Foundation Server 2010.

The state field is a unique field within the Work Item definition, and the mechanism to edit the values is not quite as intuitive as with other fields. Normally, the allowed values for a field are stored directly within the Field tag, using an AllowedValues (or even SuggestedValues) node:

<FIELD name=Priorityrefname=Microsoft.VSTS.Common.Prioritytype=Integerreportable=dimension>
  <HELPTEXT>Importance to business</HELPTEXT>
 
<ALLOWEDVALUES expanditems=true>
    <LISTITEM value=1 />
    <LISTITEM value=2 />
    <LISTITEM value=3 />
    <LISTITEM value=4 />
 
</ALLOWEDVALUES>
  <DEFAULT from=valuevalue=2 />
</FIELD>

The State field is slightly different – the field definition itself doesn’t actually contain any of the valid state values:

<FIELD name=Staterefname=System.Statetype=Stringreportable=dimension>
  <HELPTEXT>Active = work remains to be done. Closed = tested and checked in.</HELPTEXT>
</FIELD>

Instead, value state values are added to the States node within the Workflow node of the Work Item. See the below for adding a ‘Proposed’ state to the Work Item:

<WORKFLOW>
  <STATES>
    <STATE value=”Proposed” />
    <STATE value=Active>
      <FIELDS>
        <FIELD refname=Microsoft.VSTS.Common.ClosedDate>
          <EMPTY />
        </FIELD>
        <FIELD refname=Microsoft.VSTS.Common.ClosedBy>
          <ALLOWEXISTINGVALUE />
          <EMPTY />
        </FIELD>
      </FIELDS>
    </STATE>
    <STATE value=Closed />
  </STATES>
  …

Transition rules can then be applied to the Work Item State using the Transitions node:

<TRANSITIONS>
  <TRANSITION from=“” to=Proposed>
    <REASONS>
      <DEFAULTREASON value=New />
    </REASONS>
  </TRANSITION>
  <TRANSITION from=Proposedto=Active>
  …

With these modifications made, you should now be able to re-import the Work Item Type definition file back into Team Foundation Server 2010.

C:\TFS>”C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\witadmin” importwitd /collection:localhost\defaultcollection /p:”{ExistingProjectName}” /f:Task.xml

Note – don’t forget to restart your Visual Studio environment before trying to use your updated Work Item in order to refresh the cache.

Further Experiences with TFS 2010 Beta 2 on Amazon EC2

A few weeks ago I published a post containing my experiences trying to build a Team Foundation Server 2010 Beta 2 instance on Amazon Ec2. Since then I’ve used the instance I created a number of times, but not without struggles. Due to a number of reasons each time I wanted to use my TFS instance (to do productive work) I found that I had to fight with the configuration a fair bit in order to get all the windows services running properly.

So after battling with the TFS 2010 instance that I setup on Amazon Ec2 for about two weeks now, I’ve made some amendments to the build that seems to have stabilised the environment a little.

  • Disable the automatic generation of the computer name for new instancesWhile for a lot of EC2 usage scenarios it makes sense to create a new dynamic computer name for each instance created, for a lot of Microsoft server based products it causes a number of difficulties. Specifically, connection strings and hard coded URLs seem to be littered throughout configuration files and security settings. A colleague of mine recently hit the same problem in MS CRM, and I would expect to see similar issues in SharePoint, BizTalk Server etc.

    For Team Foundation Server 2010, the issues I hit was the database connection string for the TFS web services was set to the machine name by default – this one was easily fixed. The more significant issue I faced was that the SQL Server security settings were all based around windows users in the format {machinename}\{user}. The result was when I setup a new instance all the SQL Server security roles were based on the previous machine name and had to be reset before Team Foundation Server could connect.

    Thankfully Amazon has provided us with a relatively straight forward mechanism to force the same computer name to be maintained. There is a property in the config.xml file (C:\Program Files\Amazon\Ec2ConfigSetup) called Ec2SetComputerName. If this is set to disabled then the computer name set when the AMI is bundled will be maintained. An even easier mechanism (that I didn’t discover till much later) is the EC2 Config Service Settings (Ec2ConfigServicesettings.exe) utility in the same folder that wraps these configuration settings.

  • Turn off automatic sysprep of new instances. Even with Ec2SetComputerName disabled, if sysprep is enabled for the machine then a randomised instance name will still be assigned. Interestingly, this is of a different format to the normal instance names … but nonetheless this is still less than ideal for an instance we are going to want to create several times.

    Using the Ec2 Config Service Settings utility again, disable the Sysprep option from the Bundle tab. Under the covers, this will update the BundleConfig.xml file.

  • Setup the program / data Elastic Storage Block drive to be automatically assigned the appropriate drive letter based on its volume name.

    One of the other issues I have hit was getting the timing right for attaching the Elastic Storage Block (ESB) drive that I’m using for data when initiating a new instance of my Team Foundation Server AMI. If this is attached too soon then it may get automatically assigned to the D: drive. The result of this is quite severe;

    • None of the services will start, since SQL Server is expecting the data files to be on E: drive
    • The Windows page file is expecting to run on D: drive – so a large page file is suddenly loaded onto our relatively small data drive
    • The Ec2 instance storage (normally the D: drive) is not initialised and remains as an inactivate drive that needs to be manually mounted and formatted.

    Thankfully, Amazon has an answer for this issue as well. Provided we give our ESB drive an appropriate name (in my case, I’ve renamed it to DataVolume), we can use the same Ec2 Config Service Settings tool we used above to force the volume to use a specified drive letter when it is remounted in future.

  • Disable all drives on MagicDisc. Similar to the problem I encountered with the Elastic storage block drive being assigned to the D: drive, if MagicDisc is still enabled then this will get automatically assigned to the next available drive letter. As this is mounted before the instance storage, the first MagicDisc drive gets assigned to D: drive. Rather than uninstall completely, I’ve just disabled all drives before bundling the AMI (right click on the tray bar icon, select ‘set number of drives’ then ‘disable’).
  • Unrelated to the stability issues I was hitting, I’ve also installed the Visual Studio 2010 Express tools for Web Development. My next project is likely to be looking at setting up Team Foundation Server event subscriptions for changed work items, so having the development tools on the same box should be useful. One important observation though … of the 10GB hard disk made available by Ec2 … I’m now down to 1.5 free on C: drive … 😦

There is one issue I don’t have a fix for yet … and I’m not even sure that it’s an issue … but anecdotally I believe I’m also having trouble with the permissions assigned to the built-in Network Service account on my elastic storage block drive. A couple of times the SQL Server services have failed to start due to the Network Service account not having enough rights to the data directories. I’m going to keep an eye on this for a while, but for now I’m just re-assigning ‘modify’ rights whenever I come across this issue. I’m wondering if it’s worth using a local service account rather than Network Service … but it’s not causing me enough issues to want to rebuild the instance yet again (it’s been 5 times so far!!!).

I’m sure that there are still a few more idiosyncrasies that I need to iron out before I’m totally happy with this build … but at least I’m starting to feel a little more comfortable that when I start the instance I can use it productively within half an hour or so – rather than having to mess around with the configuration every time.

Setting up a TFS Development Environment on Amazon EC2

Background

I’ve been using the Amazon EC2 environment for a while now, mostly for basic development or testing environments. Since these environments have all been reproducible very easily we haven’t worried too much about getting them setup to resist an instance failure.

This week however, I’ve had three separate conversations regarding how we can get environments safely built for more complex applications:

  • Team Foundation Server Beta 2
  • Microsoft CRM / SharePoint Server Development Environments
  • Oracle SE1

In each of these cases, the environment is only required for a short time while developing or demonstrating … then they will be shut down again.

Each of these needs some form of database storage … and since this will be a development environment we don’t want to be paying $1 / hour for the Amazon SQL Server instance, given that we are entitled to use our development licenses for this purpose.

One serious note about EC2 Drives

Hard drives on EC2 are split into three different categories:

  • (C:) is the system drive, and is automatically attached when an EC2 instances. It should be noted that ONLY data on the C: is persisted when an instance is bundled!
  • (D:) is a temporal storage drive, and should be used strictly for short term storage, as this does not get persisted when an instance is bundled.
  • Additional Elastic Storage Blocks (ESB) may be added. ESBs are treated as additional hard drives, but they are stored within S3 and therefore will (usually) remain intact beyond the life of an EC2 server instance. As the capacity of the EC2 system drive is so low (10GB for a small instance) it is necessary to store all programs and data that need to be persisted within ESB drives.

I make this distinction right at the start of my post – because I got caught out by the fact that the D: drive does not get bundled … and had to start all over again 😦

Initial Setup

For this configuration we’re going to need the following setup on EC2

  • 1 x 10GB ESB for storing installers and ISOs
  • 1 x 5GB ESB for database
  • 1 x EC2 instance for downloading files
  • 1 x EC2 instance for Application installation

I’m working on the assumption that everything is installed on the same instance today – though obviously one could separate the various components of the installation, and use Amazon security groups as a way to restrict access and provide layers of security.

Preparation – Download the Installers

Maybe I’m being overly cautious, but I tend to setup a specific instance just for downloading the application installers and copying them to an Elastic Storage Block drive. This way any bloat that the download causes does not affect the size of the installed application AMI when we go to bundle it later.

Note: One of the gotcha’s I ran into was that the default setting for Internet Explorer restricts access to download some of the installation files that we need. Rather than messing around with the security settings for every new instance, I decided to download Firefox and run it straight off an ESB. This gets around the security issues, and allows Firefox to be easily used on other instances just by attaching the ESB. Instructions for setting up Firefox this way can be found here: http://www.articleworld.org/index.php/How_to_run_Firefox_from_an_USB_stick

To get started download installers for the following applications:

Copy each of these installers to an ESB drive, and then shut down the current EC2 instance.

Setup the Base AMI image

I decided to base my build on the “Amazon Public Images – Basic Microsoft Windows Server 2003” image, though in theory any of the Windows based images could be used. One note of caution though – if the author of the image is not Amazon, then ask yourself whether you trust the author not to have built in any rogue elements to the image.

Once the image is fully booted, add the new ESB drive (for storing the application database files) to the instance first, followed by the ESB containing the installers. The timing and order of adding these drives is important. If you load the new ESB drive before the instance is fully booted then the instance gets a little confused and loads the ESB as D: rather than E: – causing inconsistencies later when we create new instances from the bundled AMI.

After formatting and installing the new ESBs, the instance setup should look something like this:

  • (C:) System drive containing the OS
  • (D:) 160GB Blank instance drive
  • (E:) 10GB+ Blank ESB drive for storing data
  • (G:) 10GB ESB drive containing our installation media

Other components that need to be installed before going any further are:

  • IIS
  • ASP.NET

In order to install the additional windows components, it is necessary to mount the Windows 2003 media. The following article contains a list of snapshots that can be used to create Elastic Storage Block volumes containing the windows installation disks:

Note: At this stage I’d recommend changing the Administrator password to something that you will remember next time … I had to restart my installation after finding that a bundled AMI does not seem to regenerate new passwords for instances 😦

Installing SQL Server

On our new EC2 instance install MagicDisc (or some other ISO mounting tool) and mount the SQL Server installers.

The key configuration variations from the defaults were as follows:

  • Setup the data directories for the SQL Server Instance to be stored on the E: drive.
  • Ensure that all SQL Server features you require are installed. Since this particular install is for TFS 2010, I’ve enabled Full Text Search, Analysis Services and Reporting Services.
  • Ensure that the SQL Server management studio is installed.

After the installation is complete, it pays to double check that all of the SQL Server folders on the E: drive have been assigned the correct permissions. I found with my install that I had to give the Network Service account some additional privileges in order to get the SQL Server services to start.

Installing Team Foundation Server 2010 Beta 2 (Pre-Configuration)

Installing Team Foundation Server 2010 has been described in many blog posts before, so I’m limiting the description in this post to the specifics of getting the install working on EC2.

Installation of TFS2010 is done in two steps; firstly the installer is run to setup the base application files, then the configuration tool is used to setup Team Foundation Server for use.

  1. Mount the Team Foundation Server 2010 Beta 2 ISO image
  2. Set the install location for the Features to the E: drive

Note: At this stage I’d recommend bundling an AMI from our progress so far. The configuration step for TFS (coming up next) can be a bit finicky … so having an easy rollback option is not a bad idea. If you do bundle the AMI at this stage, be sure to reload the ESB drives in the same order as last time.

Configuring Team Foundation Server

  1. Create a local service account to run the Team Foundation Service. I’ve called mine TFSServiceAccount.
  2. Start the Team Foundation Server Administration Console
  3. Enter the local user account setup in step 1 (TFS Service Account) to run the Team Foundation Server services.
  4. Select whether you are after a Basic or a Standard configuration scenario. Either way, all the defaults should be sufficient to get the installation working – though I did find I had to spend a bit of time sorting out IIS and the Reporting Server (since I’d make the mistake of installing SQL Server before IIS).

Setting up SQL Server Backups

Even though the SQL Server databases are being stored on Elastic Storage Block drives – there is still a chance that the ESB itself will fail. Don’t get me wrong … ESB is much safer than storing the data directly on the instance storage – but there is still a risk that it can be irretrievably lost. The following article by Michael Friss discusses the different backup strategies for SQL Server on an EC2 instance – and also includes a good PowerShell script for performing the backup and saving it to S3.

Any Comments …

I have to admit; I hit a number of issues performing this setup and had to restart the overall process three times. I’m sure there may be better ways to setup some of these features … especially the SQL Server components. I’d like to hear any feedback you’ve got, or what works well – so please leave a comment if you have any thoughts.

TFS 2010 Beta 2 – Custom Work Item Controls. Step 2, Work Item Setup

Overview

[Edit: Source code for the control discussed in this post can be found here: TFS TimeSheets (Codeplex).]

In the first post (Step 1, Getting Started) in this series I looked at how to create a basic Work Item Custom Control for Team Foundation Server 2010, and went through the process of configuring and installing the custom control for use on the Task work item.

Today I’m looking at setting up the Task Work item definition to allow create a new field for our custom work item control to persist time sheet data to.

The screen shot below depicts my goals for the control; however this post focuses more on how to create and configure the control than the actual time sheeting ability itself. Yes, I know this one isn’t going to win any prizes for prettiness … but that can be fixed later 🙂

As we are restricted within Team Foundation Server to a non-relational structure for our work item data, my current thinking is that time sheet data will be stored as XML within a new field configured on the work item. The other option I considered was looking at the ‘History’ field type … but a bit of investigation suggests that the way the work item history control works is to look at the various revisions of the work item, rather than store a collection of values.

Prerequisites:

Before getting started, the following prerequisite setup should be completed:

  • Visual Studio 2010 Beta 2 (for development)
  • Visual Studio 2010 Team Explorer Beta 2 (for testing)
  • Team Foundation Server 2010 Beta 2
  • The output from the project setup in Step 1.
  • A new reference added to our custom control library to Microsoft.TeamFoundation.Client
  • A new reference added to our custom control library to Microsoft.TeamFoundation.WorkItemTracking

Modify the Work Item Template

Using the following command the existing work item template for the Task work item can be extracted from the Team Foundation Server.

C:\>”C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\witadmin.exe” exportwitd /collection:localhost\defaultcollection /p:”{ExistingProject}” /n:Task /f:Task.xml

If you haven’t worked with the work item template in the past, at this stage it’s worth having a good read through the documentation. For the purposes of the changes today we are only going to need to add a new field; however there is a lot of value in understanding some of the other features available within the schema.

The first thing we need to do is declare a new field to store the time sheet data. The XML fragment below shows the new field definition is black bold font. It should be noted that the refname is unique across the entire Team Foundation Server collection database, so be sure to apply a suitable name at this stage. I tend to prefix all my custom fields with ‘Custom’ just to make them easier to identify.


<FIELD name=Authorized As refname=System.AuthorizedAs type=String syncnamechanges=true />
<FIELD name=”TimesheetRawData” refname=”Custom.Timesheets.TimesheetRawData” type=”PlainText” />
</FIELDS>

We also need to add the FieldName attribute to the control definition for the WorkItemTimeSheetControl:

<Tab Label=TimeSheets>
<Control Type=WorkItemTimeSheetControl LabelPosition=Top FieldName=”Custom.Timesheets.TimesheetRawData” />
</Tab>

The modified work item can then be imported back into TFS 2010 using the following command:

C:\TFS>”C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\witadmin.exe” importwitd /collection:localhost\defaultcollection /p:”{ExistingProjectName}” /f:Task.xml

Using the IWorkItemControl Members

The Work Item Custom Control we created earlier implemented the IWorkItemControl interface. Using this interface we can get access to the currently loaded work item, and manipulate its values programmatically.

The full documentation for the interface can of course be found on MSDN, but below are my additional notes on the usage of some of the members.

WorkItemDataSource A reference to the work item is provided to the user control via the WorkItemDataSource property. The property itself is defined as an object, but so far I’ve found that this reference can be safely typecast to a stronger type of Microsoft.TeamFoundation.WorkItemTracking.Client.WorkItem.
WorkItemFieldName If the work item control has been setup with a fieldname value then this will be provided to the custom control via the WorkItemFieldName field.
InvalidateDataSource The InvalidateDataSource method is called during the loading (and unloading) of a work item. This method should be used to populate the fields of the Work Item Custom Control with their persisted values, or default values for a new work item.

Writing to the Work Item using the Object Model

A small number of standard properties may be set via strongly typed properties on the Work Item … however the majority of the Work Item data must be accessed from within the Fields collection.

CurrentWorkItem.Fields[WorkItemFieldName].Value = TimeSheet.Serialize(TimeSheet);

Whenever any data is changed within the Fields collection the work item interface automatically detects that the work item is ‘dirty’ and sets it to appear modified in the user interface with the * next to the work item name.

The one gotcha I did come across on the way was how to obtain the name of the current user in the context of the connection to Team Foundation Server. It turns out this can be retrieved from the following property:

CurrentWorkItem.Store.TeamFoundationServer.AuthorizedIdentity.DisplayName;

The changes I’ve made to the control at this point have been very basic. I’ve added a couple of user input controls to the form, and setup the add button handler to add a new time sheet entry entity to a time sheet list, then serialize the resulting list back to the work item based on the WorkItemFieldName.

At a high level, the functionality implemented is:

  • Ability to enter a new timesheet entry
  • Ability to view all timesheet entries associated to the work item
  • The value of Remaining Work is decreased automatically based on the minutes entered
  • The value of the Completed Work is increased automatically based on the minutes entered

Testing the Changes …

As before, it is necessary to reopen the testing instance of Visual Studio 2010 – otherwise the old work item definition may still be cached.

If we reopen Visual Studio then create a new Task work item our custom control should be displayed under the TimeSheets tab, and loading / saving Task work items should result in our changes being persisted to the database.

References

While working through this investigation process I’ve found the following resources very useful, and my thanks go to their authors:

Can’t debug VS2010 from VS008

So as part of my experiments with creating and installing Work Item Custom Controls for use in Team Explorer 2010 I inevitably wanted to be able to attach a debugger to an instance of Team Explorer 2010 to interrogate the Work Item object model at runtime.

I’ve done this before for Team Explorer 2008 and wasn’t anticipating any problems. The basic setup can be obtained using these instructions on the Team System Notes blog.

Having followed these instructions I hit a bit of a roadblock. It turns out that VS2008 doesn’t like attaching a debugger to a VS2010 instance. If VS2010 is setup as the start-up application then an error is thrown when starting the project: “The debugger’s protocol is incompatible with the debuggee“. I’m guessing here, but I’d assume that the issue is that VS2010 has been compiled using .NET 4.0, and as VS2008 only supports up to .NET 3.5 the debugger is unable to correctly attach to a 4.0 managed process. I had hoped to keep this machine (the primary one I use for work) free of the full VS2010 components … however having hit this issue I’ve had to download and install the beta 2.

Anyway, the upshot is that installing VS2010 and attaching the debugger to another VS instance from there solved the problem.

TFS 2010 Beta 2 – Custom Work Item Controls. Step 1, Getting Started

Overview

A few months ago I spent a week or so getting my head around the Custom Work Item Controls in Team Foundation Server 2008. At the time it was obvious that a lot of value could be added to our development teams if we could add some additional custom work item functionality to our processes.

My original intent at the time was to build a work item explorer that could navigate through work items based on their relationships … but that is fairly redundant now that we have proper work item hierarchies in TFS 2010. This time my intention is to have a look at how we can improve the way we record our time against tasks within Team Foundation Server.

This post (Step 1, Getting Started) will cover the process of setting up the required project and setup files, creating a basic ‘Hello World’ custom work item control and loading that control into the work item interface.

Prerequisites:

  • Visual Studio 2008 or 2010 (for development)
  • Visual Studio 2010 Team Explorer Beta 2 (for testing)
  • Team Foundation Server 2010 Beta 2

Visual Studio Project Setup

Creating Custom Work Items Controls is much like creating any other control library in Visual Studio. The control itself is based on the standard UserControl base class, and should be setup within a .NET Class Library Project.

Before going any further, we’re going to need to add a reference to the TFS Work Item Tracking Controls assembly. This is installed as part of Visual Studio 2010, and can be found in the following folder:

C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\PrivateAssemblies\Microsoft.TeamFoundation.WorkItemTracking.Controls.dll

At a minimum, we’ll also need the following artefacts in our solution:

  • One User Control (WorkItemTimeSheetControl)
  • One Work Item Custom Control definition file (.wicc). Note that it is important to set the Build Action for this file to ‘Content’ on from the properties window.

Create the User Control

There is nothing overly special about a user control housed within a Work Item form – except that it implements the IWorkItemControl interface. For the moment the goal is only to create a user control and embed it within the Work Item form, so the control itself contains nothing more than the compulsory ‘Hello World’ label in bold 24pt black font.

We’ll setup the various members of the IWorkItemControl interface in more detail in a future post – for the moment a basic implementation with the default NotImplementedExceptions removed will do the job. Just create basic backing variables for any of the properties that need to be defined.

Create the Work Item Custom Control definition (WICC)

The .wicc file is an XML file that defines the custom control to be loaded within a work item. When a Custom Control is referenced within a Work Item Template, Visual Studio will look within the custom controls folder for a .wicc definition file that matches the name of the required custom control. In the case of current custom control, Visual Studio will be looking for a file named WorkItemTimeSheetControl.wicc.

The following XML is the contents of the WorkItemTimeSheetControl.wicc file:

<?xml
version=1.0
encoding=utf-8 ?>

<CustomControl
xmlns:xsi=http://www.w3.org/2001/XMLSchema-instance
xmlns:xsd=http://www.w3.org/2001/XMLSchema>

<Assembly>TimeSheetControl.dll</Assembly>

<FullClassName>TimeSheetControl.WorkItemTimeSheetControl</FullClassName>

</CustomControl>

Create the Windows Installer

Unfortunately the runtime for a Work Item Custom Control is not defined on the Team Foundation Server instance, but rather must be installed on every Visual Studio client machine that will be interacting with the work item.

The one gotcha for the setup folder is that all the output must be installed on within the custom controls folder on the client machine. On Vista, this is located at C:\ProgramData\Microsoft\Team Foundation\Work Item Tracking\Custom Controls\10.0 … though this will vary depending on the OS.

  1. Create a new Setup Project within the solution
  2. Add a new ‘Custom Folder’ named CommonAppDataFolder to the output of the setup project.
  3. In the properties window for the new custom folder, set the Default Location to [CommonAppDataFolder].
  4. Create the folder structure shown in the screenshot above under the CommonAppDataFolder. (Microsoft, Team Foundation, Work Item Tracking, Custom Controls, 10.0)
  5. Add the Primary Output and Content Files from the user control class library we created earlier.
  6. Build and install the setup file on the local machine.

The client components are now all in place – the only remaining step is to add our custom control to one of the work item templates.

Modify the Work Item Template

I’m going to select the Task work item from the MSF for Agile Software Development 5.0 Beta 2 process templates as the testing work item template to apply our new custom control to.

Using the following command the existing work item template for the Task work item can be extracted from the Team Foundation Server.

C:\>”C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\witadmin” exportwitd /collection:localhost\defaultcollection /p:”{ExistingProject}” /n:Task /f:Task.xml

A couple of notes here:

  1. The WITAdmin tool is new to TFS 2010. When working with TFS 2008 the witimport / witexport tools would be used instead.
  2. In TFS 2008 adding custom controls to a work item template broke the process editor power tool – so all operations to import / export the work items must be done by command line. I am not sure whether this is going to be supported in the TFS 2010 power toys or not.

Now that we’ve got the work item template extracted we need to modify it to include our custom work item control. I’ve chosen to add the control to a new tab at on the bottom half of the screen.

The XML fragment below shows the addition of the time sheet control to the Task work item file in bold black font. Make sure that the ‘Type’ attribute for the Control node is set to the name of the .wicc file that has been deployed to the client machines.

        …

    <Tab
Label=Attachments>

    <Control
Type=AttachmentsControl
LabelPosition=Top />

        </Tab>

            <Tab Label=”TimeSheets”>

                <Control Type=”WorkItemTimeSheetControl” LabelPosition=”Top” />

            </Tab>

        </TabGroup>

</Layout>

</FORM>

</WORKITEMTYPE>

</witd:WITD>

The modified work item can then be imported back into TFS 2010 using the following command:

C:\TFS>”C:\Program Files\Microsoft Visual Studio 10.0\Common7\IDE\witadmin” importwitd /collection:localhost\defaultcollection /p:”{ExistingProjectName}” /f:Task.xml

Testing the Changes …

Back on the Visual Studio client machine, load up an instance of Visual Studio 2010 Team Explorer Beta 2 and open a Task from within the team project that has had the modified work item template loaded. Assuming everything has gone to plan a new tab should exist on the Work Item form, and the custom control should have loaded and displayed correctly.

One gotcha to keep in mind here – Team Explorer will cache the work item templates and controls – so it may be necessary to restart the Team Explorer client in order to see the updated layout.

Next Steps

Hopefully this ‘getting started’ post is of some use to others as they work through this process … or at least will help me out next time I get some time free to continue with this side project J

Now that we’ve got a user control successfully loading from within the work item layout, my next (likely) steps include:

  • Adding interaction with the Work Item data model
  • Investigating how / if the TFS Web Interface can have a similar custom control applied
  • Investigating whether using WPF controls will simplify the implementation of think client / web client work item templates
  • Putting together some actual logic to solve the initial issue of time sheeting!