Modelling with Domain-Specific Languages

Introduction

With the entire industry moving as quickly as it has been for the past few years, it is not surprising that every once in a while you discover a technology or capability that has been around for a number of years yet has escaped your attention.

Domain-Specific Languages (specifically the implementation within Visual Studio) has been one of those cases for me, but over the past few weeks I’ve been spending my spare time playing with the models and considering its role in the production (primarily) of design documentation. I’m always on the look-out for ways to improve communication with members of the customer’s team and one thing is obviously true – it doesn’t matter whether they are managers, business users, or analysts – pretty pictures do a great job of communicating complex ideas simply.

I can almost hear you saying now ‘but Domain-Specific languages have been around for years (since VS2005 in fact) … so how is it that this is new to you!’ Well I was feeling pretty embarrassed about that, until I presented the concept internally to some of my colleagues at work and found that no-one else had investigated DSL either. Now I don’t feel so bad.

So for those readers who haven’t used DSL in the past, it’s probably safe to say that you’ve at least used the models produced using the toolkit. The DSL modelling SDK is responsible for many of the graphical designers within Visual Studio, such as;

  • The Visual Studio Architecture Tools
  • The Entity Framework designer
  • The BizTalk ESB Toolkit Itinerary designer
  • The Web Services Software Factory designer

Why I’m Excited

Generating a nice diagram is well and good – but the fact is Visio can do that. The power of the DSL SDK comes with the extensibility – some of the features that have me interested are;

  • Customized Rules and Validation. This is key to help new team members understand the model
  • Model Generated meta-data generation. A lot of data we deal with is meta-data that ultimately needs to be loaded into a database, or system of some form.
  • Model Generated design documentation. I’m quite excited about the idea of transforming (using the Visual Studio’s T4 templates) the model xml into Word XML or similar.
  • Distributable. As most models can be executed within Visual Studio Shell, they could easily be shared with business analysts and customer representatives who might not already have Visual Studio.

On top of this, model development is fast! Without writing any code at all it doesn’t take long to produce a model. Models could easily be built for short-lived project requirements.

Getting Started

As with any technology that has been around for seven years already, there is some good material on the internet for getting started, and I’m not trying to replicate that in this post.

A Basic Example

As an example, consider for a moment the process of defining document libraries in a SharePoint environment. This often involves defining a Content Type for the library, configuring the document library (possibly with additional library-specific columns), and possibly adding an approval workflow.

The Domain Class model below outlines three core types (note I’ve hidden some of the relationships from simplicity here);

  • Content Types, including;
    • Embedded relationship to Columns
  • Document Libraries, including;
    • Embedded relationship to Columns
    • Reference relationship to Content Type
    • Reference relationship to Workflow
  • Workflows, including;
    • Several Domain Properties defining attributes for workflow configuration

Each of these Domain Classes have been mapped to Component Shape and Connector elements to support the visualization of the model.

The resulting model allows users to graphically model the Content Types, Document Libraries and Workflows required for the solution, resulting in the following diagram. In this case I’ve shown Content Types in purple, Document Libraries in orange and Workflows in lime green. As you can see, this provides a high-impact top level view of the domain model, and is well suited to including in design documentation communicating the solution to customers, developers and maintenance teams alike.

While the design process is being conducted, it is often simpler to capture any additional details required for the solution being developed at the same time – this is where diagrams in Visio start to fall short a little. Since the DSL model is hosted within Visual Studio, we have full access to the property grid – so all custom Domain Properties can be captured simply.

To extend this example a little further, consider the possibilities of artifacts that could be generated from this model such as;

  • Automation of web service calls to implement the content types / libraries and workflows directly from the model
  • Generation of Word tables containing the properties stored against each Domain Class. This could include the detailed attributes from the property grid that are not shown in the visualization.

TFS Automated Deploy – Avoid MSDEPLOY Deleting Log Files

Last weekend I posted the details of how we are using TFS Build Server in conjunction with the Web Deployment Toolkit to enable automated build and deploy of some of our internal development environments.

One piece of feedback that I received from the previous post was that the configuration by default will delete all files in the deployment destination location. Normally this is perfectly acceptable behaviour – but in other cases less than ideal. Specifically, one of the solutions that we are building logs output to a logging folder within the application directory, and we do not want to lose the logs each time a build is deployed.

This is easily solved when deploying using the one-click publishing functionality in Visual Studio 2010 by selecting the “Leave Extra Files on Destination” option on the Publish dialog for a web application (the Publish dialog can be accessed by right clicking on the web application project in Solution Explorer, then selecting Publish) – but the solution is less obvious when using MSDEPLOY or the TFS 2010 build packages directly.

After some time searching through Google I came across this thread that describes the MSDEPLOY parameter –enablerule:DoNotDeleteRule. This parameter can be added to the Windows Workflow script used by TFS2010 for a build definition by modifying the InvoiceProcess activity used in my previous post as follows (change marked in red):

<mtbwa:InvokeProcess
Arguments=”/y /M:YourServer /u:UserName /p:Password

&quot;-setParam:’IIS Web Application Name’=’IISSite/VirtualDirectory‘&quot; -enablerule:DoNotDeleteRule
DisplayName=”Deploy Web Service”
FileName=”[String.Format(&quot;{0}\YourProjectName\YourProjectName.deploy.cmd&quot;, BuildDetail.DropLocation)]”>
</mtbwa:InvokeProcess>

First Look: Using TFS2010 for Continuous Integration

Overview

Any programmers reading this might remember what their first attempt at coding resulted in. Undoubtedly it lacked a lot of the finesse of something that you would write today. The output of my first real attempt at using TFS as a build server feels much the same as that first page of code many years ago. I’m sure there are many things that could be done more ‘correctly’, and for this reason I don’t really recommend this post as learning material for getting started with TFS Build.

The scenario I’m working towards is to get an existing solution being compiled and deployed in a pseudo continuous-integration setup. I say pseudo continuous-integration in this case, because I’m ignoring a number of key steps at this stage. Specifically, I’m not customising the configuration files for different environments, deploying the database projects, or looking at how to retarget the automated integration tests at the deployed server.

The VS2010 solution for this scenario contains three WCF services based web service projects (as well as a number of supporting assemblies), all targeting .NET 3.5. I’m making use of the Web Deployment packaging and deployment tools provided as part of Visual Studio.

Preparation

  • Download and install the Microsoft Web Deployment Tool (or the 32bit equivalent) on the deployment machine (where the web services will ultimately be deployed).
  • Create the IIS Web Site on the deployment server. The Virtual Directories will be created automatically, but we do need the IIS Site in place to start with.
  • Create a TFS Build Agent on the TFS server (See MSDN article Create and Work with Build Agents for details)

Lesson Learned 1: The Web Deployment Tool does not install the Remote Agent Service by default. Be sure to select this option as part of the installation Wizard. If the Remote Agent Service is not configured correctly, the deployment activities outlined later in this past will result in an error message “The response header ‘MSDeploy.Response’ was ” but ‘v1’ was expected”.

Lesson Learned 2: The Web Deployment Agent Service is configured to start manually by default. It is necessary to reconfigure this to start automatically.

Lesson Learned 3: If you use the default Working Directory for your build agent then the path will include the full name of your TFS project. This can cause issues building as I have seen instances where the maximum length of the filename and path combined has exceeded the maximum (which surprisingly is only about 260 characters). To avoid this I’ve used the Build Definition ID rather than the Build Definition Path parameter as part of my Working Directory setup.

Basics First – Building the Solution

Setup a new definition by right clicking on the Builds node within Team Explorer for your Team Project and selecting “New Build Definition”.

You will need to select a UNC path for the output location on the Build Defaults tab, but aside from that the only configuration we need to worry about for now is on the Process tab.

For the moment, the only setting we need to change is to select the solution file to build, and select the build configuration. Since this is for an internal test deployment environment, I’ve chosen to use a Debug build for the moment.

Lesson Learned 4: Avoid spaces in the Build Definition Name. I’m sure this problem can be avoided by quoting the execution command – but I some problems executing the web deployment command file due to spaces in the path.

With the build configuration saved, simply right click on the newly created build in Team Explorer and select “Queue New Build”. I’ve found that even if the solution is already building successfully on a local machine, there is still some effort required at this stage to work through any build errors that are produced on the server.

Lesson Learned 5: The TFS Build Server is missing some resources required to build advanced projects. One such issue I ran into was an error stating that “resgen.exe” could not be found on the server. This can be resolved by either installing Visual Studio 2010 on the Team Foundation Server (an option I wasn’t really all that keen on), or install the Windows SDK.

Next Step – Setup Web Deployment Packages

Configure the Package setup options for each WCF service being deployed by right clicking on the WCF project in Solution Explorer, and selecting “Package/Publish Settings”.

In general, the default settings are enough to get the package building; however for this scenario I’ve tweaked the following:

  • Uncheck the option to “Include all databases …”. For the moment I’m excluding this because I am not trying to automate the deployment of the database – but even when we do get to this, I believe that the Database project would be better handled with its own deployment package.
  • Change the IIS Site and Application Name for the destination server to something more appropriate. The syntax used here is {ISS Web Site Name}/{Virtual Directory Name}.

Lesson Learned 6: TFS doesn’t build web deployment packages by default. Having the WCF projects configured in Visual Studio 2010 isn’t quite enough – we also need to instruct TFS to build the packages as part of a build. The easiest way to do this is to modify build definition (right click on the build configuration in Team Explorer, select Edit Build Definition) to include the following MSBuild Arguments on in the Advanced section of the Process tab:

/P:CreatePackageOnPublish=true /P:DeployOnBuild=true

Final Step – Build Template Tweaks to Deploy Web Packages

The build templates are stored as Windows Workflow files within the TFS project folder $/{Project}/BuildProcessTemplates. You can either choose to edit the Default Template (it is in source control after all, so there is no risk in messing this up), or create a new template to work from.

Note that if you create a new build template, you must be sure to set the Build Definition to use the appropriate build template file using the drop down list at the top of the Process tab of the Edit Build Definition dialog.

Within the build template, I’ve made all my modifications within the “Revert Workspace and Copy Files to Drop Location” step, which in turn is within the finally block of the “Try Compile, Test, and Associate Changesets and Work Items” workflow step. Note that each of these tasks can be added via the Windows Workflow user interface in Visual Studio – though I’ve found that editing the XML file directly is quicker on a machine with limited resources.

Remove the CopyDirectory task for the Binaries Directory:

By default, the TFS build template will copy all the project output into the Build output location. This includes all project files (such as ASPX, SVC and ASMX files), as well as all the compiled binaries. Since we are deploying our application using Web Deployment packages my preference is not to have anything in the output folder except for the required packages.

<!— <mtbwa:CopyDirectory Destination=”[BuildDetail.DropLocation]” DisplayName=”Copy Files to Drop Location” Source=”[BinariesDirectory]” /> –>

Add a CopyDirectory task to copy the web deployment package output:

<mtbwa:CopyDirectory
DisplayName=”Copy Deployment Package to Drop Location”

Destination=”[String.Format(&quot;{0}\YourProjectName&quot;, BuildDetail.DropLocation)]”
Source=”[String.Format(&quot;{0}\_PublishedWebsites\YourProjectName_Package&quot;, BinariesDirectory)]” />

Add an InvokeProcess task to run the generated deployment command file:

<mtbwa:InvokeProcess
Arguments=”/y /M:YourServer
/u:UserName /p:Password &quot;-setParam:’IIS Web Application Name’=’IISSite/VirtualDirectory‘&quot;”

DisplayName=”Deploy Web Service”
FileName=”[String.Format(&quot;{0}\YourProjectName\YourProjectName.deploy.cmd&quot;, BuildDetail.DropLocation)]”>

</mtbwa:InvokeProcess>

Successfully upgraded TFS 2010 from Beta 2 to RC on EC2

Once again I have my TFS instance on EC2 up to date, and I’m able to connect to it successfully from my Visual Studio 2010 RC instance here at home. One word of advice though; once you install VS2010 RC, VS seems to crash if you then try to install Team Explorer 2010 Beta 2 so make sure your code is all checked in before the upgrade, otherwise won’t be able to make this available to anyone else on your team until the process is complete!

I was surprised that I didn’t run into a few issues as part of this process, but the upgrade was fairly painless – thanks mostly to the TFS 2010 Beta 2 to RC Upgrade Guide posted by Brian Krieger. I ran into a bit of trouble setting up SSRS and WSS again … but that was my own fault, not an issue with the upgrade process (I’d forgotten the credentials I’d used last time … L).

The good news though is that now seems to be working, including my old VS2010 Beta 2 solution files which opened without any issues or the need for an upgrade. 

I did come across a breaking change in the TFS APIs. It would seem that the TeamFoundationServer class no longer supports the AuthorisedIdentity property (in fact the TeamFoundationServer class has now been depreciated in favor of TeamProjectCollection). Changing the reference to use TeamProjectCollection.AuthorizedIdentity seems to have resolved the problem, and everything is building again.

Can’t debug VS2010 from VS008

So as part of my experiments with creating and installing Work Item Custom Controls for use in Team Explorer 2010 I inevitably wanted to be able to attach a debugger to an instance of Team Explorer 2010 to interrogate the Work Item object model at runtime.

I’ve done this before for Team Explorer 2008 and wasn’t anticipating any problems. The basic setup can be obtained using these instructions on the Team System Notes blog.

Having followed these instructions I hit a bit of a roadblock. It turns out that VS2008 doesn’t like attaching a debugger to a VS2010 instance. If VS2010 is setup as the start-up application then an error is thrown when starting the project: “The debugger’s protocol is incompatible with the debuggee“. I’m guessing here, but I’d assume that the issue is that VS2010 has been compiled using .NET 4.0, and as VS2008 only supports up to .NET 3.5 the debugger is unable to correctly attach to a 4.0 managed process. I had hoped to keep this machine (the primary one I use for work) free of the full VS2010 components … however having hit this issue I’ve had to download and install the beta 2.

Anyway, the upshot is that installing VS2010 and attaching the debugger to another VS instance from there solved the problem.