Successfully upgraded TFS 2010 from Beta 2 to RC on EC2

Once again I have my TFS instance on EC2 up to date, and I’m able to connect to it successfully from my Visual Studio 2010 RC instance here at home. One word of advice though; once you install VS2010 RC, VS seems to crash if you then try to install Team Explorer 2010 Beta 2 so make sure your code is all checked in before the upgrade, otherwise won’t be able to make this available to anyone else on your team until the process is complete!

I was surprised that I didn’t run into a few issues as part of this process, but the upgrade was fairly painless – thanks mostly to the TFS 2010 Beta 2 to RC Upgrade Guide posted by Brian Krieger. I ran into a bit of trouble setting up SSRS and WSS again … but that was my own fault, not an issue with the upgrade process (I’d forgotten the credentials I’d used last time … L).

The good news though is that now seems to be working, including my old VS2010 Beta 2 solution files which opened without any issues or the need for an upgrade. 

I did come across a breaking change in the TFS APIs. It would seem that the TeamFoundationServer class no longer supports the AuthorisedIdentity property (in fact the TeamFoundationServer class has now been depreciated in favor of TeamProjectCollection). Changing the reference to use TeamProjectCollection.AuthorizedIdentity seems to have resolved the problem, and everything is building again.

Advertisements

Further Experiences with TFS 2010 Beta 2 on Amazon EC2

A few weeks ago I published a post containing my experiences trying to build a Team Foundation Server 2010 Beta 2 instance on Amazon Ec2. Since then I’ve used the instance I created a number of times, but not without struggles. Due to a number of reasons each time I wanted to use my TFS instance (to do productive work) I found that I had to fight with the configuration a fair bit in order to get all the windows services running properly.

So after battling with the TFS 2010 instance that I setup on Amazon Ec2 for about two weeks now, I’ve made some amendments to the build that seems to have stabilised the environment a little.

  • Disable the automatic generation of the computer name for new instancesWhile for a lot of EC2 usage scenarios it makes sense to create a new dynamic computer name for each instance created, for a lot of Microsoft server based products it causes a number of difficulties. Specifically, connection strings and hard coded URLs seem to be littered throughout configuration files and security settings. A colleague of mine recently hit the same problem in MS CRM, and I would expect to see similar issues in SharePoint, BizTalk Server etc.

    For Team Foundation Server 2010, the issues I hit was the database connection string for the TFS web services was set to the machine name by default – this one was easily fixed. The more significant issue I faced was that the SQL Server security settings were all based around windows users in the format {machinename}\{user}. The result was when I setup a new instance all the SQL Server security roles were based on the previous machine name and had to be reset before Team Foundation Server could connect.

    Thankfully Amazon has provided us with a relatively straight forward mechanism to force the same computer name to be maintained. There is a property in the config.xml file (C:\Program Files\Amazon\Ec2ConfigSetup) called Ec2SetComputerName. If this is set to disabled then the computer name set when the AMI is bundled will be maintained. An even easier mechanism (that I didn’t discover till much later) is the EC2 Config Service Settings (Ec2ConfigServicesettings.exe) utility in the same folder that wraps these configuration settings.

  • Turn off automatic sysprep of new instances. Even with Ec2SetComputerName disabled, if sysprep is enabled for the machine then a randomised instance name will still be assigned. Interestingly, this is of a different format to the normal instance names … but nonetheless this is still less than ideal for an instance we are going to want to create several times.

    Using the Ec2 Config Service Settings utility again, disable the Sysprep option from the Bundle tab. Under the covers, this will update the BundleConfig.xml file.

  • Setup the program / data Elastic Storage Block drive to be automatically assigned the appropriate drive letter based on its volume name.

    One of the other issues I have hit was getting the timing right for attaching the Elastic Storage Block (ESB) drive that I’m using for data when initiating a new instance of my Team Foundation Server AMI. If this is attached too soon then it may get automatically assigned to the D: drive. The result of this is quite severe;

    • None of the services will start, since SQL Server is expecting the data files to be on E: drive
    • The Windows page file is expecting to run on D: drive – so a large page file is suddenly loaded onto our relatively small data drive
    • The Ec2 instance storage (normally the D: drive) is not initialised and remains as an inactivate drive that needs to be manually mounted and formatted.

    Thankfully, Amazon has an answer for this issue as well. Provided we give our ESB drive an appropriate name (in my case, I’ve renamed it to DataVolume), we can use the same Ec2 Config Service Settings tool we used above to force the volume to use a specified drive letter when it is remounted in future.

  • Disable all drives on MagicDisc. Similar to the problem I encountered with the Elastic storage block drive being assigned to the D: drive, if MagicDisc is still enabled then this will get automatically assigned to the next available drive letter. As this is mounted before the instance storage, the first MagicDisc drive gets assigned to D: drive. Rather than uninstall completely, I’ve just disabled all drives before bundling the AMI (right click on the tray bar icon, select ‘set number of drives’ then ‘disable’).
  • Unrelated to the stability issues I was hitting, I’ve also installed the Visual Studio 2010 Express tools for Web Development. My next project is likely to be looking at setting up Team Foundation Server event subscriptions for changed work items, so having the development tools on the same box should be useful. One important observation though … of the 10GB hard disk made available by Ec2 … I’m now down to 1.5 free on C: drive … 😦

There is one issue I don’t have a fix for yet … and I’m not even sure that it’s an issue … but anecdotally I believe I’m also having trouble with the permissions assigned to the built-in Network Service account on my elastic storage block drive. A couple of times the SQL Server services have failed to start due to the Network Service account not having enough rights to the data directories. I’m going to keep an eye on this for a while, but for now I’m just re-assigning ‘modify’ rights whenever I come across this issue. I’m wondering if it’s worth using a local service account rather than Network Service … but it’s not causing me enough issues to want to rebuild the instance yet again (it’s been 5 times so far!!!).

I’m sure that there are still a few more idiosyncrasies that I need to iron out before I’m totally happy with this build … but at least I’m starting to feel a little more comfortable that when I start the instance I can use it productively within half an hour or so – rather than having to mess around with the configuration every time.

Setting up a TFS Development Environment on Amazon EC2

Background

I’ve been using the Amazon EC2 environment for a while now, mostly for basic development or testing environments. Since these environments have all been reproducible very easily we haven’t worried too much about getting them setup to resist an instance failure.

This week however, I’ve had three separate conversations regarding how we can get environments safely built for more complex applications:

  • Team Foundation Server Beta 2
  • Microsoft CRM / SharePoint Server Development Environments
  • Oracle SE1

In each of these cases, the environment is only required for a short time while developing or demonstrating … then they will be shut down again.

Each of these needs some form of database storage … and since this will be a development environment we don’t want to be paying $1 / hour for the Amazon SQL Server instance, given that we are entitled to use our development licenses for this purpose.

One serious note about EC2 Drives

Hard drives on EC2 are split into three different categories:

  • (C:) is the system drive, and is automatically attached when an EC2 instances. It should be noted that ONLY data on the C: is persisted when an instance is bundled!
  • (D:) is a temporal storage drive, and should be used strictly for short term storage, as this does not get persisted when an instance is bundled.
  • Additional Elastic Storage Blocks (ESB) may be added. ESBs are treated as additional hard drives, but they are stored within S3 and therefore will (usually) remain intact beyond the life of an EC2 server instance. As the capacity of the EC2 system drive is so low (10GB for a small instance) it is necessary to store all programs and data that need to be persisted within ESB drives.

I make this distinction right at the start of my post – because I got caught out by the fact that the D: drive does not get bundled … and had to start all over again 😦

Initial Setup

For this configuration we’re going to need the following setup on EC2

  • 1 x 10GB ESB for storing installers and ISOs
  • 1 x 5GB ESB for database
  • 1 x EC2 instance for downloading files
  • 1 x EC2 instance for Application installation

I’m working on the assumption that everything is installed on the same instance today – though obviously one could separate the various components of the installation, and use Amazon security groups as a way to restrict access and provide layers of security.

Preparation – Download the Installers

Maybe I’m being overly cautious, but I tend to setup a specific instance just for downloading the application installers and copying them to an Elastic Storage Block drive. This way any bloat that the download causes does not affect the size of the installed application AMI when we go to bundle it later.

Note: One of the gotcha’s I ran into was that the default setting for Internet Explorer restricts access to download some of the installation files that we need. Rather than messing around with the security settings for every new instance, I decided to download Firefox and run it straight off an ESB. This gets around the security issues, and allows Firefox to be easily used on other instances just by attaching the ESB. Instructions for setting up Firefox this way can be found here: http://www.articleworld.org/index.php/How_to_run_Firefox_from_an_USB_stick

To get started download installers for the following applications:

Copy each of these installers to an ESB drive, and then shut down the current EC2 instance.

Setup the Base AMI image

I decided to base my build on the “Amazon Public Images – Basic Microsoft Windows Server 2003” image, though in theory any of the Windows based images could be used. One note of caution though – if the author of the image is not Amazon, then ask yourself whether you trust the author not to have built in any rogue elements to the image.

Once the image is fully booted, add the new ESB drive (for storing the application database files) to the instance first, followed by the ESB containing the installers. The timing and order of adding these drives is important. If you load the new ESB drive before the instance is fully booted then the instance gets a little confused and loads the ESB as D: rather than E: – causing inconsistencies later when we create new instances from the bundled AMI.

After formatting and installing the new ESBs, the instance setup should look something like this:

  • (C:) System drive containing the OS
  • (D:) 160GB Blank instance drive
  • (E:) 10GB+ Blank ESB drive for storing data
  • (G:) 10GB ESB drive containing our installation media

Other components that need to be installed before going any further are:

  • IIS
  • ASP.NET

In order to install the additional windows components, it is necessary to mount the Windows 2003 media. The following article contains a list of snapshots that can be used to create Elastic Storage Block volumes containing the windows installation disks:

Note: At this stage I’d recommend changing the Administrator password to something that you will remember next time … I had to restart my installation after finding that a bundled AMI does not seem to regenerate new passwords for instances 😦

Installing SQL Server

On our new EC2 instance install MagicDisc (or some other ISO mounting tool) and mount the SQL Server installers.

The key configuration variations from the defaults were as follows:

  • Setup the data directories for the SQL Server Instance to be stored on the E: drive.
  • Ensure that all SQL Server features you require are installed. Since this particular install is for TFS 2010, I’ve enabled Full Text Search, Analysis Services and Reporting Services.
  • Ensure that the SQL Server management studio is installed.

After the installation is complete, it pays to double check that all of the SQL Server folders on the E: drive have been assigned the correct permissions. I found with my install that I had to give the Network Service account some additional privileges in order to get the SQL Server services to start.

Installing Team Foundation Server 2010 Beta 2 (Pre-Configuration)

Installing Team Foundation Server 2010 has been described in many blog posts before, so I’m limiting the description in this post to the specifics of getting the install working on EC2.

Installation of TFS2010 is done in two steps; firstly the installer is run to setup the base application files, then the configuration tool is used to setup Team Foundation Server for use.

  1. Mount the Team Foundation Server 2010 Beta 2 ISO image
  2. Set the install location for the Features to the E: drive

Note: At this stage I’d recommend bundling an AMI from our progress so far. The configuration step for TFS (coming up next) can be a bit finicky … so having an easy rollback option is not a bad idea. If you do bundle the AMI at this stage, be sure to reload the ESB drives in the same order as last time.

Configuring Team Foundation Server

  1. Create a local service account to run the Team Foundation Service. I’ve called mine TFSServiceAccount.
  2. Start the Team Foundation Server Administration Console
  3. Enter the local user account setup in step 1 (TFS Service Account) to run the Team Foundation Server services.
  4. Select whether you are after a Basic or a Standard configuration scenario. Either way, all the defaults should be sufficient to get the installation working – though I did find I had to spend a bit of time sorting out IIS and the Reporting Server (since I’d make the mistake of installing SQL Server before IIS).

Setting up SQL Server Backups

Even though the SQL Server databases are being stored on Elastic Storage Block drives – there is still a chance that the ESB itself will fail. Don’t get me wrong … ESB is much safer than storing the data directly on the instance storage – but there is still a risk that it can be irretrievably lost. The following article by Michael Friss discusses the different backup strategies for SQL Server on an EC2 instance – and also includes a good PowerShell script for performing the backup and saving it to S3.

Any Comments …

I have to admit; I hit a number of issues performing this setup and had to restart the overall process three times. I’m sure there may be better ways to setup some of these features … especially the SQL Server components. I’d like to hear any feedback you’ve got, or what works well – so please leave a comment if you have any thoughts.