Monday, December 08, 2008

Excluding Built Types from Alerts

Friday I got a question from one of my colleagues asking how to exclude a particular Build Type from their Alerts. I knew it could be done, but wasn't sure exactly what the proper filter would be. Thanks to Jason Prickett's blog post for some general guidance, I think we figured it out.

The easiest way to do this is to use Power Tools "Alert" interface which shows up under a Team Project in Visual Studio's Team Explorer. Simply add a 'DefinitionPath = '\TeamProject\TeamBuildType' clause. Add Or clauses if you have more than one Build Type in a Team Project you want to grab.

For those of you using Teamprise, you can install the free Visual Studio Team Explorer and use the command line tool called BisSubscribe. You can use BisSubscribe from the command line to run something similar to this:

bissubscribe.exe /eventType BuildCompletionEvent2 /server http://tfs.mycompany.com:8080 /address firstname.lastname@mycompany.com /deliveryType EmailHtml /filter "\"TeamProject\" = 'TeamProject' AND "DefinitionPath" = '\TeamProject\TeamBuildType'"


Hope this helps.

Tuesday, December 02, 2008

Using Command Line to search for Changeset comments.

Question: Mac, how can we search Changeset comments, recursively, for comments matching a certain string (e.g. "refactoring").

My Answer: I would just use the command line client and pipe the output to a string searcher.

Windows: tf history /server:http://tfs.mycompany.com:8080 "$/Team Project/Development" /noprompt /recursive | findstr refactoring

Linux: tf history /server:http://tfs.mycompany.com:8080 "$/Team Project/Development" /recursive | grep refactoring

*Linux is using Teamprise's Command Line Client software.

TF214007: No build was found with the URI

After making some changes to our build machine, we started getting the following error. I got asked to look into the issue and became immediately confused.

Exception Message: TF214007: No build was found with the URI vstfs:///Build/Build/14365. Either the URI does not exist, or TLR\svcTFSService does not have permission to access it. (type BuildNotFoundForUriException)

I was able to log onto the build machine and hit the PROD URL for build 14365 just fine. So I knew it couldn't be permissions. What was it? I said while scratching my head.

Six months back we migrated our TFS instance from DEV hardware to PROD data center hardware. As part of that process, we updated the DEV's Instance IDs so we didn't run into conflicts when trying to hit either site (we continue to use DEV for testing).

While any new connection to DEV would get the new Instance IDs, the legacy connections on our build servers still had the cached version of DEV's Instance IDs, which just so happen to be our PROD Instance IDs (as expected since we migrated from DEV to PROD).

So what was happing was, when someone was making changes to the build machine, they opened Visual Studio and got connected to DEV by mistake. DEV, using the cached Instance IDs (again which were PROD's), became the active TFS server somehow. So when team Build was firing off, it was actually hitting DEV, which obviously didn’t have a build URI 14365. Thus the error.

To fix this, we deleted the TFS cache on the build server. Then to test it, we purposely logged into DEV and checked our local cache. Sure enough, the new DEV entry in the cache had the new Instance IDs. All was well after that.

Wednesday, November 26, 2008

Check in comment to dynamically enable or disable CI builds.

I just got done adding in some custom MSBuild changes to automatically check in a Jar file to another build type (i.e. MainBuild in this example) after the Jar file build type (i.e. CommonBuild in this example) is finished.

Buck showed me a really cool way to "enable" or "disable" continuous integration (CI), which is turned on for us, in the MainBuild. That is, when CommonBuild checks in the Jar file to MainBuild, we can control whether we want to fire off a MainBuild.

First we set a Property called ExecuteMainBuildCIBuild to true|false. If true, we set the check in comment equal to /comment:"AUTO CHECKIN FROM TFS FOR COMMON JAR BUILT IN $(BuildNumber)". If false, we set the check in comment to /comment:"$(NoCICheckinComment)". $(NoCICheckinComment) resolves to the string ***NO_CI*** which tells Team Build to not fire a CI build.

End users can also use ***NO_CI*** if they want to control the CI build behavior. For example, if you're wishing to check in a change and not execute a CI build, you can simply put ***NO_CI*** in the Comments section. I'm not sure if management will enjoy us sharing this back door, however I've found it to be a valuable tool for solving some unique problems.

Friday, November 21, 2008

Update on our TFS adoption

Hello again! Yes, I'm still alive and kicking. Sorry for not writing any updates lately. I've been working on some other things outside of TFS. In addition, I've got a number of colleagues who are taking over the primary role as TFS administrators. I still get involved at times, but it's good to see others taking an interest in supporting the product.

I do have to share with you some tough news about TFS. TFS is kind of taking a public relations beating in my new group. In fact, so much so, there has been a "TFS Improvement" team created to look at changes we can make to both, our use of the tool, and possible suggestions for the vendor on how to make TFS better.

I'm a bit confused on where the movement is coming from, as I'm not part of the "improvement team," however, in reviewing the meeting notes, it looks like Work Items used to support an Agile development effort are a primary target. Here is a sampling of some of the complaints.

- Need a hierarchical representation of work items that shows partial completion
- Enable users to find and track dependencies more easily
- Need to map work items to current priorities, in/out list
- Need to make it easier to set up alerts on work items for yourself and others
- People should be able to create a Team Query
- TFS should auto-complete stories when all tasks are marked as completed. Or, depending on workflow, it should auto-magically assign the story to the business partner for final validation
- TFS should do the same to backlog stories when the associated tech stories in the work stream are completed


Looking through this list, I think there are many things we could do to help solve these, somewhat vague, issues people are complaining about. For example, Alerting is now enhanced with the new Power Tools. And I see TSWA also has a way to setup Alerts. This may have come from the Teamprise people who still are having some trouble with Alerting. We've showed them the TSWA and I think that is helping. The Team Query thing is simply a permission deal that we can work out internally.

However there is one primary feature that I think would solve a number of problems we're having. And that is a hierarchy that can perform "Actions" when parents or children are updated. For example, you can see the two bottom "ideas" (from a former Version One user who recently joined our company) are regarding a parent Work Item State action to take place when Child Task(s) are completed/updated. This, very vocal, person would also like to see Field roll-ups. That is, if the Child Tasks had "Hours Worked On Task," those values should be rolled up into the Parent Work Item so you can see the rolled up values of "Hours Worked On" for the Child Tasks of the Parent (i.e. Product Backlog Item in our case).

We'll see where this goes. There is a lot of movement towards looking at Version One right now. And I see they have a TFS interface so I'm sure that will be talked about at the management ranks. Personally, I like what we have and wouldn't suggest a change. However, I often think decisions get made because a few squeaky wheels start to complain. I guess we'll see what comes of this all. And hopefully Rosario will provide a number of nice new features for Work Item tracking. Personally, I think it will.

Friday, October 24, 2008

Team Companion

I'm actually on vacation today, but wanted to share this. So the business and management folks don't like to use an IDE when using TFS. Turns out even going to TSWA is a struggle for some.

My colleague was doing some digging and came across Team Companion, which is a TFS plug-in to Outlook. We've been piloting it for a few weeks. The users most likely to never use TFS, have been enthralled with having Work Items within their Outlook. From what I hear, they can't say enough great things about it. We're actually getting people interested in normalizing the paper work behind software development.

If you're having a hard time getting management and business folks working in TFS, give Team Companion a try. With TFS being embedded in the only software package they use (i.e. Outlook), TFS becomes even more critical in our business process.

Friday, October 10, 2008

"Mac, how do I compile Java with Team Build?"

In the past week, I've fielded a number of questions from people within my company regarding how to get started compiling Ant based Java projects within TFS. Here is what I tell them. And it would be what I'd tell you if you asked.

My Colleagues: "Mac, can you provide us some information on how to get started using Team Build to compile our Java projects?"

Mac: "There are a number of different ways to solve the problem, but I'd suggest looking at Teamprise's Team Build Extensions. This is an open source project, supported by Teamprise, to run Ant scripts via Team Build (which is nothing more than MSBuild behind the covers). They have a very good tutorial (as part of the distribution if I remember) and support forum.

I'd start by trying to get one of your simpler Java projects to build. Then once you build that, you can get a better understanding of how the tooling can best solve your problem. What we do is have one parent Ant script that calls all the other tier Ant scripts. So basically Team Build is just the high-level driver and we let Team Build Extensions and Ant take care of the rest. This seems to work pretty well for both [MY CURRENT BUSINESS UNIT] and [MY OLD BUSINESS UNIT]. Also, I think if you have JUnit tests that you run as part of the build, and follow some simple rules, Teamprise Build Extensions will publish the results back to TFS. Pretty cool!
"


So if you're looking for help getting started, consider this my response to you. Let me know how it goes.

Wednesday, October 01, 2008

Be careful with the TFS 2008 Retention Policy

Be careful with the TFS 2008 Retention Policy.  With TFS 2008, when you set the Retention Policy to remove old builds, it also removes the Label on the source code.  Bad!  In SP1 Microsoft allows for a configuration override to change the behavior of this (i.e. <add key="PreserveLabelsOnBuildDeletion" value="True"/>).  However, since we’re only on TFS 2008 right now, we just got burnt by this.

What happen was, we needed to branch off a build that was in our customer’s hands.  I informed my colleague to simply branch off that build and let the developers make the appropriate fix.  However, when he looked for the Label he couldn’t find it.  I told him that was simply not true.  Then I read the post Retention Policy and missing labels and started to get very, very nervous.

We ended up with a work around that helped us solve our problem.  We have a Build report that shows all builds and when they ran.  We were able to find the build number and then using the time that it started execution, find the Changeset which was before the build initiation.  That allowed us to track down where we needed to branch the code from.

Thursday, September 25, 2008

Manually Adding Files to Pending changes in Teamprise's Plug-in

There have been a few times recently when users have added files in Eclipse, but the files were not picked up by Teamprise's Eclipse Plug-in as a Pending Change.  If this happens, the new file will have a "?" on it.  It's one of those incidents that we can't seem to reproduce, but happens ever so often.  It also seems to happen to a small subset of users.

If you run into this problem, a work around is to manually add the files.  I'm doing this from memory so the steps might be slightly off.  However, I think it's close.

  1. Show the Team Explorer Panel in Eclipse.  Window > View > Teamprise > Team Explorer.
  2. Make sure your Team Project is selected and expand it.  There you will see Source Control.
  3. Double click on Source Control.
  4. In Source Control, path yourself down to the correct directory where the files were indented to be added.
  5. Right Click on the directory you're wanting to add the file to and say "Add Files to Source Control" (or something like that).
  6. Now select the files that were not added by the plug-in.
  7. Back in Eclipse's Package Explorer (or what ever view you're using), Right Click > Team > Synchronize.
  8. The Synchronize command should now make the "?" files show up as "+" files and thus added to your Pending Changes.

Hope this helps anyone who runs into a similar situation.

Tuesday, September 09, 2008

<UpdateBuildNumberDropLocation and NFS File Systems

Our current Team Build drop location is a Common Internet File System (CIFS) mount to some network attached storage (NAS).  To a novice like me, the drop location looks just like a Windows server drive that is shared.  However, on the back end this CIFS mount has all the bells and whistles that the data center offers (e.g. backups, redundancy, etc.).

The issue we have with the CIFS mount is that while our Windows servers can pick code on their for deployment, our Linux boxes can’t see the CIFS mount.  Since a lot of our development is done in Java and runs on Linux, we have an issue.

Enter the fine storage and system admin teams who have tested the use of Network File System (NFS), as a replacement for the CIFS mount, which is accessible by both Windows and Linux servers.  We’ve tested the NFS mount and sure enough, both our Windows servers and Linux servers can mount to the same drop location and pick up the build output used for deployment.  Perfect right?  Unfortunately there is a small issue.

According to this post, Team Build’s <UpdateBuildNumberDropLocation Task has code in it which needs to update permissions on the drop location.  From what I understand, the code will change the permissions so the App Tier service account has access to the drop location (it needs access to delete builds and such).  To see this, Right Click on a file located on the drop location and notice how the App Tier service account has full access to the folder and any sub files or directories.  The <UpdateBuildNumberDropLocation is doing this.

The issue that I’m running into is, when the drop location is a NFS system, there are no permissions at the file or directory level.  For Windows, the permissions are at the share level.  I’m assuming Linux permissions are dependent on wether the mount is a write or read mount.  So when the <UpdateBuildNumberDropLocation Task runs, it gets a message back saying permissions can’t be set and thus throws this friendly message back to the user: “TF209025: The build process is unable to set the permissions on the drop directory <actual drop location>(Detail Message: Attempted to perform an unauthorized operation.). Make sure that the build service account has proper permissions on the build drop directory and try again.”

What’s interesting is, the share is wide open thus the NFS mount is wide open.  And Team Build actually writes files to the location.  So functionally everything should work fine if we could get past the feature in the <UpdateBuildNumberDropLocation Task where it tries to adjust permissions.

I’m not sure what a good solution is for this.  However, I think the current implementation needs adjusting.  In my Connect submission to Microsoft, I’ve asked them to provide a Property (e.g. UpdatePermissions=”false”) where you can turn off the permission change when you want to use a NFS mount for your drop location.

Team Explorer "did not load because of previous errors"

My colleague installed TFS 2008 SP1 on our app tier.  He then uninstalled it because we were having some issues, which we're tracking down. However, after uninstalling, we were unable to open Team Explorer due to the following error.

The Microsoft.TeamFoundation.Client.ServicesHostPackage, Microsoft.VisualStudio.TeamFoundation.TeamExplorer, Version=9.0.0.0, Culture=neutral, PublicKeyToken=b03f5f7f11d50a3a ({8E01EC3E-2928-4AA5-B720-E28C163818E6}) did not load because of previous errors. For assistance, contact the package vendor. To attempt to load this package again, type 'devenv /resetskippkgs' at the command prompt.

I'm not sure what /resetskippkgs does, but I ran it and Team Explorer is now back to being functional.  If you run into a similar problem, I’d be interested to see if this /resetskippkgs works for you.

Friday, September 05, 2008

"Find in Source Control"

It's fall which means all the interns from MIT, University of Wisconsin and University of Minnesota (I'm an alumni) head back to school.  It's also time for us full time employees to pick up the messes they left.

For me, their messes usually include having thousands of files checked out in TFS (or VSS back when we used that).  In the past, I just deleted their workspace from a command line.  That works pretty well and to be honest, should be the way we do this.

However, my colleague showed me a really cool feature called "Find in Source Control" where you can search on a person's ID to see all their pending changes.  Then you can undo them from there from a nice gui.  Works great when you've got some lead engineer breathing down your neck because they can't update source code because some intern had an exclusive lock on the file.  Nice feature Microsoft.  Keep up the good work.

Friday, August 29, 2008

HintPath and Team Build

Very long week so I've got to keep this short. For some reason when a developer added a Reference to his csproj file it didn't add a <HintPath. The build worked locally and it worked on the build machine when we ran the build manually through Visual Studio. However, when trying to run the build via Team Build, we got the error "Could not resolve this reference. Could not locate the assembly." We could see that when Csc.exe was called, our Reference was not referenced.

The fix was to add a <HintPath to the csproj file. After doing that, Team Build ran fine and we could see it added to the Csc.exe Reference command line. I didn't get time to figure out why, but thought I'd document it here for when it happens again.

Tuesday, August 12, 2008

Source Control Cache Hits on our server.

I enjoy reading Grant's blog. In this Cache Hits post, he shows how to get your Cache Hit Ratio.

We're currently hitting our local Source Control cache 96.06% of the time. What are your numbers? I'm interesting in comparing.

Monday, August 11, 2008

Using $(WebProjectOutputDir) in the *.csproj file's PostBuildEvent

A colleague of mine wanted to add a PostBuildEvent in their *.csproj file which would run a custom executable to create generated java script files.

The change (see BEFORE below) worked locally as the path they wanted the generated was $(ProjectDir) which mapped to the .\Sources directory on the build machine and their locally directory within their IDE. The issue was, we needed the generated files to be under .\Binaries on the build machine which didn't exist on their machine. More specifically, we needed this generated file created under $(OutDir)_PublishedWebsites\$(MSBuildProjectName) on the build machine which translates to .\Binaries\_PublishedWebsites\$(MSBuildProjectName).

What we ended up doing was, instead of using the $(Outdir) property, we used the $(WebProjectOutputDir) which was set in the $(MSBuildExtensionsPath)\Microsoft\VisualStudio\v9.0\WebApplications\Microsoft.WebApplication.targets file. Using this value allows the output directories to be correctly set on both the build machine (where we want to use .\Binaries) and $(ProjectDir) where we want to use the locally structure.

BEFORE:
"$(TargetDir)JavascriptFromMvcRoutes.exe" "$(TargetDir)$(TargetFileName)" "$(ProjectDir)js\Generated.js"

AFTER:
"$(TargetDir)JavascriptFromMvcRoutes.exe" "$(TargetDir)$(TargetFileName)" "$(WebProjectOutputDir)js\Generated.js"

Thursday, August 07, 2008

TSWA Profile > Options Not Staying

A colleague of mine tracked down an issue where all users' TSWA Profile > Options were not staying after they closed the browser. For example, a user would change their Theme to Olive, but after they closed their browser it would default back to the default.

We're not sure if this is the right fix, but we had to change permissions on the D:\Program Files\Microsoft Visual Studio 2008 Team System Web Access\Web folder so that "Users" had Modify permissions. Users are now able to change their Options and they stick.

Friday, August 01, 2008

Question Marks "?" on files in Teamprise

A colleague of mine summoned me over to his desk. He had question marks "?" on all the files he checked out from the Teamprise Plug-in and was wondering why.

Typically I see files with question marks on them when the user has changed the file permissions without checking the file out. Teamprise, not knowing what the status of the file is (as I would expect), labels the file with a small question mark indicating that something has changed, but its not sure what. Or something like that.

However, in this case the user had promised me the files were checked out. He had the Pending Changes to prove it.

I did a quick search and after digging through a couple of my old posts, I found Martin's post Teamprise V2 Preview 2 where he discusses better off-line support. It's here he states "When you next connect, all the read/write files will have a little question mark on them inside Eclipse."

Now the user is using Teamprise 3.0, but my thought is this "question mark" behavior was not changed until 3.1, which by the way is what I’m using. He's been having a number of network issues so my guess is that his network connection was dropped thus resulting in Teamprise 3.0 to go into "off-line" mode. Once he was back online, he would have needed to run "Team > Synchronize" to synch back up with the server. This is exactly what we did and the question marks were changed back into check-marks indicating check out.

I couldn't reproduce this as I think Teamprise changed the behavior in 3.1. If I get sometime next week, I'm going to rollback my Teamprise version to 3.0, check out some files, drop my network connection and see if I can reproduce the error. Or more likely, simply upgrade him to 3.1!

Thursday, July 24, 2008

Bug in TFS 2008: "TF50605: There was an error looking up the SID for "

Here is a Bug we found in TFS 2008. At least I think it's a Bug. It happens when you try to delete (or perform other workspace actions) on a workspace owned by a user who no longer is in Active Directory.

1) Create an AD account and use it to log into TFS 2008, create workspaces and check out files.

2) Then, leaving some files checked out such, delete the account in AD. This is what happens when someone leaves your company.

3) Go back to TFS and try to delete their workspace. E.g. tf workspace /delete /server:http://tfs.int.mycompany.com:8080 Z055638-XPA;Z055638 *where Z055638 is the user who left the company.

4) You should get the error "TF50605: There was an error looking up the SID for Z055638" which to me means since the account can't be found in AD (the user left the company) you can't delete their workspace and thus can't "undo" any files they have checked out.

To resolve this, we had to manually update the TfsVersionControl, Tbl_Workspace table. Cameron Vetter's post gave us the idea. We changed the OwerId field to a valid user for that workspace. The we were able to delete it.

Not ideal, but worked for us. I've submitted this to Connect as well.

"Waiting for version updates to finish..."

Last week we noticed a colleague's Teamprise hanging at the end of a Get Latest Version. The message was something along the line of "Waiting for version updates to finish...". Unfortunately on my local machine, everything worked as expected. Typical I guess.

Anyway, I was playing around with the VPC image developers (and the person having trouble) use and noticed that *.tfs.mycompany.com* was not listed in the Eclipse proxy exclusion list. It is my belief, if *.tfs.mycompany.com* is not added, you'll also get the strange "Waiting for version updates to finish..." behavior, and possibly other network related issues. In addition, Teamprise has a good article titled TKB00021 Diagnosing Common Connection Problems that also provides help with connection issues.

To properly set your proxy settings in Eclipse, you can go to Window > Preferences > General > Network Connections and add *.tfs.mycompany.com* to your "No Proxy for:" host list. There is also a short cut in Teamprise if you hit a wall during the initial Import.

Wednesday, July 23, 2008

Why some files don't get pushed to TFS drop location.

Yesterday I got a question about why *.dat files were not being added to the ./Binaries folder during a Team Build, and thus not pushed to the TFS drop location. The *.dat files are used by a third party software we use to track browser requests.

After digging around a bit, we found that the *.dat files were added to the .NET 3.5 C# web project with the element <None. Other files part of this third party were added as <Content.

...
<Content Include="Views\Browse\ItemControl.ascx" />
<None Include="bin\bhawk_sp.dat" />
...

A very intellectual developer in our group (not me) found out that the *.dat files were added with "Build Action" defaulted to None. To change this, I believe he Right Clicked on the file > Properties > and changed the Build Action to Content.



Now the *.dat files get copied over to the ./Binaries directory and pushed to the Drop Location.

Friday, July 11, 2008

Analysis Services hung on Data Tier

While looking at our TFS 2008 Data Tier today I noticed that the msmdsrv.exe resource (i.e. Analysis Services) was using 99% of the Data Tier CPU. I knew that when the OLAP processing happened, the Data Tier process would spike. But this spike was consistent. Something was wrong.

When running the GetWarehouseStatus web service (i.e. http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx?op=GetWarehouseStatus) we got "ProcessingOlap". Typically this takes a minute or two for us. Oddlly, we were getting this status for over an hour (though I'm sure it was happening for a while longer than that).

We figured the process was hung and wanted to restart it. We have about 150 users on the system so we didn’t want to recycle IIS as it would drop users. So instead of recycling IIS on the App Tier, we shut down the Analysis Service on the Data Tier. This brought the Data Tier CPU usage back to normal.

Then to test things out, we invoked the data warehouse update (using the Run web service). When the OLAP processing ran, the Data Tier process neared 100% a number of times (as expected). After about a minute, the process finished and the Data Tier was back in good standing.

We're not sure what to take away from this, but it's something we'll have to monitor.

Tuesday, July 01, 2008

Odd Security issue on Windows 2008 Server running IIS 7.0

To be honest, I can't make sense of this error or it's resolution. However, I wanted to write it down incase we ever run across it again.

My project team has the first Windows 2008 servers in the data center. This means that if something goes wrong, everyone gets to stair back at us saying "why in the heck did you guys adopt this new OS so fast." And true to form, something did go wrong. We started getting the error "The server encountered an error processing the request. The exception message is 'Access is denied.'" when performing some actions (of which I'm not 100% sure as another sub-group wrote the code) in a Web Service running under IIS 7.0.

We knew this had to be some kind if security issue so we went through the standard steps. Oddly enough, our Data Center contact (who is a Microsoft Wizard) stumbled on a resolution where he added "Network Service" to the local Administrators group (this was a last resort). As we kind of expected, this resolved the error. He then removed "Network Service," as having it in the local Administrator group would be stupid, and rebooted. We expected the error to come back. Nope, that seemed to fix "it." We repeated this on one other machine and magically it fixed it as well.

We're not sure why this works. We continue to laugh about it and just hang our heads low.

The server encountered an error processing the request. The exception message is 'Access
is denied.'. See server logs for more details. The exception stack trace is:
at System.ServiceModel.Dispatcher.SyncMethodInvoker.Invoke(Object instance, Object
[] inputs, Object[]& outputs)
at System.ServiceModel.Dispatcher.DispatchOperationRuntime.InvokeBegin(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage5(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage4(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage3(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage2(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.ImmutableDispatchRuntime.ProcessMessage1(MessageRpc& rpc)
at System.ServiceModel.Dispatcher.MessageRpc.Process(Boolean isOperationContextSet)

Monday, June 30, 2008

Bug in TfsAdminUtil Status

While practicing hardware moves for both the App Tier and Data Tier, we're making use of the TfsAdminUtil command. We've had some issues with this tool. Mainly around really bad error message reporting. It seems like you can get the error "ERROR: TF55020: Could not access database." for any number of reasons. No other help is provided (e.g. "Check to make sure the Team Foundation Server App Pools and Site are up and running"). I'll try to gather some of the possible causes and solutions when I get some more time.

While trying to track down the root cause of this error, we made use of the "Status" switch. Below is what came back. You can see that Status says no Windows services are running as our TFSReports account. However, in the window you can see that "SQL Server Reporting Services" most certainly is running as TFSReports. I think this might be a bug in the tool, which given the poor error message handling, would not surprise me.

Friday, June 20, 2008

Ant and TFS 2008

Yesterday estrada pinged me on how we build Java projects in TFS 2008. If you remember, I wrote an article a while back (about a year ago) on how we did it with TFS 2005.

My current group has written a custom MSBuild task to call Ant scripts. I would prefer to use Teamprise Build Extensions, but I joined this group late and they had already had much of the infrastructure written. We do some custom things like write writing Junit tests back to TFS. I don't believe Teamprise Build Extensions has that functionality though I could be wrong.

Anyway, my recommendation for calling Ant scripts with TFS 2008 would be to use Teamprise Build Extensions. The setup instructions are great and it's a nice, easy, way to get setup and running.

Let me know how it goes!

Wednesday, June 11, 2008

"Windows SharePoint Services version 2 templates are not supported in this version of the product."

Today a trusted colleague of mine tried to create a new Team Project in TFS 2008 using the eScrum process templates. We've created around 10 of them in the past with the eScrum process template and everything went fine. Today however, which happened to be a day I'm off sick due to a nasty cold, we get the error "Windows SharePoint Services version 2 templates are not supported in this version of the product." Now why we're getting this now and not in the past, I'm not sure. Something must have changed, or the eScrum template got corrupt on the server. Not sure which is more probable.

After pounding my head against the wall, I ran into "Mike's Blog" (a great read if you have some time) where he gives one solution to our error. Here is my summary.

1) Did all the steps under Step 1), except bounced IIS, of Mike's article. Here were my commands when deleting and uploading the site template.

stsadm -o deletetemplate -title eScrum
stsadm -o addtemplate -filename C:\eScrum.stp -title eScrum

2) I also downloaded the eScum process template, saved the eScrum.stp to .eScrum\SharepointTemplate overwriting the one loaded, and uploaded the new process template labeled as "eScrum with WSS 3.0". To be honest, I'm not sure if this is needed but I did it anyways.

3) Tried to create Team Project called "Vertical Static Content" with new upload I called "eScrum with WSS 3.0" and it worked!

NOTES: During some experimentation, we got this error ""Plugin error text: “TF30272: Template not found on the server”" when trying to create a Team Project after deleting the eScrum template and before adding it back again. This leads me to believe there is something, other than what's in the Process Template, which needs to be on the server when creating Team Projects. Why? I can't understand.

Monday, June 02, 2008

Changing TFS Build Agent to look at a different TFS Server

Just because I continue to forget; if you need to change the TFS Server for your TFS Build Agent, change the add key="AllowedTeamServer" value="http://tfs.mycompany.com:8080/" element in the C:\Program Files\Microsoft Visual Studio 9.0\Common7\IDE\PrivateAssemblies\tfsbuildservice.exe.config file. Then bounce the TFS Build Agent via Services.

We have seen some instances where this change does not take affect right away. We even bounced the agent more than once with no affect. Oddly enough, when we got back from lunch everything was working. My assumption is that the agent uses a cache and that was not getting refreshed. Either that or heading to lunch fixes most everything ;).

CSS change for rendering Reporting Services correctly in FireFox

Our FireFox users are having an issue with Reports rendering in a small scroll like box. They can only see the first two inches of a report and must scroll down to see the rest.

From what I can tell, a number of people are having this issue. If you search for "Reporting Services in FireFox" the first hit you get is Jon Galloway's post where he describes making the following CSS change.

/* CUSTOM CHANGE BY <My Company Name>: Fix report IFRAME height for Firefox */

.DocMapAndReportFrame {
min-height: 860px;
}

I ran a few test reports after making this change and I think it's going to work for us for now.

Thursday, May 29, 2008

Adding an Organization Hierarchy in Team Build


We really like Team Build. Especially Team Build 2008 which has the continuous integration ability. However any good product can always be improved.

To the left you'll see a screen shoot of one of our Team Projects and the builds under it. You can't see them all but there are 42 build definitions setup. Having them all in a flat view can be a bit overwhelming. Adding the ability to group these build types would be very cool. For example, the DownloadServlet builds might be under a folder called \Team Builds\DownloadServlet. Builds that run the Unit tests could be grouped as well.

The feedback I'm expecting is, why didn't we just break these out into separate Team Projects. Separate Team Projects would build the organization that I'm suggesting. While valid, we have hundreds if not thousands of "applications" to support. Most of them are supported by sub-groups in the company. For example, the applications behind these build types are all worked on by one group. We find it easier to group the applications together in one Team Project instead of all having autonomous Team Projects. We do the same for other groups and it works very well.

What are your thoughts? Would adding the ability to "group" build types be advantageous for other teams?

Tuesday, May 27, 2008

Bounce IIS after re-stamping your TFS Databases with InstanceInfo.exe

We saw an interesting scenario today after restoring our TFS PROD instance to a TFS QA environment for testing. Even though we changed the Instance ID for each database (see here for instructions), the App Tier was still rendering the old Instance ID for the TfsVersion control database. This was causing the Source Control view to continue to look at the PROD server even though it said "QA". All other databases were rendering with the new Instance ID.

If you're not familiar with the Instance ID, let me take a second to explain it. The Instance ID is basically a GUID (globally unique identifier) for the TFS databases. While you may refer to the TFS Server as tfs.mycompany.com, your client software (e.g. Team Explorer) prefers to use the GUID.

From what Grant Holiday says, basically what happens is your client will send the URL (e.g. tfs.mycompany.com_ to the server's Registration.asmx web service to get the GUID (e.g. 2dbgh947-049b-7z6c-94y2-4b0767ggf790). Once the client has the GUID it will use that for future communication.

In our case, we properly changed the GUID using the InstanceInfo.exe utility and directions in this forum. However, the App Tier must have been caching the TfsVersionControl GUID as both PROD and QA were responding with the same GUID name, even though the database GUID was restamped. Here are the PROD and QA sections of our VersionControl.config file stored under %USERPROFILE%\Local Settings\Application Data\Microsoft\Team Foundation\2.0\Cache. Interesting enough, the repositoryGuid would stay the same, but the uri would change when we switched TFS Servers. Bad!!!

ServerInfo uri="http://tfsqa.mycompany.com:8080/" repositoryGuid="2dbgh947-049b-7z6c-94y2-4b0767ggf790"
ServerInfo uri="http://tfsprod.mycompany.com:8080/" repositoryGuid="2dbgh947-049b-7z6c-94y2-4b0767ggf790"

The fix was to bounce IIS on the QA App Tier. This reset the App Tier cache and thus resolved our issue. The lesson learned here is to bounce IIS after you re-stamp your TFS databases with new GUIDs.

Thursday, May 15, 2008

Team Build's 'Get' always writes files with the current date

We use Team Build for all of our builds including the packaging of our Static Content application. When Team Build does a 'Get' of Sources it writes the files with the creation/modify date of the current build time. For example if I checked in a file (e.g. Mac.jpeg) yesterday (5/14/08), but didn't run a Team Build until today, the file date would be today (5/15/08). For most things this is not an issue, but for Static Content it is.

The reason having the current date/time on Static Content is an issue is because of the way the browser caches things. When a request is made to our site(s), the browser will see that we have images (e.g. Mac.jpeg). Before grabbing the image from the web server the browser will check its local cache. If the image in the local cache has the same modify date as the image on the web server, the image is not requested, thus improving performance. However, if the local cached version of the image file is older than the web server's version, the new image is sent to the requested user.

The issue is, because Team Build always writes the files with the current date (e.g. 5/15/08), when we deploy them to the Static Content servers they all look like updated image files. This is bad as it invalidates ever user's cache and thus slows down our application as every image is sent to the user.

I've read a bunch of things on this and from what I've heard, the TFS Product Team might be looking into a fix for this. I've dropped our local Microsoft rep a note asking him if he had any details about that. I've also found this post from Cory Foy which looks like a custom task we might be able to write in order to get the behavior we need. If I get any word back from my Microsoft contact, I'll let everyone know. In the mean time, we are looking into Cory Foy's approach.

I'm also going to drop a note to the good folks over at Teamprise to get their input. We're using their Ant interfaces to TFS for some of our non-Microsoft builds and thus could have a similar issue.

Thursday, May 08, 2008

Bug in the TFS 2008 Uninstall on Standby App Tier

A colleague of mine, who has taken over many of my former TFS duties, ran into an interesting problem today. For reasons I won't get into, we were trying to uninstall TFS 2008 from our Standby App Tier.

The error we ran into was as follows. We found this error in the VSMsiLog which is created for installs and uninstalls.

05/08/08 11:35:14 DDSet_Status: --- STATUS: Found Reports.ReportsService=http://tfs.mycompany.com/ReportServer/ReportService.asmx
05/08/08 11:35:14 DDSet_Status: --- STATUS: Writing VSTF_RS_SERVER=tfs.mycompany.com into C:\Documents and Settings\appwesttfnqasetup\Local Settings\Temp\TfsCurrentConfig.ini section Config
05/08/08 11:35:16 DDSet_Error: *** ERROR: Failed to call WMI on the RS server. The most likely cause is that the firewall is blocking WMI calls or that the RS server is not reachable: The RPC server is unavailable. (Exception from HRESULT: 0x800706BA)
05/08/08 11:35:16 DDSet_Status: Process exited with exit code: 16

05/08/08 11:35:16 DDSet_Error: GetCurrentTfsProperties failed with exit code: 16
MSI (s) (B4!A4) [11:35:22:083]: Product: Microsoft Visual Studio 2008 Team Foundation Server - ENU -- There is a problem with this Windows Installer package. Please refer to the setup log for more information.

You'll notice that the uninstall was looking for the value ReportService which is found in the TfsIntegration database's tbl_service_interface table. The value in that table is a FQDN which we use because we have Primary and Standby App Tiers. As defined in the TFS documentation, only one is active at a time.

The Bug in the uninstall is, if you are setup behind a FQDN the value of ReportService will be the FQDN (e.g. tfs.mycompany.com) instead of the value of the Standby App Tier, which is what you're trying to uninstall.

A quick Google search led me to this blog post by Nick Berardi on his Coder Journal. He ran into a similar issue during an upgrade. He got around the error by changing the ReportService value to match his App Tier.

While I never recommend messing with the TFS databases, being we're in a QA environment I gave Nick's idea a shot. Sure enough, the uninstall finished successfully. I then changed the value back to our FQDN so the Primary App Tier could be functional.

I'm not sure why the uninstall would need these values for Reporting Services. From what the uninstall documentation says, you must uninstall Reporting Services (and Share Point) as an independent step? Anyway, again while I never recommend manually changing values in the TFS databases, we ended up needing to as to get around the uninstall issue.

Thursday, May 01, 2008

Deployment to IIS 7.0 using Power Shell

Yes, I'm still here. Sorry for not writing anything lately on TFS. I've been in one of those two week Agile (with a capital A) "Iterations" which is when we get all the work done.

I don't really have anything important to share right now on our TFS adoption. Everything is going pretty good. Knock on wood!

While not really related to TFS though, I've fallen in love with Power Shell 1.0. Everyday I find another problem to solve with "commandlets" and all the great things a full featured shell offers. Along with the AppCMD command which ships with IIS 7.0, we're using Power Shell scripts to drive our deployment mechanism. Here is a simple example of what our scripts look like. It's actually working very well for us and far less overhead than creating an msi or Install Shield package.

You'll notice that we take the nuke and pave approach to deployment. That is, we check to see if the application directory is there, and if it is, we purge it. Then we copy over the new application version's code.


#Always get the script root. Per Microsoft's recommendation, each script should know their respected script root.
$SCRIPT_ROOT = Split-Path (Resolve-Path $myInvocation.MyCommand.Path)

#check to see if these values are already set. If not, set them to defalts.
if ($ENV_SITE_NAME -eq $null) {sv ENV_SITE_NAME -value Website}
if ($ENV_APPPOOL_NAME -eq $null) {sv ENV_APPPOOL_NAME -value WebsiteAppPool}
if ($ENV_PORT_NUMBER -eq $null) {sv ENV_PORT_NUMBER -Value "8080"}

sv APPCMD -value $env:systemroot\system32\inetsrv\AppCmd.exe
sv SITE_INSTALL_PATH -value "D:\Web Sites\$ENV_SITE_NAME"
sv APP_SERVICES -Value "aspnet_state", "W3SVC"
# ***************************************************
# Step 1)
# Clean up old install.
Invoke-Expression "$APPCMD delete site $ENV_SITE_NAME"
Invoke-Expression "$APPCMD delete apppool $ENV_APPPOOL_NAME"
if (Test-Path $SITE_INSTALL_PATH) {rmdir -Recurse -Force $SITE_INSTALL_PATH}
# ***************************************************# Step 2)
# Make dirs
New-Item $SITE_INSTALL_PATH -type directory
# ***************************************************
# Step 3)
# Create the AppPool for this site
Invoke-Expression "$APPCMD add apppool /name:$ENV_APPPOOL_NAME"
Invoke-Expression "$APPCMD set apppool /apppool.name:$ENV_APPPOOL_NAME /processModel.identityType:NetworkService"
# ***************************************************
# Step 4)
# This will add the site on port 81. This does not create an App or Virtual Director by default. *****
Invoke-Expression "$APPCMD add site /name:$ENV_SITE_NAME /bindings:`"http/*:${ENV_PORT_NUMBER}:`""
# ***************************************************
# Step 5)
# Craete App with the physical path and set the Apppool we need ******
Invoke-Expression "$APPCMD add app /site.name:$ENV_SITE_NAME /path:/ /physicalPath:`"$SITE_INSTALL_PATH`" /applicationPool:$ENV_APPPOOL_NAME"
# ***************************************************
# Step 6)
# Copy over files recursively (/E and /I), overwrite read only files (/R), surpress comfirmation (/Y), quietly *****
echo "Copy Files to WebSite"
xcopy $SCRIPT_ROOT\Website $SITE_INSTALL_PATH /E /I /R /Y /Q
# ***************************************************
# Step 7)
# Make sure any needed services are started before finishing and spits out their status.
Invoke-Expression "$APPCMD start apppool /apppool.name:$ENV_APPPOOL_NAME"
Invoke-Expression "$APPCMD start site /site.name:$ENV_SITE_NAME"
$APP_SERVICES Start-Service
$APP_SERVICES Get-Service
# ***************************************************
# Step 8)
# Make sure we didn't get any errors at deployment.
if ($error.count -lt 1)
{
Write-Output "" "Deployment Successful!" ""

}
else
{
Write-Output "" "Deployment Failed!" ""
$error
Write-Output ""
}

Tuesday, April 15, 2008

Follow-up on our Restore Attempt(s)

Two weeks ago today I left the building frustrated with Microsoft because of poor documentation. You can see my rant here.

I'm still a bit miffed that the "How to restore..." document was (and still is as of this writing) wrong, but I'm feeling better that they have graciously apologized to me. And the fact that with the help of Support, we're up and running in the restored test environment. Being I'll most likely forget, here are the steps we had to do to get this test restore environment up and running.

First support walked through the steps 1 - 14 of "Restore and Test SQL Report Server, Reporting Services, and Default Reports" BEFORE we did "Rename the Team Foundation Data-Tier Server and Activate the Team Foundation Application-Tier Server" AND "Move User and Service Accounts" section. Why? I'm not sure if I understood the reasoning.

Second, support did the following steps on the App Tier and Data Tier once I handed over support with Easy Assist.

1) App Tier: Before they ran RenameDT, they changed the ReportSever AppPool identity from NetworkService to the TFSReports account.
2) App Tier: They ran RSKeyMgmt.exe –d which deleted all encrypted data on the server
3) App Tier: They ran RSKeyMgmt.exe –r on the GUID from the initial install of TFS on this new hardware. Somehow this GUID gets re-added?
4) Data Tier: Opened up the tbl_database table in the TfsIntegration database and changed all the ‘servername’ values to the new Data Tier.
5) Data Tier: Opened the tbl_service_interface in the TfsIntegration database and changed ReportsService and BaseReportsURL to have the new App Tier name.
6) In Reporting Services Configuration, they made sure that the TFSReports account was used instead of a built in account.

After they did all this, the RenameDT worked. Like my first point, I'm not sure I understood why they did all this and if it was all really needed.

Third, after "Rename the Team Foundation Data-Tier Server and Activate the Team Foundation Application-Tier Server" we did the "Move User and Service Accounts" section. All this worked just fine.

Fourth, this wasn't really related to the restore per say, but we did have to give the AppTier\Users group Read, List, Read & Execute on the c:\Program Files directory. Without this, the only users that could log in where those who were members of the AppTier\Administrators group. I'm not sure if this is correct, or why the TFS install didn't set it up for us? When I compared this test restore hardware with our current PROD hardware, the current PROD hardware was configured with AppTier\Users have that access to c:\Program Files so we just mimicked the behavior and it all worked. UPDATE: We also had to give AppTier\Users Full Control access to %Program Files%\Microsoft Visual Studio 2008 Team System Web Access\Cache.

Fifth, we finished steps 15 - 29 of "Restore and Test SQL Report Server, Reporting Services, and Default Reports". This all worked fine.

Sixth, and lastly, we changed the Instance ID so that we didn't have the same Instance ID for both our current PROD hardware and our test restore hardware. See this full forum for how to do this.

After all this, we currently have a test copy of our current PROD system restored so we can play around with it.

Wednesday, April 09, 2008

What is needed to write complex Reports

There have been a number of people asking me what they need to install in order to get up and running with report writing using the "Business Intelligence Development Studio". Here is what I send them.

1) Install Visual Studio 2005. This step is optional.

2) Install "Microsoft SQL Server 2005 Express Edition Toolkit" which gives you the "Business Intelligence Development Studio (BIDS)"

3) Read Buck's blog and download the attached documentation to get your data sources setup and an initial report written.

4) Obtain a PHD in Computer Science if you have to write any MDX queries.

Tuesday, April 08, 2008

IE 6.0 crashes with Share Point 3.0

Some of our users have had IE 6.0 crash on them (intermittently) while opening up documents from Share Point 3.0. While the error messages don't exactly match, they are similar to what's described in Steve's post.

I followed up with the users and it seems that Steve's fix, fixes their issues with IE crashing.

Thanks Steve for posting your find!!!! I'm hoping it will get me a free beer at the bar.

Thursday, April 03, 2008

Cannot create a connection to data source 'TfsOlapReportDS'. (rsErrorOpeningConnection)

If you ever see an error like below when trying to render a Report in TFS, make sure MS SQL Analysis Services is started under Control Panel > Services. Since we're running a duel server install, our MS SQL Analysis Service is running (or not running when we get this error) on the Data Tier.

An error has occurred during report processing. (rsProcessingAborted)
Cannot create a connection to data source 'TfsOlapReportDS'. (rsErrorOpeningConnection)
For more information about this error navigate to the report server on the local server machine, or enable remote errors

Tuesday, April 01, 2008

Frustrated with "How to: Move Your Team Foundation Server from One Hardware Configuration to Another"

Nothing frustrates me more than when documentation is wrong or misleading. Case in point: The document "How to: Move Your Team Foundation Server from One Hardware Configuration to Another" explaining how to move TFS from hardware to hardware is probably one of the worst written documents I've ever read. Here are just a few of my observations.

- We've bombed out at "To rename the Team Foundation data-tier server". The first issue was that a small, yet critical, detail is left out of the document. It's missing the statement explaining before running TfsAdminUtil RenameDT you must modify the Service's web.config, making sure the connection string on the new server is referencing the old server. The Community Content says this at the bottom and also the "Microsoft Visual Studio 2005/.NET Framework 2.0" says it. So why doesn't the master document for TFS 2008 say this? If you fail to do the rename, the command comes back and says "data tier name not changed."

- Even after getting past this, we are still stuck though. According the support, we actually should do the "Rename the Team Foundation Data-Tier Server and Activate the Team Foundation Application-Tier Server" and "Move User and Service Accounts" AFTER we do other things like "Restore and Test SQL Report Server, Reporting Services, and Default Reports" which is two sections BELOW "Move User and...". What? If this is true, which we're verifying with our field rep, why does the document have them out of order? How can someone expect to know this?

- Lastly, the support rep (who was actually very helpful by the way), said that you basically have to disconnect the old TFS server when you do the migration. Meaning, you basically can't do a test restore on some QA hardware before you have to come in on a weekend to do the same steps on a live PROD system. We're checking with our field rep to make sure this is accurate, but if it is, which I'm praying it's not, where in this document does it state that you must disconnect the old system before doing the restore based move?

Sorry for my rant. I just get so frustrated when documentation is wrong or misleading.

Thursday, March 27, 2008

Running Tests and getting Code Coverage for our .NET projects.

As I wrote about a last week, the entire testing portion of Visual Studio is a mystery to me. I've never been a tester (for good reason) so I've really never spent the time to dig into Microsoft's offering. That was until I was forced to figure some stuff out, due to my new team wanting it.

From what I've found, TFS 2008 makes executing user's tests at build time very easy. All we had to do is add the following line to our Team Build project file and the magic happened. We had test results!!

<TestContainer include="$(OutDir)\Unit Tests.dll">

The code coverage was missing though so we had to track that down. Thanks to Benday, all we had to do is update our Team Project file again and magically, we had coverage results.

<RunConfigFile>$(SolutionRoot)\LocalTestRun.testrunconfig</RunConfigFile>

Of course this assumed our testrunconfig was setup for Code Coverage, which in our case it was.

So far, even a testing novice has found it pretty easy to get tests to run at build time. Feel free to share you experiences, both good and bad.

Friday, March 21, 2008

Publishing Test Results using the Testers Edition of Visual Studio

I fully admit, I don't know anything about the Testers Edition of Visual Studio. However, since I know a bit about TFS, I've been asked a number of questions about how the two talk to each other. Thus it's been a learning experience for me!

Here is the way I understand the Test Results publishing process to TFS for the Tester Edition of VS. Again, this is all really new to me so if I have something wrong, please correct me.

Per the document here, I see that a Tester will run a test on their local machine or using the "rig" to run on remote machines. After the tests are run, test results are stored in *.trx file on the tester's machine. Testers then can open that file and "Publish" them to TFS's operational store (which must be the TfsBuilds database maybe?). That data then gets moved to the TfsWarehouse per the warehouse schedule. At this point, you can Report on it in the warehouse.

Now there seems to be one small deviation in that Load Tests need to be loaded to a local SQL database which is defined outside of TFS. That is, the Load Tests database stores Load Test data before it's published to TFS. There must be someway then that the Tester Edition of VS can look in that local database where the Load Test results are stored, and publish the results to TFS. I think this is an important detail as there may be some ambiguity on the difference between the Load Test database and the standard databases that make up TFS (e.g. TfsBuilds, TfsWarehouse, etc).

If I have anything wrong here, I'd love for you to share your insight as we're trying to put the big picture all together.

Wednesday, March 19, 2008

Teamprise 3.0 is released!

If you didn't see, Teamprise 3.0 has just been released. As I've written a number of times, Teamprise is a great offering if you're looking for a cross platform solution for Team Foundation Server. NO I do not work for Teamprise or sell their product for a living. I'm simply a customer at a large software firm who uses Teamprise for our Java development groups.

We use every piece of their suite including the Ant scripts and Teamprise Build Extensions which allows you to call your Ant scripts from Team Build.

Teamprise has a great product, but even a better support group. If you have not checked them out, do it soon. You'll find it being one of the best products around.

Monday, March 17, 2008

The "Copy" task could not be initialized with its input parameters

As I wrote about last week, we've started to use a few Tasks from Sdc. Per the instructions we started by importing the entire list of Tasks. Something like this.

Import Project="$(TasksPath)\Microsoft.Sdc.Common.tasks"

Doing this led to an error in our Team Build though.

...
(EndToEndIteration target) (1)
(CoreDropBuild target)
C:\Program Files\MSBuild\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets(1310,11): error MSB4064: The "SourceFiles" parameter is not supported by the "Copy" task. Verify the parameter exists on the task, and it is a settable public instance property.
C:\Program Files\MSBuild\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets(1309,5): error MSB4063: The "Copy" task could not be initialized with its input parameters.
...

I took a look in the Team Build targets file and sure enough, the Team Build targets file uses Copy, but needs the MSBuild one not the Sdc one. I'm guessing this is an conflict error in Sdc.

Using guidance from the forums, we had to change our global import to just import the TaskNames we needed. This helped us get around the issue with the Copy conflict.

UsingTask TaskName="Microsoft.Sdc.Tasks.Tools.PsExec" AssemblyFile="$(TasksPath)\Microsoft.Sdc.Tasks.dll"

Friday, March 14, 2008

Check out SdcTasks

We've taken a mass adoption of the SdcTasks found on Codeplex. When comparing MSBuild to Ant, I've always been disappointed with the Core tasks in MSBuild. The default list seems to be limited. NOTE: I have not done a task for task comparison, but only a cursory compare.

The limited task set in MSBuild is what has really excited me about SdcTasks. There is a ton of stuff in this add in. Want to talk to Active Directory, Email, or talk to SQL? Take a look at SdcTasks for help.

Since I'm lazy, I'd like to see these rolled into the standard install for Team Build or MSBuild.

Friday, March 07, 2008

Other tools we use in conjunction with TFS

While TFS can do a lot of things, I thought I'd share a list of tools we're using (or just looking at) in conjunction with TFS for our full "Build and Deploy" solution.

- PSTools - We use this fine set of tools for executing deployment scripts on remote machines. Once Power Shell allows this ability (I think it's coming in 2.0) we might look at using Power Shell for our remote communication. In the mean time PSTools will work though we are getting some output from Psexec which is hanging MSBuild's Exec task.

- Power Shell 1.0 - While I'm not a shell expert, from what I've found Power Shell seems to be a great upgrade from CMD. We've changed all our deployment scripts to use Power Shell for the basis.

- AppCMD - We use AppCMD (pronounced App Command by the Microsoft person I talked to last week) is a command line interface for IIS 7.0. Instead of using our home grown tool for writing to the IIS meta-base (which we did with IIS 5/6) we're using AppCMD. Now since IIS 7.0 does not have the concept of a meta-base, we could just xcopy the applicationHost.config file over and call it a day. However, we're currently hosting multiple applications on the box and don’t want one deployment to bring down all apps. By using AppCMD to purge/pave our IIS 7.0 configuration, we scope the downtime to the application we're trying to deploy.

- IIS 7.0 and Tomcat - Our ASP.NET apps are running under IIS 7.0 (as you would guess) and our Java applications are running under Tomcat.

- Altiris Deployment Sever and Altiris Software Delivery System - So our deployment runs fine one a machine or two, but we need something that will execute the deployment on 1000+ machines all at the same time. Altiris has a couple of different products we're looking at to do this. Basically, this would remove our need for the Psexec tool from PSTools and our FTP Tasks in Ant we use to deploy a WAR file to Tomcat.

I know, I know this has very little to do with TFS, but we do hook the tools into TFS (and vice-versa) so I thought I'd share it with you.

Friday, February 29, 2008

I'm basically an End User

Sorry for my lack of blogging lately. Basically, I've turned into an end user of TFS. How far the mighty falls. At one time I was the first person to bring TFS into a company of 35,000 people and administer the only TFS instance. My calendar was filled with demonstrations to both Directors and Vice Presidents. Now we have five or six implementations and I'm back to being a simple grunt. Which to be honest is probably where I belong.

While I don't do much TFS administration anymore (at least right now), I do offer my expertise. Like just yesterday a "Scrum Leader" asked the person administering TFS how hard it would be to not default the Current User to the Assigned To for our Bug Work Item. I gave a nod of encouragement indicating it would be a simple change.

We have ordered new hardware for my current group's instance though and I'm sure I'll get involved as we lay out the topology and do the migration. In the mean time, I'll just set back and enjoy TFS as an end user while I work on our deployment strategy.

Monday, February 25, 2008

TFS Statistics Query

A while back Brian posted on the SQL commands he uses to pull the Dogfood statistics. I seem to always lose this posting so I thought to write it down so I can quickly look it up on my blog.

I used most of them this morning on our TFS 2008 server and all the ones I took worked just fine. My "Scrum Master" (and yes I do bow before him), wants our numbers pulled every iteration which for us is two weeks.

Compared to my old group, we've got some pretty small numbers. For example, my old group had close to 10,000 changesets while this group only has just short of 800. But even though the numbers are small, I enjoy looking at them.

Anyway, if you need the queries to pull your own numbers, which I think you should, see Brian's post. Enjoy!

Thursday, February 21, 2008

Web Service Versioning

On my new project team here is a lot of discussion around supporting multiple versions of our applications (which are most likely going to be a collection of web services). This affects me as I'm responsible for deployment of these applications. This got me thinking, how does TFS do versioning for their web services? TFS 2008 supports Team Explorer 2005 and 2008 so they had to have the same discussion we're having now.

As it turns out, it looks like they support multiple versions within the TFS 2008 code base by simply organizing the versions in a folder hierarchy. It looks very well organized and something that would be easy to deploy! Let me explain what I see.

Underneath %Program Files%\Microsoft Visual Studio 2008 Team Foundation Server\Web Services\Build I find both v1.0 and v2.0 folders. My assumption is that the TFS development team decided to put all new 2008 functionality in the v2.0 folder for each web service. In this example I just show "Build" but you can see similar things in "Services" as well. I'm guessing when the next release comes out, we'll see v3.0.

I'm the type of person who has really never created anything original. Well that's not 100% true as the book I'm writing is not simply copied from other fiction thriller writers. However, the fundamentals are based on the same story structure and writing format that most all novel writers use. I just simply follow the rules. Back to TFS though. I brought up the web services versioning I saw TFS use to our architects and from what it sounds like, they are going in a slightly different direction.

What we "plan" (plan is used loosely as they are constantly changing things) on doing is hosting each "version" under a separate IIS 7.0 site. Something like this I guess.

IIS
/AppPool1.0
    /WebServiceSite1.0
        /mycode
/AppPool2.0
    /WebServiceSite2.0
        /mycode

My understanding is, this will put each "version" in a separate worker process. The approach TFS took would not.

Personally I think the way TFS does web service versioning is better. That is, simply versioning the web services within a single deployment (i.e. site under IIS). But given we're selling/supporting different types of software (ours is hosted internally and we have full control over deploying it) I can see an argument for supporting multiple version as separate deployments under IIS.

Now this all comes with my standard caveat that I often misunderstand very technical details. I'm sure there is a number of white papers and studies done on web service versioning that trump any opinion I have. This is simply an observation of how I see TFS is doing some kind of versioning.

Wednesday, February 20, 2008

Sorting data in Reporting Services

Most things come hard for me. Whether it's been work, school or play, nothing has come easy.

So it should be of no surprise, I had a hell of a time trying to figure out how to sort our custom TFS build report. It honestly took me about four hours to figure out all you needed to do is set a Sorting Expression on the report's table when in layout mode. Here is a good article that explains it in detail.

I was trying to modify the MDX query, which is nearly impossible, by using an ORDER function. That was getting me no where as I could not get the syntax figured out. So after lunch I decided to do a bit of research and fell into the TechNet article. And about five seconds later, I found my answer.

If you need to do some sorting, check this article out.

Wednesday, February 13, 2008

Team Build hangs when using Pstools and the Exec task

In my new position they have given me deployment. Basically after the build and unit tests are run, they want the code deployed to the servers. We're running under IIS 7 so I've decided to write the deployment logic using AppCmd from the IIS 7 team. It looks like a very nice tool!

To execute the series of AppCmd commands (actually we've grouped them into a batch file that we call) we're using yet another Microsoft technology called Pstools. In this suite of tools there is psexec which allows you to execute remote commands. After we got past some odd domain --> workgroup permission errors, it works pretty well. The command we use is something like this "c:\tools\pstools\psexec /accepteula \\server cmd.exe /c c:\deploy\install.bat 0.0.25"

Back in our TFSBuild.proj file we're using the Exec task to execute this. Unfortunately the Exec task hangs. "Gonzobent" has submitted a wonderful write-up on this to Connect. I added my two cents in as well.

To get around this (as we need to so our deployment efforts can continue) we put the "non-interactive" switch (i.e. -d) on the psexec command. So now the command looks like this "c:\tools\pstools\psexec -d /accepteula \\server cmd.exe /c c:\deploy\install.bat 0.0.25". This is not ideal as we don't get any output back in the Team Build log file to see if the deployment works or not. We just assume it does - since I wrote it ;)

As I put in my comments on Connect, I think psexec is sending back MSBuild information it's not expecting. We ran into this a few years ago when we wrote a homegrown build system which had a Java application as it's Built Client. When a user ran a Ant script that failed, the system would just hang. There is a good article here that examples how we fixed it. We basically thread reading the output and error.

If anyone has any other work around ideas, please let us know.

Monday, February 11, 2008

Be careful with consultants

This goes for all software, but being I've been working with TFS for the past year, I thought it would be good to mention it on this blog.

In my last group, we implemented TFS by ourselves. That is, we did all the setup, migrations, training and ongoing support. We decided this was the best approach since the team was very technical and motivated to tackle a new configuration management tool. And for the most part, we were successful.

My new group took a different approach (before I got here) and hired out most of the TFS work. Everything from setup to customization has been done with the consultants taking the initial lead. We've been bumping heads a bit this past week because I take a slightly different view on how to implement a new product than they do. I think they have sold my new group on too many customizations.

I was a consultant for two years back in the early 2000s so I know a bit about the business. A consultant's number one goal is to stay billable. Meaning if you, as the customer, keep nodding your head, they will continue to do what you say and keep billing. If you're a consultant and reading this, don't get all bent out of shape. You are very necessary and we can't live without you, but there are times when a good consultant needs to help the customer say "no" to themselves.

We as customers get a bit too excited about all the possibilities TFS has, and forget that at the end of the day, everything costs money So when you say "hey we'd like to have check in policies for this and that", everything can be done, but it will cost you. And don't forget about the on-going maintenance costs.

So here is my suggestion. Start with the basics. That is, try to take what TFS has out of the box, then slowly start making the needed modifications with a well balanced team of consultants (if you need them) and full time employees. You'll be better off in the end.

Thursday, February 07, 2008

Safari

I'll have to make this quick as my son is starting to toss up his sweet potatoes. We've had a number of "business" folks who want to use Safari for TSWA. They have had some connection issues that look similar to what happens when you don't have your proxy settings configured correctly in Firefox. That is, you continually get the authentication prompt while a portion of the screen renders in the back.

We've read about a number of proxy server and/or Windows Authentication issues with Safari. While I can't say that is the primary issue, we've directed the users to Firefox as that works just fine after the proxy connection is configured correctly.

Monday, February 04, 2008

eScrum instead of CMMI

It's been an interesting day in my new group. My former organization used the TFS CMMI templates. Our group wide CMMI efforts are pretty much dissolved, but TFS and the templates are still in use.

The new group is using something called "eSrum" and an Agile development mythology. From the looks of it, it's been around a while though I've never looked at them before. I thought Scrum and Agile were two different things, but from what I've found so far, people around here use the terms interchangeably.

They have an existing TFS system setup in their DEV environment. They were having some issues with creating "Build" reports as the eScrum templates don't include one. So I pumped out a few pretty quick referring back to Buck's posting here when needed. If you have not checked this out, you should. It's very helpful if you need to get up to speed with Reports quickly.

We also upgraded their TSWA from 2005 to 2008. I'm not sure why they were running 2005 as we're running 2008 for everything else. The upgrade was very smooth. It only took about 10 minutes after we got started.

Overall they have things pretty well setup in their TFS 2008 DEV environment. They do plan on scaling out to a full TFS deployment (i.e. primary/standby app servers with clustered data tiers), which we'll have to tackle when that times come.

Friday, February 01, 2008

Slight change in my role

If you're a regular reader of my daily Thoughts blog, you'll notice that I'm changing groups starting Monday. Oddly enough, the first assignment in my new group is to get TFS up and running for them.

The new group is starting off with TFS 2008 so I won't get to experience the 2005 to 2008 upgrade. However, given I'm just a floor away, I'm sure my old group will ask for some help when the time comes.

Given I'm heading to a new group, my postings may change slightly. I'll still be supporting Teamprise users, but over half the group uses Team Explorer so that may sway my subject matter a bit. I'm also taking on some more responsibility (with very little more pay). In addition to leading up the TFS adoption, which includes Work Item Tracking, Reporting, Documents, Team Build, and Source Control features (including branching and code-line policy), I'm responsible for deployment strategy. That is, how in the heck we're going to blow out the build results to large server farms. Daunting to say the least.

I hope you'll keep reading as I still plan on posting what has become simply a brain dump on our company's adoption of TFS.

Thursday, January 31, 2008

Odd Behavior for Workspace Mappings

Say you have the following Team Project Source Control structure. 1.0 under "releases" is the 1.0 branched version of the mainline.

$/TeamProject
  /mainline
    /src
    /lib
  /releases
    /1.0
      /src
      /lib

Then set your workspace mapping up like this.

$/TeamProject/mainline --> c:\Development
$/TeamProject/releases/1.0/src --> c:\Development

Then do a Get on $/TeamProject/mainline/src. You'll find that $/TeamProject/mainline/src will show "Not downloaded" under the column "Latest" while the "release" branch will show "Yes" under the column "Latest."

I can actually kind of understand why this behaves like it does. The deepest mapping is honored. However, it's kind of odd that you do a "Get" on mainline and the "release" branch is what is Got.

Wednesday, January 30, 2008

Phase 2: Tracking down the TFS license key.

Here's an update on trying to track down our TFS 2008 license key (a.k.a product key). For background see my post last week. If you're struggling with how to get your license key for TFS 2008, check out Martin's post or Brian's post. Both are very helpful.

From what I understand; Our company of 33,000 buys Microsoft software from a procurement vendor who in return buys it from Microsoft distributor who buys it from Microsoft (or something like that). Getting product keys always seems to be an issue.

Luckily we just got it figured out. There is a small body of people who can download "licensed" versions of software from Microsoft which have the license key embedded in the install. Unfortunately these people don't always know they have such powers, or they don't advertise themselves very well.

Anyway, we found a person who was able to download the standard edition of TFS 2008 for us, and then using Martin's post (above) we grabbed the "Product Key" and updated our trial edition of TFS 2008. Like Martin suggested; just to make sure, we used Brian's TFSVersionDetection and everything was kosher.

Now that was not that hard was it!

Tuesday, January 29, 2008

Firefox authentication issue resolved

The good folks at Microsoft reached out to use to see if they could get more information on the Firefox troubles we've been having. That request forced to look into the Firefox issues a bit, and as it turned out I think we have the primary problem resolved on both Windows and Linux.

Here is what we found out; we use a proxy server for the entire company. Because of the proxy we have a defined inclusion list in IE that tells IE and Visual Studio to not go through the proxy for internal sites. We use *.int.mycompany.com*. If you're wondering why the trailing "*" is there, check out this post. This list is company wide and managed by the support folks.

Since Firefox is not "company supported" they don't send out an inclusion list so people build their own. This is typically done by just copying what IE has. When the users copied IE, they got *.int.mycompany.com*. By the looks of it though, Firefox handles the list a bit different and was giving the consistent "Authentication Prompt" as the page rendered. To fix the issue, we needed just *.int.mycompany.com without the trailing "*".

So far this has worked with the small body of users we've had try it. Now that we have this fixed, we might have some more users go back to Firefox and test it out more.

Monday, January 28, 2008

New Blogger

My colleague Greg Collins is now blogging. While my blog is geared towards TFS and Teamprise, Greg's will be more on the software configuration management side of our business. For example in his first post, he wrote about how to grab certain values out of our major.minor.bug.build release number schema using Ant. We needed this solution so that our Teamprise Ant scripts know where to grab release code from during build time on our Linux/Linux64/Unix machines.

Being that we do most all of our CM work in TFS, I'm sure he'll have a number of postings related to specific solutions that we've found for TFS.

Greg is actually the person who figured out how to use Team Build for our Java builds. I was a bit sceptical, but Greg was persistent and as it turns out, we love using Team Build for our Java based builds.

I encourage you to check out his blog at softwareconfiguraton.blogspot.com

Friday, January 25, 2008

Statistics for TfsWitDisplayNames Issue

I was asked to collect some statistics on how often we run the TfsWitDisplayNames tool to fix our 'displayName' issue. Here are our results for the past two weeks.

1/14/08 - one time for 1 user. Changed business *units.
1/14/08 - one time for 1 user. Changed first name.
1/18/08 - one time for 1 user. Changed business units.
1/22/08 - one time for 119 users. Changed business unit name.
1/23/08 - one time for 3 users. Left company or changed business units.
1/25/08 - one time for 1 users. Left company.

As you can see, on average we do an update about every other day. The large change you see on the 22nd was due to our business unit name change. That is not normal, but does happen as you can see.

Overall the tool works very well. Other than the issue that I posted on here, we've had pretty good success using it. We have a modest number of work items (around 3000) so the tool runs very quickly.

The system administrators and DBAs don't like it very much as we have to run the tool with an account that has access to the data tier. Since this is against company policy, we've had a temporary account created until this is fixed. I threw out the idea that maybe we should have the DBA run the command for us, at which point they quickly gave us access ;).

Hopefully this provides some quantitative data of how much this Bug affects our group. While not overly difficult to fix, it's a pain to have to do this every few days.

* The Active Directory 'company' field is used for our business unit names. Thus when someone changes business units, the 'company' name is modified. 'company' is then used in our 'displayName' format. The overall 'displayName' format is Lastname, Firstname (Company). If you're wondering why don't we change this, it's because this is the format for 33,000 employees and the business doesn’t want to change it for everyone, just because TFS has an issue. I can't argue.

Thursday, January 24, 2008

TFS and Teamprise slides are posted

Last week I gave a presentation to the Minnesota Visual Studio User Group on our adoption of TFS and Teamprise for a J2EE development group. If you're interested in getting the slides, you can find them here under "Mac Noland Teamprise-TFS presentation".

To be honest, I've never downloaded many presentations so by no means feel that I'm forcing these upon you. However, if you're thinking about implementing TFS in a Java shop, you may be interested in some of our information.

Wednesday, January 23, 2008

Phase 1: Tracking down the TFS license key.

Today I had intentions of trying to track down our TFS 2008 License key. This was a big problem for us last time so I started with a "Escalation Specialist" from the "NACS Response Management Team-MSDN".

After dialing the number they gave me, I ended up talking to City Gold and Smith Barney. Thinking that was wrong, I gave the MSDN number I have (800-759-5474) a call and after a transfer was able to talk to someone from the "Activation Team."

The Activation Team member said to get the License Key I just needed to look on the back of the CD. We don't actually receive any media as we installed the trial edition (per Brian Harry's suggestion). So then he directed me to have our procurement department talk to the reseller. We're a company of 33,000 people so you can imagine how hard it is to find the procurement department.

From what he said, if the reseller does not know the key (which they didn't with TFS 2005) then we need to have our procurement tell our reseller to contact their local distributor. At this point I lost interest in the conversation and offered my thanks for his time.

So I'm back trying to find our procurement department. Once the reseller's distributor tells the reseller that they don’t have the key, our procurement department will most likely direct me back to the vendor at which point I'll start over again with the "Escalation Specialist".

Tuesday, January 22, 2008

Port Issue Fixed in TFS 2008 Install

When we installed TFS 2005 we had a nasty issue getting it installed on a different port (i.e. 8888). The default (i.e. 8080) was consumed by a data center wide service (some raid copy service I'm told) and the fine data support folks were reluctant to change it just for us. Being I like standardization, I could not argue with them.

So after a few days and numerous calls to support, they told us installing TFS 2005 to a different port was impossible. See here for a work around, if you're ever so unfortunate.

Luckily they have this fixed in TFS 2008. Or at least their install documentation says it's fixed. See the section "How to Customize the Port Assignment for Team Foundation Server" for instructions.

FYI: If you're adamant in putting TFS on 8080 and need to figure out what is consuming 8080, try the command "netstat -ano". From here you should be able to find out what process is using 8080 (or any other port for that matter).

Thursday, January 17, 2008

How we branch our code in TFS

There is always a lot of discussion about branching philosophies. From what I've found, there is no one right answer. That being said, there are better approaches than others.

We do things pretty consistent across our development groups so I thought we'd share it with you here. It is in my opinion, this approach is the best branching approach for developing and supporting multiple releases based off of one code base.

What we do is a variation of what they call the "Hybrid Branching Structure" in Microsoft's Live Meeting called "Application Platform: Branching and Merging: Which Approach Is Right for You (Level 200)"

First we have a mainline for each Application. Some people may call this "main" but we refer to it as mainline. Mainline is wide open for all developers working on that Application.

$/TeamProject
     /App
          /mainline

Then when we do code freeze, we branch the mainline to a folder under releases to it's appropriate major.minor release number. Release branches are read-only for everyone. If a change (either a Bug fix or Enhancement) is needed, permissions are opened for that one user after the Release Manager gives approval. Once the change is made, permissions are removed.

$/TeamProject
     /App
          /mainline
          /releases
               /releaseX.X (* branched from mainline)

For an application that has been around for a while and had a number of releases, the TFS structure looks something like this. In this example, we release every month and use year.month for our major.minor release numbers.

$/TeamProject
     /App
          /mainline
          /releases
               /release8.1 (* branched from mainline)
               /release8.2 (* branched from mainline)
               /release8.3 (* branched from mainline)
               /release8.4 (* branched from mainline)

This approach seems to be the best for us. It minimizes the need for merging and allows us to support multiple releases at once.

UPDATE: There have been some questions about how we merge using the branch philosophy stated above. When a change is made to a release branch, that change is merged back to the mainline. Depending on how many "live" branches you're supporting, you may (or may not) have to merge it into other branches as well. For example, if you made a change to release8.3, it should be merged into mainline AND release8.4.

Monday, January 14, 2008

VSTS Meeting on Wednesday, January 16th

If you're in the Minneapolis/St. Paul area you'll want to make sure you stop by the Visual Studio Team System Minnesota User Group on Wednesday night at 5:00PM. Details can be found on their website www.vstsmn.net.

I'll be the keynote speaker, which probably does little justice to the term "keynote." In all seriousness, I'll be speaking about our use of Team Foundation Server and Teamprise in a Java world. That is, we're using TFS even though almost our entire development area works in Java.

My discussion will cover our configuration management tool evaluation, why we choose TFS, how we use TFS, and some suggestions and things to look out for when implementing TFS for a Java development group.

Lastly, I hope you've enjoyed my blog so far. You may have noticed that my blog is a bit different than most TFS blogs. There are basically two reasons for this. First, we're using TFS in a Java world. I'd be willing to bet, most companies are using TFS for .NET developers. But I think that is starting to change. At least it is for us, as we've been very happy with TFS and Teamprise for our Java developers.

Second, I don't work for Microsoft, Teamprise or a consulting company implementing a Microsoft or Teamprise product. I work for a large software company and have simply been in charge of choosing and implementing a new configuration management tool. So I'm not here to sell anything, just state my observations in hope of helping others adopt TFS and to hopefully help make the product better.

I hope you've enjoyed so far!

Thursday, January 10, 2008

Bug in the TfsWitDisplayNames tool

So HR contacts me yesterday afternoon and says we're going to have a 'displayName' change in Active Directory. Being the TFS Administrator, my heart sunk. I knew this was going to be an issue.

I've actually been through this many times before, yet not with 250+ users. Usually it's just a handful every week. I've been using a tool called TFSLocalize which I believe was a precursor to the new tool TfsWitDisplayNames.

To get the tool you need to call support. It can take a bit to get routed to the correct group. I'd recommend you tell the "router" to look up the KB article number (i.e. 932717) so they know what to do. Or at least ask someone in their group what to do. As usual my experience was a bit rough, but eventually I did get to the 2nd level support group. Once I got there, they were very helpful.

Anyway, like I said we've been through this a number of times in the past so I'm pretty familiar with the issue. The TfsWitDisplayNames tool works pretty well. The documentation is well put together, which helps calm your nerves after you get over the fact you have to map 250+ users.

Here is an observation we had when running the tool with the /d switch, which I think indicates you are accounting for deleted users. The error we get might happen without the /d switch as well, but I didn't test that.

Here is our first mapping file and output. You can see that we have three entries. The first two are for users who are no longer with the company. I mapped them both to me. The third user is an employee who changed business units and thus had his 'displayName' changed.

Mapping File #1
oldValue="Gates, Bill (MyCompany)" newValue="Noland, Mac (MyCompany)"
oldValue="Harry, Brian (MyCompany)" newValue="Noland, Mac (MyCompany)"
oldValue="Hodges, Buck (MyCompany)" newValue="Hodges, Buck (MyOherCompany)"
Output
Processing changes for ...[Field Count : 13; Value Count: 3]
Value: Gates, Bill (MyCompany) -> Noland, Mac (MyCompany)
Value: Harry, Brian (MyCompany)" -> Noland, Mac (MyCompany)
[WARNING]: Skipping update. Value 'Harry, Brian (MyCompany)' for field '' has already been changed.
Processing changes completed.


You can see that by having Noland, Mac (MyCompany) twice in a row, the tool stops after the first update with a warning. Now look at our second mapping. Here I've removed the duplicate and it produces correct results.

Mapping File #2
oldValue="Gates, Bill (MyCompany)" newValue="Noland, Mac (MyCompany)"
oldValue="Harry, Brian (MyCompany)" newValue="Noland, Brock (MyCompany)"
oldValue="Hodges, Buck (MyCompany)" newValue="Hodges, Buck (MyOherCompany)"

Output
Processing changes for ...[Field Count : 13; Value Count: 3]
Value: Gates, Bill (MyCompany) -> Noland, Mac (MyCompany)
Value: Harry, Brian (MyCompany) -> Noland, Brock (MyCompany)
Value: Hodges, Buck (MyCompany) -> Hodges, Buck (MyCompany)
Processing changes completed.


This produced the results we expected, but I think TfsWitDisplayNames has a bug. I sent this back to support so hopefully the duplicate issue will be fixed. In the mean time, you have to remove all duplicates and/or use multiple mapping files. Which unless you've accepted that this is a huge pain in the butt, like I have, may frustrate you even more.

Attention Microsoft: This is becoming a bigger, and bigger, and bigger problem as our TFS instances (currently four) continue to grow. I can safely speak for the 1000+ users we have (or soon will have).