Thursday, December 27, 2007

Installing TFS Power Tools with PowerShell 1.0

This is not really related to TFS, but I need a place to write it down and my TFS blog felt like the best place. So we could install TFS Power Tools on our application tier we had to install Power Shell 1.0. Unfortunately when we tried to install PowerShell but got the error listed below.

0.313: 2007/12/27 10:42:01.209 (local)
0.328: d:\4dd898e4a95d73bd5cfec8a50379\update\update.exe (version 6.2.29.0)
0.344: Failed To Enable SE_BACKUP_PRIVILEGE
0.344: Setup encountered an error: You do not have permission to update Windows Server 2003.
Please contact your system administrator.
0.359: You do not have permission to update Windows Server 2003.
Please contact your system administrator.
68.000: Message displayed to the user: You do not have permission to update Windows Server 2003.
Please contact your system administrator.
68.000: User Input: OK
68.000: Update.exe extended error code = 0xf004
68.000: Update.exe return code was masked to 0x643 for MSI custom action compliance.

I found this article from Microsoft. It looks like the machine's group policies were a bit out of whack. We fixed them by going to Start > Run > gpedit.msc and following the steps listed out in the article above I was able to add machine-name\Administrators to the "Back up files and directories" policy, "Restore Files and Directories" policy.

After a reboot we were able to install PowerShell 1.0 and TFS Power Tools.

Teamprise 3.0 Preview Release #1

If you use Teamprise, which you should if you want a great cross platform tool, you'll want to check out the Teamprise 3.0 Preview Release that just came out. 3.0 has a number of nice features like check in policies, Team Build integration and recursive compare that you'll want to take advantage of. Contact support for details.

I've played around mostly with the recursive compare and sent them the following feedback.

  • When you "Compare" source tree (e.g. $/TeamProject) to local workspace (e.g. c:\my_development) the Compare tool you setup under Tools > Preferences > Compare Tools is not used when you double click on a file. Should it be? I noticed that when I compare file to file, the compare tool I selected is used.

  • Maybe just a personal preference, but I notice that other tree structure diff tools (e.g. TreeDiff for Team Explorer or Beyond Compare, or even Source Gear's DiffMerge) they use a common format where location #1 is on the Left Hand Side and location #2 is on the Right Hand Side. I think something like that would be more familiar than the single Structure Compare window like 3.0 preview #1 has now.

  • It might be nice to add some of the Source Control functions in the compare window. For example if you see a file has changed, but is not checked out, the user can Right Click and say Check out for Edit. I think VSS had this and the TFS Power Tools Tree Difference has this as well.

  • Along with the bullet right above is might be nice to have some Source Control meta-data with the files. Like maybe show if they are checked out or not.

Friday, December 21, 2007

Team Explorer and Teamprise Date format difference

So I'm writing this extension for Anthill Open Source so that Anthill can talk to TFS from a Unix/Linux machine. We still have a need for Anthill and it was requested of me by our testers. I did the same for Windows months ago and it has worked remarkably well. See my previous posting on where to get the code. I only mention "remarkably well" as there is probably a good reason why I don't develop much software; I was never very good at it.

The Window's adapter uses tf.exe which is provided by Microsoft. The Unix/Linux version uses Teamprise. Both command lines work well for what Anthill needs to do.

In Anthill there is logic where it will check to see when the last code change was and compare it with the last successful build Anthill has recorded. If there is a change AFTER Anthill's last successful build, a build will fire off. If there is NOT a change, the build will not run. Very nice feature that I think Team Build will have in TFS 2008. If not, my colleague wrote a program that does this for us in Team Build 2005 and we can just apply the concept to Team Build 2008.

Back to Anthill. What we do is run a history command to get back the "Date" of the most current changeset on a particular folder. Tf.exe gives us back the format "Wednesday, May 23, 2007 4:25:41 PM" which translates into a SimpleDateFormat of "EEEE, MMMM d, yyyy h:mm:ss aa". Teamprise on the other hand, returns "May 23, 2007 5:24:39 AM" which translates into a SimpleDateFormat of "MMM d, yyyy h:mm:ss aa". This causes a cross platform issue as the SimpleDateFormat is used for the compare with Anthill's last successful build.

I've modified our code to account for the difference and will be posting it back up to Anthill OS as soon as we have it fully tested. I doubt many people will have an issue like this, but if you are depended on the "Date" format coming from Team Explorer and Teamprise, note they are slightly different.

Tuesday, December 18, 2007

Permissions for Labeling

A colleague of mine ran across an interesting scenario today. They have a large Team Project with multiple code bases in it. They control permissions in Source Control so only certain groups can work on certain code lines. An example is something like this:

$/TeamProject/CodeLine1
$/TeamProject/CodeLine2

They wanted to ensure that only "CodeLine1 Contributors" could label CodeLine1. And vice versa. To do this, they gave "CodeLine1 Contributors" LABEL permission to $/TeamProject/CodeLine1. And vice versa. But when a member of "CodeLine1 Contributors" tried to label "CodeLine1" they got the error TF14098: Access Denied: User %USER% needs Label permission(s) for $/TeamProject.

My assumption is that when you label a folder in the structure, the label is actually applied to the folder's parents as well. I'm pretty sure there is good reason for that. I'm guessing it has something to do with the fact that when you label something you want to take a point-in-time baseline of the artifacts. Since the parents are part of the artifact structure, it applies the label to the parent as well. This is just a hypothesis, but I think it makes sense.

Anyway, to fix this "CodeLine1 Contributors" and "CodeLine2 Contributors" need label access to $/TeamProject. At that point then I think you can DENY "CodeLine1 Contributors" on CodeLine2 and DENY "CodeLine2 Contributors" on CodeLine1. Obviously if you have a ton of code lines in one Team Project this can take a few minutes to get setup. The good thing is, I think you only have to do it once.

Please feel free to refute my assumptions if you find them incorrect.

Thursday, December 13, 2007

Initial Results of TeampriseBuild for TFS 2005

I've spent a few hours working with TeampriseBuild and to be honest, I love it! With Martin's instructions a novice could get up and running within an hour or so (I seem to take longer due to excesive note taking and pondering). That is assuming their Ant scripts work autonomously from the initiation mechanism. In our case they are, so I was able to simply use his instructions to call an Ant file just like we'd use Anthill our CruiseControl to call an Ant script.

I did have to edit the WorkspaceMapping.xml to make sure you're not bringing down too much code. This has nothing to do with TeampriseBuild, just an observation of Team Build that is not always apparent.

I also like Martin's idea of not having to create the bogus solution file. "In a future release of Teamprise we hope to remove this restriction". While it's easy to crate the bogus solution file, it would be nice to avoid this. It just confuses people more than anything.

We did need one deviation from Martin's write-up. To get the logging correct we had to change the TFSBuild.rsp file to use /v:d, which is detailed logging for MSBuild, instead of /v:n. /v:n did not give us the Ant's standard output. However /v:d did. So while you have to dig through a bunch of MSBuild junk, you do get to see the output returned by Ant.

Some of our Ant scripts take care of their own logging so it's not a big deal. But it would be nice to not see all the "detail" information in the log file and just get back Ant's standard output.

Overall, I give this effort an "A". Keep up the good work Teamprise.

Tuesday, December 11, 2007

Teamprise Offers Team Build Solution

A few months back a colleague of mine found a way to use Team Build to call our Ant build scripts. It works quite well to be honest. You can see my original post on how we do it.

I just saw that the good folks at Teamprise are offering what I would guess, is a better solution. I've got that same colleague of mine checking it out, but if it's anything like other Teamprise offerings, it should be a nice addition for those of us using TFS in J2EE development groups.

Check TeampriseBuild out and give Martin some feedback on what you think. We plan on doing the same.

Friday, December 07, 2007

A scare when changing workspace mappings

Here is a good way to scare even the most seasoned developer.


  • Set your workspace U000001-XPA mapping so $/TeamProject/MySource is pulled down to c:\MySource
  • Check out $/TeamProject/MySource/MyFile.cs to c:\MySource\MyFile.cs
  • Make a change to MyFile.cs, but do not check in.
  • Now let's say for some reason you want to get the code $/TeamProject/MySource to c:\MyTempSource. To do so you decide to change workspace U000001-XPA mapping so $/TeamProject/MySource is pulled down to c:\MyTempSource.
  • Do a Get Latest.
  • Open up My Computer and you should see code under c:\MyTempSource. And you should notice that all code, including the code that was checked out and changed!, is deleted from MySource.
  • Start freaking out.

This just happened to one of our tech leads. Luckily he checked in the code before running through these steps. So he felt alright, but did have me over to his desk to take a look and see if it's correct behavior. I didn't think it was and did some testing.

Well good news. Team Explorer and Teamprise are smart enough to move the changes under the new mapping location. In our case. c:\MyTempSource. In addition, Team Explorer and Teamprise are smart enough to not delete any files that were in the original location (i.e. c:\MySource), but NOT added to the list of pending changes.

I think this is correct behavior, but it does scare us a bit.

Thursday, December 06, 2007

Long source file names don’t work well in TSWA 2005

We have not run into an issue with long file paths in TSWA, but another TFS administrator in this monster of a company has. Apparently it's caused when you have a file path more than 260 characters. We're probably not running into the error for one of these two reasons.

First, our localCache path is set to D:\cachedir from the default of C:\Program Files\Microsoft Visual Studio 2005 Team System Web Access\cache\. Thus saving us 64 characters.

Second, who interacts with source code from TSWA? While I nice feature to put in the marketing material, our users don’t use it often.

Here is a good posting explaining some more of the gory details. Sounds like the issue is fixed in TSWA 2008.

Tuesday, December 04, 2007

Setting up DiffMerge in Teamprise

To be honest, I don't like the compare tool that defaults in Teamprise. And if you've noticed - or if I've missed it somehow - Teamprise does not ship with a merge utility. Teamprise does however, provide documentation on how to setup new compare and merge tools. See their user guide for how to setup kdiff3.

Anyway, I seem to like DiffMerge better. So here is how we setup DiffMerge on our machines. NOTE: Some developers use other compare/merge tools as well, but I recommend DiffMerge if they don't have a preference.

I'll assume you know how to get DiffMerge installed and find your way to Teamprise Preferences. If not, then read the user guide as it will lead you there.

For compare our Arguments are %2 %1 -t1 %7 -t2 %6. If you read through DiffMerge's documentation, this syntax will put the server file (i.e. %2) on the left hand side, your local file (i.e. %1) on the right hand side and title the panels with (Latest Version) on the left and (Local Version) on the right.

For merge our Arguments are %2 %3 %1 -result %4 -t1 %7 -t2 %8 -t3 %6. If you read through DiffMerge's documentation, this syntax will put the local file (i.e. %2) on the left hand side, original baseline file (i.e. %3) in the middle, server file on the right hand side (i.e. %1), resulting file (i.e. %4), and then title the panels with Yours on the left, Original the middle and Theirs on the right.

Hope this helps!

Wednesday, November 28, 2007

"Does TFS support Java code?"

One of the most common questions I get from managers of J2EE development groups is, "Does TFS support Java code?" To the untrained eye, this is a valid question as most managers are a bit removed from their coding days.

Those of us in the know though, we understand that most development languages are written in ASCII based files. In Java, for example, we write our code in .java files which are ASCII based. TFS versions them just fine. Just like it would Perl, Python, C++/C, etc. And even if you write in some format that spits out a binary file, TFS would version them just fine as well. Though, like all Source Control tools, not quite as efficient.

To TFS, a ASCII file is an ASCII file. TFS knows nothing of J2EE and to be honest, does not care. I know most of you all understand this, but I wanted to drop a note as it's a common question we get from non-technical folks.

Tuesday, November 27, 2007

Update 2 on the horrid "There may be problems with the work item type definition" error

Well I'm back from an extended vacation / company holiday. Before I left a week or so ago, I was working with a member of the TFS development team on what I have been referring to as the horrid "There may be problems with the work item type definition" error. Search this blog for "horrid" and you'll find a number of posts.

First of all, I'd like to thank the TFS development team for reaching out to us on this error. It's very random and I can't reproduce the error on demand, so I was ambivalent about submitting it through connect. The TFS development team however, reached out and asked if they could help. As it turns out, they could.

What we're finding is the error seems to be related to a user's SQL View being out of sync with the tables, after a work item template change. Example changes we're making are adding/removing states and then adding/removing rules (e.g. REQUIRED) around those states. We've come to this conclusion, as first it only happens to certain users (on a variety of different machines). And second, the most common fix is to remove/add the user. From what development tells me, this act will force a refresh of the user's View.

Anyway, we're not out of the woods yet on this. But I'm really happy with the support MS has been offering us with TFS. Compared with my frustrating customer support experiences with ClearCase, TFS (and Teamprise) support is swimmingly.

Monday, November 19, 2007

Migrating Legacy Defect Data to TFS

Not too much new here. We just passed the halfway mark in our TFS migration and so far everything is pretty good. A few minor issues like the AD displayName not syncing with TFS, but nothing too major. Knock on wood.

There are a number of posts on this, but I thought I'd add my experiences with migrating legacy defect data to TFS. Like most everyone else, we simply used Excel with the TFS plug-in. Here is our process.

- Lock down the legacy defect management tool so it's read only.
- Export the data to Excel. We just migrated the open defects.
- Map the fields. For example, Owner in old tool with the format "FirstName LastName" mapped to Assigned To in TFS with the format of "LastName, FirstName (Company)". This is actually the hardest part.
- Run the Publish.
- Verify results.

The one thing we did that really helped was to migrate the legacy ID number to a simple text field in TFS. This allowed the user a reference to go back and look at data we were unable to migrate like state history.

Good luck with your migrations!

Thursday, November 15, 2007

TFS 1.5, TFS Orcas, VSTS 2008 Team Foundation Server?

I've been refereeing to the next release of TFS as "TFS 1.5."

"Hey Mac, what's the next release for TFS?"
"That would be TFS 1.5 my good man. Can't wait to get our hands on it as it has some nice Team Build features like continues integration we want to try out."

And like a lot of things, I've been misinforming people in the office. While I've heard people refer to the next release as TFS 1.5, Brian Harry has confirmed for me that there is no official "1.5" release number in the next TFS release. He did admit though that some people have called it 1.5, informally that is.

I've also heard of it called "Orcas" which as it turns out, is somewhat correct. The official name from MS marketing (or whoever makes up these names) is "Visual Studio Team System 2008 Team Foundation Server."

Semantics aside, this release is a minor release that does not have a lot of the bells and whistles that, what I've heard people call Rosario, will have. If I remember right, Rosario will have the really cool things like work item hierarchy.

Not to cut TFS 2008 (yet another term I just made up for it) short though. Among other things, it will provide SharePoint 2007, Continuous Build, server side tools for deleting work item types (we're going to love this), and the obligatory list of bug fixes. Brian has a nice list of things here that you can check out.

Monday, November 12, 2007

One way to rollback changes using Teamprise

I don't purport to say this is the best, or only way to rollback changes using Teamprise Explorer. It's just one way we came up with.

Before you do this, make sure you DON'T have "Automatically get the latest version before check out" selected under Tools > Preferences > Source Control. From my findings, this setting can cause some issues with the solution below. I'm working with Teamprise to see if there is a bug, or if I'm solving a problem with the wrong solution.

To rollback changes, or in other words, revert the latest version to a previous version:

  • Right Click on the file you need to rollback and select View History. Right Click on the appropriate changeset and select Get this version.

  • Right Click on the file in the Source Control window and say Check out for Edit. You should now have the old version of the file checked out.

  • Right Click on the file and select Check In Pending Changes...

  • The Resolve Conflict window will pop up. Select the file and click Resolve... Select Undo Server changes. This will tell the server that you mean to change the most current version of the file on the server.

  • Lastly, Right Click on the file and select Check In Pending Changes... This will commit the changes to the database.

Thursday, November 08, 2007

Disabling Visual Studio 2005 Just-In-Time Debugger

Being a TFS administrator, I run into some of the oddest situations. Here's today's.

A user of mine works with files located on a Unix machine, but accessible via Samba. She basically just looks at logs. Because Notepad does not handle the "end of line" character well (it will not automatically stick in a carriage return when viewing the file) she uses WordPad. Yes, you're probably just as surprised as me. Some people still use WordPad!

I had a good laugh asking her why she used such an antiquated tool. It's because WordPad will insert a carriage return when viewing so the log file looks nice. I directed her to more advanced tools (Textpad is my favorite), and she said she'd look into getting something better in the future.

Anyway, the real problem was that WordPad was throwing random exceptions for her. In the past she got the "WordPad MFC Application has encountered a problem and needs to close. We're sorry for the inconvenience." Thanks Microsoft. I appreciate your apology.

After installing Visual Studio Team Explorer, it was tossing up the Just-In-Time debugger and asking her if she wanted to debug the error in WordPad. This confused the user and prompted an email to the TFS administrator who made her install the offending software.

My goal was to turn off the debugger and return to Microsoft's apology note. I first started by going to Visual Studio's Tools > Options > Debugging > Just-In-Time and unchecking all the boxes. The standard Just-In-Time debug message went away, but it was replaced with an ugly "you don't have a default debugger configured dummy!" message. Or something along that line.

A quick Google search led me to this article. They show you the two reg hacks to fully disable debugging on your machine. Remember, standard registry backups should be applied before doing these steps.

I'm glad to note the user is now back to normal and continues to use WordPad and still gets random exceptions.

Monday, November 05, 2007

SQL Reporting Services Training

Three of us are in "SQL Reporting Services" training this week. Class "2793A: Implementing and Maintaining Microsoft SQL Server 2005 Reporting Services" to be exact. My suggestion is, if your first introduction to Reporting Services is with TFS (like ours was), spend the time and money to take a class. Reporting Services is not easy.

This class is pretty basic, but it does help you get the fundamentals down. After you take the class, I'd suggest you take something that works with Analysis Services. The material in 2793A is centered around pulling data from a relational model. TFS, on the other hand, has the cube which we'd like to use. Thus the instructor suggested our next stop be in an advanced course in Analysis Services.

Friday, November 02, 2007

Update on the horrid "There may be problems with the work item type definition" error

The issue TFS has with the AD displayName changing is bad. But now that we know what it is, it's not so bad. On the other hand, the horrid "There may be problems with your work item type definition.." error continues to give us fits.

The first time we got it, the error was legitimate, though not descriptive. We had a state named the same as a group. Microsoft is fixing this. Fair enough once we got it figured out. Now we seem to get it when ever we're making a lot of state and transition changes. Throw a few rules (e.g. REQUIRED) and we have real problems.

We got it again on Monday. This time I tabled everything I was working on and dedicated myself to figuring it out. I turned logging on by adjusting the web.config's traceLevel to 4 and traceWriter to true. Be careful with this if you don't have a lot of disk space. It will throw you a lot of information. Even with all that information, nothing good came from the logs except the standard "Microsoft.TeamFoundation.WorkItemTracking.Server.ValidationException: Forcing rollback ---> System.Data.SqlClient.SqlException: Forcing rollback" stack trace. Then I deleted all my local cache (typically your first step), but continued to get the error.

Next, out of curiosity, I tried logging into a new machine with my account. Same issue. I then tried Teamprise and TSWA. Same issue in both. Finally, before quitting my job and moving to Mexico, I tried logging into Teamprise using a different account. Something you can do with Teamprise which is a nice feature. This time it worked!

So why? My local cache was all removed. I tired the same account on multiple machines and still got the error. The permissions between my account and the account that works are the same. As a last resort, I restarted IIS thinking maybe something was cached on the server side. Just to be extra safe, I also removed all my local cache again. Sure enough, that did it. I could once again create work items.

So my hypothesis is, TFS must be doing some server side caching for work items. Or maybe there was something in memory that was not being cleaned up correctly? I'll continue to see if this solution works and post my results.

Wednesday, October 31, 2007

"Culture name 'en-securid' is not supported. Parameter name: name"

A user in England was getting the error "Culture name 'en-securid' is not supported. Parameter name: name" when accessing Team System Web Access 1.0. I asked Google and Live. No luck.

I then started to dig. In IE if you click Tools > Internet Options > Languages.. you'll see the Language Preference dialog. The user with the issue only had "User Defined [en-securid]" listed. They added "English (United States) [en-us]" right below "User Defined [en-securid]", but still got the error. We then had them Move Up "English (United States) [en-us]" to the top value, click Ok > Ok and the error was resolved.

If you run into a similar error, give this a shot and let me know how it goes.

Monday, October 29, 2007

"Application Not Found" in Team Explorer

A user in my group was getting "Application Not Found" when clicking on any attachment using the Team Explorer in Visual Studio 2005. The error was fixed when we (actually Desktop Support) changed the user's default browser from Firefox to Internet Explorer.

Because I'm curious, I changed my default browser to Firefox but was unable to repeat the error on my machine. I got a prompt from Firefox asking me what program to open the file with. My guess is, the user's Firefox settings are a bit off.

Anyway if you do get "Application Not Found", you may want to see what the default browser is and see if changing it to IE will fix it.

In other news, another person blamed firewall settings. You can find that potential solution here.

Friday, October 26, 2007

How to implement check in policies

We met with Teamprise this morning to discuss check in policies. One of the larger business units in the company has a long list of policies they'd like to see. Including, but not limited to, Work Item Association, Comment Association, Forbidden Pattern Policy and a Are You Breathing Policy. We (a smaller business unit) on the other hand, are only looking for the first one, Work Item Association.

If you're coming from a tool that has never enforced check in policies before (like we are with VSS) here is my suggestion for how to best implement them. Keep things simple and follow the three step process I've laid out below.

Step 1) Don't implement any policy right away. That is, try to mimic your current source control workflow in TFS. If users could check in and out at will, then let them. This non-intrusive approach allows end users to get comfortable with the new tool, by using a common approach they're accustom to. Remember, change is hard for people.

Step 2) Start with one policy and one policy only. I'd suggest all user bases start with Work Item Association. Asking the user to simply select what piece of paperwork (e.g. Task or Bug) the code was addressing, is an innocuous approach to improving your processes. Remember, change is hard for people so don't Big Bang them. Lull them into a better place like you do when boiling a frog (not that I've ever tried that...).

Step 3) Don't get crazy. That is, don't feel like you have to implement every single check in policy known to man. Engineers are smart, that's why we're engineers for goodness sake. Assume that you don't need a check in policy to verify that we got morning coffee. We all know how to get morning coffee. A tool does not need to tell us so.

By following this three step process, you'll find a more accepting user base and a better TFS/Teamprise implementation.

Tuesday, October 23, 2007

Accessing TFS from Windows Explorer with "Dubbelbock TFS"

When the big wig at IBM came in to show off ClearCase, he waved in my face the ability to interface with ClearCase by Right Clicking on a file in Windows Explorer, and then seeing the Source Control features at the tip of my proverbial fingers. While I didn't ever think I'd use it, I did find it an interesting feature. Personally, I like to interact with Source Control in my IDE only, but obviously others differ.

Anyway to my point. A fellow blogger, who happens to work at the same company but uses a separate TFS instance, found a tool called "Dubbelbock TFS" that allows just this. I don’t think I'll ever use it, but I did find it a nice example of how easy it is to extent TFS and how many small little projects are out there to improve developer efficiency.

Lastly just a note on my game theory analysis. I see ClearCase as a direct competitor of TFS. Since ClearCase has this Right Click in Windows Explorer functionality, I'm thinking MS might want to add it, even though I don't think users like me will ever use it. Reason being; when all the vendors show up on your door knocking they like to tout all the neat bells and whistles centered around functionality. When the ClearCase vendor showed this Right Click deal, I looked around the room and saw a bunch of manager types peak above their BlackBerrys and nod, as indicating "Oh that is nice. Let's make it a mandatory requirement for our next SCM tool so I don't get fired for buying a tool that does not have it." So even though I think the functionality is unneeded (at least by my group), game theory says MS should add it to a forthcoming release.

Monday, October 22, 2007

ReportViewer only pulls back 100 Work Items by default

Our release manager comes to me today and says he really likes the Report Viewer option in Team System Web Access, but hates the fact that he's limited to only viewing a 100 work items. At first I thought he was crazy - I have similar thoughts about most manager types - but after peering over his shoulder, he was right, only 100 work items showed up.

I started by digging on the forums and didn't find much. Turns out Google didn't know either so I started to track it down in our DEV environment.

If you open the Web.config under \Program Files\Microsoft Visual Studio 2005 Team System Web Access\Web, you'll find a setting for "maxWorkitemsInReportList" which defaults to 100. Push the value up a bit (we set ours to 200) and you get the requested results.

Why didn't I set it to 2000? Here is my logic; I figure Microsoft put a limit on the number of work items for a reason. I'm guessing performance, though I don't have any proof. The release manager said he would not expect anything near 200 at this point, but did have queries that pulled back more than 100. So we agreed on 200 for now and shook hands. We can gradually push the value up as we go.

I'd be interested to hear from someone with far more Report Viewer knowledge that I to comment on why there is a default max limit set.

Thursday, October 18, 2007

Bug in the Power Tools recursive "Compare" utility (i.e. TreeDiff) when working with Windows and Unix ASCII files.

If you're a user of VS.NET 2005 with the TFS Power Tools and work with files edited in both Windows and Unix, you may find some odd results with the recursive Compare tool, which I believe uses TreeDiff and DiffMerge combined. The issue is, when you setup Compare to ignore end of line characters (/ignoreeol), which you need to do as Windows uses Carriage Return (CR) + Line Feed (LF) and Unix only uses LF, the Tree Difference does not honor the /ignoreeol settings. Nor do we find a way to tell TreeDiff to ignore EOL. Below you'll find the steps to reproduce and my opinion on why this is wrong. I've submitted this to Microsoft so you can get the details here.

1) Open a file in Window's Notepad and enter two lines of txt. (e.g. c:\temp\CRLF\test.txt)
2) FTP file in binary mode to UNIX machine.
3) Run cat -v on file (e.g. cat -v test.txt) to see the ^M which shows the carriage return.
4) Run dos2unix on file (e.g. dos2unix test.txt) to remove ^M
As an alternative, you can also transfer the file to Unix in ASCII mode which removes the ^M
5) Transfer the file back to Windows to a different directory (e.g. c:\temp\LF) in BINARY mode, which will not add the ^M back in (e.g. c:\tmp\LF\test.txt).
6) Edit your VS.NET 2005 Compare Tool (i.e. DiffMerge) to /ignoreeol See my post for details on how to do that.
7) Now use the Right Click, Compare tool in VS.NET 2005 with Power Tools add-in to compare directories c:\temp\LF and c:\temp\CRLF.

The Tree Difference will show the files are different, but if you Right Click, Compare Files, you should get "The Files are identical" as it now remembers you included /ignoreeol in our compare settings.

This is incorrect as the Tree Difference should also honor your Compare setting overrides (or allow overrides itself) and not flag the files as different.

Wednesday, October 17, 2007

The importance of Teamprise

A few months ago Microsoft's Gregg Boer was nice enough to fly in and spend a day with us discussing TFS. If I understood him right, one of his goals was to "not" develop a SCM solution for Microsoft, and Microsoft only. That is, TFS needs to be a product that helps company's outside of Microsoft solve problems. As a software company, my organization struggles with this as well. We get caught up developing software that works well for us, not necessary for the user base.

Any way, I took Gregg's comments to heart as we're using TFS in a business unit that develops all our products in Java. We don't need a SCM solution that is geared towards Visual Studio users. We need one for Java developers. Thus we leverage Teamprise for our RAD/Eclipse users and Linux desktop users. So far we've had pretty good success and great support from the vendor.

As much as Microsoft would like to see it, large companies are not going to use .NET exclusively. Sure we're going to use .NET, but we're also going to use other languages like Java, Python, or Perl. If TFS wants to compete with ClearCase or Perforce (which I think they should as TFS is positioning itself very competitively), then Microsoft needs to ensure vendors like Teamprise continue to offer cross platform solutions. I would go so far as to say, if it was not for Teamprise, our organization would have went with ClearCase/ClearQuest or Perforce (plus an additional workflow tool). Both of those tools provide our Java developers or Linux platform users, interfaces from their native environments, which was one of our top ten requirements.

In ending, we really like TFS and feel we made the right decision. But if it was not for Teamprise, we would be using a different SCM tool. That's how important Teamprise is to Microsoft.

Monday, October 15, 2007

"Connection Failed" in our Webex Training files for TFS

This is not really related to TFS, but I wanted to write it down someplace so when I have to do it again (hopefully never) I know where to look. We use a lot of Webex when training users on TFS. Showing a user something rather than writing up a long-winded email, seems to work better.

Anyway, we (actually our server support folks) just got our Webex Website server setup. It's nothing more than a simply web server running IIS 6.0. Unfortunately we were getting a "Connection Failed" error when launching the WRF files. In the past we had just a simple file share and the WRF files launched right away. After reading this post from Experts Exchange, we found that we had to add a MIME type of WRF. Once we did this, the WRF files fired right up. Below are the steps for setting this up in IIS 6.0.

1) Right Click on Website and select Properties
2) Click on HTTP Headers
3) Select MIME Types at the bottom
4) Select New
5) Enter "wrf" (no quotes) for the Extension and "application/octet-stream" (no quotes) for the MIME Type
6) Ok, Ok, Ok,
7) Restart Website. Not sure if this is needed, but it's always a good step to take with Microsoft products.

* Here is another site we used for reference.

Friday, October 12, 2007

Fixing odd diff results when using Visual Studio .NET 2005 Compare on files with line feeds

The Compare tool that ships with Visual Studio .NET 2005 is called DiffMerge.exe and it's located under "C:\Program Files\Microsoft Visual Studio 8\Common7\IDE" by default. The posting on MSDN says the default "Option" passed in on the tool is only /ignorespace.

This causes a problem with our hybrid Windows/Unix users (i.e. users who edit files on Unix, then FTP them over to their Windows box to check them in to TFS) as files with line feeds were not diffing correctly. Oddly the line above the line that was different would show a difference, even though they were exactly the same. The line that was different (right below the line showing the difference) did not show a difference.

To fix this we had to change the DiffMerge Option to /ignoreeol. These values override the default values (described in the link above). Here are our steps.

1) In VS.NET 2005 go to Tools > Options > Source Control > Visual Studio Team Foundation Server
2) Click on Configure User Tools > Add
3) Fill in the Following

- Extension = *
- Operation = Compare
- Command = C:\Program Files\Microsoft Visual Studio 8\Common7\IDE\diffmerge.exe
- Arguments = %1 %2 /ignoreeol

4) Ok > Ok > Ok
5) Tree down to file $/MyteamProject/MyApp/.../myaffectedfile > Right Click > View History
6) Highlight the last two versions > Right Click > Compare.

For us, the files now diff correctly.

Thursday, October 11, 2007

Long string values in the drop down list for Team System Web Access.

We're seeing a slight usability issue with Team System Web Access. When submitting a new Work Item Type (e.g. Task) the user is given a new, smaller, browser pop-up with the template form. This is fine, but if you have drop down fields on the right hand side of the new window, which contain long string values (e.g. "This is a really long string value that we have in our drop down list"); the user is not able to get the scroll bar as it's off the screen.

The solution is to expand the pop-up window so you can scroll the drop down list. This problem happens in Mozilla as well. We're using 1024 / 678 pixels. Because we're fielding - no pun intended - a number of questions on this, we thought it would be best to submit the a case with Microsoft.

Details can be found here.

Monday, October 08, 2007

A follow-up on a few suggestions for TFS.

I thought I'd provide an update on some of the suggestions I've posted the past few weeks.

"Find In Files" - Good news. Microsoft said they plan on adding this feature in an upcoming release. I'm not sure if it's in the Orcus release or Rosario. Being it was just "resolved" in September of 07, I'm guessing it will be in the Rosario release in late 2008 - or so I've heard. Click here for details.

"Multiple Source Control trees pointed to one local directory" - Bad news. It does not look like Microsoft has plans to change this anytime soon. Click here for details.

"Right Click and "Set Working Folder"" I've submitted this to Microsoft to see if they'd like to add the feature. Our Teamprise users really like it so we'll see what Microsoft comes back with. Click here for details.

"Allow "Get Latest Version at Checkout to be a preference setting"" - We've submitted a suggestion with Microsoft, so we'll see what they come back with. We really like this feature in Teamprise so hopefully Microsoft will listen to us! Click here for details.

"Undo you check out, but keep changes" - We've submitted this to Microsoft as a suggestion. Click here for details.

"Diff files at check in" - We've submitted this to Microsoft as a suggestion. Click here for details.

Thursday, October 04, 2007

Finding your Active Directory server when extending TFS for Alerts

This is only somewhat related to TFS, but I wanted to document it anyways for my future reference. Because TFS RTM 1.0 Alerts stink, we wrote our own notification server. It simply looks at each work item change and if the "Assigned To" changes it uses the "displayName" (e.g. LastName, FirstName (Company)) to do a look up in Active Directory and find the user's email address.

In a posting way back - where I share all the things we dislike about TFS - I describe how we're using Mariano Szklanny's solution.

Anyway, I wanted to do some browsing in the Active Directory tree to see what other information we could get back, but I couldn't find the DNS to connect my LDAP browser. After digging around for an hour, I kind of found it here on Microsoft's page.

I'm not sure if this is right, but basically all I did is run the "nslookup" command and it gave me back the domain controller (I think that is what it gave me) and I was able to connect to Active Directory and get what I needed.

Again, I'm not sure if this is the best way, but it seems to work for us.

Tuesday, October 02, 2007

Using the "TFSUrlPublic" value

When we first setup our TFS instance, for some reason we didn't change the "TFSNameUrl" value to be our DNS name (e.g. tfs.mycompany.com) or uncomment and change the "TFSUrlPublic" value. Not sure why as we did do all the other things you need to do, which are spelled out in a number of blogs. Here is a pretty good one from Etienne.

Anyway, everything worked fine except the alerts were including the local app tier server name (TFSAppTier1) instead of the friendly DNS name. To fix this we simply uncommented the "TFSUrlPublic" value and changed it to match our DNS name (e.g. tfs.mycompany.com).

We also found out, if you don't uncomment "TFSUrlPublic" then "TFSNameUrl" is used as the default. But once you uncomment "TFSUrlPublic" then it overrides the "TFSNameUrl" value for alerts. At least that is the behavior we found.

http://geekswithblogs.net/etiennetremblay/archive/2006/07/28/86542.aspx

Monday, October 01, 2007

The horrid "There may be problems with the work item type definition" error - cont.

We got the horrid "There may be problems with the work item type definition" error again today. We get it every few days while making changes to Work Item Templates.

This time we got this when changing the "for=GroupName" attribute on a Transition. The error was only happening to a user who happened to be in a Group that was in GroupName and added to GroupName individually.

We removed the individual user from GroupName and everything worked fine. Unfortunately, like the posting below, adding the individual user back into the GroupName did not reproduce the error. It's almost as if, the root cause is caching where the only way you can force the tooling to refresh the cache (sever I'm guessing as we got the same error when logging onto a new machine with the same user) is to make a change that forces the system to update some cache.

Wednesday, September 26, 2007

The horrid "There may be problems with the work item type definition" error

If you've changed any work item templates in TFS, you most likely have run into the horrid There may be problems with the work item type definition error. We've run into this error a number of times and it's been caused by a number of things (e.g. local cache out of sync with server, having a group the same name as a state, looking at the system too long ;), etc.).

Anyway, today's horrid There may be problems... error message was caused by having a field REQUIRED in two Team Projects, and not REQUIRED in a third Team Project. When we removed the REQUIRED element from one of the first Team Projects, we got the error in that Team Project. The only way we were able to fix this is first remove the REQUIRED element from the field in question in all Team Projects. Then add it back to the one Team Project that wanted it.

Tracking down this issue was difficult, but since we were only making this one change we knew it had to be related to the field. This is why we make one change and a time and test it out before moving on to the next work item template change. If you make too many in the row and get the error, good luck backing out all the changes to try and figure out which change is the culprit.

While this seemed to fix the issue, we can't seem to reproduce it so unfortunately I don't think I can submit it to Microsoft. Anyway if you run into this situation, try removing all rule elements (e.g. REQUIRED) from the field and then re-add them. That is what seemed to fix this for us.

Teamprise's "Download (Save Locally).." for Attachments Rocks!

In one of our Work Items we have a process where a number of LDAP modification scripts get attached. In a recent request we had 34 files that needed to be downloaded so the end user could run them.

In VS.NET 2005 we had to "Open" each one and save them individually. In Teamprise Explorer, we were able to highlight all the files and select "Download (Save Locally).." which downloaded all the files in about one second.

A very nice feature from the fine folks at Teamprise. Microsoft should add this to their own Team Explorer.

Tuesday, September 25, 2007

Source Control Usability improvements for Team Explorer and Teamprise - Part 2

Here are some additional observations from end users.

- Diff files at check in

A few users have commented that they like VSS's "Diff" option at check in. I argued that there was not such a function, but then I tried it out and sure enough, that is a nice feature. I've used it a couple of times and found it valuable.

- Show Deleted files in Teamprise

I think this is something that is coming in the future, but I've had one or two users asking why their VS.NET colleague can see deleted files and they can't.

- Undo you check out, but keep changes

In VSS users can "Undo Check Out" and choose "Leave" local changes. As of right now, I think users are making full backups of files as when they do "Undo Pending Changes" their local files are replaced with the server files. Now you could argue VSS gave the user too much flexibility, but I'm not here to argue, I'm must relying information back from what users are doing. Personally, I think providing a "Leave" function is fine like VSS does today.

Wednesday, September 19, 2007

Source Control Usability improvements for Team Explorer and Teamprise

Good day! Now that we've been up and running in TFS for about five months and have 150 or so users on the system, I thought it would be a good time to aggregate some feedback from the users on how the Source Control experience can be improved.

- Add "Your current project" functionality like VSS has.

In a VSS user's ss.ini file there is a "Project" setting (e.g. Project = $/Development/client") that remembers where the user was in the tree when they last where logged in. When the end user closes/opens Team Explorer and Teamprise Explorer, they have to traverse the tree to get back to the structure they were working in. It would be nice if the tooling remembered where they were, so the user does not need to start over again each time the open the tool.

- Changes between or after labels.

I understand why "Labels" don’t show up in Version History in TFS. Buck Hodges does a nice job explaining the reasons. This being said, there should be an easy way for the end user to quickly answer the question "What changes happened since this label (or better, happened after this label and are not included in this label)". My thought would be to include a tool in the interface (i.e. Team Explorer or Teamprise) where the user can pick a label and then see the changes after it or before it. For us adding the label in the version history would be sufficient, as we don't edit labels we just re-label for a change, but again I understand the issues with doing that.

- Show the date of when the label was first applied (or updated).

Right now, we can't see when the label was applied in the interface (i.e. Team Explorer or Teamprise). In addition to the label name, owner, comment, I think seeing the label date (applied or last updated) would be helpful.

- Change the "User" in the Version History to a friendly name

No one knows that I'm U000124578924. They simply know me as "Noland, Mac (Company)". Yet in Version History, the unique ID is used and thus we have to look the unique ID up in Outlook when we want to see who actually changed something.

- Enabling the Compare selection on files that are not downloaded locally.

I know this problem can be solved via selecting two files in the Version History result window, but as an Administrator I can a number of questions regarding why they have to download the file first before "Comparing" it with another file.

- Allow "Get Latest Version at Checkout" to be a preference setting.

Ii Teamprise we really like the preference setting where a user can select "Get Latest Version at Checkout" instead of having to do a get latest and then check out. Team Explorer should include this setting as well. While it's probably philosophical, I'd like to see the setting be the default. It would avoid a number of merge conflicts in our group.

- Right Click and "Set Working Folder"

Teamprise Explorer makes setting up a workspace mapping easy as a Right Click and "Set Working Folder." Team Explorer should implement this feature as I field a number of questions from Team Explorer users on how to setup their workspaces. Teamprise users never ask.

- Recursive Compare of server files with local files.

Team Explorer with Power Tools 1.2 allow this. I hear Teamprise will have this feature in a future release as well. This is a very nice feature and one our team uses all the time in VSS.

- "Find in Files"

Maybe I'm missing something, but I don't see a "Find in Files" function like VSS had. I've got a number of people asking for this feature. Right now they have to do a Get Latest and then use a separate tool (e.g. Eclipse) to do the search.

- Multiple Source Control trees pointed to one local directory
This is the most common question: "How do I map $/TeamProject/mainline and $/TeamProject/release/7.12 to my c:\Development folder like I do in VSS?" Our answer is, "You can't." That is, unlike VSS you can't have a local directory (e.g. c:\Development) that looks at two Source Control trees. While I can see why TFS did this (e.g. helps you avoid stepping all over yourself) most people want this feature back in TFS.

I'm sure there will be more requests that pop up, but these are the most common questions we've gotten from the user base so far.

Monday, September 10, 2007

"The permissions granted to user 'Domain\UserId' are insufficient for performing this operation. (rsAccessDenied)" in TFS

We've been up and running with TFS now for a few months. We started by using Source Control, then moved onto Team Build and Work Item tracking. We're now trying to implement Reports. Here was our latest issue.

We started off by adding everyone using the TFS Administration Tool v 1.2 as "Contributors" in TFS, "Contributor,Web Designer" in SharePoint, and "Publisher" in Reporting Services. From what we can tell, this action will add each user to the Publisher role in Reporting Services under the "Folder" which is created when the Team Project is created. E.g. Team Project Name = Test123, a Reporting Services "Folder" called Test123 is created.

Unfortunately after doing this, the user base continued to get the error "The permissions granted to user 'Domain\UserId' are insufficient for performing this operation. (rsAccessDenied)". We scratched our heads as when checking "Folder" permissions, the user shows up and has "Publisher" access.

This went on for days (actually months to be honest). After an extra long lunch, I came back to my desk and asked the question "What permissions does the Publisher really have?" You may argue that I should have asked this question long before, but in SQL Lesson 2 they say "Assign the Publisher role to users who will perform all of the tasks provided in the previous roles, with additional permissions for publishing reports and models from Business Intelligence Development Studio." The Browser role is right above this statement.

As it turns out the Publisher had the ability to "author reports or models in Report Designer or Model Designer and then publish those items to a report server", but could not View a report. I'd argue if you can publish a new report to the server, you should be able to View it like Browser right? Wrong.

Maybe we missed something in the TFS setup, but as far as I can tell the "Publisher" role did not have View Folders, View Model, View Reports, View Resources. So when the TFS Administrator Tool setup a Contributor with "Publisher" permissions as the default, we took it - and the documentation - as gospel and thus frustrated end users and administrators alike.

We ended up fixing the problem by appending all of Browser's permissions "View Folders, View Model, View Reports, View Resources" to Publisher and so far the problem is solved.

Outside of the "Lesson 2: Setting Item-level Permissions on a Report Server" documentation being incorrect - or at least ambiguous, I'm not sure if there is anyone to blame here. We should have checked the permissions of Publisher instead of assuming Microsoft's documentation was correct.

All in all, I think the root cause is TFS is laborious to administer and thus causes tons of headaches when trying to implement it. From what I hear, they are working on the administration of TFS v2.0, but until then make sure if you add a user to the Publisher group in Reporting Services, either make sure Publisher has all the permissions Browser has, or add all users to both groups. If you've already got 150+ users setup in TFS, I'd recommend the former so you don't have to edit each user.

Friday, August 31, 2007

When "All Files are NOT up to date"

I admit, my TFS blogging has slowed in recent weeks. We're right in the middle of our largest migrations and so I've been spending more time doing work, then telling you about my work. Anyway, today's posting is about removing files from the file system outside of Teamprise Explorer or Team Explorer.

While I don't keep count, I'm guessing ten or so people have stopped by my desk asking why the "Get Latest Version" does not work. At first I freaked out and thought we'd implemented a product developers couldn’t' use, but then found it was simply a circumventing issue.

99.9% of the time when people have this problem it goes as so. They first setup of their workspaces and get the code, but since they're new to TFS, they typically setup the workspace not to there liking (e.g. $/ --> c:\development) and do a Get. This results in a lengthy Get of all the code we have.

To correct this, they first delete the code locally. That is; they open My Computer and delete c:\development \*.*. They don't want all that code filling up there 100 gig hard drives (insert smile here).

After the full delete finishes (it usually takes most of the lunch hour given they just pulled five million lines of code locally), they return to Teamprise Explorer or Team Explorer and try to do a Get just on the projects they need (e.g. $/MyProject/MyApp). Unfortunately TFS says "All Files are up to date" and won't allow them to Get files. This is when they call me, the administrator.

I simply fix their issue by having them do a Specific Version Get and Force it to pull the files down locally. To be honest, the message they get says this as well, but for some reason humans avoid reading messages. To prove that, just two days ago a developer asked me how to resolve the check-in policy error "Please provide some comments about your check-in" (insert second smile here).

So I'm not sure what the ideal solution is, but I'm wondering if TFS ever could be so smart as to first look at the file system before it says "All Files are up to date." This would add overhead, but might save overworked administrators some headaches.

Friday, August 17, 2007

"Undo Pending Changes..." for users no longer with the company

Well I'm back from a two week parental leave and excited to tackle TFS again. Funny how a nice break makes everything better at work. Here is a question posed to me a few minutes ago.

How do you undo a change for a person who is no longer with the company? Apparently people do leave this place. From what I can tell the needed functionality is not provided in the Team Explorer interface. Luckily it's provided in the command line tf.exe command. Here is the syntax I used.

tf undo /workspace:U0000001-XPA;Domain\U0000001 /server:http://tfs.mycompany.com:8080 $/MyTeamProject/MyApp/mainline/build.xml /noprompt

Replace the needed values. In our example, U0000001 is the user's unique ID in Active Directory. Yours is obviously much cleaner than the cryptic format we use.

Lastly you'll have to make sure you have the correct permissions. I'm in the [Server]\Team Foundation Administrators group, so I'm as close to Superman as you can get.

Wednesday, August 01, 2007

New "Team System Web Access" is great!

August 2007 - We just loaded Team System Web Access 1.0 on our sandbox and we love it! The ability to start and stop and monitor builds is a wonderful addition. No need to fire up Studio just to launch a build. Or even better, we can now have our wireless deviced managers, start builds from their Blackberry's. I think Teamplain 2.0 had some of these features, but we stayed away from it since it was not "officially" released.

In addition to some new features, the load time has decreased significantly which is comforting. With Teamplain 1.0 it often took 5-10 seconds minimum for the application to load in the user's browser. While unscientific, Team System Web Access is loading in a second or so.

I also tested the Documents interface and it looks like a minor bug (actually it's quite major if you didn't abandon the interface like we did) with data loss was addressed. With Teamplain 1.0 we noticed that all changes were lost when checking the document back in. I'm not 100% sure if this is because we had something messed up or not. With Team System Web Access we are now planning to actually use the Documents interface.

We haven't played around with the Source Control tab yet, but there looks to be some nice light weight tools for viewing source and source meta-data.

Our final opinion will be made once we get this on our DEV and PROD systems, but so far we've had nothing but good vibes about this product. It will making adopting TFS by management (who doesn't want to install Visual Studio) much easier.

Thanks for releasing this before Orcas!

Tuesday, July 17, 2007

"No history entries were found for the item and version combination specified"

July 2007 - My worst TFS fear almost happened today. I Right Clicked on a folder under Source Control and selected View History using Team Explorer. After spinning for a few seconds I got the error "No history entries were found for the item and version combination specified." "What the heck is that?" I said to myself. I was able to tree through the source and open a file, but couldn't do a Get Latest or perform any other function. "Am I loosing my mind? This just worked fine yesterday."

My first few hypothesis's were based on my previous experiences as a VSS administrator. "We have a corruption issue," I screamed out to my local contemporaries. The .NET guys had worry written all over their faces. The Java guys started to snicker.

I Googled the error and got back two unrelated hits. I Lived it, can ironically got the same two hits. Coincidence?

Anyway, after suffering a near break down, for some reason I decided to check the permissions of the project. When Right Clicking on the project, selecting Properties, then Security, the project had no one setup with permissions (or so I thought). That was odd? Since I'm part of the [Server]\Team Foundation Server Administrators group I thought I'd be able to add my account. Nope.

I then logged into the App Tier (don't tell our system administrators this) and checked the Security. Sure enough, I could see there was one group who had access to the project and it didn't include me. The Team Project administrator in London had adjusted permissions for the project. We centralized the Server administration, but decentralized some of the Team Project administration. I Logged onto the App Tier with an account in the local Administrator group and was able to add my account. Back on my machine then, I was able to do View History just fine. .NET guys were happy to see the error was so simple to resolve, the Java guys frowned and stuck their faces back in Eclipse.

So the moral of the story is, if you get the error "No history entries were found for the item and version combination specified" make sure you have correct permissions setup on the Source Control projects (a.k.a Folders).

Thursday, July 05, 2007

Using Team Build for Java builds that use Ant

July 2007 - We're a Java development group using TFS. You're probably asking why a Java development group would buy TFS, which is a valid question. The answer is quite long and complicated (political), so I'll save the gory details for a future post. In short; though my group uses primarily Java, a group across the hall, which is equally sized, uses .NET. We're a fairly large software company who was looking for a cost effective solution that would satisfy both Java and .NET developers. After evaluating the array of offerings, TFS offered this for us.

To compile our Java code we use Ant exclusively and Anthill Open Source as a high-level driver to kick off the builds. If you're interested in using Anthill Open Source to talk to TFS, I hear there is a friendly open source developer ;) who has written an adaptor for you.....

Anyway, we really wanted our builds to list out the work items linked to changesets included in each build. Team Build gives us exactly what we want, but again we're a Java shop using Anthill. How can we use Team Build to run our Ant scripts?

We (actually my wonderful configuration management colleague who deserves all the credit for figuring this stuff out) had to override the "CoreCompile" target in our TFSBuild.proj file so we have control of what "compile" means for us. The full list of things "CoreCompile" does is located in the "microsoft.teamfoundation.build.targets" file. Since "compile" to use means call ant, we can override "most" of that stuff.

I say "most" because we do need to call MSBuild. If we don't, build results are not published to the data warehouse. This is a bit lame, but after we all got done cussing and swearing at Microsoft (remember we're a Java development group and take every good natured opportunity to bash Microsoft), we simply created a blank solution file, placed it next to our build.xml in source control and called it. The only two lines in the solution file are "Microsoft Visual Studio Solution File, Format Version 9.00" and "# Visual Studio 2005". It does nothing more than somehow signal Team Build to publish our results in the data warehouse.

After we get done calling the superfluous MSBuild task, we do the real work. We use the Exec task to call Ant with the needed parameters. Lastly we copied over the OnError task from the main "CoreCompile," which will execute the "OnBuildBreak" task in the master targets file (i.e. microsoft.teamfoundation.build.targets).

I don't consider this an optimal solution, but a step in the right direction. Our current Anthill solution has a nice feature where it will only build if there are changes. Team Build fires away every night, which ends up creating a number of needless builds. We also frown every time we setup a new project and have to put a blank Solution File next to the build.xml. Again, we get over it with a afternoon coffee and obligatory Microsoft bash session (all in good fun of course).

So if you're a Java group using TFS, give Team Build a try. It seems to be working fine for us, and once Orcus gets here Team Build should have a number of additional features (e.g. continues integration builds) that you'll want to take advantage of.

<Target Name="CoreCompile">
<TeamBuildMessage Tag="Configuration" Value="%(ConfigurationToBuild.FlavorToBuild)" />

<MSBuild Condition=" '@(SolutionToBuild)'!='' "Projects="@(SolutionToBuild)" Properties="Configuration=%(ConfigurationToBuild.FlavorToBuild);Platform=%(ConfigurationToBuild.PlatformToBuild);SkipInvalidConfigurations=true;VCBuildOverride=$(MSBuildProjectDirectory)\TFSBuild.vsprops;FxCopDir=$(FxCopDir);OutDir=$(OutDir);ReferencePath=$(ReferencePath);TeamBuildConstants=$(TeamBuildConstants);$(CodeAnalysisOption)" Targets="Build" />

<Exec Command='ant -f "$(SolutionRoot)\ant\build.xml" -Dappdeploy_nightly="true" -Dversion=0.0.0.0'>
<Output TaskParameter="ExitCode" ItemName="ExitCodes"/>
</Exec>

<OnError ExecuteTargets="OnBuildBreak;" />
</Target>

Tuesday, July 03, 2007

SQL 2005 SP2 and TFS RTM

July 2007 - We're running our Data Tier in a cluster and thus unable to upgrade to TFS SP1, as until lately TFS SP1's install was not supported on a clustered Data Tier. They have that fixed now, but since we're running fine in PROD, we've decided to wait for Orcus before we upgrade. With the amount of time we spent on getting TFS RTM running, we want to use the product and not continue to troubleshoot it.

We started off with SQL 2005 SP1 (plus a number of hot fixes that the DBAs take care of). For a reason unknown to me (probably just consistency in our data center) the DBAs wanted to upgrade to SQL 2005 SP2 (plus a number of hot fixes). We were a little nervous about this as according to Mr. Harry, they run MSSQL 2005 SP2, but also TFS SP1 (see link).

We got confirmation from our sales engineer that Microsoft was able to test the SQL 2005 SP2 and TFS RTM configuration and everything looks good. We've updated our DEV environment and can confirm that we've not run into any issues. We're planning on upgrading our PROD environment in two weeks.

If your DBAs are looking to get you to SQL 2005 SP2 and you're still running TFS RTM like us, feel free to upgrade SQL 2005 as it's worked fine so far.

Friday, June 22, 2007

Deleting a workspace pointing to a server that is no longer

June 2007 - During the last year or so we've had a number of TFS servers up and running. The first server was a "proof of concept" server that ended up expiring on us right before a demo. Microsoft was nice enough to give us an extra 30 day extension on the 120 day trial license, but after 150 days of running TFS (both AppTier and DataTier) on an old desk top machine, we had had enough of it.

Our next TFS instance was for the "pilot," and to be honest it was not much better. In fact performance might have been worse (not due to TFS though). It did however fulfill our pilot needs and got the TFS product approved by upper management. Once we purchased TFS, we bough a bunch of bulky hardware and have it all under tight control in our datacenter. We also ditched our "pilot" TFS box as the one gig of RAM was giving me heartburn. Enough about our old TFS instances though.

Just yesterday I was trying to setup a workspace and got the error "The remote name could not be resolved: 'F1-777-2KA'" This perplexed me as 'F1-777-2KA' was our pilot server that the local system administrators took off to the graveyard a few months back. Then I remembered that I must have had a workspace from the old pilot machine that is stuck in my local cache. Sure enough issuing the command 'tf workspaces' showed the old workspace pointing at a machine that was no longer in existence. I'll just delete it right? Wrong!

Logically, I issued the command 'tf workspace /delete TestMainline' thinking I could delete the workspace from my cache. The command returned the error "The remote name could not be resolved: 'F1-777-2KA'" Sure it can't. The machine is retired in some garbage heap.

After digging around I found Buck's post which lays the solution out for us. Simply open the 'VersionControl.config' file which for me was under 'C:\Documents and Settings\%USERNAME%\Local Settings\Application Data\Microsoft\Team Foundation\1.0\Cache' and delete the invalid <ServerInfo> element. Below is what I had to delete. Good luck deleting your workspaces!

<serverinfo uri="http://F1-777-2KA:8080/" repositoryguid="0544w34a0-6cb2-498c-a64c-8ae38a6575657">

<workspaceinfo comment="" computer="U0000001-XPB" name="TestMainline" ownername="Domain\U0000001" lastsavedcheckintimestamp="0001-01-01T00:00:00Z">

<mappedpaths>
<mappedpath path="C:\anthill">
<mappedpath path="C:\tmp\ant_test">
</mappedpaths>

</workspaceinfo>

</serverinfo>

Thursday, June 14, 2007

Our hardware and software configuration for TFS

I thought we'd take some time and share our hardware and software configuration for TFS. On the App Tier we're running a Primary with a Standby. Only one machine is "Active" at a time. We control what machine is "Active" with a load balancer (i.e. Big/Ip) and DNS entry (i.e. tfs.mycompany.com). We actually don’t use the load balancer's "load balancing" functionally (unfortunately not supported by TFS), but do use its interface to drain users before activating the Standby. This allows anyone with long sessions (e.g. a large Get from Source Control) to finish their session before we switch over to the Standby. Moving back to the Primary works the same way.

Each App Tier machine is running Windows 2003 Server 32-bit Standard Edition SP1, has four 2.8 GHz Intel P4 processors and four gigs of memory. Windows says it only has 3,456 megs of memory, but from what the Sys Admins tell me it really has four gigs. My guess is there is some plausible reason for Windows not recognizing the other half gig of memory.

On the Data Tier we're using an active/passive cluster. Both machines are running Windows 2003 64-bit Enterprise Edition SP1, have four 3.8 GHz Intel P4 processors and four gigs of memory. Oddly enough, unlike the App Tier, Windows recognizes the four gigs (4,095 to be exact) on the Data Tier. I'm guessing that is either because we're running the 64-bit or Enterprise edition of Windows.

The App Tier is running Reporting Services 32-bit Standard Edition and using the included Reporting Services license that ships with TFS. The Data Tier is running the 64-bit Enterprise Edition of Reporting Services which we had to buy separately. As a side note, make sure someone with four PhDs and two law degrees reads the licensing white paper. Figuring out what you need to license can be a nightmare. Here is the one big thing we missed: If you have a "Duel-Server Deployment" you can only use the included Reporting Services license for either the App Tier or Data Tier. You must buy the second copy of Reporting Services separately. In addition, here is something we almost missed. Since Data Tier is an active/passive cluster you only need one license of Reporting Services for both machine. Since the second machine is "passive" you only need one license. When the "passive" machine switches to "active" (and "active" switches to "passive") the license switches over (just on paper of course). Again make sure the suites in procurement get this all straight before you sign on the dotted line.

If we run into performance problems the first thing we're going to do is increase the RAM on our Data Tier. We're only running four gigs, which may be a bit low. Our user base is expected to slowly grow to around 200-250 users so we'll monitor the usage patterns to see if we need to add memory.

Feel free to share your configurations!

Tuesday, June 12, 2007

Should you use Active Directory Groups or Users when managing users in TFS?

June 2007 - One decision you'll have to make when implementing TFS is whether to use Active Directory Groups or Active Directory Users when setting up users in TFS. There are advantages and disadvantages of using both and I'll discuss them below.

Active Directory (AD) Groups - AD groups are typically setup and managed by system administrators or setup by system administrators and managed by help desks. Though I've never used the AD tooling to manage users, I can't imagine it being more difficult than how you manage users in TFS. Remember TFS permissions must be managed in TFS, SharePoint, and Reporting Services. Although the TFS Administration Tool makes things easier, even it can be unbearable at times. We're hopeful Microsoft will produce a more robust administrative tool for TFS in the near future. In the mean time, when using AD Groups all you do is add the group once in each Team Project and you're done messing with TFS administration.

AD groups also off loads the administrative process of adding/removing users from TFS. As a development team, we have little interest in adding/removing users all day. We're paid to develop software, not manage users.

Active Directory (AD) Users - Users give you a bit more control. Unlike the use of AD Groups, you get to control who is added to your TFS Team Projects and who is not. No need to worry about the help desk adding a third-party consultant to the wrong AD group and thus giving away your intellectual capital to someone who should not see it. You also have more flexibility on what TFS groups you want to assign users to. For example we have a Testing TFS group in each of our Team Projects. Using AD Users we can add the specific testers to the specific Testing groups in their respected Team Project. If we used AD Groups we'd need one AD Group for each Testing group in each Team Project. If you use a number of TFS Groups, you can create quite a mess.

Using AD Users also helps when troubleshooting rights. At our company, finding out who is part of what AD Group involves a call to the help desk and a 2-4 hour turn around. If a user is having issues getting access to something or getting too much access to something, it can be very difficult to find out why. If you use AD Users, you can see what permissions that specific user has.

The use of AD Users is not without it's faults though. The biggest issue is what a pain it is to add the users. Again the TFS Administration Tool helps, but it's still not the easiest tool to use. By using AD Groups, you avoid fighting with TFS.

In conclusion, we decided to use AD Users as having 100% control over who sees what is very important to us. We have a billion dollar product in TFS with some very sensitive code that we only want select developers to see. With the number of contractors we have coming and going, we need to safeguard our priceless assets. Though the use of AD Groups would be easier, since we need to have tight control, using AD Users works better. Since this will not change anytime soon, we'll look forward to when Microsoft gives us a more robust TFS administration tool.

Friday, June 08, 2007

Automatic Checkouts with Team Explorer

June 2007 - Last night while I should have been sleeping, I was thinking why doesn't Team Explorer implement automatic checkouts like Teamprise's Eclipse Plug-in? I can't believe Microsoft left that out. It then dawned upon me that maybe you have to tell Team Explorer you want to "associate" your code with Source Control. Sure enough, I just bound my .NET solution file (and all its associated projects) to Source Control, and now Team Explorer does the automatic check out for me. This is very nice as it saves us a few more clicks when working on .NET code.

I'm probably overlooking this functionality as well, but while we can "Bind" a solution file I can't seem to "Bind" anything else in Team Explorer. For example we have our Process Template modifications under Source Control. Seems like a prudent thing to do since we're the Configuration Management group. Anyway, when using Teamprise and modifying Task.xml, we get the nice automatic check out feature. Unfortunately when we edit the same file in Team Explorer, we don’t get that feature and have to "Overwrite," "Check Out," and "Check In."

I can't say the reason I use Teamprise for most of my Process Template modifications is because of the automatic check out (the real reason is 60%+ of our user base is using Teamprise and I have to support them), but it is a very nice feature. Keep up the good work Teamprise folks!

Thursday, June 07, 2007

Automatic Checkout with Teamprise

June 2007 - I'm doubt this posting will help anyone in trouble, but I thought that I'd post it anyways as I think it's a pretty cool feature. One of the really nice things I like about Teamprise's Eclipse Plug-in is how you can open up a file, start to edit it while Teamprise automatically checks the file out for you.

Unfortunately I don't see this feature in Team Explorer. In Team Explorer I find myself making a few changes, clicking Save and getting the "Save As, Overwrite, Cancel" button. I then "Overwrite" the file, check it out, and check it in. With my math, Teamprise saves you at least two clicks with the automatic check out. Below is a high level (very high level) workflow.

Teamprise Eclipse Plug-in
Open file (click 1)
Save file (click 2)
Check in (click 3)

Team Explorer
Open file (click 1)
Save file (click 2)
Overwrite file (click 3)
Check out file (click 4)
Check in file (click 5)

Wednesday, June 06, 2007

"Code Coverage Analysis Service" in Stopped Status

June 2007 - We've been looking into an "issue" of our "Code Coverage Analysis Service" service sitting in a "Stopped" state in both our DEV and PROD environments. Our Sys Admins alerted us of the potential "issue" as they were getting paged every time the service stopped. I'm not 100% sure how their monitoring works, but from what I understand they are paged when ever a service set to "Automatic" stops. Once these servers went out of maintenance mode, the Sys Admin pagers wouldn’t stop buzzing!

When starting the service we'd periodically get a message stating "The Code Coverage Analysis service on Local Computer started and then stopped. Some services stop automatically if they have no work to do, for example, the Performance logs and Alerts service." So the Sys Admins set the service to "Manual" until we got this figured out.

While working with Microsoft support on a different issue we asked about the service being stopped. According to our support representative this is normal behavior. Apparently other services do this as well. I'm surprised our Sys Admins never ran into it before given we run over a thousand Windows boxes.

Anyway the services are put back to "Automatic" and the Sys Admins have excluded this particular service from their "Started/Stopped" monitoring.

I'd be interested to see if anyone has similar experiences.

Friday, May 25, 2007

RSKEYMGMT command for failover to standby TFS server

We're a daring group of folks who have added a standby TFS App Tier server to our topology. We'd like to see TFS support true failover, but for now we're comfortable with a warm backup that with some manual intervention we can get up and running if need be.

We followed the How to: Activate a Fail-Over Application-Tier Serverto a tee and as it turned out, that was our downfall. I'm not saying the documentation is wrong, all I'm saying is we had to modify step 4) under Reporting Services to get our standby up and running.

After a number of hours, we broke down and called Microsoft Support who directed us to a very competent TFS support engineer. He correctly diagnosed the issue as being Reporting Services related and joined us with another very competent Reporting Services support engineer. Microsoft should be proud of the work they both did. Great support is one of the primary reasons we buy commercial software/support.

The issue was regarding the command "RSKEYMGMT –a –i <instance ID for AT2> -f c:\backups\My_RSBackup_TFS_AT01 -p aPassword". We got the error "Unable to locate the Report Server Windows service for instance <instance ID>". From what I understand, since the TFS install requires Reporting Services to be installed as a "default" instance (see step 10 in "How to: Install Microsoft SQL Server 2005 Reporting Services for Team Foundation Server (Dual-Server Deployment)" which is located in the TFS install documentation.) you can't activate it by a named instance name. Thus the "–i " switch in the command was throwing the error.

Removing the –i and running the command "RSKEYMGMT –a -f c:\backups\My_RSBackup_TFS_AT01 -p aPassword" on the standby App Tier worked just fine.

Although the process is not perfect, we're now able to failover. Hope this helps if you run into the same issue.

Thursday, May 24, 2007

TFS Data Warehouse Not Being Updated by TFS TFSServerScheduler

May 2007 - We had an ongoing issue where the TFS Data Warehouse was not updated by TFS TFSServerScheduler but could be invoked manually. What we think happened is when we installed TFS to a different port (i.e. 8888) and went through the process of putting our App Tier behind a Fully Qualified Domain Name, not all of the Registry Entries were updated as needed.

With the help of this posting and some sleepless nights we sifted through the App Tier's registry and found a number of references to AppTier-B01:8080 instead of AppTier-B01:8888 or better tfs.int.mycompany.com:8888 which is a DNS entry for AppTier-B01:8888. We also found a number of duplicate references where both AppTier-B01:8080 and tfs.mycompany.com:8888 were listed. We logged on as the TFSInstall, TFSService, and TFSReports account and did the clean-up as some of the settings are under HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\8.0\TeamFoundation\Servers.

The lesson learned is to make sure all your registry settings are correct for ALL USERS. Because our AppTier name did not change (we just put it behind DNS), we think the most important piece was the port number which we changed from 8080 to 8888. Below are our current settings on the App Tier.

Sys Admin Account:

HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\VisualStudio\8.0\TeamFoundation\Servers
tfs.mycompany.com = http://tfs.mycompany.com:8888

HKEY_CURRENT_USER\SOFTWARE\Microsoft\VisualStudio\8.0\TeamFoundation\Servers
tfs.mycompany.com = http://tfs.mycompany.com:8888

TFSService Account:

HKEY_CURRENT_USER\SOFTWARE\Microsoft\VisualStudio\8.0\TeamFoundation\Servers
tfs.mycompany.com = http://tfs.mycompany.com:8888

Wednesday, May 23, 2007

Can you have TFS states and groups with the same name?

May 2007 - We've been dealing with the very informative user error message "TF26212: Team Foundation Server could not save your changes. There may be problems with the work item type definition. Try again or contact your Team Foundation Server Administrator." for about two weeks now and we finally think we have an answer.

The error seemed to show up when we had a state (e.g. Testing) named the same as a group (e.g. Testing). Here are our steps and final thoughts on how we resolved the issue.

Start by editing the Web.config file under \Program Files\Microsoft Visual Studio 2005 Team Foundation Server\Web Services by changing

name="traceLevel" value="1" to name="traceLevel" value="4"

and

key="traceWriter" value="false" to key="traceWriter" value="true"

This will start logging myriad amounts of information under C:\WINDOWS\Temp\TFLogFiles (or some other directory if you change the defaults).

After you make these changes (you don't need to bounce IIS) try to get the error again. If you see an error like below in the TFLogFiles you may be in the same boat as us.

[WI] [Error, 2916, 5, 11:31:33.016] SvrEx: Microsoft.TeamFoundation.WorkItemTracking.Server.ValidationException: Forcing rollback ---> System.Data.SqlClient.SqlException: Forcing rollback
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlDataReader.HasMoreRows()
at System.Data.SqlClient.SqlDataReader.ReadInternal(Boolean setTimeout)
at System.Data.SqlClient.SqlDataReader.NextResult()
at Microsoft.TeamFoundation.WorkItemTracking.Server.PayloadTableCollection.Populate(SqlDataReader reader)
at Microsoft.TeamFoundation.WorkItemTracking.Server.SqlAccess.ExecuteBatchPayloadImpl(IRequestContext context, String sqlBatch, List`1 parameterList, Boolean& errorOnBulkUpdate, String connectionString)
--- End of inner exception stack trace ---.


While we've had differing results with a number of use cases, the root cause seems to be related to having a state and group named the same. We saw the issue with a state called CCB and group called CCB and the same issue with a state called Testing and a group called Testing.

What's even more alarming is they don't even have to be in the same project. We're pretty sure if you have a Testing group in Project1 and a Testing state in Project2, you will get a conflict.

Again we're not 100% sure this should be written in stone, but our error has disappeared now that we renamed the CCB state to Change Control Board and Testing group to Testers. And removed all references to the CCB state and Testing group from all the projects in our instance. This seemed to work for us.

Good luck!

Friday, May 11, 2007

Report Server Windows Service (MSSQLSERVER) cannot connect to the report server database

May 2007 - We just got our standby server setup and failed over for the first time. I'll send out the steps we took in a forthcoming post as we're still working out some minor details.

We were a bit concerned as after setting up the standby server we were getting the following error messages (below) in Event Viewer. The first error message just happened once (or so we can tell). The second one was recorded every minute in Event Viewer on the standby.

When we failed over the to standby both errors went away and started to appear on the primary. I posted a message to the MSDN forums and Mr. Chen confirmed this was as expected. I'm not sure I followed him on this, but he said the reason was "This is usually used to improve production quality in the future." Again I'm not sure I follow him on why this will help to improve quality, but I'll take his word for it.

Happy failover!

Error Messages:

Report Server Windows Service (MSSQLSERVER) has not been granted access to the catalog content.

Report Server Windows Service (MSSQLSERVER) cannot connect to the report server database.

Thursday, May 03, 2007

HTTP Status 400 With Reporting Services

May 2007 - There are probably a myriad of reasons to get the error "The request failed with HTTP status 400: Bad Request" in Reporting Services running with TFS, but here is why we got the error.

While trying to get our TFS instance behind a fully qualities DNS name we added tfs.int.mycompany.com to host header entry on the Default Web Site on our TFS App Tier. After adding this entry users could still see Reports in Visual Studio, but when Right Clicking and selecting Show Report Site Internet Explorer would pop up and give us the error ""The request failed with HTTP status 400: Bad Request". As a side note, adding this entry in the host header did nothing for us while trying to get our App Tier behind DNS.

Once we removed that value, the error went away. Hope this helps anyone else.

Tuesday, May 01, 2007

Our own Dog Food Statistics

May 2007 - What a month this has been. We had some issues getting TFS up and running (see my previous post for some of the bigger ones), but finally we have a PROD system that is capable of taking on users. For those of you who are thinking about implementing TFS, make sure you have dedicated resources to work on it. We're fortunate that our company is funded a full time administrator to get this up and running for 200+ users (1 admin --> 200 users). If you're a Sys Admin who, in addition to your day-to-day responsibilities, is supposed to get TFS up and running, good luck!

I'm a bit of a statistics nut so while we currently wait for Brian Harry to release the TFSServerManagerTool (http://blogs.msdn.com/bharry/archive/2007/01/22/tfsservermanager-powertool.aspx) I thought I'd try to pull some simple statistics that we might find valuable and post them. Being our first project was created today, it seems like an apropos time to make this posting. I'm a bit pressed for time so I didn't have the opportunity to put together the myriad of statistics that Brian has. I'll wait for him to release the TFSServerManager PowerTool to get those crazy numbers. That being said, we find this information helpful so here it goes.

Team Projects = 1; This was a pretty easy one to find!

Users = 6; This one was a bit tougher. I started by running the TfsSecurity tool (http://msdn2.microsoft.com/en-us/library/ms252504(VS.80).aspx) with the "/imx all:" parameter. Unfortunately this gave us back all the users plus the 50 Sys Admins that also have access to the machine, but would never use TFS with their administration accounts (e.g. MyDomain\M0000001). Consequently we don't want to count them. The good thing is our user accounts all start with \U (U for user they tell me) so I was able to find actual TFS users by running "TfsSecurity /server:tfs.int.mycompany.com /imx n:"[SERVER]\Team Foundation Valid Users" find "MyDomain\U"". This gave us back the number of MyDomain\U users (e.g. MyDomain\U000001) which are the only users we add to TFS projects.

Work items = 16; This was also easy as each new work item creates a new incremented id. There are 15 standard CMMI process template Tasks that get created. After that we created one to change a Change Request state which was number 16.

Files/Folders = 1859/31; I got these values using the following commands. For files I ran "tf dir /server:tfs.int.mycompany.com $/ /recursive" less the value I got for the number of folders, which is next. For folders I ran "tf dir /server:tfs.int.mycompany.com $/ /folders /recursive".

Changesets = 7; Another easy one for now. I got this value by running "tf changeset /server:tfs.int.mycompany.com /latest /noprompt"

That's it for now. If you find additional or better ways to find your own Dog Food statistics before Brian releases TFSServerManagerTool, let us know.

Monday, April 09, 2007

Trials and tribulations of a TFS install

Ah it's spring and we're still plugging away on getting TFS installed on our DEV environment. In TFS's defense most of the big delays are big company bureaucracy. That being said, installing TFS is not as easy as installing MS Word on my grandmother's PC. Here are some of the issues we ran into and the resolutions. I've also included some big company standard deviations we did. By big company standard deviations I mean things like our system administrators make us install "third-party" software on the D: drive instead of the C: drive. I can't imagine the number of hours upper management spent on that decision.

First a bit of background. From what I can tell our company supports Windows based servers with two groups. For this discussion let's call the database group the "DBAs" and the system administrators "Sys Admins." Upper management actually refers to them as something totally different, but the average sane person can't understand upper management's terminology (or thought process for that matter). The DBAs and Sys Admins work like this: In the morning the DBAs get coffee on the second floor, Sys Admins the fourth. DBAs go out to lunch at 11AM. Sys Admins eat at their desk around noon. DBAs drink Diet Coke, Sys Admins drink regular Pepsi. To an outsider one could conclude the two groups try to avoid each other at all costs. Thus, it's with these two groups we must coordinate the temperamental TFS install.

  • The DBAs like to use a standard company SQL install image for all data tier machines. Unfortunately the big company image does not include Integration Services and Analysis Services. Apparently we're the first team that needs such "advanced technology." Lesson number one, don't install a base SQL 2005 64-bit image on a cluster and then expect to install Integration Services and Analysis services afterwards. After three days working with Microsoft, our DBAs gave up and re-installed SQL 2005 64-bit with Integration Services and Analysis Services selected on the initial install. I'm not sure why adding these two components after the cluster is setup is such a big deal, but the amount of vulgar language the DBAs used it must be a big deal.
  • Our DBAs have "Big Company Approved" Reporting Services install instructions that includes "Configuration." We had the DBAs do the basic install with their instructions, then tackled them before they could run through the "Configuration" section. From what we're told, the TFS install does the configuration for us.
  • We installed the TFS App Tier software to the d: drive. That is a requirement from the Sys Admins. No big deal here.
  • Here is a big one. The entire datacenter has some kind of "flash copy agent" software running on port 8080. Conventional wisdom would lead you to believe one can simply change the TFS msiproperty.ini to some other port (e.g. 8888) and install TFS. There I go again thinking conventional wisdom is logical. Changing the misproperty.ini file resulted in the following error: "Error 28925.TFServerStatusValidator: Calling the Team Foundation Server ServerStatus Web service failed. Additional details about the problem can be found in the setup log. Verify your network configuration. For more information on troubleshooting this error, see the Microsoft Help and Support Center. Log file: TFServerStatusValidator - Team Foundation Server Status Validator tool (C) Copyright 2006 Microsoft Corporation. All rights reserved." The details are outlined in my forum posting (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1411189&SiteID=1&mode=1). After giving up and calling Microsoft, the support folks from India had the Sys Admin shut down the agent software running on 8080, install TFS to port 8080, step through a number of manual steps which included making a database change to switch URL values from 8080 to 8888 (steps are in my forum posting above), restart all TFS related applications/services, and finally start the flash copy agent software back up on 8080. The look on our Sys Admin's face after all this was priceless!
  • Since our DBAs don't let the Sys Admins on the database machines and vice versa, we had some trouble installing the App Tier. From what we found out, the TFSSetup account used to install the App Tier needs to be a local administrator on the App Tier and a member of the local administrator group on the Data Tier. After bringing in a United Nations negotiator, we got the DBAs to "temporarily" add the Sys Admin's TFSSetup account to the local administrator group on the Data Tier. The install went fine after that.
  • The TFS administrator (i.e. me) won't be a local administrator on either the App Tier or Data Tier. That would be against company policy. So we followed the three sections outlined by Microsoft (http://msdn2.microsoft.com/en-us/library/ms253096(VS.80).aspx) and added our team's group name to the appropriate locations in TFS, SharePoint, and SQL Reporting Services. Amazingly this seems to work just fine.
  • Lastly we had to add the TFSReports user to the local administrator group on the App Tier to get around the error "The application-specific permission settings do not grant Local Activation permission for the COM Server application with CLSID {BA126AD1-2166-11D1-B1D0-00805FC1270E} to the user NT AUTHORITY\NETWORK SERVICE SID (S-1-5-20). This security permission can be modified using the Component Services administrative tool." The end user error was "Reporting Services Error --- An error has occurred during report processing. (rsProcessingAborted) Get Online Help Cannot impersonate user for data source 'TfsOlapReportDS'. (rsErrorImpersonatingUser) Get Online Help Logon failed. (rsLogonFailed) Get Online Help For more information about this error navigate to the report server on the local server machine, or enable remote errors --- SQL Server Reporting Services." Being the install instructions say "This account should not be an administrator on Team Foundation Server computers" I'm pretty sure this is not correct. However at the time of this posting, no one from Microsoft can give us any guidance on better way to get Reporting Services to work. I'll keep this forum (http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=1345776&SiteID=1) updated if we ever figure out a better way to handle this.
And there it is. Our take on the joys of installing TFS on an App Tier with a clustered Data Tier. Happy installing!