Monthly Archives: March 2014

Customizing TFS work item types

One of the best features of TFS is that the changes of the source code are documented not only by the changeset comments, but you can bind all changes to work items. TFS provides several work item types out of the box, however they are sometimes not perfectly suit our needs and require some customization.

Traditionally you can use the witadmin.exe command line tool to manage work item types. You can export their XML definitions and import them back to the server after you modified them. The benefit of this approach is that you can add the modified XML files to your source control to track the changes.

If you prefer GUI tools, you can install the TFS Power Tools which adds a Process Editor item to the Tools menu, which you can use to open the work item definition directly from the server:

vs-open-wit-from-server

In the next step you have to select which work item type you want to edit:

vs-select-work-item-type

And you instantly get the list of the properties of the work item type:

vs-work-item-type-properties

If you don’t like that by default the Assigned To field shows all users of your server, then you can select this field from the and click the Edit button to modify it. In the second, Rules tab of the popup edit dialog you can see the reason of that behavior:

vs-field-definition

The problem is the VALIDUSER entry which refers to the Team Foundation Server Valid Users group. You can remove that and add the ALLOWEDVALUES rule instead, which allows you to configure the selectable list items. If you want to add user groups, you can refer to server-level groups in the [Global]\groupname form, and to the groups of the current project in [Project]\groupname form. For example:

vs-field-definition-allowedvalues

You can find the detailed documentation of the available rules in the Rule Type dialog on the All FIELD XML elements reference MSDN page.

When you click Save the changes are directly committed to the server, and you may need to click Refresh in the Team Explorer window to let the Visual Studio client updated.

 

Technorati-címkék: ,,

UriFormatException when sending a test e-mail from TFS

I like the Team Foundation Server Administration Console. It is obvious that when its UI was designed, they went through the typical admin tasks and tried to support them as much as possible. For example it is very easy to change the service account or the URL of the server, and you can even find a Test button near almost every option, which you can use to quickly validate your settings.

One of the most useful features is the Send Test Email button which allows you to check the SMTP settings of your TFS server:

tfs-email-settings

After clicking this button you can enter the recipient’s e-mail address and optionally a message which will be included in the test e-mail:

tfs-test-email

Click OK and if you are lucky, TFS can send the test e-mail immediately:

tfs-test-email-success

But it may happen that sending the test e-mail fails, and you get the following error message:

Unable to connect to the TFS server to test email settings. Url = ‘/_api/_common/TestMailSettings?sendTo=myuser%40example.com&message=Test+email’. Exception = System.UriFormatException: Invalid URI: The format of the URI could not be determined.
   at System.Uri.CreateThis(String uri, Boolean dontEscape, UriKind uriKind)
   at System.Net.WebRequest.Create(String requestUriString)
   at Microsoft.TeamFoundation.Admin.Console.Models.DlgSendTestMailViewModel.SendEmail()

tfs-test-email-error

After reading the error message one can check the SMTP settings again, however this time the problem is somewhere else. Open the Change URLs dialog and check that you have selected the Use localhost option in the Server URL section:

tfs-change-urls

The error above is raised only if you have selected the second option and entered the real name of your server instead of the default localhost in the Use: field. If you are hit by this issue, vote on Connect, and hopefully it fill be fixed soon.

 

Technorati-címkék: ,

Upgrading TFS 2005 to 2013 with a database hack

We’ve had an ancient source control server, which were running Team Foundation Server 2005. Yes, that classic 1.0 version! Because the underlying OS, the SQL Server, the SharePoint framework as well as the hardware were becoming quite outdated, I decided to upgrade it to the latest 2013 version. Unfortunately you cannot upgrade from 2005 to 2013 in a single step (which is not a surprise because there were 3 versions between them), but first you have to upgrade to 2010, and then from 2010 to 2013:

tfs-2005-2013

Upgrading from 2005 to 2010 is a bit tricky, because you have to upgrade SQL Server and SharePoint as well, but you can find a good step-by-step guide and a best practices collection on the TFS Setup Support Team blog. The TFS Team also published a very useful article on upgrading WSS 2.0 to 3.0 for TFS.

In theory upgrading TFS is fairly simple: just uninstall the previous version, but leave the SQL databases in place, then install the new release and choose Upgrade in the install wizard which will update the SQL database schema to the latest version. In theory it does not look complicated, what’s more it looks simple if you use Tim Elhajj’s 61-page Upgrade Team Foundation Server 2012: The Ultimate Upgrade Guide.

However the practice can be a bit different.

First, none of the above blog posts call your attention that there is a serious bug in the TFS 2010 upgrade process which may cause inconsistencies in the database, so you have to be very careful with the upgrade. You can read more about it in Brian Harry’s TFS 2010 Upgrade Issue blog post which contains a link to the hotfix as well.

If was even more frustrating  that after successfully completing the upgrade wizard I could not connect to the server. Different clients greeted me with different error messages, for example:

TF400324: Team Foundation services are not available from server MyServer\MyCollection. Technical information (for administrator): Unable to connect to the remote server.

Or:

TF205020: A connection could not be made to the following server: MyServer\MyCollection. This server was used during your most recent session, but it might be offline, or network problems might be preventing the connection. Contact the administrator for Team Foundation Server to confirm that the server is available on the network.

Or simply:

TF31002: Unable to connect to this Team Foundation Server

It even happened that Team System Web Access running on the same server could access the data, while remote Visual Studio clients could not connect to it. According to Network Monitor the communication went between the hosts flawlessly, but the server mostly returned HTTP 4xx and 5xx errors (which were visible in the IIS log anyway). It also happened, that I could connect to the server and manage the work items, but could not access the version control and the file history data.

Particularly interesting was that the clients could successfully connect to a new collection I created on the upgraded server, which meant that there were no problem with system level settings (eg. port, firewall, certificate), but the issue was somehow related to the upgraded database. I even tried the Best Practices Analyzer from the TFS 2013 Power Toys, but it could not find any relevant issue neither on server, nor on collection level.

Because I ran out of ideas, I went down to database level. I monitored the queries with Profiler and looked into the tables via Management Studio. I compared the data of the freshly created, working collection database with the records in the database of the upgraded collection. Finally I have found the tbl_ServiceDefinition table which seems containing the URL endpoints of the TFS .ASMX webservices. It was interesting that while all cells contained data in the working database, the table in the upgraded database contained several NULLs in the RelativePath column. So I took a deep breath, created a snapshot of the VM, and manually updated the records in the tbl_ServicesDefinition table. I changed the values of the RelativeToSetting, RelativePath and IsSingleton fields in those rows where RelativePath was NULL, and I could find a matching row in the working database based on the GUID value in the Identifier column. This is my final version:

tfs-service-definitions

On the clients I cleared the Visual Studio Team Explorer cache folder (C:\Users\username\AppData\Local\Microsoft\Team Foundation), added the server again, tried to connect aaaaaaaaaaaaaand it worked!

I have to add that this is a total hack! It is not a good solution, not supported and not recommended, it just works on my machine. If you have similar issues, you probably better to contact with Microsoft via the official support channels.

After this using the Configure Features wizard and manually updating the process templates was not that terrible at all.

 

Remembering remote desktop passwords

It is really annoying that the remote desktop client sometimes remembers only your login name but not your password, because:

Your credentials did not work

Your system administrator does not allow the use of saved credentials to log on to the remote computer COMPUTER because its identity is not fully verified. Please enter new credentials.

rdp-credentials

To fix this start the Local Group Policy Editor (gpedit.msc), and navigate to this branch: Computer Configuration –> Administrative Templates –> System –> Credentials Delegation. Here open the Allow delegating saved credentials with NTLM-only server authentication option and set it to Enabled.

Click the Show… button in the Add servers to the list option and add the servers you want to apply this setting to. You must use the TERMSRV/computername format to specify a single computer, or you can use TERMSRV/* to refer to all servers.

 

Technorati-címkék: ,

Exception calling "SqlBackup" with "1" argument(s)

We use the PowerShell script below to backup our SQL Server databases:

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null

$server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $dbInstance
$backup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup")
$backup.Action = "Database"
$backup.BackupSetDescription = "Full backup of " + $dbName
$backup.BackupSetName = $dbName + " backup"
$backup.Database = $dbName
$backup.MediaDescription = "Disk"
$backup.Devices.AddDevice("$localSqlBackupPath", "File")
$backup.SqlBackup($server)

This script runs fine for years, however recently it started to fail. It successfully backed up most databases, however the backup occasionally failed on some other databases which were previously backed up successfully. I found this error in the log:

Exception calling "SqlBackup" with "1" argument(s): 
"Backup failed for Server 'MyServer\MySqlInstance'. " At D:\Backups\BackupSite.ps1:151 char:22 + $backup.SqlBackup <<<< ($server) + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : DotNetMethodException

After a long investigation it turned out, that the error has nothing to do with how we call the SqlBackup function, instead the real issue is that it timeouts after 10 minutes. I turned off the timeout monitoring via the StatementTimeout property of the ServerConnection object and error is gone:

$server.ConnectionContext.StatementTimeout = 0

 

Technorati-címkék: ,

Grouping files in Solution Explorer

A nice feature of Visual Studio Web Essentials is that it can process your file right after you saved it. For example it can compile your TypeScript file to JavaScript, and even create the map file for it. Or it can generate the CSS from your LESS file including the minified version.

This feature is really useful, if this kind of transformation is not part of your build process, and in this case it is very logical that the generated files are added to the project file. Unfortunately Visual Studio does not always recognize the connection between these files, and does not group them in the Solution Explorer window. If you are brave enough, you can edit the .csproj file and force the Solution Explorer window to show the connecting files in a nested hierarchy. The trick is the DependentUpon element, for example this code:

<TypeScriptCompile Include="js\Main.ts" />
...
<Content Include="js\Main.js">
    <DependentUpon>Main.ts</DependentUpon>
</Content>
<Content Include="js\Main.js.map">
    <DependentUpon>Main.ts</DependentUpon>
</Content>

results this:

solution-explorer-group-ts

You can even create a deeper hierarchy:

<None Include="css\default.less" /> 
... 
<Content Include="css\default.css"> 
    <DependentUpon>default.less</DependentUpon> 
</Content> 
<Content Include="css\default.min.css"> 
    <DependentUpon>default.css</DependentUpon> 
</Content>

The above code results this display in the Solution Explorer:

solution-explorer-group-less

 

Technorati-címkék:

Compiling your MVC views without IIS errors

Visual Studio and ASP.NET by default does not compile your MVC views during build, instead the runtime processes them at run time. The consequence of this is that any error in your views may break your app only at run time.

Fortunately you can force Visual Studio to compile your MVC views as part of the build process. Just open your .csproj file and switch the value of the MvcBuildViews element from false to true:

<MvcBuildViews>true</MvcBuildViews>

This fixes the original problem, however it often introduces another issue. You may receive the following error when you try to build or publish your web application:

It is an error to use a section registered as allowDefinition=’MachineToApplication’ beyond application level.  This error can be caused by a virtual directory not being configured as an application in IIS.

I don’t know who made up this message and what was his intention to communicate. This message does not help neither to find the error, nor to fix it.

The solution – which is definitely not obvious from the previous error – is to delete the obj folder before build.

If you do not want to manually delete that folder every time, you can add that step into your build process. Thankfully MSBuild provides the BaseIntermediateOutputPath variable which points right to the obj folder, and you can combine it with the RemoveDir task to get rid of that folder. Just open the .csproj file again and add a new element to the BeforeBuild target:

<Target Name="BeforeBuild">
    <RemoveDir Directories="$(BaseIntermediateOutputPath)" />
</Target>

If you don’t like the bin folder and want to remove that as well, you can refer to it via the BaseOutputPath variable.

 

Technorati-címkék: ,,