Occasionally you may want to move your Team Foundation Server database to another server, because for example:
- You want to upgrade the underlying hardware or software infrastructure.
- You want to split the project collection between multiple servers.
- You want a copy of your live data in your test environment.
The process is fairly simple:
- Ensure that the target server has the same or newer SQL Server version than the source environment. It is important, because you cannot restore a SQL backup which was created with a newer SQL Server.
- Ensure that you have exactly the same version of TFS in both environments. Not only service packs, but also minor hotfixes matter!
- In your source server:
- Click Stop Collection in the TFS Admin Console.
- Click Detach Collection in the TFS Admin Console.
- Start SQL Server Management Studio and create a full backup of the database of the project collection.
- Copy the SQL backup to your target SQL Server.
- In your target server:
- Use SQL Server Management Studio to restore the database backup to a new database.
- Use Attach Collection in the TFS Admin Console.
- Update the SharePoint and Report Server settings according to your needs.
It may happen that when you try to attach the project collection TFS cannot find the restored database and you receive the following error message:
TF254078: No attachable databases were found on the following instance of SQL Server: MyServer. Verify that both the name of the server and the name of the instance are correct and that the database was properly detached using the detach command in the Team Foundation Administration Console.
The error message is really correct, so you can check the following:
- Verify that you can connect to the SQL Server instance and the database in it with the TFS service account.
- Verify that you have exactly the same TFS version in both environments.
- Verify that you have not skipped Step 3b and correctly detached the project collection from TFS.
TFS verifies the second and the third criteria by querying the list of databases in the SQL Server and then executing the following query in each of them:
SELECT name, value FROM sys.extended_properties WHERE name LIKE 'TFS_%'
This query returns the custom properties of the databases which start with “TFS_”. You can do the same in your target environment, and you will get something similar for your Configuration database:
This is for an attached database:
And finally you will get something like this for a correctly detached database:
If you cannot see the TFS_SNAPSHOT_STATE property with the Complete value, than you have a fair good chance that you forgot to detach the project collection in the TFS Admin Console before created the SQL backup.
If you don’t need the data of a project any more, you can choose among various tools to delete it from the Team Foundation Server:
- There is a Delete button on the Team Projects tab in the TFS Administration Console.
- You can run the tf.exe with the delete switch, which runs fast, but it only sets a flag to mark deleted files, so you can restore them later using the undelete command.
- The destroy switch of tf.exe permanently deletes the items, so you cannot restore them later. Unfortunately it can only delete items from the version control database.
- The TFSDeleteProject.exe is another command line tool which can delete not only from the TFS database, but also from the reporting and SharePoint databases.
Whichever method you choose, you may notice that the size of the database is not reduced immediately right after you deleted large amount of data. This is because all of the above methods leave some orphan data behind, which is deleted later by a job ran by the TFS Background Job Agent. However this job runs only once a day!
If you don’t want to wait that much, you can run tf destroy with the /startcleanup switch which immediately kicks off the cleanup job.
Another option is to dive deep into the database, and run the cleanup stored procedures manually. If your Content table is large:
EXEC prc_DeleteUnusedContent 1
If your Files table is large:
EXEC prc_DeleteUnusedFiles 1, 0, 1000
This second sproc may run for a long time, that’s why it has the third parameter which defines the batch size. You should run this sprocs multiple times, or if it completes quickly, you can increase the chunk size.
Obviously, this is not supported, but worked on my machine.
One of the best features of TFS is that the changes of the source code are documented not only by the changeset comments, but you can bind all changes to work items. TFS provides several work item types out of the box, however they are sometimes not perfectly suit our needs and require some customization.
Traditionally you can use the witadmin.exe command line tool to manage work item types. You can export their XML definitions and import them back to the server after you modified them. The benefit of this approach is that you can add the modified XML files to your source control to track the changes.
If you prefer GUI tools, you can install the TFS Power Tools which adds a Process Editor item to the Tools menu, which you can use to open the work item definition directly from the server:
In the next step you have to select which work item type you want to edit:
And you instantly get the list of the properties of the work item type:
If you don’t like that by default the Assigned To field shows all users of your server, then you can select this field from the and click the Edit button to modify it. In the second, Rules tab of the popup edit dialog you can see the reason of that behavior:
The problem is the VALIDUSER entry which refers to the Team Foundation Server Valid Users group. You can remove that and add the ALLOWEDVALUES rule instead, which allows you to configure the selectable list items. If you want to add user groups, you can refer to server-level groups in the [Global]\groupname form, and to the groups of the current project in [Project]\groupname form. For example:
You can find the detailed documentation of the available rules in the Rule Type dialog on the All FIELD XML elements reference MSDN page.
When you click Save the changes are directly committed to the server, and you may need to click Refresh in the Team Explorer window to let the Visual Studio client updated.
I like the Team Foundation Server Administration Console. It is obvious that when its UI was designed, they went through the typical admin tasks and tried to support them as much as possible. For example it is very easy to change the service account or the URL of the server, and you can even find a Test button near almost every option, which you can use to quickly validate your settings.
One of the most useful features is the Send Test Email button which allows you to check the SMTP settings of your TFS server:
After clicking this button you can enter the recipient’s e-mail address and optionally a message which will be included in the test e-mail:
Click OK and if you are lucky, TFS can send the test e-mail immediately:
But it may happen that sending the test e-mail fails, and you get the following error message:
Unable to connect to the TFS server to test email settings. Url = ‘/_api/_common/TestMailSettings?sendTo=myuser%40example.com&message=Test+email’. Exception = System.UriFormatException: Invalid URI: The format of the URI could not be determined.
at System.Uri.CreateThis(String uri, Boolean dontEscape, UriKind uriKind)
at System.Net.WebRequest.Create(String requestUriString)
After reading the error message one can check the SMTP settings again, however this time the problem is somewhere else. Open the Change URLs dialog and check that you have selected the Use localhost option in the Server URL section:
The error above is raised only if you have selected the second option and entered the real name of your server instead of the default localhost in the Use: field. If you are hit by this issue, vote on Connect, and hopefully it fill be fixed soon.
We’ve had an ancient source control server, which were running Team Foundation Server 2005. Yes, that classic 1.0 version! Because the underlying OS, the SQL Server, the SharePoint framework as well as the hardware were becoming quite outdated, I decided to upgrade it to the latest 2013 version. Unfortunately you cannot upgrade from 2005 to 2013 in a single step (which is not a surprise because there were 3 versions between them), but first you have to upgrade to 2010, and then from 2010 to 2013:
Upgrading from 2005 to 2010 is a bit tricky, because you have to upgrade SQL Server and SharePoint as well, but you can find a good step-by-step guide and a best practices collection on the TFS Setup Support Team blog. The TFS Team also published a very useful article on upgrading WSS 2.0 to 3.0 for TFS.
In theory upgrading TFS is fairly simple: just uninstall the previous version, but leave the SQL databases in place, then install the new release and choose Upgrade in the install wizard which will update the SQL database schema to the latest version. In theory it does not look complicated, what’s more it looks simple if you use Tim Elhajj’s 61-page Upgrade Team Foundation Server 2012: The Ultimate Upgrade Guide.
However the practice can be a bit different.
First, none of the above blog posts call your attention that there is a serious bug in the TFS 2010 upgrade process which may cause inconsistencies in the database, so you have to be very careful with the upgrade. You can read more about it in Brian Harry’s TFS 2010 Upgrade Issue blog post which contains a link to the hotfix as well.
If was even more frustrating that after successfully completing the upgrade wizard I could not connect to the server. Different clients greeted me with different error messages, for example:
TF400324: Team Foundation services are not available from server MyServer\MyCollection. Technical information (for administrator): Unable to connect to the remote server.
TF205020: A connection could not be made to the following server: MyServer\MyCollection. This server was used during your most recent session, but it might be offline, or network problems might be preventing the connection. Contact the administrator for Team Foundation Server to confirm that the server is available on the network.
TF31002: Unable to connect to this Team Foundation Server
It even happened that Team System Web Access running on the same server could access the data, while remote Visual Studio clients could not connect to it. According to Network Monitor the communication went between the hosts flawlessly, but the server mostly returned HTTP 4xx and 5xx errors (which were visible in the IIS log anyway). It also happened, that I could connect to the server and manage the work items, but could not access the version control and the file history data.
Particularly interesting was that the clients could successfully connect to a new collection I created on the upgraded server, which meant that there were no problem with system level settings (eg. port, firewall, certificate), but the issue was somehow related to the upgraded database. I even tried the Best Practices Analyzer from the TFS 2013 Power Toys, but it could not find any relevant issue neither on server, nor on collection level.
Because I ran out of ideas, I went down to database level. I monitored the queries with Profiler and looked into the tables via Management Studio. I compared the data of the freshly created, working collection database with the records in the database of the upgraded collection. Finally I have found the tbl_ServiceDefinition table which seems containing the URL endpoints of the TFS .ASMX webservices. It was interesting that while all cells contained data in the working database, the table in the upgraded database contained several NULLs in the RelativePath column. So I took a deep breath, created a snapshot of the VM, and manually updated the records in the tbl_ServicesDefinition table. I changed the values of the RelativeToSetting, RelativePath and IsSingleton fields in those rows where RelativePath was NULL, and I could find a matching row in the working database based on the GUID value in the Identifier column. This is my final version:
On the clients I cleared the Visual Studio Team Explorer cache folder (C:\Users\username\AppData\Local\Microsoft\Team Foundation), added the server again, tried to connect aaaaaaaaaaaaaand it worked!
I have to add that this is a total hack! It is not a good solution, not supported and not recommended, it just works on my machine. If you have similar issues, you probably better to contact with Microsoft via the official support channels.
After this using the Configure Features wizard and manually updating the process templates was not that terrible at all.
It is really annoying that the remote desktop client sometimes remembers only your login name but not your password, because:
Your credentials did not work
Your system administrator does not allow the use of saved credentials to log on to the remote computer COMPUTER because its identity is not fully verified. Please enter new credentials.
To fix this start the Local Group Policy Editor (gpedit.msc), and navigate to this branch: Computer Configuration –> Administrative Templates –> System –> Credentials Delegation. Here open the Allow delegating saved credentials with NTLM-only server authentication option and set it to Enabled.
Click the Show… button in the Add servers to the list option and add the servers you want to apply this setting to. You must use the TERMSRV/computername format to specify a single computer, or you can use TERMSRV/* to refer to all servers.
We use the PowerShell script below to backup our SQL Server databases:
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoExtended") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.ConnectionInfo") | Out-Null [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SmoEnum") | Out-Null $server = New-Object ("Microsoft.SqlServer.Management.Smo.Server") $dbInstance $backup = New-Object ("Microsoft.SqlServer.Management.Smo.Backup") $backup.Action = "Database" $backup.BackupSetDescription = "Full backup of " + $dbName $backup.BackupSetName = $dbName + " backup" $backup.Database = $dbName $backup.MediaDescription = "Disk" $backup.Devices.AddDevice("$localSqlBackupPath", "File") $backup.SqlBackup($server)
This script runs fine for years, however recently it started to fail. It successfully backed up most databases, however the backup occasionally failed on some other databases which were previously backed up successfully. I found this error in the log:
Exception calling "SqlBackup" with "1" argument(s):
"Backup failed for Server 'MyServer\MySqlInstance'. " At D:\Backups\BackupSite.ps1:151 char:22 + $backup.SqlBackup <<<< ($server) + CategoryInfo : NotSpecified: (:) , MethodInvocationException + FullyQualifiedErrorId : DotNetMethodException
After a long investigation it turned out, that the error has nothing to do with how we call the SqlBackup function, instead the real issue is that it timeouts after 10 minutes. I turned off the timeout monitoring via the StatementTimeout property of the ServerConnection object and error is gone:
$server.ConnectionContext.StatementTimeout = 0