Mixed content warning

August 18, 2014 1 comment

It is so sad, when a webpage falls apart in the browser, like this one in Chrome:

mixed content chrome

Why is that? Oh, isn’t it obvious? The explanation is there, let me help you:

mixed content chrome warning small

It is called the mixed content warning, and although it is a warning, it is very easy to miss. Let’s see the same page in Firefox:

mixed content FF

Do you get it? Here it is:

mixed content FF blocked small

Internet Explorer is not so gentle, it immediately calls the user’s attention:

mixed content warning

Although you don’t have to search for a shield icon (which is one of the most overused symbol in the IT history) here, because you immediately receive a textual message, the situation is not really better. Average users don’t understand this message and the real cause behind it. What’s more, not only users don’t get it, but also web developers don’t understand the security consequences, otherwise there won’t be any page with this warning at all.

It is so easy to get rid of the mixed content warning: just ensure that if you load the page via https:// protocol, then you must load all referenced content (yes, all of them) via https as well. If you have a single http:// URL in your page, then the browser will trigger the mixed content warning. If you load content from a third party domain and you cannot use relative URLs, then start your reference URLs with “//”, which tells the browser to use the same protocol which was used to load the page itself. It is called the “protocol relative”, “scheme relative” or “scheme-less relative” URL, and you can find its description already in the RFC 3986 (dated January 2005) which specifies the URI syntax. Thankfully all browsers understand it as well.

It is time to fix these pages, and let the browsers sooner or later completely block these poorly implemented pages.

 

Technorati-címkék: ,,
Categories: Security, WebDev Tags: ,

How many requests do you need to authenticate?

It sometimes happens that a webservice requires Basic authentication, which is usually not an issue in .NET clients thanks to the Credentials property on the generated proxy class where you can set the user name and password:

MyService ws = new MyService
{
    Credentials = new NetworkCredential( "user", "password" )
};

You may think that after setting this property the client will send an HTTP request which contains the authentication data, but unfortunately things happen differently. First a request is sent without the Authorization header, and if the server returns a 401 Authenticate response, a second request is submitted which will contain the user name and password in the header. So the result is doubled traffic.

If you don’t like this (and why would you), you can use the PreAuthenticate property which forces the client to always send the authentication header without waiting for a 401 response:

MyService ws = new MyService
{
    Credentials = new NetworkCredential( "user", "password" ),
PreAuthenticate = true };

 

Technorati-címkék: ,,
Categories: Security, WebDev Tags: ,

Calling a PHP webservice from .NET

August 14, 2014 1 comment

It happened that I had to call a webservice implemented in PHP from a .NET client via a standard SOAP interface, however it turned out again that Simple Object Access Protocol is not so simple in the real world.

The client received a 400 Bad Request response status from the server which itself, without any more details, didn’t help much to find the real problem. The Apache webserver log contained an Invalid URI in request entry, but that didn’t bring me closer to the solution either.

As many times before, Fiddler helped. Diving into the HTTP traffic I noticed that the request contained an Expect: 100-continue header which was not handled by the webserver. This is a very interesting header, which allows the client to ask the server to evaluate the request headers before the client submits the request body (see RFC 2616 Section 8.2.3 for more details). In short this single header can drastically change the classic request-response sequence to something like this:

Client –> Server:

POST example.com HTTP/1.1
request headers
Expect: 100-Continue

Server –> Client:

HTTP/1.1 100 Continue
response headers

Client –> Server:

request body

Server –> Client:

HTTP/1.1 200 OK
response headers
response body

The .NET client always sends this header to minimize the network traffic, but it seems that not all servers tolerate it. You can use the Expect100Continue property of the ServicePointManager class to override this behavior:

ServicePointManager.Expect100Continue = false;

You may even have better performance this way, because the HttpWebRequest class by default waits 350 msec for the Continue response.

 

Technorati-címkék: ,,
Categories: .NET, WebDev Tags: , ,

.NET Framework 4.5.2

Microsoft has just released a new version of the .NET Framework. Announcement and quick overview of the new features: Announcing the .NET Framework 4.5.2

More details about the new features: What’s new in the .NET Framework 4.5.2

Install packages:

This is an in-place upgrade! It installs side-by-side with the 3.5 SP1 and earlier versions, but in-place upgrades the 4, 4.5 and 4.5.1 version. That’s why the following knowledge base article my be important for you:
KB 2962547 – Known issues for the .NET Framework 4.5.2

Download!

 

Technorati-címkék:
Categories: .NET, WebDev Tags:

Migrating a TFS project collection to another server

Occasionally you may want to move your Team Foundation Server database to another server, because for example:

  • You want to upgrade the underlying hardware or software infrastructure.
  • You want to split the project collection between multiple servers.
  • You want a copy of your live data in your test environment.

The process is fairly simple:

  1. Ensure that the target server has the same or newer SQL Server version than the source environment. It is important, because you cannot restore a SQL backup which was created with a newer SQL Server.
  2. Ensure that you have exactly the same version of TFS in both environments. Not only service packs, but also minor hotfixes matter!
  3. In your source server:
    1. Click Stop Collection in the TFS Admin Console.
    2. Click Detach Collection in the TFS Admin Console.
    3. Start SQL Server Management Studio and create a full backup of the database of the project collection.
  4. Copy the SQL backup to your target SQL Server.
  5. In your target server:
    1. Use SQL Server Management Studio to restore the database backup to a new database.
    2. Use Attach Collection in the TFS Admin Console.
    3. Update the SharePoint and Report Server settings according to your needs.

It may happen that when you try to attach the project collection TFS cannot find the restored database and you receive the following error message:

TF254078: No attachable databases were found on the following instance of SQL Server: MyServer. Verify that both the name of the server and the name of the instance are correct and that the database was properly detached using the detach command in the Team Foundation Administration Console.

tfs-attach-db-not-found

The error message is really correct, so you can check the following:

  • Verify that you can connect to the SQL Server instance and the database in it with the TFS service account.
  • Verify that you have exactly the same TFS version in both environments.
  • Verify that you have not skipped Step 3b and correctly detached the project collection from TFS.

TFS verifies the second and the third criteria by querying the list of databases in the SQL Server and then executing the following query in each of them:

SELECT name, value 
FROM   sys.extended_properties 
WHERE  name LIKE 'TFS_%'

This query returns the custom properties of the databases which start with “TFS_”. You can do the same in your target environment, and you will get something similar for your Configuration database:

tfs-properties-configuration-db

This is for an attached database:

tfs-properties-attached-db

And finally you will get something like this for a correctly detached database:

tfs-properties-detached-db

If you cannot see the TFS_SNAPSHOT_STATE property with the Complete value, than you have a fair good chance that you forgot to detach the project collection in the TFS Admin Console before created the SQL backup.

 

Technorati-címkék: ,,
Categories: Visual Studio Tags: , ,

Cleaning up and reduce the size of the TFS database

If you don’t need the data of a project any more, you can choose among various tools to delete it from the Team Foundation Server:

  • There is a Delete button on the Team Projects tab in the TFS Administration Console.
  • You can run the tf.exe with the delete switch, which runs fast, but it only sets a flag to mark deleted files, so you can restore them later using the undelete command.
  • The destroy switch of tf.exe permanently deletes the items, so you cannot restore them later. Unfortunately it can only delete items from the version control database.
  • The TFSDeleteProject.exe is another command line tool which can delete not only from the TFS database, but also from the reporting and SharePoint databases.

Whichever method you choose, you may notice that the size of the database is not reduced immediately right after you deleted large amount of data. This is because all of the above methods leave some orphan data behind, which is deleted later by a job ran by the TFS Background Job Agent. However this job runs only once a day!

If you don’t want to wait that much, you can run tf destroy with the /startcleanup switch which immediately kicks off the cleanup job.

Another option is to dive deep into the database, and run the cleanup stored procedures manually. If your Content table is large:

EXEC prc_DeleteUnusedContent 1

If your Files table is large:

EXEC prc_DeleteUnusedFiles 1, 0, 1000

This second sproc may run for a long time, that’s why it has the third parameter which defines the batch size. You should run this sprocs multiple times, or if it completes quickly, you can increase the chunk size.

Obviously, this is not supported, but worked on my machine.

 

Technorati-címkék:
Categories: Visual Studio Tags: ,

Customizing TFS work item types

One of the best features of TFS is that the changes of the source code are documented not only by the changeset comments, but you can bind all changes to work items. TFS provides several work item types out of the box, however they are sometimes not perfectly suit our needs and require some customization.

Traditionally you can use the witadmin.exe command line tool to manage work item types. You can export their XML definitions and import them back to the server after you modified them. The benefit of this approach is that you can add the modified XML files to your source control to track the changes.

If you prefer GUI tools, you can install the TFS Power Tools which adds a Process Editor item to the Tools menu, which you can use to open the work item definition directly from the server:

vs-open-wit-from-server

In the next step you have to select which work item type you want to edit:

vs-select-work-item-type

And you instantly get the list of the properties of the work item type:

vs-work-item-type-properties

If you don’t like that by default the Assigned To field shows all users of your server, then you can select this field from the and click the Edit button to modify it. In the second, Rules tab of the popup edit dialog you can see the reason of that behavior:

vs-field-definition

The problem is the VALIDUSER entry which refers to the Team Foundation Server Valid Users group. You can remove that and add the ALLOWEDVALUES rule instead, which allows you to configure the selectable list items. If you want to add user groups, you can refer to server-level groups in the [Global]\groupname form, and to the groups of the current project in [Project]\groupname form. For example:

vs-field-definition-allowedvalues

You can find the detailed documentation of the available rules in the Rule Type dialog on the All FIELD XML elements reference MSDN page.

When you click Save the changes are directly committed to the server, and you may need to click Refresh in the Team Explorer window to let the Visual Studio client updated.

 

Technorati-címkék: ,,
Categories: Visual Studio Tags: ,
Follow

Get every new post delivered to your Inbox.

%d bloggers like this: