Monthly Archives: August 2014

Mixed content warning

It is so sad, when a webpage falls apart in the browser, like this one in Chrome:

mixed content chrome

Why is that? Oh, isn’t it obvious? The explanation is there, let me help you:

mixed content chrome warning small

It is called the mixed content warning, and although it is a warning, it is very easy to miss. Let’s see the same page in Firefox:

mixed content FF

Do you get it? Here it is:

mixed content FF blocked small

Internet Explorer is not so gentle, it immediately calls the user’s attention:

mixed content warning

Although you don’t have to search for a shield icon (which is one of the most overused symbol in the IT history) here, because you immediately receive a textual message, the situation is not really better. Average users don’t understand this message and the real cause behind it. What’s more, not only users don’t get it, but also web developers don’t understand the security consequences, otherwise there won’t be any page with this warning at all.

It is so easy to get rid of the mixed content warning: just ensure that if you load the page via https:// protocol, then you must load all referenced content (yes, all of them) via https as well. If you have a single http:// URL in your page, then the browser will trigger the mixed content warning. If you load content from a third party domain and you cannot use relative URLs, then start your reference URLs with “//”, which tells the browser to use the same protocol which was used to load the page itself. It is called the “protocol relative”, “scheme relative” or “scheme-less relative” URL, and you can find its description already in the RFC 3986 (dated January 2005) which specifies the URI syntax. Thankfully all browsers understand it as well.

It is time to fix these pages, and let the browsers sooner or later completely block these poorly implemented pages.

 

Technorati-címkék: ,,

How many requests do you need to authenticate?

It sometimes happens that a webservice requires Basic authentication, which is usually not an issue in .NET clients thanks to the Credentials property on the generated proxy class where you can set the user name and password:

MyService ws = new MyService
{
    Credentials = new NetworkCredential( "user", "password" )
};

You may think that after setting this property the client will send an HTTP request which contains the authentication data, but unfortunately things happen differently. First a request is sent without the Authorization header, and if the server returns a 401 Authenticate response, a second request is submitted which will contain the user name and password in the header. So the result is doubled traffic.

If you don’t like this (and why would you), you can use the PreAuthenticate property which forces the client to always send the authentication header without waiting for a 401 response:

MyService ws = new MyService
{
    Credentials = new NetworkCredential( "user", "password" ),
PreAuthenticate = true };

 

Technorati-címkék: ,,

Calling a PHP webservice from .NET

It happened that I had to call a webservice implemented in PHP from a .NET client via a standard SOAP interface, however it turned out again that Simple Object Access Protocol is not so simple in the real world.

The client received a 400 Bad Request response status from the server which itself, without any more details, didn’t help much to find the real problem. The Apache webserver log contained an Invalid URI in request entry, but that didn’t bring me closer to the solution either.

As many times before, Fiddler helped. Diving into the HTTP traffic I noticed that the request contained an Expect: 100-continue header which was not handled by the webserver. This is a very interesting header, which allows the client to ask the server to evaluate the request headers before the client submits the request body (see RFC 2616 Section 8.2.3 for more details). In short this single header can drastically change the classic request-response sequence to something like this:

Client –> Server:

POST example.com HTTP/1.1
request headers
Expect: 100-Continue

Server –> Client:

HTTP/1.1 100 Continue
response headers

Client –> Server:

request body

Server –> Client:

HTTP/1.1 200 OK
response headers
response body

The .NET client always sends this header to minimize the network traffic, but it seems that not all servers tolerate it. You can use the Expect100Continue property of the ServicePointManager class to override this behavior:

ServicePointManager.Expect100Continue = false;

You may even have better performance this way, because the HttpWebRequest class by default waits 350 msec for the Continue response.

 

Technorati-címkék: ,,