User-agent sniffing is just asking for problems. There's a reason why pretty much every notable person in the JavaScript world gave up on it a long time ago.
\r\n\r\nAnother real-world example: suppose that you decide to use UA sniffing to determine whether you use the pre-iE7 ActiveX implementation of XMLHttpRequest
or a native browser class (though IE7's XMLHTTPRequest
isn't exactly "native", even though it behaves that way. Sort of.). So what do you do? If you look for "MSIE" in the UA string, you're likely to start throwing errors in Opera because it has a nasty habit of sending an IE UA string. Similarly, plenty of users of Mozilla-based browsers intentionally set their UA string to IE's to get around stupid browser-sniffing tricks.
The result is that any AJAX stuff you were doing stops working for a not-insignificant number of people who might otherwise have been paying customers. User-agent sniffing has been dead since 1995, when IE started including "Mozilla" in its UA string, and has only gotten deader since -- Safari, for example, includes "Gecko" in its UA string.
\r\n\r\nWhich leaves you with, well, object detection. Want to know how to do XMLHttpRequest
? Ask the browser whether it has a native object for that; if not, ask if it has an ActieX implemention. If the browser fails both those tests, it can't do XMLHttpRequest
and you either need to fall back to other tricks (like iframes) or give up and force full-pagse reloads on the user. And in the long run, this is simpler than maintaining a massive list of compatible UA strings, and more effective -- when a new browser comes out, you don't have to update all your JavaScript.
Or does "the world's largest retailer" not care about missed sales due to easily-corrected coding mistakes?