Selection Bias When Reviewing Browser Stats
A recurring problem I find is when web developers, their support teams and their managers try to evaluate who is using their site(s) by reviewing their web logs (or Google Analytics) in a vacuum. It is far too easy to simply look at statistics reporting what browsers use a site to determine what browsers should be supported as new work is done. Unfortunately, this rarely takes into account how well the current site works with current browsers.
If your site, for example, uses features that are unique to a particular browser, you can expect to see that browser rank higher in your reports. This isn’t necessarily because the browser is the preferred browser of your audience, but because users on different browsers either don’t or can’t use the site and either go away (perhaps never to return) or return to the site in a different browser (which may not be their default/preferred browser).
This is essentially a selection bias when reviewing browser stats, effectively funneling a particular type of user (or browser) to the top of your reports, reinforcing a preset opinion or conclusion.
This is a mistake we all make for various reasons. For example, the 3rd Annual State of Web Development Survey was announced on March 10 by John Allsop. He’s no slouch in the web development world, even having built a tool for editing CSS. But the survey site that gathers information from web developers has its own selection bias — it is unusable in Internet Explorer 8. The CSS for the site is trying so hard to use CSS3 features (that look pretty swell in, say, Chrome) that visiting the site in Internet Explorer 8 produces white text inside white form fields with white text on light gray buttons.
In Chrome, however, the CSS3 effects make for some nice highlighting that helps indicate where I am as I go from field to field:
The survey asks what my default browser is, but to even get to that question I had to drop out of Internet Explorer and re-open the survey in Chrome. Web developers on the whole may not use Internet Explorer as their default browser (I do because it’s what my clients use and I want to see the web as they do), so Web Directions can expect a relatively low number of users to report IE. If Web Directions and John Allsop review the logs/reports to see what browsers people actually use, they can expect to see a far lower number of users filling out the survey in Internet Explorer 8 than claim to use it as their default browser (because they cannot complete the survey). I am hoping that they are all smart enough not to draw a conclusion from that disparity (perhaps that respondents are lying), although I have seen people make that mistake time and time again. To be safe, I did tweet them to let them know (no response yet).
Some developers, however, make a decision that blocks browsers without understanding the impact. It’s one thing for a site to display a message telling users to upgrade (see the latest incarnation, Microsoft Promoting the Death of IE6), but it’s a whole other issue when you actively block users of certain browsers (example images below). It’s easy to justify to your client/employer that nobody uses that browser when you show the reports, especially if the client/employer does not use the browser and you have provided inaccurate data to bolster your points.
I’ve seen this many times in my career, but recently ran into it with a local not-for-profit whose web developers had convinced it that Internet Explorer 6 and 7 not only weren’t widely used, but when faced with the fact that many users are restricted to those browsers at work, was told that end users shouldn’t — and for the most part don’t — surf from work due to corporate policies. Asking to back that assertion up with a report of activity on the site by time of day was met with silence.
This is a selection bias of the worst kind, one driven by personal goals and agendas and ignoring data to the contrary while refusing to acknowledge those who provide contrary data. As a web developer, the example of the Web Directions survey is unfortunate, but most likely just a mistake. In this latter example there is no excuse and it is your responsibility as a developer to provide a better level of service not only to your clients, but particularly to your users.
Thanks, and apologies – we are onto this
We did test it in IE8 – we'll see what is going on here!
thanks, and apologies again