Yahoo has published data from 2010 and R. Reid published data from 2009 (picked from a site he had access to).
The findings from Yahoo were rather interesting at that time:
We took a combination of access logs and beacon data (previously
included in the page) and filtered out all of the automated requests,
leaving us with a set of requests we could confirm were sent by actual
users. This data, which is completely anonymous, gave us a good
indication of traffic patterns in several countries.
After crunching the numbers, we found a consistent rate of
traffic, with the highest rate being roughly 2 percent in the United
States and the lowest being roughly 0.25 percent in Brazil. All of the
other countries tested showed numbers very close to 1.3 percent.
This is about what I could find so far. But since this data is getting old, I wonder what the percentages are today.
I, therefore, ask you to provide:
links to any open, freely available statistics which touches this area
Your own stats, preferably from larger sites with do not target developers
Give the basic info, with a clear route for how to go further – update your browser!
I think sacrificing functionality for 99% of users to accommodate 1% is sheer bloody mindedness.
“Sorry, your computer is too old and slow to render this website.” OR
“Sorry, 99.9% of the planet, we’ve presented you with a sub-optimal 1993 experience because 0.1% of you have outdated tech”
Most active and extensive discussions on StackExchange sites on this topic:
You’re right… These are pretty hard to come to. Could actually only find the ones you mentioned, the YDN 2010 article being referenced quite often.
I guess you could also use a traffic tracking and analysis suite to verify these stats on your own, if you have a site with sufficient traffic and the relevant demographic you are aiming for.
In the interests of link-rot, the number was 1.1% with 0.9% of that where it was enabled in the browser but otherwise not run, due to reasons guessed to be things like corporate content filters, mobile network errors, and even page-preloading.
If we could find out what constitutes that 0.9% and how much is not a human sitting at an intentionally JS-disabled browser, then the effort and cost of investing in progressive enhancement/graceful degradation could be weakened.
In any case, it looks to be a tiny proportion.
Although progressive enhancement is dead to me, I do think JS should be used sparingly, unless its a single-page app.
We recently invested a large amount of time into non-js optimization and wanted to know whether the effort paid off. Turned out that exactly zero of our customers chose to sign a contract while having js disabled, while there are about 3% of non-js visits on our homepage. Thus I think that most of the traffic is generated by bots.
Feel free to draw your own conclusions
Such statistics can only ever be useful for a specific site, and even then, there are cases hard to interpret:
- What about users that execute some, but not all scripts of a site?
- What about users that don’t execute scripts of a site most of the time, but occasionally execute all/some?
Other factors to consider:
- Sites that require JS, even if only for some parts, can only gather biased statistics, as they have probably already put off the no-JS visitors in the past.
- If your site is JS-free, you gather statistics, and then start to add JS, blacklisters (which had JS enabled before) might block (some of) your scripts.
- No-JS visitors are probably more sensitive to privacy, so it’s likely that they are taking other measures in addition … they might look like bots in site statistics 😉
- Site topic (what is your audience like and interested in?), browser stats (NoScript is one of the most popular Firefox add-ons.), country (The German Federal Office for Information Security strongly recommends¹ all citizens to install NoScript.) and also available competition on the market (if your site is unique and I really want to use it, I’ll allow scripts; otherwise, I go to your competitor) might have a strong influence.
¹ The BSI link is 404 now. Not sure if this recommendation is still somewhere on their site. For reference, here is the last snapshot of that page in the Internet Archive.
The statistics differ between different countries