I'm saddend (and a little angry) at the number of developers here who absolutely insist it's so much extra work to build a site that doesn't require JavaScript. If you follow best practices, it's not a big deal. Web apps create, retrieve, update, and delete records. No matter how fancy your new startup's idea is, you have to realize that it's all the same task over and over again, and you can do that with simple web forms that don't require AJAX. A little unobtrusive JS to capture clicks, hide boring interfaces, and transform your dull non-JS interface can go a LONG way. Plus you can test that the underlying functionality of your site works very early.
I'm required by law (Section 508) to build accessible web sites at my day job. It's much easier to make Section 508 compliant websites without JS. Then we put the icing on the cake to satisfy the other 98% of our users.
It seems like your zoom software interacts badly with behaviour that's common on some websites. Disabling Javascript seems like a reasonable workaround, but not without cost. I'm curious - why are web developers the target of your aggravation, rather than the people who wrote your zoom software?
It seems to me that a lot of expensive accessibility software copes well with computing circa 1998. But then it hit a wall. Most of the advocacy and lobbying around this issue seems to focus on forcing most developers to adapt to the limitations of this accessibility software, rather than pressuring accessibility software vendors to provide software which (a) delivers innovation and utilizes advanced technology, as you'd expect in 2010 and (b) deals with common patterns in modern software development.
Now imagine I'm surfing as a blind user, and I follow a link from HN, and my screen reader starts reading the page, only to be interrupted by a "take this survey" modal DIV. After a couple of times I'm turning off the JS, cos that's really annoying.
The people who make my zoom software make OSX (it's built into the operating system and it works amazingly well for me except for applications which fire events when I hover.
Following the principles of UJS and web standards makes it infinitely easier, in my experience, to develop web sites that work with screen readers and still use that advanced technology that everyone's gaga over. But if you can't write JS easily that works across multiple browsers, how do you ever expect assistive technology to be able to interpret it correctly? :)
I am open to suggestions :).
If the target is HN type people I'm guessing the number of non-Javascript goes up.
If the website is pure information, why not provide it even to people who telnet to port 80?
But if the website is a rich interactive application, even if it could fall back on simple form submissions, I'd say it's fair that non-JS users sees a "This requires Javascript" message.
(my answer is very different if the "interactive application" was powered with something vendor/platform-specific)
And then there's sites like this: http://partiallystapled.com/~gxti/trash/2010/10/09-javascrip...
You can almost recognize who's mostly in charge for some site:
-- needs JavaScript only to bring me to another page: some bad "programmer" whose best achievement was that he learned JS.
-- needs Flash only to bring me to another page: some "decision maker" who wants "dynamic" or a "programmer" whose best achievement was that he learned Flash.
- needs a magnifying glass to read the text set in 5 pt font: some "designer" who doesn't care for the text anyway. But look how good colors he selected. And everybody has the same screen resolution as it's on his computer, got it?
-- needs a magnifying glass to read the text, the menus are half JS behind images, half Flash: you tell me, but there's higher chance it's made by a West European guy.
etc.
Do you know many people who say: "I won't run C applications because they are vulnerable to buffer overflows."
And I also use Opera, exactly because I can enable using the built in browser settings both JS and Flash only for a few sites where I need it (like youtube).
I have just checked again: I don't have JS turned on for HN. It just works.
Unless your C program takes in input from advertisers and posts from people on the internet, I don't think that's a valid comparison.
Also, you say "very unlikely" but these attacks happen all the time. Most of them don't make the news, but they're pretty much an every day affair.
Very few websites give you a substantial improvement in user experience by enabling javascript.
I really don't understand what's the big deal about making sure your site works without JS, it's just some extra work that you're going to have to do anyway later down the line (and it will be more complex then).
But if your web page is a complex application, having to support no-JS either shuts you out from from a huge set of interface tools, or forces you to maintain two interfaces in tandem. To me, 2% doesn't justify that, particularly when you can just politely inform the user that their browser has a significant feature disabled, and inform them how they can remedy the situation so that your site works.
Explain to an investor, "We don't have features X, Y, and Z, which our competitors have, because we were spending time maintaining the non-JS version for 2% of our user base.".
The problem with bending over backwards for non-js is that you double your dev cycles (no AJAX, need regular forms for everything), lose dynamic menus (want to duplicate those?), lose a ton of instrumentation you may be doing, etc.
I think it's perfectly fine to say "non-js means you have a read-only experience". Even facebook doesn't go that far -- you can't even log in without js enabled! (And they're definitely in the camp of 2% * 500M).
What will people do who have js disabled? They'll get a tech friend to help, because that disabling was probably a misconfiguration :).
At some point you have to prioritize your time to work on the features that help 95% of your users.
It's worth reminding ourselves that the business and technical constraints of different web applications vary widely and so strategies will also vary widely.
For example, if you are building a greenfield app and your purpose is to try to search for a product/market fit, it generally doesn't make sense to optimize for the 2% of non-js users yet. You haven't yet found a steady source of revenue meaning that (1) you probably can't afford the extra effort and (2) your app's features are likely going to change drastically.
Shorter version: 2x more work does not justify 2%
We snubbed 2% of our potential user base and reinvested the money we saved in increasing the conversion rate of the 98%. Further we are going to use some of that money to chase additional revenue streams. If we exploit all available revenue models and hit a wall with upping the conversion rate, we will look at creating a separate none JS version of the site with reduced functionality, so that the two distinctive development models are not intertwined therefore reducing out maintenance burden.
How many do it unintentional because they think JS is Java(applets ...), or the browser does not support it.
I'm also wondering how many people change the advanced JS settings in the browser (do browser other than Firefox even support them?).
I'm one of those 2% for different reasons(open for discussion): - it blocks 90% of ads, because most banners are JS driven - I consider JS as a privacy and security vulnerability - disabled JS saves bandwidth
It never takes more than 2 seconds to get an idea if a page does work without JS, and another 2 seconds to decide if it is worth it to enable JS.
These are my usual reasons. I've run into a lot of sites with orders of magnitude more ads, fluff, formatting scripts, etc. than they have content (e.g. thereifixedit and related sites typically take me a full minute to load with JavaScript and often under a second without). I don't make much effort to block ads that don't make the page load much slower.
Then there are the NoScript people, who are willing to whitelist your site to allow JS if they think there is value in it. These people skew the numbers but are pretty reasonable when it comes to turning JS on.
Pretty much all of the attacks against Mozilla that might have actually affected me have been mitigated to some extent by NoScript, so it's pretty useful.
I also have scripting off because I tend to be curious about the diligence and talent of the authors at a site.
Wikipedia: JavaScript is an implementation of the ECMAScript language standard and is typically used to enable programmatic access to computational objects within a host environment.
If you want real privacy, then why not just disable cookies? ohhh your login won't work... well then just disable your internet connection then.
Disabling JS wastes bandwidth because you can't do partial page replacements. OMG so much stupidity.
Edit: So many downvoting trolls. Edit #2: Even more trolls who can't even leave a reply.
a) reinforces the notion that you didn't care enough about your message to spend 15 seconds longer on it to write it properly, a feeling which already existed in readers' minds because of the presence of your (poorly reasoned) points, and
b) requires readers to spend longer parsing your comment before being able to comprehend the (poorly reasoned) points you were attempting to make.
I'd wager that the average NoScript user has at least two machines, so the total number of NoScript users is probably less than 2.7 million. (http://blog.brandonbloom.name/2010/06/noscript-add-on-instal...)
If you don't cater for those people, put some money aside for lawyers.
I've found that one of the major complaints that blind users have is that it's a total bitch to work with dynamically updating pages. If more developers paid attention to web accessibility, a lot of the major problems to disabled folks would be greatly alleviated.
http://en.wikipedia.org/wiki/Web_Accessibility_Initiative http://en.wikipedia.org/wiki/Web_Content_Accessibility_Guide... http://en.wikipedia.org/wiki/Section_508_Amendment_to_the_Re...
I say this as a visually impaired developer with several blind friends and family. To me, web accessibility is serious business.
http://www.dojotoolkit.org/reference-guide/dijit-a11y-statem...
I have built many sites with the specific criteria that it work with JAWS. I do not use Server Side web frameworks and my UI logic is implemented entirely in JavaScript. I am an authority in the usability and accessibility field and am hired as such by many of the big boys. JavaScript has nothing to do with accessibility and in fact can be used to increase usability for the disabled. People confuse JavaScript with bad UX design that hinders accessibility.
I think it depends on what content you're providing and which users you want to reach.
I've seen text-only blogs that involve javascript in their visual design and cannot be read without it. I've also seen websites where the site navigation links are rendered by javascript. In both cases the sites are inaccessible to non-javascript users.
In my opinion if you can provide the same content without requiring active scripting, you should. If you're providing an interactive experience that can't be achieved without scripting at least be aware that you're limiting your user base. Even if users have scripting available but disabled by choice, they may decide that your content is not worth enabling scripting for.
I'm also concerned that technical users who disable javascript by choice might not frequent a site like "yahoo" as much as non-technical users. I'm curious if a more neutral but still popular site would have a larger percentage of non-javascript visitors.
That's certainly true for me. I don't care that a site requires scripting, but I do care when there's nothing to indicate that I'd be better off if I allowed javascript.
This is especially true for sites that let a user begin some set of actions (write a lengthy comment, or fill out a multi-step form), and then simply fail at the end because scripting is disabled.
It is trivial to add a "This site requires JavaScript" message. Leaving it out says you either don't give a shit or don't know what you're doing.
"We took a combination of access logs and beacon data (previously included in the page) and filtered out all of the automated requests, leaving us with a set of requests we could confirm were sent by actual users."