To Hell with Bad Editors
By now everyone has had the opportunity to read the Web Standards Project (WaSP) position on old browsers, and see A List Apart (ALA) implement that campaign’s message with its own rebuild. Overall, a good message, and lots of good points within.
But WaSP has made it too easy. The people who partake in the WaSP and ALA are all too familiar with standards, compliance, and the failure of the browser manufacturers. In fact, none of what happened is really new to anyone who frequents the sites. In essence, WaSP and ALA are preaching to the choir. Not everyone in the choir may agree with the implementation, but they all understand the message.
And they’ve chosen the easiest targets out there — the browser makers. There’s loads of data to demonstrate that the browsers don’t adhere to standards, and that they’ve often ignored them. Microsoft is always a target, so it’s no great leap to attack their implementation of the marquee
tag. Netscape is now an AOL company, so it’s easy to demonstrate how they took liberty with (then nascent) standards. Regardless of the fact that they both helped extend and stress-test the standards until the W3C was ready to take a strong role.
So now they lay blame on the older browsers, and praise the new ones for almost implementing standards that are up to four years old (even though there are more recent standards out there). They want developers to tell users to upgrade their browsers. With a campaign more annoying to users than the "Best viewed with…" buttons of the first six years of the web, WaSP suggests you kick users over to a page telling the users that their browsers are old and crappy. Who cares about the reason they may be using the browser? Who cares that the browser qualifies as good but the user had JavaScript disabled? Who cares that some users don’t care?
In focusing on the browsers, they’ve taken the pressure from where it really belongs — the editors. The browser makers are getting it, they’ve been making the changes (thanks in part to the WaSP and many developers). The developers who frequent ALA and WaSP get it, they’re coding to standards. The users are going to be assaulted with annoying redirects if WaSP has its way, so they may even upgrade in less than the projected 18 month window. But what about the developer who doesn’t partake? How is the campaign benefitting him/her or users of his/her sites?
Ultimately, there are two kinds of editors, people and software. Not all software writes bad code, and not all hand-coders write good code. But just as everyone thinks he or she is a good driver, nobody wants to fess up to the fact that someone is writing abysmal code.
Software
And by software, I primarily mean WYSIWYGs. This also includes those great text editors that offer incorrect HTML syntax guidance. And there are some that are self-described visual editors, or are really page layout applications, or even word processors. But ultimately, if it writes HTML for you, I’m talking about it. I don’t want to name any in particular, however, since I know people can be defensive about the tools they use. Some are bad, and some are good, and some are only as good as the user is bad.
I am, however, going to offer this statement from a company who makes all sorts of web tools. This statement was reported at a few places, including a review of the Web Standards Project Panel posted by Macromedia (don’t worry, there are other sources to verify it):
The compliancy argument, despite its good intention, does not have any important real-world application or meaning when considering the challenges Web designers face today. Nearly all professionally-created sites created with a plethora of web design visual authoring and coding tools will not pass compliancy tests as presented at http://validator.w3.org/. Failure of this test likewise does not serve any strong indication as to the validity of the Web site design itself in terms of user experience.
This circular argument basically says, nobody’s making sites with valid code, so we’re not going to make a tool that writes valid code. To some degree, all the tool vendors are guilty of promoting this logic. There are tools that happily insert invalid tags and attributes, allow incorrect nesting of elements, and even have incorrect (or misleading) documentation. The resulting code is often bloated, and is generally optimized for the developer’s system.
There are open-source tools out there that could be incorporated into the editors. Off the top of my head, I can think of three that would make any WYSIWYG (or otherwise) editor a much more viable solution for the developer who wants to code to standards:
- The W3C HTML/XHTML validator. This will validate the given page against the DTD listed within the page. The source code is distributed under a GPL-compatible license.
- The W3C CSS validator. Another tool that could be integrated into an editor.
- HTML Tidy. A handy stand-alone utility that searches for, and corrects, tag errors (nesting, unclosed tags, illegal tags, etc.). The source code is there, and they promote integration with other tools.
Granted, this doesn’t necessarily apply to some tools that only output to HTML as an ancillary function. But if they choose to market this feature and know developers rely on it (like creating entire sites from sliced images), then they should have the responsibility of building the tool to write correct code. Some tools offer the option to customize code by, for instance, letting you quote attributes. This should not be an option, attributes should be quoted. If somebody really wants to write non-compliant code, that person can edit it manually, but the tool should default to correct code at all times, and assume the user utilizes the tool because the user cannot or will not code by hand.
Wouldn’t it be nice if the editor, or other non-dedicated tool (page layout tool, for instance), could notify the developer when he/she is creating inaccessible code? Wouldn’t it be nice if all those positioned div
s were re-ordered, with prompting to the user, so that a screen reader could make sense of the content when linearized? Maybe it could coach the user for page titles instead of leaving blank title
s everywhere. Perhaps it could tell the user that “click here” is an unacceptable string of text to make into a hyperlink. How about warning when a frame has no navigation in it? Image maps without text links? Lack of meta information? And the list goes on.
There are too many people who’ve been pushed into web development as part of their daily job, but have no idea what HTML is. I’ve seen too many human resource staffers expected to maintain the job posting section of a site. Why not provide them with a tool that does it right? They aren’t going to learn HTML, or even know about the WaSP campaign, so let’s target the software manufacturers who are the de facto authors of millions of invalid pages. Let the users create good code, despite themselves.
So I say to the tool developers, use your software to guide the user with correct code, validate all output, and cut out all that evil. For all the tool users, you must understand that the tool limits you. Unless you hand tweak the output (in which case I refer you to the next section of this piece), you can only generate what the tool will let you generate.
People
Standards and support is a well understood problem, many people just don’t care. There are developers who want the easiest way out possible, and don’t care about standards in light of everything from the extra work to the nagging client. Just because they eschew WYSIWYGs doesn’t mean they can code their way out of a triply-nested table.
These people can’t or won’t make that change. The worst offenders are those who won’t. After surfing the responses to the ALA article, I saw way too many comments where the person was quite gung-ho about the "to hell with old browsers" message, and used it as justification to abruptly stop coding for older or alternative browsers. Nothing in the world changed just because ALA and the WaSP got some press on this issue. The same people using Navigator 3.04 yesterday are still using it today.
Yet too many people will use it as an excuse to dump all support for those older browsers. Let’s be clear, there’s no reason you can’t build pages that work and look generally good in old browsers while still validating, this very site is an example (in fact, you can read about how we did it). But there’s a certain gee-whiz factor with being on the bleeding edge. So now, instead of trying to get some bizarre DHTML trickery to work properly, a developer feels he/she can say, "It’s the browser’s fault. Tell them to upgrade." Immediately responsibility has been handed off. And all I wanted to do was buy a scarf. I did turn off JavaScript, though, since it kept crashing my version of IE5, so I guess it’s my fault.
These are the developers who need to learn to code for the user, while still adhering to standards. Too many sites are simply justification for playing with code. On a personal site, that’s great. On an e-commerce site, that’s probably a bit dim. On a community site, that’s just bad web karma.
In many cases, the culprit is that the developer is trying to apply old rules to a new medium. There are many things the web is not. It is not a CD-ROM presentation; users don’t come to your site to learn a new navigation technique. It is not print; you can’t control how text wraps, you can’t control the leading, hell, you can’t even control the typeface. The web is not television; users don’t navigate linearly and without bandwidth concerns. This isn’t to say we don’t see the web in these media, but we need to code for what the web is, a highly malleable medium where the user has as much control as the developer in how the content is presented. And I’m not the first person to say it — it’s been said on ALA, Jakob Nielsen has said it, and they’re on opposite ends of the developer scale. Somewhere in between are the rest of us. And yet we see developers constantly massaging image-sliced table layouts and DHTML effects designed to wow themselves, their boss, or their clients, but rarely their users.
Hand coders also need good resources for their skills. Many of them turn to books, given the ease with which one can read them versus surfing the W3C site. However, many of these books provide incorrect code samples. I’ve personally returned four HTML books because they had incorrect tags, attributes, or syntax throughout (I’ve seen them include both the spacer
tag and the marquee
tag, among other near-Greek tragedies). This isn’t limited to books on HTML, either, but is seen perhaps more readily in books covering server-side programming and scripting. Often the authors are only concerned with getting their script correct, and the HTML is the unfortunate offspring of the wonderful world of servers and scripts. As such, it is the bastard child of the code in the book, lacking in everything from quotes on the attributes to closing tags. After writing to an author of a server-side scripting book about the incorrect HTML and XHTML examples, I received this response:
You are correct about the sample of code shown, but it was done deliberately. It was meant to show a typical sample of HTML, whether or not that correctly conformed to standards. I agree that, in itself this isn’t really an excuse for writing ‘bad code’, but it wasn’t sloppyness. […] For myself I just hadn’t really been aware of XHTML and it’s importance – a pretty poor excuse I think you’ll agree.
To his credit, the author was aware of the importance of standards by the time I had found this book, and had made significant improvements in later books. But certainly this is indicative of an overall lack of strict standards compliance in the very “text books” so many developers use. And since those developers often don’t know about, or won’t take the time to visit, the W3C site, they are at a significant disadvantage.
So I say to the people who code, learn the standards, code to compliance, and always keep the user in mind, regardless of what unfortunate browser he or she might use.
Leave a Comment or Response