My feelings about this are mixed. Of course, good standards are helpful - especially when they make the end device and browser more powerful in function, and new, efficient user interfaces can be implemented based on them.

On the other hand, however, the multitude of standards and sub-standards creates so much technical overhead that it becomes harder for ordinary people to get into it. And regardless of how we feel about the result of invalid and partly haphazardly cobbled together HTML dumps, it is precisely this easy access and the fairly tolerant implementation in browsers that allowed HTML and the web to take off in the first place.

It's much more accessible for a lot more people to produce this format - if necessary, you take another site, look at the source and do something similar. Many started that way, many don't get beyond copying - but that doesn't matter, they are present.

Sure, designers recoil in horror, HTML standard purists too, as do software developers. I myself get screaming fits when I look at certain output on the web. But the fact remains that with more complicated techniques, these people wouldn't be here at all.

Would the web be better because of that? Is it really sensible to shield yourself through technical barriers and make the web more elitist? Or is it precisely the haphazardly hacked and sometimes truly awful content that makes the web what it is: an almost popular medium?

The new W3C standards are becoming ever more technical, ever more complex. And in doing so, they raise the barrier to entry. Sure, HTML 4 still exists and will certainly be supported for a long time - but it will become, so to speak, the dumbed-down version. The professional will throw XHTML and XForms around, the amateur with shoddy HTML 4.

I don't know what would be more fun for me. But I'm afraid it would be the shoddy HTML 4...

At heise online news there's the original article.