Web Standards: Evolving A "Successful Esperanto" for the New Human Network…

What are we actually building as we’re building the Web? It may seem like we’re linking computers all over the world. But the computers, the cables, the protocols etc are just the tools and the glue.

What we’re really building here is a new Human Network which allows people all over the earth to create, distribute, share and consume information, which they can use to add value to their personal or business lives (That’s why I believe my Digital Declaration of Independence is important).

We’re not there yet, of course. There are hundreds of millions of people who don’t have access. It’ll take many decades – maybe even a couple of centuries – before that changes.

One of the things that separates people today is language. I speak English, and some French which seems to have survived since it was instilled in me at high school. Plonk me down in Germany, Tokyo or Timkbuktu and I’m pretty helpless. I’ll eventually find what I need, but it’ll be a long, tedious process punctuated with miscommunications which are either hilarious or frightening.

Well-meaning people have tried to solve the language problem in the past with new artificial languages which were meant to eventually replace the natural-but-different ones spoken by different peoples.

It hasn’t worked. The most successful effort to date is probably Esperanto, and even that’s a pretty dismal failure as far as adoption is concerned.

On the Web we’ve done much better, thank goodness.

Anyone who wants to establish a proprietary standard for the Web knows it’s hopeless. They missed the bus a long time ago. It pulled out of the station in the early 1990s, and it had “HTML” written across its front. By now it’s uncatchable.

The problem is: the standards are still pretty awful. HTML has evolved over the years, been augjmented with other standards like Cascading Stye Sheets and so on. But it started out as an information interchange format invented by geeks, for geeks. There was little or no attention paid to readability, the shorthand term I use for the vast array of typographic and design techniques developed in the 550 years since Gutenberg (or the 5500 years since humans first began to write) to make it easy for humans to absorb information from a page or screen.

When the first NCSA Mosaic browser appeared, for instance, you could have any typeface you wanted as long as it was Times (or Times New Roman). Lines of text spanned the whole width of the window and long passages of text were just about impossible to read.

I think it was 1993 when I first noticed “design coolness” attempting to break through. People started using all kinds of tricks to overcome the limitations of the HTML.

They’re still having to do that, because the Web Standard have not yet evolved to support everything that’s needed. So we still see text as bitmapped graphics, or Flash – just because the Web itself is so bad.

There’s a lot more that needs to go into standards like CSS before they’ll really be capable of doing the job we need. And the browsers will have to keep changing themselves, in order to support those new standards.

This is a real dilemma if you are a company with a browser. Changes to standards may break older pages – will certainly break many. If your customers have built hundreds of millions – or billions – of pages to work well on one version of your browser, and you change it, they face a lot of work fixing them.

On the other hand, if you “fossilize”

Leave a comment