Wednesday, January 12, 2005

Dump the World Wide Web!



Here's an interesting excerpt from an article written by Bill Thompson that appeared on the www.opendemocracy.net website on December 23rd, 2004. What Thompson is saying has a good deal of relevance to what we are doing with Croquet. I will let the reader decide on where that relevance might be. Comments on this posting are particulary welcome.

"The World Wide Web is dead. Like a cartoon character running off a cliff but making it some way out into space before awareness brings gravity back into operation, it may continue to dominate our online lives a little longer, but its day is over.

Soon the whole clumsy, inadequate edifice will come crashing to the cyberspatial equivalent of the ground and we will look back upon the crazy decade from 1994 to 2004 for what it was – a dead-end in the development of the networked world.

The reasons are simple: the web, like many a political refugee, lacks a state. What’s worse, it doesn’t speak a language that will let it express anything more than basic requests for food, shelter or yet another poorly-resized JPEG image. Like all analogies this one breaks down pretty quickly if you scratch it too hard, but it’s worth keeping in mind during the (necessarily) more technical explanation you’re about to encounter.

It is important to understand how the web works. The web, like email, uses a “client-server” model. The client, in this case your browser, requests something – a web page – from a server. When a request is received, and assuming the parts are all there and the client has permission to take them, they are sent over the network by the server. It’s then up to the client to deal with them appropriately. In the case of a web page the elements will usually be a document written using HTML, the hypertext markup language, some image files and maybe extra bits and pieces. It is all very simple, and it’s made even simpler because the browser and the server communicate using a language of their very own called the Hypertext Transport Protocol, or HTTP.

The browser takes what it is given and displays it on your screen, laid out as prettily as it can manage. However once we want to do anything more complicated than display a page of text and graphics on a screen we rapidly discover that both HTML and HTTP are simply not up to the job.

The problems with HTML are serious but understandable. When Tim Berners-Lee created the web he wanted a simple text-based publishing tool for the high-energy physics community, and a simple markup language that let authors specify headings and link to other documents was fine.

But in 1993 two graduate students at a United States university decided they could improve on Tim’s work by writing a new browser which would display images too. In order to make this work they had to change HTML by adding the < IMG > tag – and they started a process of non-standard extensions which continues to this day.

The result is the mess we see today, where despite the best efforts of the standards bodies it is still necessary to write dozens of lines of code at the start of a web page in order to figure out which browser is in use, so that the “correct” version of the page can be sent over.

Present at the creation

It’s an appalling mess, but it wasn’t directly Tim’s fault. However the same cannot be said for HTTP, the protocol which allows browsers to ask for pages and servers to send them across the network. Here Tim’s desire for simplicity has led directly to our current problems, because he decided that the server should treat each request for a page from a browser as a separate transaction. The decision to make HTTP a “stateless” protocol has caused immense trouble. It’s rather like being served by a waiter with short-term memory loss: you can only order one course at a time because he will have forgotten your name, never mind your dessert order, by the time you’ve had your first spoonful of gazpacho.

Unfortunately many of the things that we want the web to do for us, from online shopping to having a newspaper that tailors its pages to our interests, rely on some degree of long-term interaction between client and server. Cookies, small data files that are placed on a client computer by the server, provide a partial solution, rather like the tattoos sported by Guy Pearce in the film Memento, but they are inelegant, complicated and far from reliable. As, indeed, the tattoos turn out to be.

We have spent the last decade fighting against the limitations of the web standards, extending, breaking, reinventing and compromising with them to the point where you can just about do online shopping, make pages look reasonably attractive and even offer personalised services.

But enough is enough. Just as it is sometimes necessary to demolish old buildings to make way for new, so it is time to move on from the web. It isn’t as if we need to look far for an alternative – we’ve had one since 1990 when the web was just starting to emerge from CERN physics lab. It’s called “distributed processing” and it enables programs to talk to each other in a far richer, more complex and more useful way than the web’s standards could ever support.

Had it not been for the rush to embrace the web’s page-based publishing model, choosing the simple solution over the right one, we would have proper distributed systems available today. Instead we have to invent technologies which preserve the web approach while making it slightly more usable, like the eXtensible Markup Language, or XML. Any tool that is too embarrassed even to use the first letter of its full name for an abbreviation is surely in trouble from the start.

Unusually for a company which is credited with following trends rather than creating them, Microsoft saw this first. They never liked the web and it was only the horrible realisation that every company, every net user and every competitor was going to invest a vast amount of money, effort and resources making it seem like it worked that forced Bill Gates to turn the company around and give it a web focus late in 1995.

At the time their programmers were just beginning to explore the possibility of direct programme-to-programme communication and network-based collaboration between applications. Without the distraction of the web we may well have had widespread distributed online services five or even more years ago.

These services would not rely on the Web browser as the single way of getting information from an online service, but would allow a wide range of different programs to work together over the network. We already accept that email, chat and even music sharing do not have to be Web-based, but we can go much further.

A news site could deliver text, images, audio and even video through a program designed for the purpose, instead of having to use a general-purpose browser, or a shopping site could build its own shopping cart and checkout that did nor rely on Web protocols. And we would have no need for Google, because information services would advertise their contents instead of having to be searched by inefficient ‘spiders’.

The web may have served a purpose once, giving net users something relatively simple to look at and use and convincing the world that being online was a good thing, but it has done so at great cost to the network’s architecture and has diverted research into usable, scalable and functional distributed systems for the last decade.

There is a deep need among the users for something better than the shoddy, half-baked hypertext publishing model that we geeks foolishly embraced back in the early 1990s. If we do not start delivering it the net itself will stumble, fail and eventually die away, trapped in this stateless web of deceit."

3 comments:

Anonymous said...

Two points I miss in the article: the aspect of where are the data stored, the aspect of available hardware (or better: environment) and the aspect of standards

One effect of the original web was, that people could have a look on a huge amount of data which never could stored on their PC alone. And second, the view could done with relativly small PC of ANY operating system (nearly ;-)

From the hardware availibilty, this also was necessary, because only a few people had access to a lot of disk space and processing speed.

These days, this situation changed obviously. But where are the data stored ? If data and processing are equal in the sense, that it is the same for the user if data are read from database or calculated in time it makes sense what the text states: switching to a network of programs talking together, which in consequence means also a network of relativly equal powered nodes (PCs).

But for me, this means not that the question is answered where are the data hold and how the programs talk to each other or for short: the aspect of giving and living standards. This seems to me as important in the "new" internet as in the current: platform independent software and a standard data (and process! ) mark up.

For platform independend and equal valued processing nodes here comes Croquet on the scene. But also on this platform I see no clear answer of "where are the data stored" and "how programs talk to each other in a standard way", because in that texts I've read to croquet I missed always the point database, and web services or something similar seems also to be ignored in croquet world. But perhaps I'm only a dummy ;-) Anyway for the new internet, croquet will be exciting and the right step....

Darius said...

One counterexample to this essay not explored by it is that of Macromedia’s products. They allowed distributing programs with a common language in a platform that made all PC’s behave equally for the programmer. Yet, Flash & Shockwave never replaced HTML. Dreamweaver remains their strong seller. Why?
For the same reason, JavaScript never _replaced_ HTML. Why?

Another point, HTML created _a_ universal standard. There was no indication in ’95 that any other standard protocol could or would ever become a _universal_ standard. Turf wars instead between entire protocol sets. There are HTML turf wars now, but at least a minimal universal standard that gets things _done_.

As a programmer, I don't like HTML either. Neither can I make a universal application without it, no matter how clever I may be. I do hope Croquet will lead to the next standard. However, there are certain critical needs it _must_ satisfy. I've not seen much effort to list these needs so that we can predict or judge how close Croquet is to meeting them.

Personally tailored hardware configurations are just too flaky for a server to guess what programs to send down to the client. Sending text is just "safe" in such a scenario. The Croquet team has already seen this exhibited in 3D driver levels.

Croquet has a chance now because games & commoditization helped push the price of 3D down, and hence, make it universal and fairly homogonous. For example, Croquet would have no chance if there were hundreds or thousands of completely incompatible 3D hardware & API configurations that made up the population of PC’s owned today. The authors of Croquet acknowledge this in their introduction to Croquet.

Darius said...

Perhaps the Roadmap as a wiki will help here.