TUNES vs the WWW

An essay which contrast the concepts of Metatext in Tunes with that of World Wide Web (WWW).

WARNING! Very early draft. It's only a bunch of references, it needs work to make sense .. but anyway it is food for thought. This is not intended to be "politically correct" or even only "balanced" and it is my (MaD70) particular point of view, not necessarly endorsed by other members and contributors of TUNES, so beware. [Note that my English prose is horrible, feel free to improve it].

See also Tunes Distributed Publishing by fare and the concept of Metatext in Tunes.

The fundamental realization that The Web is broken (Completely Rethinking the Web), by Dirk Knemeyer (some thoughts on something similar to Metatext).

Some technologies predating the Web that W3C is trying to reinvent (badly):

§ courtesy of the Wayback Machine, see Search Facilities.

As an introduction, see this small article by Tim Bradshaw: The wheel of reinvention§ (old broken link).

From less formalized/near the user to more formalized theories and resulting technologies (I established this hierarchy because I think that implementing upper in terms of lower levels is convenient, it is a good separation of concerns):

[Insert here a comment about the wasting and unreliability of actual search engines, which are a centralized, little scalable solution, vs a true Distributed DBM/IR Systems].

By Tim Berners-Lee:

Weakness of the Hypertext model: links!

[Review W3C XML Linking Language (XLink)].

We can stress the weakness of traditional links with various contrasts, different in terms but similar as concepts expressed, for example:

Essay by Tim Berners-Lee:

Weakness of HTTP (with which people are continually confronting):

[Review W3C HTTP - Hypertext Transfer Protocol.]

Winning the Application Server Arms Race: Using Smalltalk to Redefine Web Development keynote by Avi Bryant, at Smalltalk Solutions 2004 (also at James Robertson's blog StS 2004 - The Seaside keynote)

Abstract: It would be hard to imagine a worse model for user interface development than HTTP. Would you use a GUI framework where every event from every widget in your application was handed to you at once, periodically, as a large hashtable full of strings? Where every time a single piece of data changed you had to regenerate a textual description of the entire interface? Where the event-based architecture was so strict that you couldn't ever, under any circumstances, open a modal dialog box and wait for it to return an answer?

Those are the costs of using the web browser as a client platform, and, by and large, we accept them. The dominant paradigms of web development -- CGI, Servlets, Server Pages -- do very little to hide or circumvent the low level realities of HTTP, and as a result, web applications are fragile, verbose, and ill-suited to reuse.

[..]

Google for "weakness of HTTP" gives only 7 results (4-jan-2003) .. amazing!

[Review REST (REpresentational State Transfer - RESTwiki)].

Weakness of Markup Languages:

A talk given by Aaron Crane: Does XML Suck? Or: Why XML is technologically terrible, but you have to use it anyway. On the same site The Big List of XML Technologies.

Why XML is awful§ (another copy and old broken link).

By Rita Knox: Here's What's Wrong With XML-Defined Standards - ".. Wasn't XML supposed to make data shareable? No. XML provides the tools to define shareable data models[sic], but it does not make them shareable any more than the alphabet makes every word in the English language understood by anyone who speaks English. .." [or the "naïve", with "self describing", imply that you don't need agreement on semantic and pragmatic issues before exchanging information.]

Fabian Pascal shows that XML is a poor choice for data transmission too:

See also XML Binary Characterization Working Group Public Page:
The XML Binary Characterization Working Group is tasked with gathering information about uses cases where the overhead of generating, parsing, transmitting, storing, or accessing XML-based data may be deemed too great for a particular application, characterizing the properties that XML provides as well as those that are required by the use cases, and establishing objective, shared measurements to help judge whether XML 1.x and alternate (binary) encodings provide the required properties.
As said in XML node, an implicit admission, by W3C, that the standard XML format is a useless waste of resources.

Erik Naggum, maintainer of the SGML Repository at the University of Oslo for nearly 6 years, before:
[??? where is it? it was a citation on the front page of an official SGML site ???]

.. and after the SGML cure: Erik Naggum on SGML and DSSSL, Arguments against SGML.

Also these threads on comp.lang.lisp: Core ideas behind SGML and XML and "Re: The Next Generation of Lisp Programmers":






































This page is linked from: KnowOS   Ontology   XML