Tuesday, January 23, 2007

The Metaphors of the Net

I. The Genetic Blueprint

A decade after the invention of the World Wide Web, Tim Berners-Lee is promoting the "Semantic Web". The Internet hitherto is a repository of digital content. It has a rudimentary inventory system and very crude data location services. As a sad result, most of the content is invisible and inaccessible. Moreover, the Internet manipulates strings of symbols, not logical or semantic propositions. In other words, the Net compares values but does not know the meaning of the values it thus manipulates. It is unable to interpret strings, to infer new facts, to deduce, induce, derive, or otherwise comprehend what it is doing. In short, it does not understand language. Run an ambiguous term by any search engine and these shortcomings become painfully evident. This lack of understanding of the semantic foundations of its raw material data, information prevent applications and databases from sharing resources and feeding each other. The Internet is discrete, not continuous. It resembles an archipelago, with users hopping from island to island in a frantic search for relevancy.

Even visionaries like Berners-Lee do not contemplate an "intelligent Web". They are simply proposing to let users, content creators, and web developers assign descriptive meta-tags "name of hotel" to fields, or to strings of symbols "Hilton". These meta-tags arranged in semantic and relational "ontologies" - lists of metatags, their meanings and how they relate to each other will be read by various applications and allow them to process the associated strings of symbols correctly place the word "Hilton" in your address book under "hotels". This will make information retrieval more efficient and reliable and the information retrieved is bound to be more relevant and amenable to higher level processing statistics, the development of heuristic rules, etc.. The shift is from HTML whose tags are concerned with visual appearances and content indexing to languages such as the DARPA Agent Markup Language, OIL Ontology Inference Layer or Ontology Interchange Language, or even XML whose tags are concerned with content taxonomy, document structure, and semantics. This would bring the Internet closer to the classic library card catalogue.

Even in its current, pre-semantic, hyperlink-dependent, phase, the Internet brings to mind Richard Dawkins seminal work "The Selfish Gene" OUP, 1976. This would be doubly true for the Semantic Web.

Dawkins suggested to generalize the principle of natural selection to a law of the survival of the stable. "A stable thing is a collection of atoms which is permanent enough or common enough to deserve a name". He then proceeded to describe the emergence of "Replicators" - molecules which created copies of themselves. The Replicators that survived in the competition for scarce raw materials were characterized by high longevity, fecundity, and copying-fidelity. Replicators now known as "genes" constructed "survival machines" organisms to shield them from the vagaries of an ever-harsher environment.

This is very reminiscent of the Internet. The "stable things" are HTML coded web pages. They are replicators - they create copies of themselves every time their "web address" URL is clicked. The HTML coding of a web page can be thought of as "genetic material". It contains all the information needed to reproduce the page. And, exactly as in nature, the higher the longevity, fecundity measured in links to the web page from other web sites, and copying-fidelity of the HTML code - the higher its chances to survive as a web page.

Replicator molecules DNA and replicator HTML have one thing in common - they are both packaged information. In the appropriate context the right biochemical "soup" in the case of DNA, the right software application in the case of HTML code - this information generates a "survival machine" organism, or a web page.

The Semantic Web will only increase the longevity, fecundity, and copying-fidelity or the underlying code in this case, OIL or XML instead of HTML. By facilitating many more interactions with many other web pages and databases - the underlying "replicator" code will ensure the "survival" of "its" web page =its survival machine. In this analogy, the web pages "DNA" its OIL or XML code contains "single genes" semantic meta-tags. The whole process of life is the unfolding of a kind of Semantic Web.

In a prophetic paragraph, Dawkins described the Internet:

"The first thing to grasp about a modern replicator is that it is highly gregarious. A survival machine is a vehicle containing not just one gene but many thousands. The manufacture of a body is a cooperative venture of such intricacy that it is almost impossible to disentangle the contribution of one gene from that of another. A given gene will have many different effects on quite different parts of the body. A given part of the body will be influenced by many genes and the effect of any one gene depends on interaction with many others...In terms of the analogy, any given page of the plans makes reference to many different parts of the building; and each page makes sense only in terms of cross-reference to numerous other pages."

What Dawkins neglected in his important work is the concept of the Network. People congregate in cities, mate, and reproduce, thus providing genes with new "survival machines". But Dawkins himself suggested that the new Replicator is the "meme" - an idea, belief, technique, technology, work of art, or bit of information. Memes use human brains as "survival machines" and they hop from brain to brain and across time and space "communications" in the process of cultural as distinct from biological evolution. The Internet is a latter day meme-hopping playground. But, more importantly, it is a Network. Genes move from one container to another through a linear, serial, tedious process which involves prolonged periods of one on one gene shuffling "sex" and gestation. Memes use networks. Their propagation is, therefore, parallel, fast, and all-pervasive. The Internet is a manifestation of the growing predominance of memes over genes. And the Semantic Web may be to the Internet what Artificial Intelligence is to classic computing. We may be on the threshold of a self-aware Web.

2. The Internet as a Chaotic Library

A. The Problem of Cataloguing

The Internet is an assortment of billions of pages which contain information. Some of them are visible and others are generated from hidden databases by users requests "Invisible Internet".

The Internet exhibits no discernible order, classification, or categorization. Amazingly, as opposed to "classical" libraries, no one has yet invented a sorely needed Internet cataloguing standard remember Dewey. Some sites indeed apply the Dewey Decimal System to their contents Suite101. Others default to a directory structure Open Directory, Yahoo!, Look Smart and others.

Had such a standard existed an agreed upon numerical cataloguing method - each site could have self-classified. Sites would have an interest to do so to increase their visibility. This, naturally, would have eliminated the need for todays clunky, incomplete and highly inefficient search engines.

Thus, a site whose number starts with 900 will be immediately identified as dealing with history and multiple classification will be encouraged to allow finer cross-sections to emerge. An example of such an emerging technology of "self classification" and "self-publication" though limited to scholarly resources is the "Academic Resource Channel" by Scindex.

Moreover, users will not be required to remember reams of numbers. Future browsers will be akin to catalogues, very much like the applications used in modern day libraries. Compare this utopia to the current dystopy. Users struggle with mounds of irrelevant material to finally reach a partial and disappointing destination. At the same time, there likely are web sites which exactly match the poor users needs. Yet, what currently determines the chances of a happy encounter between user and content - are the whims of the specific search engine used and things like meta-tags, headlines, a fee paid, or the right opening sentences.

B. Screen vs. Page

The computer screen, because of physical limitations size, the fact that it has to be scrolled fails to effectively compete with the printed page. The latter is still the most ingenious medium yet invented for the storage and release of textual information. Granted: a computer screen is better at highlighting discrete units of information. So, these differing capacities draw the battle lines: structures printed pages versus units screen, the continuous and easily reversible print versus the discrete screen.

The solution lies in finding an efficient way to translate computer screens to printed matter. It is hard to believe, but no such thing exists. Computer screens are still hostile to off-line printing. In other words: if a user copies information from the Internet to his word processor or vice versa, for that matter - he ends up with a fragmented, garbage-filled and non-aesthetic document.

Very few site developers try to do something about it - even fewer succeed.

C. Dynamic vs. Static Interactions

One of the biggest mistakes of content suppliers is that they do not provide a "static-dynamic interaction".

Internet-based content can now easily interact with other media e.g., CD-ROMs and with non-PC platforms PDAs, mobile phones.

Examples abound:

A CD-ROM shopping catalogue interacts with a Web site to allow the user to order a product. The catalogue could also be updated through the site as is the practice with CD-ROM encyclopedias. The advantages of the CD-ROM are clear: very fast access time dozens of times faster than the access to a Web site using a dial up connection and a data storage capacity hundreds of times bigger than the average Web page.

Another example:

A PDA plug-in disposable chip containing hundreds of advertisements or a "yellow pages". The consumer selects the ad or entry that she wants to see and connects to the Internet to view a relevant video. She could then also have an interactive chat or a conference with a salesperson, receive information about the company, about the ad, about the advertising agency which created the ad - and so on.

CD-ROM based encyclopedias such as the Britannica, or the Encarta already contain hyperlinks which carry the user to sites selected by an Editorial Board.

Note

CD-ROMs are probably a doomed medium. Storage capacity continually increases exponentially and, within a year, desktops with 80 Gb hard disks will be a common sight. Moreover, the much heralded Network Computer - the stripped down version of the personal computer - will put at the disposal of the average user terabytes in storage capacity and the processing power of a supercomputer. What separates computer users from this utopia is the communication bandwidth. With the introduction of radio and satellite broadband services, DSL and ADSL, cable modems coupled with advanced compression standards - video on demand, audio and data will be available speedily and plentifully.

The CD-ROM, on the other hand, is not mobile. It requires installation and the utilization of sophisticated hardware and software. This is no user friendly push technology. It is nerd-oriented. As a result, CD-ROMs are not an immediate medium. There is a long time lapse between the moment of purchase and the moment the user accesses the data. Compare this to a book or a magazine. Data in these oldest of media is instantly available to the user and they allow for easy and accurate "back" and "forward" functions.

Perhaps the biggest mistake of CD-ROM manufacturers has been their inability to offer an integrated hardware and software package. CD-ROMs are not compact. A Walkman is a compact hardware-cum-software package. It is easily transportable, it is thin, it contains numerous, user-friendly, sophisticated functions, it provides immediate access to data. So does the discman, or the MP3-man, or the new generation of e-books e.g., E-Inks. This cannot be said about the CD-ROM. By tying its future to the obsolete concept of stand-alone, expensive, inefficient and technologically unreliable personal computers - CD-ROMs have sentenced themselves to oblivion with the possible exception of reference material.

D. Online Reference

A visit to the on-line Encyclopaedia Britannica demonstrates some of the tremendous, mind boggling possibilities of online reference - as well as some of the obstacles.

Each entry in this mammoth work of reference is hyperlinked to relevant Web sites. The sites are carefully screened. Links are available to data in various forms, including audio and video. Everything can be copied to the hard disk or to a R/W CD.

This is a new conception of a knowledge centre - not just a heap of material. The content is modular and continuously enriched. It can be linked to a voice Q&A centre. Queries by subscribers can be answered by e-mail, by fax, posted on the site, hard copies can be sent by post. This "Trivial Pursuit" or "homework" service could be very popular - there is considerable appetite for "Just in Time Information". The Library of Congress - together with a few other libraries - is in the process of making just such a service available to the public CDRS - Collaborative Digital Reference Service.

E. Derivative Content

The Internet is an enormous reservoir of archives of freely accessible, or even public domain, information.

With a minimal investment, this information can be gathered into coherent, theme oriented, cheap compilations on CD-ROMs, print, e-books or other media.

F. E-Publishing

The Internet is by far the worlds largest publishing platform. It incorporates FAQs Q&As regarding almost every technical matter in the world, e-zines electronic magazines, the electronic versions of print dailies and periodicals in conjunction with on-line news and information services, reference material, e-books, monographs, articles, minutes of discussions "threads", conference proceedings, and much more besides.

The Internet represents major advantages to publishers. Consider the electronic version of a p-zine.

Publishing an e-zine promotes the sales of the printed edition, it helps sign on subscribers and it leads to the sale of advertising space. The electronic archive function see next section saves the need to file back issues, the physical space required to do so and the irritating search for data items.

The future trend is a combined subscription to both the electronic edition mainly for the archival value and the ability to hyperlink to additional information and to the print one easier to browse the current issue. The Economist is already offering free access to its electronic archives as an inducement to its print subscribers.

The electronic daily presents other advantages:

It allows for immediate feedback and for flowing, almost real-time, communication between writers and readers. The electronic version, therefore, acquires a gyroscopic function: a navigation instrument, always indicating deviations from the "right" course. The content can be instantly updated and breaking news incorporated in older content.

Specialty hand held devices already allow for downloading and storage of vast quantities of data up to 4000 print pages. The user gains access to libraries containing hundreds of texts, adapted to be downloaded, stored and read by the specific device. Again, a convergence of standards is to be expected in this field as well the final contenders will probably be Adobes PDF against Microsofts MS-Reader.

Currently, e-books are dichotomously treated either as:

Continuation of print books p-books by other means, or as a whole new publishing universe.

Since p-books are a more convenient medium then e-books - they will prevail in any straightforward "medium replacement" or "medium displacement" battle.

In other words, if publishers will persist in the simple and straightforward conversion of p-books to e-books - then e-books are doomed. They are simply inferior and cannot offer the comfort, tactile delights, browseability and scanability of p-books.

But e-books - being digital - open up a vista of hitherto neglected possibilities. These will only be enhanced and enriched by the introduction of e-paper and e-ink. Among them:

  • Hyperlinks within the e-book and without it - to web content, reference works, etc.;
  • Embedded instant shopping and ordering links;
  • Divergent, user-interactive, decision driven plotlines;
  • Interaction with other e-books using a wireless standard - collaborative authoring or reading groups;
  • Interaction with other e-books - gaming and community activities;
  • Automatically or periodically updated content;
  • Multimedia;
  • Database, Favourites, Annotations, and History Maintenance archival records of reading habits, shopping habits, interaction with other readers, plot related decisions and much more;
  • Automatic and embedded audio conversion and translation capabilities;
  • Full wireless piconetworking and scatternetworking capabilities.
  • The technology is still not fully there. Wars rage in both the wireless and the e-book realms. Platforms compete. Standards clash. Gurus debate. But convergence is inevitable and with it the e-book of the future.

G. The Archive Function

The Internet is also the worlds biggest cemetery: tens of thousands of deadbeat sites, still accessible - the "Ghost Sites" of this electronic frontier.

This, in a way, is collective memory. One of the Internets main functions will be to preserve and transfer knowledge through time. It is called "memory" in biology - and "archive" in library science. The history of the Internet is being documented by search engines Google and specialized services Alexa alike.

3. The Internet as a Collective Nervous System

Drawing a comparison from the development of a human infant - the human race has just commenced to develop its neural system.

The Internet fulfils all the functions of the Nervous System in the body and is, both functionally and structurally, pretty similar. It is decentralized, redundant each part can serve as functional backup in case of malfunction. It hosts information which is accessible through various paths, it contains a memory function, it is multimodal multimedia - textual, visual, audio and animation.

I believe that the comparison is not superficial and that studying the functions of the brain from infancy to adulthood is likely to shed light on the future of the Net itself. The Net - exactly like the nervous system - provides pathways for the transport of goods and services - but also of memes and information, their processing, modeling, and integration.

A. The Collective Computer

Carrying the metaphor of "a collective brain" further, we would expect the processing of information to take place on the Internet, rather than inside the end-user’s hardware the same way that information is processed in the brain, not in the eyes. Desktops will receive results and communicate with the Net to receive additional clarifications and instructions and to convey information gathered from their environment mostly, from the user.

Put differently:

In future, servers will contain not only information as they do today - but also software applications. The user of an application will not be forced to buy it. He will not be driven into hardware-related expenditures to accommodate the ever growing size of applications. He will not find himself wasting his scarce memory and computing resources on passive storage. Instead, he will use a browser to call a central computer. This computer will contain the needed software, broken to its elements =applets, small applications. Anytime the user wishes to use one of the functions of the application, he will siphon it off the central computer. When finished - he will "return" it. Processing speeds and response times will be such that the user will not feel at all that he is not interacting with his own software the question of ownership will be very blurred. This technology is available and it provoked a heated debated about the future shape of the computing industry as a whole desktops - really power packs - or network computers, a little more than dumb terminals. Access to online applications are already offered to corporate users by ASPs Application Service Providers.

In the last few years, scientists have harnessed the combined power of online PCs to perform astounding feats of distributed parallel processing. Millions of PCs connected to the net co-process signals from outer space, meteorological data, and solve complex equations. This is a prime example of a collective brain in action.

B. The Intranet - a Logical Extension of the Collective Computer

LANs Local Area Networks are no longer a rarity in corporate offices. WANs wide Area Networks are used to connect geographically dispersed organs of the same legal entity branches of a bank, daughter companies of a conglomerate, a sales force. Many LANs and WANs are going wireless.

The wireless intranet/extranet and LANs are the wave of the future. They will gradually eliminate their fixed line counterparts. The Internet offers equal, platform-independent, location-independent and time of day - independent access to corporate memory and nervous system. Sophisticated firewall security applications protect the privacy and confidentiality of the intranet from all but the most determined and savvy crackers.

The Intranet is an inter-organizational communication network, constructed on the platform of the Internet and it, therefore, enjoys all its advantages. The extranet is open to clients and suppliers as well.

The companys server can be accessed by anyone authorized, from anywhere, at any time with local - rather than international - communication costs. The user can leave messages internal e-mail or v-mail, access information - proprietary or public - from it, and participate in "virtual teamwork" see next chapter.

The development of measures to safeguard server routed inter-organizational communication firewalls is the solution to one of two obstacles to the institutionalization of Intranets. The second problem is the limited bandwidth which does not permit the efficient transfer of audio not to mention video.

It is difficult to conduct video conferencing through the Internet. Even the voices of discussants who use internet phones IP telephony come out though very slightly distorted.

All this did not prevent 95% of the Fortune 1000 from installing intranet. 82% of the rest intend to install one by the end of this year. Medium to big size American firms have 50-100 intranet terminals per every internet one.

One of the greatest advantages of the intranet is the ability to transfer documents between the various parts of an organization. Consider Visa: it pushed 2 million documents per day internally in 1996.

An organization equipped with an intranet can while protected by firewalls give its clients or suppliers access to non-classified correspondence, or inventory systems. Many B2B exchanges and industry-specific purchasing management systems are based on extranets.

C. The Transport of Information - Mail and Chat

The Internet its e-mail function is eroding traditional mail. 90% of customers with on-line access use e-mail from time to time and 60% work with it regularly. More than 2 billion messages traverse the internet daily.

E-mail applications are available as freeware and are included in all browsers. Thus, the Internet has completely assimilated what used to be a separate service, to the extent that many people make the mistake of thinking that e-mail is a feature of the Internet.

The internet will do to phone calls what it has done to mail. Already there are applications Intels, Vocaltecs, Net2Phone which enable the user to conduct a phone conversation through his computer. The voice quality has improved. The discussants can cut into each others words, argue and listen to tonal nuances. Today, the parties two or more engaging in the conversation must possess the same software and the same computer hardware. In the very near future, computer-to-regular phone applications will eliminate this requirement. And, again, simultaneous multi-modality: the user can talk over the phone, see his party, send e-mail, receive messages and transfer documents - without obstructing the flow of the conversation.

The cost of transferring voice will become so negligible that free voice traffic is conceivable in 3-5 years. Data traffic will overtake voice traffic by a wide margin.

The next phase will probably involve virtual reality. Each of the parties will be represented by an "avatar", a 3-D figurine generated by the application or the users likeness mapped and superimposed on the the avatar. These figurines will be multi-dimensional: they will possess their own communication patterns, special habits, history, preferences - in short: their own "personality".

Thus, they will be able to maintain an "identity" and a consistent pattern of communication which they will develop over time.

Such a figure could host a site, accept, welcome and guide visitors, all the time bearing their preferences in its electronic "mind". It could narrate the news, like the digital anchor "Ananova" does. Visiting sites in the future is bound to be a much more pleasant affair.

D. The Transport of Value - E-cash

In 1996, four corporate giants Visa, MasterCard, Netscape and Microsoft agreed on a standard for effecting secure payments through the Internet: SET. Internet commerce is supposed to mushroom to $25 billion by 2003. Site owners will be able to collect rent from passing visitors - or fees for services provided within the site. Amazon instituted an honour system to collect donations from visitors. PayPal provides millions of users with cash substitutes. Gradually, the Internet will compete with central banks and banking systems in money creation and transfer.

E. The Transport of Interactions - The Virtual Organization

The Internet allows for simultaneous communication and the efficient transfer of multimedia video included files between an unlimited number of users. This opens up a vista of mind boggling opportunities which are the real core of the Internet revolution: the virtual collaborative "Follow the Sun" modes.

Examples:

A group of musicians is able to compose music or play it - while spatially and temporally separated;

Advertising agencies are able to co-produce ad campaigns in a real time interaction;

Cinema and TV films are produced from disparate geographical spots through the teamwork of people who never meet, except through the Net.

These examples illustrate the concept of the "virtual community". Space and time will no longer hinder team collaboration, be it scientific, artistic, cultural, or an ad hoc arrangement for the provision of a service a virtual law firm, or accounting office, or a virtual consultancy network. The intranet can also be thought of as a "virtual organization", or a "virtual business".

The virtual mall and the virtual catalogue are prime examples of spatial and temporal liberation.

In 1998, there were well over 300 active virtual malls on the Internet. In 2000, they were frequented by 46 million shoppers, who shopped in them for goods and services.

The virtual mall is an Internet "space" pages wherein "shops" are located. These shops offer their wares using visual, audio and textual means. The visitor passes through a virtual "gate" or storefront and examines the merchandise on offer, until he reaches a buying decision. Then he engages in a feedback process: he pays with a credit card, buys the product, and waits for it to arrive by mail or downloads it.

The manufacturers of digital products intellectual property such as e-books or software have begun selling their merchandise on-line, as file downloads. Yet, slow communications speeds, competing file formats and reader standards, and limited bandwidth - constrain the growth potential of this mode of sale. Once resolved - intellectual property will be sold directly from the Net, on-line. Until such time, the mediation of the Post Office is still required. As long as this is the state of the art, the virtual mall is nothing but a glorified computerized mail catalogue or Buying Channel, the only difference being the exceptionally varied inventory.

Websites which started as "specialty stores" are fast transforming themselves into multi-purpose virtual malls. Amazon.com, for instance, has bought into a virtual pharmacy and into other virtual businesses. It is now selling music, video, electronics and many other products. It started as a bookstore.

This contrasts with a much more creative idea: the virtual catalogue. It is a form of narrowcasting as opposed to broadcasting: a surgically accurate targeting of potential consumer audiences. Each group of profiled consumers no matter how small is fitted with their own - digitally generated - catalogue. This is updated daily: the variety of wares on offer adjusted to reflect inventory levels, consumer preferences, and goods in transit - and prices sales, discounts, package deals change in real time. Amazon has incorporated many of these features on its web site. The user enters its web site and there delineates his consumption profile and his preferences. A customized catalogue is immediately generated for him including specific recommendations. The history of his purchases, preferences and responses to feedback questionnaires is accumulated in a database. This intellectual property may well be Amazons main asset.

There is no technological obstacles to implementing this vision today - only administrative and legal patent ones. Big brick and mortar retail stores are not up to processing the flood of data expected to result. They also remain highly sceptical regarding the feasibility of the new medium. And privacy issues prevent data mining or the effective collection and usage of personal data remember the case of Amazons "Readers Circles".

The virtual catalogue is a private case of a new internet off-shoot: the "smart shopping agents". These are AI applications with "long memories".

They draw detailed profiles of consumers and users and then suggest purchases and refer to the appropriate sites, catalogues, or virtual malls.

They also provide price comparisons and the new generation cannot be blocked or fooled by using differing product categories.

In the future, these agents will cover also brick and mortar retail chains and, in conjunction with wireless, location-specific services, issue a map of the branch or store closest to an address specified by the user the default being his residence, or yielded by his GPS enabled wireless mobile or PDA. This technology can be seen in action in a few music sites on the web and is likely to be dominant with wireless internet appliances. The owner of an internet enabled third generation mobile phone is likely to be the target of geographically-specific marketing campaigns, ads and special offers pertaining to his current location as reported by his GPS - satellite Geographic Positioning System.

F. The Transport of Information - Internet News

Internet news are advantaged. They are frequently and dynamically updated unlike static print news and are always accessible similar to print news, immediate and fresh.

The future will witness a form of interactive news. A special "corner" in the news Web site will accommodate "breaking news" posted by members of the the public or corporate press releases. This will provide readers with a glimpse into the making of the news, the raw material news are made of. The same technology will be applied to interactive TVs. Content will be downloaded from the internet and displayed as an overlay on the TV screen or in a box in it. The contents downloaded will be directly connected to the TV programming. Thus, the biography and track record of a football player will be displayed during a football match and the history of a country when it gets news coverage.

4. Terra Internetica - Internet, an Unknown Continent

Laymen and experts alike talk about "sites" and "advertising space". Yet, the Internet was never compared to a new continent whose surface is infinite.

The Internet has its own real estate developers and construction companies. The real life equivalents derive their profits from the scarcity of the resource that they exploit - the Internet counterparts derive their profits from the tenants content producers and distributors, e-tailers, and others.

Entrepreneurs bought "Internet Space" pages, domain names, portals and leveraged their acquisition commercially by:

  • Renting space out;
  • Constructing infrastructure on their property and selling it;
  • Providing an intelligent gateway, entry point portal to the rest of the internet;
  • Selling advertising space which subsidizes the tenants Yahoo!-Geocities, Tripod and others;
  • Cybersquatting purchasing specific domain names identical to brand names in the "real" world and then selling the domain name to an interested party.
  • Internet Space can be easily purchased or created. The investment is low and getting lower with the introduction of competition in the field of domain registration services and the increase in the number of top domains.

Then, infrastructure can be erected - for a shopping mall, for free home pages, for a portal, or for another purpose. It is precisely this infrastructure that the developer can later sell, lease, franchise, or rent out.

But this real estate bubble was the culmination of a long and tortuous process.

At the beginning, only members of the fringes and the avant-garde inventors, risk assuming entrepreneurs, gamblers invest in a new invention. No one knows to say what are the optimal uses of the invention in other words, what is its future. Many - mostly members of the scientific and business elites - argue that there is no real need for the invention and that it substitutes a new and untried way for old and tried modes of doing the same things so why assume the risk of investing in the unknown and the untried.

Moreover, these criticisms are usually well-founded.

To start with, there is, indeed, no need for the new medium. A new medium invents itself - and the need for it. It also generates its own market to satisfy this newly found need.

Two prime examples of this self-recursive process are the personal computer and the compact disc.

When the PC was invented, its uses were completely unclear. Its performance was lacking, its abilities limited, it was unbearably user unfriendly. It suffered from faulty design, was absent any user comfort and ease of use and required considerable professional knowledge to operate. The worst part was that this knowledge was exclusive to the new invention not portable. It reduced labour mobility and limited ones professional horizons. There were many gripes among workers assigned to tame the new beast. Managers regarded it at best as a nuisance.

The PC was thought of, at the beginning, as a sophisticated gaming machine, an electronic baby-sitter. It included a keyboard, so it was thought of in terms of a glorified typewriter or spreadsheet. It was used mainly as a word processor and the outlay justified solely on these grounds. The spreadsheet was the first real PC application and it demonstrated the advantages inherent to this new machine mainly flexibility and speed. Still, it was more of the same. A speedier sliding ruler. After all, said the unconvinced, what was the difference between this and a hand held calculator some of them already had computing, memory and programming features

The PC was recognized as a medium only 30 years after it was invented with the introduction of multimedia software. All this time, the computer continued to spin off markets and secondary markets, needs and professional specialties. The talk as always was centred on how to improve on existing markets and solutions.

The Internet is the computers first important application. Hitherto the computer was only quantitatively different to other computing or gaming devices. Multimedia and the Internet have made it qualitatively superior, sui generis, unique.

Part of the problem was that the Internet was invented, is maintained and is operated by computer professionals. For decades these people have been conditioned to think in Olympic terms: faster, stronger, higher - not in terms of the new, the unprecedented, or the non-existent. Engineers are trained to improve - seldom to invent. With few exceptions, its creators stumbled across the Internet - it invented itself despite them.

Computer professionals hardware and software experts alike - are linear thinkers. The Internet is non linear and modular.

It is still the age of hackers. There is still a lot to be done in improving technological prowess and powers. But their control of the contents is waning and they are being gradually replaced by communicators, creative people, advertising executives, psychologists, venture capitalists, and the totally unpredictable masses who flock to flaunt their home pages and graphomania.

These all are attuned to the user, his mental needs and his information and entertainment preferences.

The compact disc is a different tale. It was intentionally invented to improve upon an existing technology basically, Edison’s Gramophone. Market-wise, this was a major gamble. The improvement was, at first, debatable many said that the sound quality of the first generation of compact discs was inferior to that of its contemporaneous record players. Consumers had to be convinced to change both software and hardware and to dish out thousands of dollars just to listen to what the manufacturers claimed was more a authentically reproduced sound. A better argument was the longer life of the software though when contrasted with the limited life expectancy of the consumer, some of the first sales pitches sounded absolutely morbid.

The computer suffered from unclear positioning. The compact disc was very clear as to its main functions - but had a rough time convincing the consumers that it was needed.

Every medium is first controlled by the technical people. Gutenberg was a printer - not a publisher. Yet, he is the worlds most famous publisher. The technical cadre is joined by dubious or small-scale entrepreneurs and, together, they establish ventures with no clear vision, market-oriented thinking, or orderly plan of action. The legislator is also dumbfounded and does not grasp what is happening - thus, there is no legislation to regulate the use of the medium. Witness the initial confusion concerning copyrighted vs. licenced software, e-books, and the copyrights of ROM embedded software. Abuse or under-utilization of resources grow. The sale of radio frequencies to the first cellular phone operators in the West - a situation which repeats itself in Eastern and Central Europe nowadays - is an example.

But then more complex transactions - exactly as in real estate in "real life" - begin to emerge. The Internet is likely to converge with "real life". It is likely to be dominated by brick and mortar entities which are likely to import their business methods and management. As its eccentric past the dot.com boom and the dot.bomb bust recedes - a sustainable and profitable future awaits it.



http://www.searcharticles.net/article.cfm/id/2999