Content Suppliers are the underprivileged sector of the Internet. They all lose money (even sites that offer basic, standardized goods – books, CDs), except sites proffering sex or tourism. No user seems grateful for the effort and resources invested in creating and distributing content. The recent breakdown of traditional roles (between publisher and author, record company and singer, etc.) and the direct access the creative artist gains to its paying public may change this attitude of ingratitude. Still, hitherto, there are scarce signs of that.

Moreover, it is either the quality of presentation (which only a publisher can afford) or ownership and (often shoddy) dissemination of content by the author. A qualitative, fully commerce-enabled site costs up to 5,000,000 USD, excluding site maintenance and customer and visitor services. Despite these heavy outlays, site designers are constantly criticized for lack of creativity or too much imagination. More and more is asked of content purveyors and creators. They are exploited by intermediaries, hitchhikers, and other parasites. This is all an off-shoot of the ethos of the Internet as a free content area.

Content

Most users like surfing the net (browsing and visiting sites) without reason or goal. This makes it difficult to apply traditional web marketing techniques.

What does “targeted audiences” or “market shares” mean in this context? If a surfer visits sites that deal with aberrant sex and nuclear physics in the same session – what to make of it?

Moreover, the public and legislative backlash against gathering surfer data by Internet ad agencies and other websites has led to growing ignorance regarding Internet users’ profiles, demography, habits, preferences, and dislikes. “Free” is a keyword on the Internet: it belongs to the US Government and many universities. Users like information, with emphasis on news and data about new products. But they do not like to shop on the net – yet. Only 38% of all surfers purchased in 1998. It would seem that users will not pay for content unless it is unavailable elsewhere, qualitatively rare, or made rarely. One way to “rarely” content is to review and rate it.

1. Quality-Rated Content

There is a long-term trend of clutter-breaking website ratings and critiques. It may have a limited influence on the consumption decisions of some users and on their willingness to pay for content. Browsers already sport “What’s New” and “What’s Hot” buttons. Most Search Engines and directories recommend specific sites. But users are still cautious. Studies discovered that users, no matter how heavy, have consistently revisited more than 200 sites, a minuscule number. Some recommendation services often produce random – at times, wrong – selections for their users.

There are also concerns regarding privacy issues. The backlash against Amazon’s “readers circles” is an example. Web Critics, who work today mainly for the printed press, publish their wares on the net and collaborate with intelligent software that hyperlinks to websites, recommends them, and refers users. Some web critics (guides) became identified with specific applications – expert systems -that incorporate their knowledge and experience. Most volunteer-based directories (such as the “Open Directory” and the late “Go” directory) work this way.

The flip side of content consumption is content creation, marketing, distribution, and maintenance.

2. The Money

Where is the capital needed to finance content likely to come from?

Again, there are two schools:

According to the first, sites will be financed through advertising – and so will search engines and other applications accessed by users. Certain ASPs (Application Service Providers that rent out access to application software on their servers) are considering this model. The recent collapse in online advertising and click-through rates raised serious doubts regarding the validity and viability of this model. As Seth Godin, marketing gurus declared “interruption marketing” (=ads and banners) dead.

The second approach is simpler and allows for the existence of noncommercial content. It proposes to collect negligible sums (cents or fractions of cents) from every user for every visit (“micro-payments”). These accumulated cents will enable the site owners to update and maintain them and encourage entrepreneurs to develop and invest in new content. Certain content aggregators (especially digital textbooks) have adopted this model (Questia, Fathom).

The adherents of the first school point to the 5 million USD invested in advertising in 1995 and to the 60 million or so invested in 1996. Its opponents point at the same numbers: ridiculously small when contrasted with more conventional advertising models. The potential of advertising on the net is limited to 1.5 billion USD annually in 1998, thundered the pessimists. The figure was double the prediction but still woefully small and inadequate to support the internet’s content development. Compare these figures to the sale of Internet software (4 billion), Internet hardware (3 billion), and Internet access provision (4.2 billion in 1995 alone!).

Even if online advertising were to be restored to its erstwhile glory days, other bottlenecks remain. Advertising encourages the consumer to interact and to initiate the delivery of a product to him. This – the delivery phase – is a slow and enervating epilog to the exciting affair of ordering online. Too many consumers still complain of late delivery of the wrong or defective products.

The solution may lie in the integration of advertising and content. The late Pointcast, for instance, integrated advertising into its news broadcasts and continuously streamed to the user’s screen, even when inactive (it had an active screen saver and ticker in a “push technology”). Downloading digital music, video, and text (e-books) leads to the immediate gratification of consumers and increases the efficacy of advertising.

Whatever the case, a uniform, agreed-upon rating system as a basis for charging advertisers is sorely needed. There is also the question of what the advertiser pays for. For instance, the rates of many advertisers (Procter and Gamble) are based not on the number of hits or impressions (=entries, visits to a site). – but how often was their advertisement hit (page views) or clicked through?

Finally, there is the paid subscription model – a flop to judge by the experience of the meager number of sites of venerable and leading newspapers that are on a subscription basis. Dow Jones (Wall Street Journal) and The Economist. Only two. All this is not very promising. But one should never forget that the Internet is probably the closest thing to an efficient market. As consumers refuse to pay for content, the investment will dry up, and the content will become scarce (through closures of websites). As scarcity sets in, a consumer may reconsider.

Your article deals with the future of the Internet as a medium. Will it be able to support its content creation and distribution operations economically?

If the Internet is a budding medium, we should derive great benefit from studying the history of its predecessors.

The Future History of the Internet as a Medium

The internet is the latest in a series of networks that revolutionized our lives. A century before the internet, the telegraph, the railways, the radio, and the telephone have been similarly heralded as “global” and transforming. Every medium of communication goes through the same evolutionary cycle:

Anarchy

The Public Phase

At this stage, the medium and the resources attached to it are very cheap and accessible, under no regulatory constraints. The public sector includes higher education institutions, religious institutions, government, not-for-profit organizations, nongovernmental organizations (NGOs), trade unions, etc. Bedeviled by limited financial resources, they regard the new medium as a cost-effective way of disseminating their messages.

The Internet was not exempt from this phase, which ended only a few years ago. It started with complete computer anarchy manifested in ad hoc networks, local networks, and organizations (mainly universities and organs of the government, such as DARPA, a part of the USA’s defense establishment). Noncommercial entities jumped on the bandwagon and started sewing these networks together (an activity fully subsidized by government funds). The result was a globe encompassing a network of academic institutions. The American Pentagon established the network of all networks, the ARPANET. Other government departments joined the fray, headed by the National Science Foundation (NSF), which withdrew only lately from the Internet.

The Internet (with a different name) became semi-public property – with access granted to the chosen few. Radio took precisely this course. Radio transmissions started in the USA in 1920. Those were anarchic broadcasts with no discernible regularity. Noncommercial organizations and not-for-profit organizations began their broadcasts and even created radio broadcasting infrastructure (albeit of the cheap and local kind) dedicated to their audiences. Trade unions and certain educational institutions sent religious groups to commence “public radio” broadcasts.

The Commercial Phase

When the users (e.g., listeners on the radio or owners of PCs and modems on the Internet) reach a critical mass – the business sector is alerted. In the name of capitalist ideology (another religion, really), it demands “privatization” of the medium. This harps on susceptible strings in every Western soul: the efficient allocation of resources resulting from competition. Corruption and inefficiency are intuitively associated with the public sector (“Other People’s Money” – OPM). This, together with the ulterior motives of members of the ruling political echelons (the infamous American Paranoia), a lack of variety and catering to the tastes and interests of certain audiences, and the automatic equation of private enterprise with democracy lead to a privatization of the young medium.

The result is the same: the private sector takes over the medium from “below” (makes offers to the owners or operators of the medium that they cannot possibly refuse) – or from “above” (successful lobbying in the corridors of power leads to the appropriate legislation and the medium is “privatized”). Every privatization – especially that of a medium – provokes public opposition.

There are (usually founded) suspicions that the interests of the public are compromised and sacrificed on the altar of commercialization and rating. Fears of monopolization and cartelization of the medium are evoked – and proven correct in due course. Otherwise, there is fear of the concentration of control of the medium in a few hands. All these things happen, but the pace is so slow that the initial fears are forgotten, and public attention reverts to fresher issues.

A new Communications Act was enacted in the USA in 1934. It was meant to transform radio frequencies into a national resource to be sold to the private sector, which was supposed to transmit radio signals to receivers. In other words, the radio was passed on to personal and commercial hands. Public radio was doomed to be marginalized. The American administration withdrew from its last major involvement in the Internet in April 1995, when the NSF ceased to finance some of the networks. It privatized its hitherto heavy involvement in the net.

A new Communications Act was legislated in 1996. It permitted “organized anarchy.” It allowed media operators to invade each other’s territories. Phone companies were allowed to transmit video, and cable companies were allowed to send telephony. This was all phased over a long period – still, it was a revolution whose magnitude is difficult to gauge and whose consequences defy imagination.

It carries an equally momentous price tag – official censorship. “Voluntary censorship,” to be sure, somewhat toothless standardization and enforcement authorities, to be sure – still, a censorship with its institutions to boot. The private sector reacted by threatening litigation – but beneath the surface, it is caving into pressure and temptation, constructing its censorship codes in cable and internet media.

Institutionalization

This phase is the next in the Internet’s history, though few realize it. Enhanced activities of legislation characterize it. On all levels, legislators discover the medium and lurch at it passionately. Suddenly, ” free ” resources are transformed into “national treasures not to be dispensed with cheaply, casually and with frivolity.”

It is conceivable that certain parts of the Internet will be “nationalized” (for instance, in the form of a licensing requirement) and tendered to the private sector. The legislation will deal with permitted and disallowed content (obscenity ? incitement ? racial or gender bias ?) No medium in the USA (not to mention the wide world) has eschewed such legislation. There are sure to be demands to allocate time (or space, or software, or content, or hardware) to “minorities,” to “public affairs,” to “community business.” This is a tax that the business sector will have to pay to fend off the eager legislator and his nuisance value.

All this is bound to lead to a monopolization of hosts and servers. The important broadcast channels will diminish in number and be subjected to severe content restrictions. Sites refusing to succumb to these requirements will be deleted or neutralized. Even as we write, content guidelines (euphemisms for censorship) exist in all major content providers (CompuServe, AOL, Yahoo!-Geocities, Tripod, Prodigy).

The Bloodbath

This is the phase of consolidation. The number of players is severely reduced. The number of browser types will settle on 2-3 (Netscape, Microsoft, and Opera?). Networks will merge to form privately owned mega-networks. Servers will combine to form hyper-servers run on supercomputers in “server farms.” The number of ISPs will be considerably cut. Fifty companies ruled most of the media markets in the USA in 1983. The number in 1995 was 18. At the end of the century, they will be number 6.

This is the stage when companies – fighting for financial survival – strive to acquire as many users/listeners/viewers as possible. The programming shall be owed to the lowest (and widest) common denominator. Shallow programming dominates as long as the bloodbath proceeds.

From Rags to Riches

Tough competition produces four processes:

1. A Major Drop in Hardware Prices

This happens in every medium, but it doubly applies to computer-dependent mediums like the Internet. Computer technology seems to abide by “Moore’s Law,” which says that the number of transistors that can be put on a chip doubles every 18 months. As a result of this miniaturization, computing power quadruples every 18 months, and an exponential series ensues. Organic-biological-DNA computers, quantum computers, and chaos computers – prompted by vast profits and spawned by inventive genius will ensure the continued applicability of Moore’s Law.

The Internet is also subject to “Metcalf’s Law.” When we connect N computers to a network, we increase N to the second power in its computing processing power. And these N computers are more powerful every year, according to Moore’s Law. The growth of network computing powers is a multiple of the effects of the two laws. More and more computers with ever-increasing computing power connect and create an exponential 16 times growth in the network’s computing power every 18 months.

2. Content Related Fees

This was prevalent on the Net until recently. Even potentially commercial software can still be downloaded for free. In many countries, television viewers still pay for television broadcasts – but in the USA and many other countries in the West, the basic package of television channels comes free of charge.

As users/consumers form a habit of using (or consuming) the software – it is commercialized and begins to carry a price tag. This happened with the advent of cable television: content is sold for subscription or per usage (Pay Per View – PPV) fees.

Gradually, this is what will happen to most of the sites and software on the Net. Those who survive will begin to collect usage, access, subscription, downloading, and other appropriately named fees. These fees are bound to be low – but the principle counts. Even a few cents per transaction may accumulate to hefty sums with the traffic that characterizes some websites on the Net (or, at least, its more popular locales).

3. Increased User Friendliness

As long as the computer is less user-friendly and less reliable (predictable) than television – less of a black box – its potential (and its future) is limited. Television attracts 3.5 billion users daily. The Internet stands to attract – under the most exuberant scenario – less than one-tenth of these people. The only reasons for this disparity are (the lack of) user-friendliness and reliability. Even browsers, among the most user-friendly applications ever, are insufficient. The user must still know how to use a keyboard and possess some basic knowledge of operating systems. The more mature the medium, the more friendly it becomes. Finally, it will be performed using speech or common language. There will be room for user “hunches” and built-in flexible responses.

4. Social Taxes

Sooner or later, the business sector must mollify the God of public opinion with offerings of a political and social nature. The Internet is an affluent, educated, yuppie medium. It requires literacy and numeracy, a lifelong interest in information and its various uses (scientific, commercial, other), and a lot of resources (free time, money to invest in hardware, software, and connect time). It empowers and deepens the divide between the haves and have-nots, the developed and the developing world, the knowledge and the ignorant, the computer illiterate.

In short, the Internet is an elitist medium. Publicly, this is an unhealthy posture. “Internet phobia” is already discernible. People (and politicians) talk about how unsafe the Internet is and its possible uses for racial, sexist, and pornographic purposes. The wider public is in a state of awe.

So, site builders and owners will do well to improve their image: provide free access to schools and community centers, bankroll internet literacy classes, freely distribute content and software to educational institutions, and collaborate with researchers, social scientists, and engineers. In short, it encourages the view that the Internet is a medium catering to the community’s and the underprivileged’s needs, a mostly altruistic endeavor. This also makes good business sense by educating and conditioning future users.

He who visited a site when a student, free of charge – will pay to do so when made an executive. Such a user will also pass on the information within and without his organization. This is called media exposure. Undoubtedly, the future will witness public Internet terminals, subsidized ISP accounts, free Internet classes, and an alternative “noncommercial, public” approach to the Net. This may prove to be one more source of revenue for content creators and distributors.

Sam Vaknin is the author of “Malignant Self Love – Narcissism Revisited” and “After the Rain – How the West Lost the East.” He is a columnist in “Central Europe Review,” United Press International (UPI), and ebookweb.org and the editor of mental health and Central East Europe categories in The Open Directory, Suite101, and searcheurope.com. Until recently, he served as the Economic Advisor to the Government of Macedonia.