disponibile anche in Italiano
|Here is, once again, a quotation from one of my favorite online writers: Gerry
McGovern (two of his articles are in issues 13 and 14 of this newsletter).
On August 24 he published a brilliant article on The Myth of the Individual. For decades weve been reading and discussing about the end of mass production, mass markets and mass culture, the growth of diversity and individual values. But we are still living in a desperately repetitive and uniform culture (or so pictured and driven by "mass media" on a global scale).
This is what Gerry McGovern says:
Like Jerry McGovern, I am not anti-American. There are many things in America that I admire, like and enjoy. But there are also things that I wish we wouldnt imitate so clumsily. And I am uncomfortable with the pseudo-Americans in this part of the world, including those "anti-Americans" that, with a can of coke in one hand and a big mac in the other, criticize America without knowing what they are talking about and badly copying, or poorly translating, something that Americans have been saying about their own country.
But lets get back to the subject: individuality. There are two playing fields. In one, people and companies in this part of the world are likely to be the losers. Our comparatively lightweight, and somewhat clumsy, dinos have a slim chance of prevailing in the fields dominated by the multinational Tyrannosaurs (of which not many are based in Europe, and very few in a country like mine).
Just like the Americans, were busy merging and concentrating. Only "big", we think, can survive. But how big is big? We pay lip-service to the notion that our economy is based on millions of small companies or individual enterprises. The attention is concentrated on the big companies and the big deals. We model our society on imitation and flattening standards. We discourage individuality and imagination. That stands in the way of our ability to compete in the global economy.
It the internet an opportunity for small payers? I think so. But the big conglomerates (government bodies as well as private corporations) want to be in control.
I think diversity and individuality are good for everyone around the world. For countries like mine, that dont have strong enough dinosaurs, they are a necessity.
back to top
|Ive been wandering why the highest use of the internet is in Finland. Ive
heard several explanations; the most obvious is that in a cold climate its less easy
to meet people outside. Maybe... but thats not a good enough reason, as there are
cold places, such as Russia, with low internet penetration.
Low population density is a good reason. We see high net activity in Canada, Australia, Scandinavia, etc. But there are densely populated countries with heavy use of the internet such as the Netherlands.
I think there are other reasons. Here are some observations by a friend of mine, David Casacuberta, who has just returned to Spain from one of his many visits to Finland.
Unfortunately I dont know Finland. I was there only once, for a few days and only in Helsinki. The (not many) Finns I know fit Davids description; and thats why I like them.
Other observations on the same subject come from Ireland. This is what Sorcha Ni hEilidhe says in an article published by NUA on August 12:
So the reasons why the internet is so widely used in Finland is the way people there think, feel and behave. Their technical superiority is a consequence of their human needs (not vice versa).
Theres a moral, I think, to this example. When we want to understand something about the net (or the use of any technology) we invariably find that human behavior is the key factor. With a bit of "political" support, such as a government that tries to cater for peoples needs and refrains from gagging, censoring or otherwise controlling communication.
back to top
|Ive been repeating ad nauseam why I dont think "counting net
users" is very important; why most "headcount" calculations arent
very reliable; and why data for different countries (or from different sources)
arent comparable. However... heres another way of trying to figure out
"how many online".
Ive always thought that there could not be a direct relation between hostcount and number of people online. Maybe Im wrong. Several people think that there is a fairly constant factor of four or five "users" per host. Thats a bit strange, because there would be 95 to 120 million users in the United States, while the people that tried to count them came up with figures between 30 and 70 million. But it seems to work better in the rest of the world.
Of course not all .com or .net or .org domains are in the US. But if we analyze that we find that the pseudo-American hosts cant be more than 2 or 3 percent, so that doesnt change the picture in any relevant way.
With all these disclaimers... lets take a look at what happens with such a calculation. Here are figures for the 29 countries that in the latest worldwide survey had more than 50,000 internet hosts (except the US) and for broad areas (not including thr US and Canada). There are two figures for European countries; the fist is based on the worldwide hostcount, the second on the data for the Europe-Mediterranean area (that are a bit different and more up-to-date).
This may be a bit whimsical but the results are surprisingly close to figures worked out in other ways. And in this case the figures are comparable, because hostcount criteria are basically the same worldwide.
back to top
|The legal fight between Microsoft and US authorities is going on and on... and I
dont find the details very interesting, except when they point to problems that go
far beyond the case of any individual company.
Nathan Newman if NetAction, in an article published on August 20, explains that Microsoft was ordered by U.S. District Court Judge Thomas Penfield Jackson to turn over the source code for its Windows operating system to government lawyers so they could determine whether the company has been using internal structures of the code to illegally expand its monopoly. Of course Microsoft complained quite loudly.
This, points out the article, highlights why a secret operating system is so incompatible with both legal and innovation needs in the new economy.
I leave it to legal experts to get into the legal issues. But I think this is an opportunity to focus on a very relevant problem.
The general trends of the New Economy were well defined in an article by Kevin Kelly that I quoted last year; and its quite clear that open solutions are very important for the improvement of technology and especially for the internet.
Nathan Newman says:
The general issue of shared knowledge versus intellectual property goes far beyond any specific technology. But its especially relevant in information technology, and even more so in open networks. This isnt just a matter of innovation, but of the basic efficiency end compatibility that we need and we arent getting with proprietary operating systems.
Its hard to understand if, how and when this problem can be solved. There could be a combination of factors. Intelligent initiatives by competitors. Choices made by large institutions. Maybe the law. So far we see only tiny glimpses of light in what looks like a huge legal war of giants but is only a detail in the general development of the tools we need to communicate freely and effectively. We can only hope that some of the tiny cracks will get large enough to break through the wall that prevents us from getting the solutions we need.
Here are some resources for more information on this subject, published in a special issue of Web Review and quoted by Nathan Newman in his article:
There are also two articles that I had quoted last year: The Cathedral and the Bazaar by Eric Raymond and a report at the OReilly Perl Conference Information Wants to be Valuable. More on this subject on the NetAction. site.
back to top
|Readers of this newsletter know that I am quite uncomfortable with a glut
of so-called technological "innovation" that is not serving ant practical
purpose and often makes things unnecessarily difficult.
More from Ireland... an article published on August 24 by Antóin O Lachtáin In Praise of Lo-Tech explains the situation very clearly.
Two basic facts are confirmed here. Innovation works when it serves a relevant and useful purpose technology for its own sake does more harm than good. Simple, open, compatible solutions work much better than any over-complex, or in any case "closed", systems.
back to top
|I am becoming more and more impatient with a lot of the words that have
become fashionable when discussing new technology and electronic communication. Especially
those with a confused and often misleading meaning, such as "virtual" or
"multimedia" (or anything starting with "cyber").
But there is a word that, though it sounds a bit funny, has a precise and important meaning: hypertext.
Many people seem to think that it means mixing pictures (maybe also sounds and animations) with written text. Of course it can do that as well. But its most important quality is the way it allows us to organize information.
Expert readers please forgive me if I say things that, for them, are obvious; or if I am nor very precise on the technical side. I am not trying to explain the technology. I am simply trying to define a few basic concepts that, I think, are important for people who are not involved in the technical execution of a website or any other hypertext-based solution.
The notion of hypertext is over twenty years older than the technique most widely used today (HTML, HyperText Markup Language, the backbone of the World Wide Web) and can survive any change of technology. There is already a new and more flexible system: XML (Extensible Markup Language) that users can adapt to their specific needs, as some scientific communities are doing. Math Markup Language makes it easy to display equations without converting them to images; Chemical Markup Language enables browsers to display the chemical structure of a molecule from a text describing the compound. MusicML allows compositions to be stored as text but displays them as sheet music.
One simple fact may be interesting for non-technical people. What we read as text is lots of gibberish for a computer; while software sees as "language" instructions that are spelled out in "alphanumeric" characters (like text) but we dont see when we read something with a browser or with any word processor.
Whats really important, for anyone reading or offering material to be read, is the structure of information; often called (quite rightly) the architecture of a site (or a cd-rom, or a file system, or whatever).
Technically, this isnt quite as simple as it seems. But the really difficult, and most important, task is the organization of content. A "hypertext" structure allows a "potentially infinite" depth of information, that can be placed in a complex "hierarchy" on n levels. In addition to that, links can work across the system to reach related subjects; and in the case of a website "outgoing" links can connect to anything anywhere on the net.
The problem is: how can we provide the greatest possible amount of information with the easiest possible access? The value of a website (or anything else with a hypertext structure) increases with its complexity (i.e. lots of information and links) but its usefulness is based on how simple and easy it appears to readers.
Putting the two together (depth of content and ease of access) is quite difficult; also because it needs to be done from the point of view of the reader, not of the content provider. But its crucial: this is where hypertext is superior to any other possible form of communication. The quality of service offered to readers is one of the key success factors for an online site.