More than 80% of the Net surfers use the search engines as relates of entry to Internet, and among those an immense majority uses Google. This search engine, in less than 10 years of existence, was essential by the relevance of the answers brought to the requests posed. But the search for information , as well as the access to the diversity of the contents of the Web and Blogs shows already its limits and Google is engaged in an unrestrained diversity of services to ensure its development. But in fact the conditions of navigation in Internet will be then to reconsider.
Where the linguistic engines indexed and sought key words, index of contents, Google, one strongly knows it module this indexing by the "Ranking Page", the audience or the notoriety of a site, measured according to the number of bonds which point on this site. In a Google direction, as the scientometric models of quotations ("Which quotes which? Who is quoted by whom? ") postulates that the quality of contents is a function of the interest which it causes. In less than 10 years, Google made this algorithm the base of an economic capacity without equal, (nearly 100 billion dollars of stock exchange capitalization) enabling him to widen its services with geographical knowledge (Google Earth), and thus, potentially with all the couplings of geolocalisation (GPS...), with the contents of the digitized books and the libraries (Google Book), with the own-produced video images (YouTube) in Blogs (GoogleBlog), the transport and the private correspondence (Gmail), with the press (Google News), etc.
All this is known. What is less is it that this economic power, at the bottom collecting notoriety, the doxa, the increasing opinion surrounding any information, rests on the advertising model of election, in a way more sophisticated much than any media had done it to date. As John Batelle says it, the indexing to which proceeds Google is not only that of the sites and pages visited and accumulated by the robots of an immense computer network. It is also and initially the constitution of "the data base of our intentionnalities". Each one of our billion clicks which each month information retrieval, knowledge, leisures, commercial or financial economic opportunities, is identified and indexed. It is this "data base" of our research, needs, desires of all natures which "is sold" automatically with the advertisers, targeting in return any commercial proposal closer to the social or cultural requests of the Net surfers.
That a great Franco-German project, Quaero, is on the way to compete with Google, one can only be delighted some. The linguistic abilities, statistical of the European researchers are undoubtedly as good as those of Larry Page and Sergei Brin, the creators of Google, in 1997. After all, those had largely nourished experiment of Louis Monnier, misunderstood in France and gone away to create AltaVista in the Nineties in the USA. French companies as Exalead are themselves the heiresses of this adventure of the search engines. But obviously the question is not put any more about the only scientific or technological ground. The economic and social critical mass of Google obliges to consider other ways, to reconsider the problems of reference.
On the one hand, in 10 years month, the Web developed considerably in the socialization which one awaited from him. In addition, this socialization developed in the virtuality which dice the origin characterized the data-processing communication.
Initially, Google is indeed on the way to reach the limits of what makes its force. The model of “democratic notoriety” which is at the bottom of the algorithm, the relevance correlated with the audience, is a model on the way to be vitiated. The notoriety of information, the number of bonds which point on this one, can be only an indication of its shared value, not of its intrinsic value. To be largely quoted (and to largely quote!) in much of fields of rationality, can as much be the sign of a land interest that of a passion of mode. Admittedly a very quoted scientific article can be supposed more relevant than a little quoted article. But on the margins, the reverse is quite as true: a really innovative article largely misunderstood, will be forsaken during a given time, like were it, at their time the theories of the thermodynamics of Sadi Carnot or the genetics of Mendel.
Questions of “trust”…
Obviously, Internet is other thing that a scientific space of publication, and the sex, the play the business and the rumour reign there massively. But precisely, if any aesthetic or cultural creation is also marginalized, crushed, - whatever the advanced libertarian justifications if the information of quality is levelled by the free press and the free access, it is the sign which the algorithm of Google does not manage to be other thing only one thermometer of the common opinion. For a market of mass and entertainement, it is quite sufficient. To sit a cognitive and heuristic hegemony (to locate itself in the expansion of the network), that is not it any more. The company of Mountain View had well considering, while have just deposited the concept of “Ranking Trust” (row of confidence), which should replace that of Ranking Page, to classify, filter and treat on a hierarchical basis knowledge. This indexing would then return to the capacity to place indices of confidence (of checking,of authority, of authentification) in addition to the indices of notoriety already used. One suspects well that it is as much for economic and financial reasons that cognitive or ethics that this need for “confidence” is essential from now on in the search for information.
On the other hand, it is certain that it is neither simple nor feasible to thus allot such “indices of confidence”. That supposes to classify sites and contents of references, encyclopaedic or ontological step rather foreign to the culture of Google. But especially, difficulty because Internet is not a static library, where the preserved documents would wait until one read them. These is a universe which at every moment becomes deformed, develops, pushes rhizomes, erases, cuts off, adds, cutting and binds images, texts, data and calculations in any direction.
In a way, Google runs up against what proves to be the principal characteristic of the Internet: information swells where it sensitive, is discussed, culturally, politically: from which does the climatic reheating come? Who killed Habyarimana on April 6, 1994 in Rwanda? How spreads the Avian flu? Which are the causes and effects of September 11? etc In that sense, the “Multiverse” that is Internet makes the echo of the enigmas and controversies of the century. Even if one can also there find the receipt of the Tatin tart or buy a car…
As much to say that on the matter, the reference to “thirds of confidence” (of the academic institutions, governments, large media, etc…) can be only one partial answer, their function (and their capacity) not being able to slice or solve such plentiful controversies. This intrinsic limit of Google, as Gaston Bachelard had already supposed it, must be found side of the direction even of what one names “to seek”, activity which goes from the search of the schedule of one train to the ultimate metaphysical question: “When one does not know what one seeks, one does not understand what one finds”.
In addition, Internet has rediscovered (what one pretends to call the Web 2.0), which the indexing which governs the effectiveness of the search for information also could, and more judiciously to be required as regards prescriber or appraiser of the information which themselves are the Net surfers. A marking or free labelling of any preconceived information the contextualise more surely than any semantic or logical categorization of an encyclopaedic nature. The world is not a directory arranged hierarchically, but if I identify, by rather effective chains of locations the communities which index such or such knowledge, this social location will be qualitatively of a rare effectiveness. No one is not then need for a thesaurus closed, terms of “authority”, intangible, to mark knowledge. A “cloud” of tags, marks, are enough to locate the social “folksonomies”, ontologies which do not cease being formed and to become deformed in the universe of the network according to the communities of interest. The blogosphère, by its leading capacities channeling flows of information, “syndicating” the networks of contents, mutualising the sets of themes, is the type even of this increased dynamic socialization that only Internet is able to give birth to.
There is extremely to bet however that these logics, so promising are they will not be able to exceed the creation “of ontologies regional”, partial, and to bring only one answer limited in the search of “confidence” like search key of the future. The semantic Web, promoted by Tim Berners Lee, encounters another type of constraint, a logical nature. Internet, like automat, needs to compose the units of information located on the Web, for example, within causal continuations as to enrich information, to produce it in its complexity. In fact, as long as one sails in a space thematically, ontologically, homogeneous (medicine for example, or vine growing, etc) nothing forbids logical operators to extract from the contents published of various origins to logically rebuild them in response to a given request.
A crossing of the semantic Web, with the logical direction, and social markings in term of “foksonomy” is possible? It is one of the fields open in data-processing and documentary research current.
For a virtual engine!
But it is in a very different direction and much more promising than one is foreseeing the future of the search engines. Paradoxically, it is not towards a rational, logical reduction (and at the ethical bottom of rigour) which it is necessary to move. But rather towards an amplification of what is the lacking even Internet, its virtuality. In reality, Internet concerns an industrial production and a division of the labour of creation of contents of an anarchistic richness without equal. Why not leave from what not only allows, but promotes this virtual creation, a priori out of the fields of “verifiable” (would be this only by the anonymity of the misadventures). One remembers this drawing of Steiner in 1993 “On Internet, Nobody knows you are a doge”. Potentially, dice the origin of the Web, all one each one can become creator, or at least transformer of contents, so that the “informational reality” of the material supports is now completely exceeded, subjugated, by the virtual profusion of the contents. Plays like Second Life, by ambiguity which they maintain between reality and fiction, the second amplifying the possibilities of the first, show the way. Required “confidence” has direction only if it confronts with the virtual universes, qualified rightly “persistent universes”, only sufficiently broad and stable reference to evaluate gradually and perennialize information. Precisely because the universes of the play in network are not necessarily related on time and space (it is possible to teleport); because they introduce a “play of the world” where all seems possible, then they allow, almost by excess where Google runs up by defect, to imagine the search engines of the future.
The banner page of the Web of tomorrow will be the entry in a play like Second Life. In reality, “the data base of our intentionnalities” will include “logically” also our dreams, our desires of knowledge in extreme cases, while mobilizing “virtual communities” to reach gradually the object even of the request. “Reality” will be a subset of a broader universe, “virtual”, where the apparently prohibited or forced possibilities “current” reality will open, like socially testing their to become. The controversy is not locked up any more in what makes it enigmatic. It is spread in as many assumptions as this new Web can produce some, starting from the contents themselves produced by each virtual resident. etc The small one limps of request of Google will become a true initiatory voyage, ...where it will be also possible to find the receipt of the Tatin tart or to buy a car.
More seriously, the virtualisation of information should be indeed a response to the lack of “confidence” which weakens the access and navigation within Internet currently: it is while being confronted with the various “worlds possible” which information is carrying that its value can to him be restored, cognitive, aesthetic, ethical, therefore its value of confidence. Arestrictive vision, authoritative, academic, a “checking” one, is not surely any more setting. In do fact need we really “to check” information, knowledge, the facts, by a vain reduction with reality, or need the “virtualiser”, to learn to us from what is large this “reality”? Go Misters Schmidt, Page and Brint, still an effort! After the PageRanking and the TrustRanking, here is DreamRanking! After having repurchased YouTube 1,65 billion dollar, how much (maybe in LindenDollars) are you able to invest in the repurchase of Second Life?
Yannick Maignien
Aix en Provence 11 december
Bravo! I admire this line of thought. It's fun to parse a translation of French rant now and then. Regards.
Rédigé par : Rob Baker | 15 décembre 2006 à 16:20