in-the-media
European Scientists Embark on Project RoboEarth
RoboEarth: Wikipedia for Our A.I. Friends
A group of European scientists are embarking on a milestone project known as RoboEarth, which will serve as the database and communications network for all future standardized robots with learning capabilities. Envisioned as something akin to the Internet and Wikipedia for Artificial Intelligence-powered machines, the objective of RoboEarth project is to establish a common platform for all robotic systems, allowing them to benefit from the collective experience and memory shared among all robots.
According to BBC's brief interview with Dr. Waibel, a Swiss scientist assigned to the four-year long multinational project, "it would be a place that would teach robots about the objects that fill the human world and their relationships to each other." For example, RoboEarth database could help any robot understand the courses of action when it is asked to set the dinner table, as well as objects that are required for completion of that task. With about 35 researchers participating in this EU-funded project, the initial development of demo version is expected to span over the next four years.
How It Works
RoboEarth will include everything needed to close the loop from robot to RoboEarth to robot. The RoboEarth World-Wide-Web style database will be implemented on a Server with Internet and Intranet functionality. It stores information required for object recognition (e.g., images, object models), navigation (e.g., maps, world models), tasks (e.g., action recipes, manipulation strategies) and hosts intelligent services (e.g., image annotation, offline learning).
Susan Blackmore's Third Replicator
So how is this relevant to our fancy of memes, you might ask. For a start, any robot-exclusive network of communications would align closely with the concept of "third-level cultural replicator," which was proposed by writer and psychologist Dr. Susan Blackmore. Back in August 2010, Dr. Blackmore wrote an op-ed column in the New York Times explaining this still-futuristic era of The Replicator as hypothetical scenarios in which robots would purposefully generate, copy and share information amongst themselves (coined temes, as in "technological memes"), as we humans do through the medium of memes.
Computers handle vast quantities of information with extraordinarily high-fidelity copying and storage. Most variation and selection is still done by human beings, with their biologically evolved desires for stimulation, amusement, communication, sex and food. But this is changing. Already there are examples of computer programs recombining old texts to create new essays or poems, translating texts to create new versions, and selecting between vast quantities of text, images and data. Above all there are search engines. Each request to Google, Alta Vista or Yahoo! elicits a new set of pages -- a new combination of items selected by that search engine according to its own clever algorithms and depending on myriad previous searches and link structures.
This is a radically new kind of copying, varying and selecting, and means that a new evolutionary process is starting up. This copying is quite different from the way cells copy strands of DNA or humans copy memes. The information itself is also different, consisting of highly stable digital information stored and processed by machines rather than living cells. This, I submit, signals the emergence of temes and teme machines, the third replicator.
For more information on Dr. Blackmore's theory, you can check out her op-ed column NYTimes – The Third Replicator and her fascinating video lecture on the subject of memetics at TED Conference 2008:
Comments ( 11 )
Sorry, but you must activate your account to post a comment.