Image: An iRat meets a real rat. Credit: Minh McCloy.
On Tuesday June 11, at The Edge, State Library of Queensland, Janet Wiles, Professor of Complex and Intelligent Systems at The University of Queensland, speaks about how a system of memory and communication can be built from scratch, and developed through use.
Professor Wiles reveals ‘Lingodroids’.
She opens with a video of two lunchbox-sized robots on wheels, covered in antennae-like structures.
These particular Lingodroids are called iRats: language-learning robots emitting and receiving signals that sound like telephone button-presses: beep-boop-boop, boop-boop-beep-beep.
“What you just heard,” Professor Wiles explains, “was a conversation between two robots with words that they have invented themselves.”
They greet one another and name their location – spatial awareness is at the heart of this vocabulary.
The system was inspired by a cell found within the brain of a rat, the “place cell”, that fires on return to a known location.
When two iRats are let loose in an environment, they create and internalise a map of their local area.
The goal is to let the pair encounter one another and have a conversation about the space based on their experience of it.
This project raises questions not only about how the robots “conceive” space, and their actions within that space, but also about how humans understand and use the concept.
“Why are they exploring space and time?” asks Professor Wiles. “Because they are the foundations of cognition.”
An interesting problem that arose in the course of the research is how to allow iRats to develop a vocabulary for meeting at unknown places at a particular time.
To do this, researchers break down the various cognitive components required to understand distance, direction and chronology.
Professor Wiles describes the robots as playing different types of conversational games: where-are-we games, meet-at games, go-to games and how-far games.
The first three require the iRats to remember a location, the final game requires a robot to link these locations, developing new words for the distance between them.
Directions, again, can be developed relationally between two known places.
Now, the robots can meet at a place they’ve not been before and name it, improving their vocabulary and, consequently, increasing their capabilities.
This achieved, these games might be extended indefinitely.
iRats understand time through experience; they are able to detect light, and can establish the time based on the intensity of sunlight, and whether its level is rising or falling.
So an iRat can propose to its buddy that they rendezvous at sunset, meeting at a location to which they’ve never been.
Professor Wiles is excited about the possibilities; perhaps one day these robots could learn any of the globe’s 6000 languages.
“It’s not just an academic exercise. Every child on the planet has that ability, why not our robots?”
The talk wraps up with Professor Wiles pointing out a revelation that comes from language work with robots: the need for “social skills”.
While the iRats have identical brains, experiments are being conducted into how robots with different cognitive structures could converse.
Professor Wiles highlights two vital social skills if this is to take place: timing and attention.
“The robots know they’re in conversation together because of the relative timing, and we know we’re in conversation because in fractions of a second we detect differences in who we’re speaking to.
“They need to pay attention to the same topic, whether that’s an event, space, time or distance.
“If you take a child with a new toy and they hear a new word, they’ll tend to associate that word with the toy they’re paying attention to; that’s what our robots are doing.”
Lingodroids offer an extraordinary insight into the complexity of the many interacting biological systems comprising a human being.
To replicate even such a small element of social communication is a triumph, and a thrilling reminder of how often the complexity of human biology is taken for granted.
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported.
Current average ratings.