Researchers teach AI to think like a dog and find out what they know about the world

What can artificial intelligence learn from dogs? A lot, say researchers at the University of Washington and Allen Institute for AI. They recently trained neural networks to interpret and predict canine behavior. Their results, they say, show that animals could provide a new source of training data for AI systems, including those used to control robots.

To train AI to think like a dog, the researchers first needed data. They collected this in the form of videos and motion information captured from a single dog, a Malamute called Kelp. A total of 380 short videos were taken of a GoPro camera mounted on the dog's head, along with sensor movement data on its legs and body. Essentially, Kelp was being recorded in the same way that Hollywood uses motion capture to record actors who play CGI creations. But instead of Andy Serkis giving life to Gollum, they were capturing a dog that took care of their daily life: walking, playing, searching and going to the park.




An image of Kelp the Malamute, showing the view from his GoPro mounted on the head and sensor data from his limbs.

With this information in hand, the researchers analyzed Kelp's behavior through deep learning. This is an AI technique that can be used to select patterns from data. In this case, that meant linking the movement data of the Kelp limbs and the visual data of the GoPro with various activities for dogs. The resulting neural network trained on this information could predict what a dog would do in certain situations. If he saw someone throwing a ball, for example, he would know that a dog's reaction would be to turn around and chase him.

In declarations to The Verge the main author of the article, Kiana Ehsani, explained that the predictive capacity of her AI system was very accurate, but only in short bursts. In other words, if the video shows a set of stairs, then you can guess that the dog will scale them. But beyond that, life is too varied to predict. "Whether or not the dog will see a toy or object that he wants to pursue, who knows," says Ehsani, a doctoral student at the University of Washington.

. However, what is really intelligent is what the researchers did next. Taking the neural network trained in the behavior of the dog, they wanted to see if he had learned something more about the world that they had not explicitly programmed. As explained in the document, dogs "clearly demonstrate visual intelligence, recognizing food, obstacles, other humans and animals", so that a neural network trained to act as a dog show the same cunning ?

It turns out that it does, although only in a very limited capacity. The researchers applied two tests to the neural network, asking him to identify different scenes (for example, indoors, outdoors, on stairs, on a balcony) and "transitable surfaces" (which are exactly what they look like: places where you can walk). In both cases, the neural network was able to complete these tasks with decent precision using only the basic data it had about the movements and whereabouts of a dog.

"Our intuition was that dogs are very good at finding where to walk, where they are allowed to go and where they are not," says Ehsani. "This is a very difficult task for a computer because it requires a lot of prior knowledge." This knowledge could be if a surface is too steep to walk or if it is pointed and uncomfortable. It would be a long time to program a robot with all these rules, but a dog already knows them all. So by observing the behavior of Kelp, the neural network learned these rules without needing to be taught. In other words, he learned from the dog.




The neural network trained in the Kelp data was able to accurately identify "passable surfaces".

Now, it is important to include many warnings here. The software created by Ehsani and his colleagues is by no means a model of a dog's brain or his consciousness. All it does is learn some very basic rules from a limited set of data, that is, where a dog likes to walk. And as with any other artificial intelligence system, there is no reasoning here; the software simply looks for patterns in the data. This in itself is not new. Researchers are always training artificial intelligence systems from similar data.

But, as Ehsani points out, this seems to be the first time someone has tried to learn from a dog, and the fact that it worked suggests that animals could be a useful source of training data. After all, dogs know many other things that would be very useful for robots. How humans look, for example, or the difference between an adult and a baby. Dogs know how to avoid cars and how to navigate stairs, which are important lessons for any robot that needs to operate in a human environment.

Of course, this document is just a very simple demonstration of how we can learn from animals, and much more work is needed before this paradigm is productive. But Ehsani says he is confident that he could have some very useful applications. "An immediate thing that comes to my mind is to make a robot dog, it is already a difficult task for a robot, to know how to move and where to go, or if they want to pursue something," he says. "This would definitely help us build a more efficient and better robot dog."

Leave a Reply