'Smart Fur' Lets Robo-Pets Read Owners' Emotions

Robot Pets
A piece of smart fur can tell the difference between pets, scratches or even the breath of a human owner. (Image credit: University of British Columbia)

Man's best friend is getting an upgrade.

Pets can have positive effects on their owners' emotions, the logic goes, so would a robot pet be able to do the same? A robo-bunny developed at the University of British Columbia can mediate its users' emotions, calming them down or cheering them up by leading them through deep-breathing exercises, for example. The robo-bunny also has a pulse and can stiffen or relax its ears.

But for now, users of the robo-bunny need to be wired up to biometric sensors for the rabbit to sense the user's emotional state and react.

"You can't seriously expect kids to be wired up with sensors while they're using this," said Karon MacLean, a professor of computer science at UBC and the leader of the lab in which the bunny was developed.

The smart fur could lead to a new generation of robot pets capable of interacting with owners. (Image credit: University of British Columbia)

That's where a new "smart fur" that the team has created comes in.

Developed by graduate student Anna Flagg, the sensor — right now just a square blob a few inches long, vaguely reminiscent of a furry Star Trek tribble — can tell the difference between a pet, a scratch, even a breath, and ultimately will recognize up to 30 gestures.

"The end goal of this would be to try to infer a person's emotional state, given how they're touching the fur," Flagg said. Imagine a cat that, instead of biting you when you scratch it too hard, rolls over and purrs. "The one thing a robot can do that's different from an animal is truly be in the service of its owner and do what the owner needs it to do," MacLean said. "You can't always expect that from a robot."

The wired version of the "Haptic Creature" robot rabbit began as a theoretical experiment by Ph.D. student Steve Yohannon, who was interested in learning whether the language of touch was universal. That is, whether everyone expresses emotion through touch and interpret others' touches in the same way.

Flagg's pilot study seems to show that they can.  "I was nervous when I was running [the study] because I thought, 'There's no way [the sensor] will be able to learn a pattern here,'" she said. The seven volunteers recruited to scratch and pet the fur sensor all had their own ways of interacting with the blob, but enough similarities emerged that the system could tell the difference. There's much more research needed, though.

The smart fur will be on display at the 2012 IEEE Haptics Symposium March 4-7 in Vancouver. (Image credit: University of British Columbia)

The wired robot rabbit has already proved popular among its test subjects, though: children with anxiety disorders and children on the autism spectrum. The early results are promising. "Doctors, parents come through and are desperate to have them," MacLean says. "Kids repeatedly bug me, 'when can I take one home?'"

Integrating the fur sensor into the robot is a first step. —More tests are planned — MacLean is planning a study at the children's hospital in Vancouver to see if the robot is useful for kids about to undergo surgery.

"We have ideas for adults. Probably not a 20-pound robot, but your cellphone could do this. It would be interesting to have a little companion with me that could see when I'm becoming stressed and help guide my breathing, and maybe even notice it's happening before I notice it. We're wondering how this [effect] scales, if it's breathing in your pocket instead of in your lap."

Flagg is demonstrating her smart fur at the 2012 IEEE Haptics Symposium March 4-7 in Vancouver.

This story was provided by InnovationNewsDaily, a sister site to LiveScience. Follow InnovationNewsDaily on Twitter @News_Innovation, or on Facebook.

Rachel Kaufman

Rachel is a writer and editor based in Washington, D.C., who covers a range of topics for Live Science, from animals and global warming to technology and human behavior. Rachel also contributes to National Geographic News, Smithsonian Magazine and Scientific American, and she is currently a senior editor at Next City, a national urban affairs magazine. She has an English degree with a journalism concentration from Adelphi University in New York.