Jacob Whitehill has built an innovative smile detector that can turn his face into a remote control device that can send simple commands to a computer.
The computer science Ph.D. student and colleagues at the University of California, San Diego created the setup to read facial expressions and then alter the playback speed of a videotaped lecture. This way, a person's preferred viewing speed can be automatically changed.
Whitehill demonstrates the setup in a video.
"If I am a student dealing with a robot teacher and I am completely puzzled and yet the robot keeps presenting new material, that's not going to be very useful to me," Whitehill said. "If, instead, the robot stops and says, 'Oh, maybe you're confused,' and I say, 'Yes, thank you for stopping,' that's really good."
In tests, the facial movements people made when they perceived the lecture to be difficult varied widely from person to person. Most of the eight test subjects, however, blinked less frequently during difficult parts of the lecture than during easier portions of the lecture, which is supported by findings in psychology, Whitehill's team said in a statement.
The next step: determine what facial movements one person naturally makes when they are exposed to difficult or easy lecture material. From here, Whitehill says he could then train a user-specific computer model that predicts when a lecture should be sped up or slowed down based on the spontaneous facial expressions a person makes.
For example, nodding might indicate a person understands and the presentation should continue, while a puzzled look would suggest some rewinding is in order. No word on what would happen when the student falls asleep.
- Video: See Whitehill's Device in Action
- The Next Step: Wild New Technologies
- Top 10 Disruptive Technologies