Expert Voices

After the Laughter Comes a Serious Role for the SmartWig (Op-Ed)

SmartWig
Yep, looks like you’ve got too many browser windows open. (Image credit: eliz.avery.)

This article was originally published at The ConversationThe publication contributed the article to LiveScience's Expert Voices: Op-Ed & Insights.

Want to move your slide presentation on? Just raise an eyebrow and let your SmartWig flip to the next graph on your behalf.

Well, not yet perhaps, but this is one of the ideas behind Sony’s surprising plan to market a wig-based interface, apparently as an alternative to the Google Glass. Its an attempt to get a piece of the wearable-technology market, through which firms are competing over new cyborg, hybrid, computery devices.

The basic idea is to hide a sensor, processor and communication interface under a wig. But the patent filed last week also includes tactile feedback, GPS, ultrasound transducer, camera, laser pointer and remote control devices. But could it, as Sony hopes, become a “technically intelligent item and fashion item at the same time”? Mobile phones have become fashion accessories, so why not SmartWigs?

Because at the moment wigs are seen as a matter of vanity, even when they’re worn by lawyers and judges. Where men once took on a hairpiece when things started to go awry upstairs, the shaved-head look has now become a fashion statement in reaction to hair loss. That said, women’s wigs have been more successfully used as a fashion accessory since the 1980s. Wigs transform someone’s appearance so dramatically and obviously that noticing someone is wearing a wig is remarkable, even laughable. But to save their face, one should not remark and certainly not laugh.

But, as a discreet and wearable 3D helmet, the SmartWig could be very useful for someone who is blind, or deaf or who needs to be guided through an alien environment in which their senses are impaired. The faux follicle tech revolution is also great news for gamers who don’t want others to know what they are playing at.

Perhaps if they have a serious purpose, such as a discreet form of assistance for people with disabilities, it will be easier to take them seriously. After all, psychologists and neuroscientists have been making sensory smart wigs for many years and the TV show Masters of Sex recently showed us how wiring up the heads of naked subjects makes watching them at it through a glass screen somehow scientific.

Hairing your thoughts?

But do we want to give a commercial company such close access to our scalps? So far we know very little about how the SmartWig will be pitched, aside from the scant information gleaned from Sony’s patent filing. Scientists have worked out how to sidestep putting messy gunk on the outside of our heads when trying to find out what’s going on inside by attaching electrodes and have made brain sensors that pick up signals from neurons in the brain or can tune into its magnetic fields. What will our SmartWig be able to do once we’ve become accustomed to it?

More scarily, US government defence agency DARPA has turned humans into pattern recognition machines to help computers work out the significance of what people are seeing before they work it out for themselves. If Sony takes this road, the SmartWig wearer could become a humanoid mobile sensing device for commercial or military purposes. We might even find the NSA tuning into our wigs. Just whose smarts will the SmartWig use?

It might help me if my bouffant can tell me what is behind me or to see in the dark with a beam of light projected from my fringe but I’m not sure I want to be seen wearing a wig in public. For now at least, I’ll continue to jab at a button with my finger when I’m delivering a powerpoint presentation … at least until neural dust reads my mind from the inside.

Tim Dant does not work for, consult to, own shares in or receive funding from any company or organisation that would benefit from this article, and has no relevant affiliations.

This article was originally published at The Conversation. Read the original article. The views expressed are those of the author and do not necessarily reflect the views of the publisher. This version of the article was originally published on LiveScience.

Lancaster University