The Next Digital Interface Could Be Controlled By Your Facial Expressions

What Happened
Following the rise of voice-activated conversational interfaces powered by the personal AI assistants like Amazon’s Alexa, Microsoft’s Cortana, and Google Assistant, the next big thing in digital interaction could be hand-free devices controlled by facial expressions.

Denys Matthies, a computer interaction researcher, created prototype earbuds that can detect the wearer’s facial expressions and turn them into commands to control your phone. As The Verge explains:

The earbuds come with special electrodes that recognize the shape of the user’s ear canal using an electrical field. This bends and flexes in a consistent fashion when people pull certain faces, allowing the earbuds to detect specific expressions. So far, they can pick up on five separate movements with 90 percent accuracy: smiling, winking, making a “shh” sound, opening your mouth, and turning your head to the side. With these hooked up to your phone, you could open texts, play and pause your music, and so on, all without using your hands.

What Brands Need To Do
While still in early stage of its development, this earbud prototype gives a glimpse into the future of digital interactions, enabling a more intuitive and private user experience. It would be especially be suitable for hearable devices to allow for non-verbal feedbacks.  Moreover, this kind of facial expression-powered interfaces can also theoretically be achieved by face-tracking technologies, making it viable for devices other than earbuds as well. If realized and mass-commercialized, this type of devices could usher in a brand era of brand-customer interactions.

In the larger picture, the earbud is but one example in the kind of emerging advanced interfaces that are changing the way that brands will deliver their customer experiences and marketing messages. The smartphone supply chain has given rise to powerful, cheap computing components that can augment many parts of our bodies, our homes, and our cities with intelligence. We think about these things as IoT devices or wearables, but when backed by machine learning in the cloud, and down up with other devices, they create totally new kinds of advanced interfaces for media. For more on this topic, check out the first section in our Outlook 2017 report.


Source: The Verge

“Hearable” Is The New Wearable

When it comes to smartwatches or fitness bands, the wrist is the focus of the current wave of wearable technology. But wireless smart-earbuds, also fashionably dubbed “hearables”, might just be the piece that truly mainstreams wearables. From Samsung’s necklace-like Gear Circle to Motorola’s new micro-headset The Hint, more and more tech companies are coming out with their own hearables that sit in your ears, freeing up your hands and eyes.

The prerequisite technology for mature hearables—voice command, wireless connection, and cloud processing—already exist, while obstacles like including battery life, connectivity, and faster cloud processing are being tackle by industry leaders. Intel has plans to develop a voice recognition that uses offline possessing for faster responses. Google, meanwhile, has been working on its voice control for years as part of its Google Now interface, and is allegedly grooming it to be featured on its wearbles.

By utilizing the oft-ignored auditory sense and eliminating screens that require constant visual attention, hearables are convenient, unobtrusive, and most importantly, intuitive—all invaluable qualities that make them easily adoptable. With the worth of hearables market predicted at over $5 billion by 2018, these smart-earbuds are set to take over as a key part of wearable tech.