Home technology iot Alexa now understands sign-language, tap to Alexa in Echo Show
Iot
CIO Bulletin
2018-07-25
It was presumed that the voice-interfaces were the future of computing, but “camera-and-screen based voice assistant is the ultimate use-case of the Amazon Echo prototype”, says an Indian software developer Abhishek Singh (shek.it). He created a mod that lets Amazon’s Alexa assistant understand some simple sign language commands.
He demonstrates in a video as to how a laptop’s webcam records his gestures, and some back-end machine learning software decodes it. Amazon Echo which is connected to the laptop is served with these decoded instructions, to function. He had to teach his program to understand visual signals by feeding training data.
The mod, a “thought experiment”, as he calls it, uses Google’s TensorFlow software (TensorFlow.js) which allows users to code machine learning applications using JavaScript.
Amazon, coincidentally, released its own update for Alexa which lets users interact with the virtual assistant without using any voice commands. The IoT device Amazon Echo Show which is screen-equipped now includes a new feature called “Tap to Alexa” which lets users with hearing and speech impairments to tap the device’s screen to access the digital assistant. The new device can be taught to hold routine or personalized commands.
However, Abhishek’s project can be the next step towards realizing a more convenient way to interact with voice assistants. He plans to open-source the code and says “… people will be able to download it and build on it further or just be inspired to explore this problem space.”
Digital-marketing
Artificial-intelligence
Lifestyle-and-fashion
Food-and-beverage