Navigation
Search
|
Alexa is Implementing Self-Learning Techniques To Better Understand Users
Monday December 10, 2018. 10:30 AM , from Slashdot
In a developer blog post published this week, Alexa AI director of applied science Ruhi Sarikaya detailed the advances in machine learning technologies that have allowed Alexa to better understand users through contextual clues. From a report: According to Sarikaya, these improvements have played a role in reducing user friction and making Alexa more conversational. Since this fall, Amazon has been working on self-learning techniques that teach Alexa to automatically recover from its own errors. The system has been in beta until now, and it launched in the US this week. It doesn't require any human annotation, and, according to Sarikaya, it uses customers' 'implicit or explicit contextual signals to detect unsatisfactory interactions or failures of understanding.'
The contextual signals range from customers' historical activity, preferences, and what Alexa skills they use to where the Alexa device is located in the home and what kind of Alexa device it is. For example, during the beta phase, Alexa learned to understand a customer's mistaken command of 'Play 'Good for What'' and correct them by playing Drake's song 'Nice for What.' Read more of this story at Slashdot.
rss.slashdot.org/~r/Slashdot/slashdot/~3/APGPasuepMI/alexa-is-implementing-self-learning-techniques-...
|
25 sources
Current Date
Nov, Thu 21 - 21:56 CET
|