Artificial Intelligence has found an excellent application in virtual assistants, enabling our voice to become a technology of choice. Gadgets like Amazon Alexa and Google Assistant can understand humans better today than ever before. They listen to us patiently and help us take notes and organize our appointments. But our interactions with our machines are not just confined to us alone anymore.

SEE ALSO: How Artificial Intelligence Is Helping Tourism

Reportedly, there is a third party in the scene. Google sends out audio clips from our devices without the user’s knowledge and pays its contractors to transcribe them, so as to improve the company’s AI voice-recognition technology. Usually, Alexa and Google Assistant start recording when they hear their wake word, such as “Okay Google,” but the VRT NWS report shows that these devices can get activated by accident and can pass on sensitive information like name, contact numbers and addresses to Google workers.

The story covered by the publication focuses on Dutch and Flemish speaking Google Assistant users. Out of the 1000 audio clips reviewed by the broadcaster, 150 were recorded unintentionally. A Google worker told the VRT NWS that he transcribes about 1000 audio clips from Google Assistant every week. While most of these included intended commands for weather information and pornographic videos, some of them were bits of phone calls and private information like announcements of going to the restroom or one’s love life.

SEE ALSO: Researchers Create AI Using Just A Sheet of Glass!

Tech companies stress on the need for transcription that helps them improve the automated speech voice-processing feat and claim that only a small percentage of these recordings are shared in the user-transcriber arrangement. Google Home, for example, does not even mention the use of human contractors by the company in their privacy policy because that would look creepy. Neither does it offer any disclaimer that there is a possibility that the device can record users by mistake.

Original Source:


Please enter your comment!
Please enter your name here