Apple and Google stop employees from playing back voice recordings

Aug 6, 2019 | Mobile, Regulation

Apple and Google stop employees from playing back voice recordings
Google and Apple are suspending some of their voice data-review practices, after it was revealed that both firms allow humans to listen to private conversations on smart speakers and voice apps.

The move follows news reports that third-party contractors used by Apple and Google had heard people having sex and discussing private medical information.

Following a data leak last month, Google confirmed that some of its contractors listen back to recordings of what people say to Google Assistant — this, it said, helps it improve its support for more languages, accents, and dialects.

Employees and contractors are not able to correlate recordings with user accounts, but the contents of many of the recordings contained personally identifiable data, including addresses, names, and other private information. In addition, many of the recordings had been accidentally activated by the user.

However, many members of the public were unaware of the practice until the Bloomberg news agency reported the fact earlier this year.

Later in the month, a separate report alleged that Apple often allowed workers to access up to 30 seconds of “accidental” Siri recordings as part of its voice grading program. While it was already known that Apple listened to some Siri recordings to improve its quality, the new report found that recordings were accessed not only by its internal staff, but by contractors with high turnover rates.

Siri could be accidentally triggered, such as by the sound of a zip unfastening, or words that sound like “Siri,” with 30-second long snippets of recordings.

Apple said the move would affect users worldwide.

“We are committed to delivering a great Siri experience while protecting user privacy,” Apple said in a statement.

“While we conduct a thorough review, we are suspending Siri grading globally.”

The company added in the future users’ voice recordings would not be included in the grading process unless they had chosen to opt in.

A German privacy authority had ordered Google to stop harvesting Google Assistant voice data in Europe for human reviewers.

The Hamburg Commissioner for Data Protection and Freedom of Information Virtual assistants are supposed to send audio to remote computer servers only if they hear a “wake” word

Siri and other services can activate in error after wrongly picking up sounds they mishear as their “wake” words.

Germany’s data protection commissioner in Hamburg has also launched an investigation into Google over the practice, with which the search company is cooperating.

“The use of speech assistance systems must be transparent so that informed consent can be obtained from users,” added the commissioner, Johannes Caspar.