EU Drafts Data Regulations for Voice Assistant Developers

Written by ShannonFlynn | Published 2021/04/09
Tech Story Tags: data-security | data-protection | law-and-technology | european-union | eu | technology-news | hacker-news | voice-assistant

TLDR EU Drafts Data Regulations for Voice Assistant Developers: Guidelines on Virtual Voice Assistants (VVAs) 4.2 billion virtual assistants in use globally as of 2020. European Data Protection Board (EDPB) released new guidelines over these technologies for public comment on March 2, 2021. Most of the EDPB’s new recommendations deal with transparency. They suggest VVAs should regularly update users about what data they collect. The regulations suggest developers prevent accidental listening, but if VAAs don't listen as closely, they may not understand when people try to wake them.via the TL;DR App

Voice assistants are some of the most popular pieces of tech today. There were 4.2 billion virtual assistants in use globally as of 2020, and yet the technology remains comparatively unregulated. That’s starting to change, though.
On March 2, 2021, the European Data Protection Board (EDPB) released new guidelines over these technologies for public comment. These proposed regulations, dubbed the Guidelines on Virtual Voice Assistants (VVAs), aim to protect users’ privacy with these increasingly popular devices. That will likely mean changes down the road for developers.

The EDPB’s New Recommendations

The European Union (EU) famously has some of the most stringent regulations for digital data practices. Its General Data Protection Regulation (GDPR) is one of the most recognizable examples of user privacy legislation, but it talks very little about voice assistants. These new proposed guidelines seek to amend that.
Most of the EDPB’s new recommendations deal with transparency. For instance, they suggest that VVAs should regularly update users about what data they collect. That’s a reasonable request since 52% of virtual assistant users are concerned their information isn’t secure.
Similarly, the regulations ask providers to avoid bundling VVAs with other services like video streaming. These bundles, which many VVA providers offer, lead to complicated, often unclear privacy policies. The EDPB argues that this could violate the GDPR’s transparency principle.
Another prominent issue the guidelines address is accidental listening. AI privacy expert Hannah Fry warns users to keep VVAs out of private areas, as they “keep recording for a short period” after activation. Multiple studies have also shown that these devices can wake accidentally, potentially recording without users’ knowledge.
The EDPB recommends developers use tech-like noise filters to prevent accidental awakenings. They also suggest VVAs inform users about what data they’ve stored recently.

Rising Regulations Around the Globe

While these new guidelines are new to the GDPR, they’re part of a broader trend. Privacy regulations around smart tech have are rising worldwide. Even seemingly unrelated industries like the medical sector have suggested revising risk management standards in light of new technologies.
The U.S.’s Children’s Online Privacy Protection Act (COPPA) requires services to obtain parental consent before collecting kids’ voice data. This rule doesn’t apply to voice commands, as long as VAAs don’t save personal information and delete the data soon.
Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) doesn’t specifically mention VVAs much, but it does apply to them. Like the GDPR, it requires user consent to collect data, asking again if the company wants to use it for a new purpose. It also says companies must set maximum retention periods for all information.

Regulations’ Impact on Developers

These rising regulations mean voice assistant developers have more work ahead of them in the future. Location-specific rules can apply to companies outside of that specific area, too. For example, an American tech company like Amazon still has to comply with the GDPR for its devices and services in Europe.
Most current regulations over voice data are matters of policy. They apply to how the company stores and uses customer information, which doesn’t necessarily concern the development side of things. The EDPB’s new proposed regulations could change that.
Under these guidelines, developers would have to provide privacy by design. They must limit devices’ capacity to breach users’ privacy in the first place. Balancing these considerations with convenience may be challenging.
Some improvements are straightforward enough, like providing regular updates to users about collected data. Others may be more complicated. The regulations suggest developers prevent accidental listening, but if VAAs don’t listen as closely, they may not understand when people try to wake them.
Helping VAAs differentiate between wake commands and other noise is primarily an issue of natural language processing (NLP). More nuanced NLP can be an issue of diversity. AI consultant Catherine Breslin emphasizes that language is full of ambiguity, and “technology that works … for those who have nonstandard speech” will go far.
If NLP models can recognize various speech patterns without accidental awakening, VAAs will be far more secure. Developers will have more challenges to overcome with voice assistants, but it will result in better products on all fronts.

As Tech Changes, so Too Does the Law

Regulatory changes are nothing new for the tech industry. There is always a period where technology advances beyond regulations, but the law catches up eventually. Developers have overcome these challenges before, and they’ll do it again.
As voice assistants become more popular, regulations will continue to change. Developers must stay on top of these shifts and pay attention to growing consumer feelings. In the end, more secure, private devices will benefit everyone.

Published by HackerNoon on 2021/04/09