Do you use a digital voice assistant at work? If not, you likely soon will. Alexa, Google Home and other smart speakers and smart assistants are becoming prolific not just in homes, but in the workplace as well. There are roughly 4.2 billion digital voice assistants in use worldwide in 2020, and Statista predicts that number to increase to 8.4 billion by 2024.
The growing popularity of internet-connected, voice-activated smart speakers is easy to understand. With a simple phrase spoken to a patiently waiting device, music can play, lights can dim, or an email can be read. We can make a purchase, listen to the news, or be reminded to complete a task. We can even compose and send emails, text messages and more.
And, with more employees working from home now than ever before, these smart speakers are designed to help us be more productive with our daily work as well. That presents businesses with potential new issues regarding security and litigation.
Business Confidentiality Concerns
While some digital voice assistants are created specifically for the office (such as Microsoft Cortana and Alexa for Business), most have been designed for everyday use. That means they may not have the security features that an enterprise-focused or company-issued device should include.
With any digital capability or electronic device, there are potential threats to business confidentiality and concerns about hacking or malware attempts. This is even more true for smart speakers and digital voice assistants in the work environment. Once inside your system, they can be used to access sensitive corporate information.
That is scary considering how the technology works. These speakers and digital assistants are always listening for their “wake word,” like “Alexa” or “Hey Google,” which triggers them to start recording and respond to your request. However, sometimes the digital assistants can be triggered accidentally. For example, an Oregonian couple unintentionally triggered their smart home device, which recorded a private conversation and sent it to someone in the couple’s contact list. The couple wasn’t even aware it happened until the recipient alerted them. If a sensitive conversation within a company were accidentally recorded, it’s not a stretch to imagine how that could be devastating to the organization’s privacy and security, and even reputation and competitive positioning.
These devices are also new playgrounds for bad actors. Hackers have learned how to create an Alexa “skill” or Google “action” that looks harmless but is actually malicious. When activated, the bad actors can silently record people, obtain passwords, or disarm security systems. New vulnerabilities are being found regularly. Access to internal files means companies could be affected by malware or ransomware.
The devices can monitor people’s actions, too. Using AI, digital assistants can monitor tone, sentiment, heart rate, blood pressure and other nonverbal details. That information could then be sent off to someone else. The technology can even forecast what the person will do in the future and later check to see if its predictions were correct.
Data Privacy Issues
There is a new focus on transparency and disclosure with technology, especially as the U.S. legal system integrates GDPR considerations and state privacy laws. Examples include the California Consumer Privacy Act (CCPA), which went into effect in 2020 and the Illinois Biometric Information Privacy Act (BIPA), which was passed in 2008 but has recently become a major focus in privacy litigation. Users want to know exactly what data is being collected and how it is being used. For example, before GDPR, BIPA and CCPA, there were no legal provisions specific to biometric data, yet millions of people were using their voice or a thumbprint as a password. Until there is a single, comprehensive federal law to govern personal data, we have to rely on the companies that create these digital devices and a patchwork of localized laws that may or may not apply based on the user’s location.
When it comes to the device creators themselves, unfortunately, Amazon, Google and others are not doing all they could be to engender trust. While they are keen to promote their privacy policies and security measures, researchers have found they are breaking their own developer rules. The policies of these conglomerates are either unclear, inconsistent, or not correctly enacted. One app can put your whole company at risk.
Adding even more concern and potential burden, the data from a digital voice assistant may need to be preserved, collected, and disclosed if it is pertinent to a legal matter. Whether or not the data was recorded intentionally, it could be collected and pored over by a team of investigators and attorneys, and then produced to the court as part of eDiscovery.
Legal Considerations
Digital voice assistants have already been used in legal matters. For example, recordings and data from an Amazon Echo were used in a first-degree murder case in Arkansas and a double-murder case in New Hampshire. However, in those cases it was clear who owned the device and its data. There are additional things to consider if a personal device is being used for business, or vice versa.
Typically, user agreements are different between consumer-focused and business-focused products. A company will have unequivocal ownership of its data if it owns the device. For a voice assistant, that includes all recordings, all requests and answers to/from the device, and the corresponding metadata with more details about the date and time of each action. Consumer agreements may not always have the same clarity. Technically, the user would own the data from the device, but that doesn’t stop the company who made it (such as Amazon or Google) from retaining the right to access the data.
If the data on a personal smart speaker is relevant to a company’s legal matter, it may need to be gathered by the company for a case. The company may incur costs just to find out if such data exists, plus there will be costs for the forensic data collection. Thus, the best approach to using digital assistants for company use — whether they’re owned by the company or not — is to have a plan.
Company Usage Policy
Smart speaker usage now is similar to that of smartphones when they first came out. Some businesses chose to ignore their existence until it became a problem, and some simply didn’t allow the use of smartphones at all, which was difficult to enforce. When companies outlined specific policies about usage of smartphones, their people could take advantage of the technology in a smart, consistent, and secure way. Now it’s time for forward-looking businesses to implement similar policies around digital assistants and smart speakers.
Your company’s policy should include:
- An official corporate statement about which, if any, digital voice assistants are permitted by the company
- Real cases relating to your operations that provide clarity around acceptable and unacceptable usage
- Specifics about where this technology can and cannot be used, such as a conference room or trading floor where sensitive company information is often discussed
- Instructions on how to engage privacy options such as blocking the camera, muting the microphone or other features
Once the policy is set up, train your people on it — and regularly retrain them — to ensure that digital assistants are used properly and securely. Routinely monitor usage and confirm that everyone is in compliance with corporate policies. I also recommend surveying employees from time to time about how they use the smart speakers. That can give you new ideas about incorporating the technology into your company and developing appropriate use guidelines and policies.
Just remember: The “smart” part of this technology still comes from people. When you are informed and strategic about the way you use digital voice assistants, you can keep your company data safe while also empowering your team to be more productive and efficient.
Brian Schrader, Esq., is president & CEO of BIA (www.biaprotect.com), a leader in reliable, innovative and cost-effective eDiscovery services and digital forensics. With early career experience in information management, computer technology and the law, Brian co-founded BIA in 2002 and has since developed the firm’s reputation as an industry pioneer and a trusted partner for corporations and law firms around the world. He can be reached at bschrader@biaprotect.com.
Alexa stock image by Juan Ci/Shutterstock