Washington, D.C. – As businesses develop and adopt virtual (VR), mixed (MR) and augmented (AR) reality into education, healthcare, entertainment, and more, the urgency to consider potential privacy and data protection risks to users and bystanders grows. Lawmakers, regulators, and users themselves are increasingly interested in how XR technologies work, what data protection risks they pose, and what steps can be taken to mitigate these risks.
On August 13, 2024, the XR Association hosted a virtual webinar diving into these issues and more alongside Brittan Heller, an expert in technology and human rights; Alexandra Givens, President and CEO of the Center for Democracy & Technology; Carrie Valladares, Senior Corporate Counsel at Microsoft; and Jameson Spivack, Senior Policy Analyst of Immersive Technologies at the Future of Privacy Forum. Joan O’Hara, Senior Vice President of Public Policy at the XR Association, moderated the conversation.
Speakers discussed the new data types and biometrics needed to power immersive technologies. All agreed that while this creates new opportunities for technology, it also raises privacy concerns around user and bystander privacy, as well as XR data collection methods.
Speakers also debated the most effective mechanisms for protecting individual privacy rights, with some advocating for opt-in content mechanisms and others for opt-out. Givens argued that consent alone is “insufficient” for protecting XR users’ biometric data while acknowledging limitations in user power to consent.
Valladares agreed, suggesting a layered consent approach in which users can choose what they do and do not want companies to access. Heller also emphasized the importance of education and transparency in companies’ data collection practices, pointing to a recent study that found users responded positively to understanding how XR systems work.
“To me, that is an ultimately optimistic point of view, where we have these legal premises like consent,” Heller said. “But it shows that as technology evolves, our understanding of not only what is possible, but what is desirable for users can evolve with it. And so for me, that is why consent is not the end all be all, but is a great place to start, because as long as you bring users along with you, and you make sure that it’s meaningful and informed.”
The conversation also touched on ethical challenges, such as data privacy and potential abuse scenarios, and the need for a human rights-based framework to address these concerns. Speakers discussed the balance of responsibility between government, tech companies, and users regarding data use and privacy, agreeing that all involved parties bear some role in staying informed and being mindful of why and what data is necessary to enable immersive experiences.
“Hopefully you will get more companies realizing that establishing trust with their customers is good business practice, and they will do the right thing, but there will be others that don’t,” said Valladares. “And so certainly, government does have a place in it, and education is in there too. Whoever can provide the education and whoever’s best interests are in it should be doing it. That’s how I would say those three groups play out.”
The conversation also touched on current legislation in the European Union (EU) and the U.S., including the proposed American Privacy Rights Act (APRA). Speakers debated whether the EU’s General Data Protection Regulation (GDPR) could serve as a model for the government as states outpace Congress with a series of patchwork and individual data privacy bills.
Spivack explained that consistent privacy standards, including a lexicon of terms, would help keep users protected across various regions and ensure smooth functioning and regulation for companies and developers. Givens agreed, suggesting that data minimization in AI-powered XR tech is challenging but important for privacy.
To combat that, usage controls and data minimization should be combined for added protection, she emphasized.
“This is why usage limitations come in as the important backstop in the end,” Givens said. “So even if we can’t control all of the inputs coming in – and there are some legitimate reasons why data is needed for training purposes – that’s when you buttress a regime with usage controls to add another layer of protection for people. And in this space, it’s going to be another essential part of the puzzle.”
The webinar concluded with a forward-looking discussion on the opportunities and challenges that lie ahead as the tech ecosystem continues to innovate. “Privacy as an Element of Trust” marks the third webinar in XRA’s Charting the Future of Immersive Technology series. Watch the webinar recording to follow the full discussion.