Skip to main content



Washington, D.C. – On March 21 and April 10, the XR Association, under the auspices of its Future of XR Advisory Council, convened its third working group—this one focused on XR and privacy. The XRAC is an independent, multi-stake-holder advisory council focused on defining the XR ecosystem and informing the ongoing work at XRA, and specifically geared to discuss the technological innovations, regulations, and considerations for the future of immersive technologies. The working group is one of five overall that will take a deeper examination of specific topics that include:

  • Safety and Well-Being of Young People
  • Interoperability
  • Privacy
  • Diversity, Equity, Inclusion and Accessibility (with a focus on the workforce)
  • Norms & Behaviors in Immersive Spaces

The working group started from this simple maxim: immersive technologies cannot flourish if there is no trust that one’s personal privacy will be respected. Functionality requires the use of many types of data, but this must be done responsibly and thoughtfully.

To unpack this beginning premise, the first part of the discussion focused on what is “net new” when it comes to immersive technology and data privacy.  Personally identifiable information (PII), biometric data and spatial data have been in our lexicon for some time and the many legal and regulatory frameworks for privacy are largely aimed at such data. XR technology can also capture other forms of body-based data (e.g., autonomic responses or eye gaze). XR will likely raise the frequency with which these types of data are used, and what seems to differentiate immersive technologies even more is the volume and aggregation of this type of data which can lead to a more granular understanding of things like geolocation and what is increasingly referred to as inferred data. Inferred data is the information and insights derived from data that has been processed, as opposed to raw data that is provided (knowingly or otherwise) directly by a person. In other words, “inferred data” is the conclusion reached about an individual based on the analysis of other discrete pieces of information.

To address safeguarding these categories of data, the working group focused on issues surrounding data collection, storage, and minimization. The conversation centered on what strategies will be necessary to protect data based on the type of use case at issue, and the context in which the data is used. A subset of this topic was on how the amount of data collected by XR could facilitate more targeted advertising and the line between curation versus persuasion versus manipulation.

Empowering the user to make knowledgeable decisions about what data they want to provide access to was also widely discussed. Most agreed that existing consent practices for data collection and/or use in the context of immersive technologies may not adequately provide the consumer with  knowledge of how data is used and meaningful informed consent. The way forward for XR technology will require a stronger emphasis on “privacy by design” as well as a rethinking of how consumers are informed about data collection/use, whether the consumer has a legitimate choice about allowing or denying the collection of that data, and how to empower consumers to take a more proactive role in data privacy protection.  Other policy areas covered included youth and privacy, and interoperability.

The challenge identified in this working group is how to articulate and execute a shared model of responsibility that does not place unrealistic or inappropriate burdens on the consumer but still allows for different business models to develop and grow as immersive technology matures.