Exploring how organizations developing or adopting XR training can integrate best practices for accessibility into design, development and implementation processes.
1
00:00:00,033 –> 00:00:19,901
-Stephanie: Sounds cool. Well, let’s get started. So I’m going to say welcome. I’m Stephanie Montgomery for all of you who are listening with the XRA, and I wanted to thank you for joining me today for the breakout session on how to bake accessibility into XR-ba
1
00:00:00,033 –> 00:00:19,901
-Stephanie: Sounds cool. Well, let’s get started. So I’m going to say welcome. I’m Stephanie Montgomery for all of you who are listening with the XRA, and I wanted to thank you for joining me today for the breakout session on how to bake accessibility into XR-based training apps. So I’m just going to let Bill and Tim take it away.
9
00:00:19,901 –> 00:01:16,033
-Bill: Thank you, Stephanie, and we really appreciate XRA inviting PEAT here today as well as our esteemed speakers to talk about accessibility, a very important topic in our community. I’m Bill Curtis-Davidson. I codirect the Partnership on Employment & Accessible Technology, otherwise known as PEAT. I’m a former accessibility leader at Magic Leap, and in the past, I’ve also worked at Level Access and IBM. And my pronouns are he/him, and I am a middle-aged, definitively, white male who has a balding hairline as well as a mixed salt and pepper goatee, and I wear glasses. So with that, I’d like to introduce my esteemed industry colleague, Tim Stutts. Tim, would you like to introduce yourself?
25
00:01:16,033 –> 00:01:44,901
-Tim: Hi, yes. Tim Stutts here, and as mentioned, worked with Bill at Magic Leap. At Magic Leap, I worked on design and partnered with Bill on accessibility-related features.
My pronouns are he/him, and I guess, technically, I am also middle-aged white male. I have a beard. I guess that’s it for me. I’ll pass it back to you, Bill.
33
00:01:44,901 –> 00:03:06,868
-Bill: Thanks, Tim. Now, I’d like to review quickly. We have such a short time. We could talk all day about this topic, but what we plan to do today is explore just a little bit about some strategies for baking in accessibility. What can your organization do in general to make sure that you’re designing inclusively and integrating best practices as they’re being developed? I’ll cover some prompts here for our discussion, and hopefully you can take some of them away and continue the dialogue with us. And then importantly, Tim will be spending about 10 minutes sharing some examples of XR product accessibility design that he’s been involved with to give you a sense of some of the work that’s happening in our community. And then we’ll save hopefully 5 to 8 minutes, something like that, or 5 to 7 minutes for some questions, and please put them in the chat if you have them. Finally, I’ll just mention that immediately following this breakout, I’m zipping over to our virtual booth in the expo area with my colleagues Alexa and Ashley, and we will be able to have dialogue with you until 4:30 when we all return to adjourn the session.
59
00:03:06,868 –> 00:09:04,067
Bill: So let’s dive right into our discussion. So, again, five key strategies, things that you can think about doing, strategy one is really about envisioning the value of inclusion. Inclusion is obviously not just about people with disabilities, but considering the way in which your app can be used by people with different sensory, physical and cognitive needs also really spans to the use of XR for thinking about flexibility and activities and environments.
And here I’m showing a table that can help you envision how meeting the needs of people with disabilities can also help us consider supporting all users who will experience limitations based on the activities or environments they’re utilizing XR in. The table has four columns covering key areas related to access such as moving and touching, hearing and speaking, seeing and observing and thinking and learning. When we design with inclusion in mind, we can work to not only provide access, for example, in the motor or mobility area or moving and touching area for people who use wheelchairs, canes or prosthetic devices, but we can also design for uses of XR where people are stationary or have hands busy such as in surgery or designing for use of XR in small spaces or from a seated position.
And I could go through each example here, but we have limited time. I think the last session that Ashley led with Meryl Evans indicated that captioning is valuable for people who are deaf or hard of hearing, for example, but it may be also valuable with quiet or loud environments, for example. The second key strategy is, we have to diversify our teams. We do not want to have teams that are only filled with the same kind of people. We need to make sure your teams include people with a diversity of lived experiences, perspectives and creative ideas that are shaped by race, ethnicity, gender, age, sexual identity, ability, disability and location among many other things. And here I’m showing some icons representing different areas of diversity like vision, cognition, hearing, motor, emotional, socioeconomic and intersectional. And I’m also sharing some photos that, actually, Tim and I are represented in. These were real activities at Magic Leap where we were leveraging collaborations with the disability community and also our DEI and employee resource group efforts to make sure that our product development really factored in a variety of perspectives. A key strategy to think about is flexibility and options. If you think about nothing else, think about testing and designing for flexibility of input modalities, interaction modes and output modalities. For example, supporting different ways of communicating through voice, text or even sign language through a video feed or something might help employees in loud or quiet environments as well as helping people who are deaf or hard of hearing be included. In essence, you want to avoid designing for an average user. There is no such thing. We all need different things at different points, and we need to design that flexibility. Four, making a business case, as we’ve heard in many of the sessions today, it’s really important to think about this now. These technologies are mature. They’re being implemented clearly at scale by very large, leading companies. And those companies have strong business goals to maintain a capable workforce or whatever other goals exist. We must consider inclusive designs so that everyone can be involved in your business or what you offer from your business,
so make sure that your training products, consider that context of what can this can do to support your overall goals. We can think about areas, for example, like high-growth job reskilling for clean energy and other high growth jobs or supporting the hybrid workforce or workplace through inclusive, immersive telework.
Whatever it is, make a business case that works for your organization. And finally, speak with suppliers. Don’t be afraid to ask your vendors about this topic even though many of them may not have really explored it fully. It’s important to open that conversation up and make sure the voice of the customer includes requirements for inclusion and for designing for diverse populations, so we’ll use these technologies. And this will help us make sure that they’re usable broadly in your organization. Find out if they’ve involved people, diverse people, including people with and without disabilities in their product design. But even if they haven’t, you can involve people who are diverse in your supplier relationship and activity, your procurement activity as well as planning and implementing the technologies themselves and building improvements over time as technologies become more accessible. And with that, I’m going to turn it over to Tim to share some examples. Over to you, Tim. Can you unmute?
171
00:09:10,634 –> 00:20:40,400
-Tim: I’m setting up a stopwatch for myself, so I’m going to talk for 10 minutes. Hi. So, excited to talk today about product accessibility, and the first example I’m going to bring up is the work that I did with with the UX team and Bill and other folks at Magic Leap around inputs and feedback. So one of the things that I worked on while there was symbolic input for text entry for the Magic Leap 1 mixed reality wearable display device. There are a number of different ways to enter text. Pictured here are a virtual keyboard that floats midair in front of the user, and the user is able to type on that keyboard, yeah, a handhold controller that can either point at the characters at the keyboard that they want to type, use voice dictation, or there’s even another way of pointing at the keyboard using the touch pad to move the selection around mouse-style. Next to that is an image of our mobile companion app, which is functioning as a controller just like the control that ships with the device.
Here, text entry happens with iOS, Android keyboard once a user has selected a field in the UI. And then finally, another thing we support is text entry via external Bluetooth keyboard. So pictured here is a hand on a Logitech K600 wireless Bluetooth keyboard with trackpad. So here, a user can type using a QWERTY keyboard mechanical.
I also worked on the LED and haptic feedback for Magic Leap 1. So in terms of LEDs, our control had 12 LEDs around a circular touch pad to provide feedback for interactions
and system functions like batteries. Our Lightpack, which is our hip-mounted computer system, had its own LEDs to convey similar things, and our Lightwear has yet another LED — that’s the wearable — to cover things like surveillance recordings and so forth. So, Bill, if you wouldn’t mind advancing to the next slide. So pictured to the left here is the settings area of Magic Leap, and Bill is going to demo soon a video of the numeric virtual keyboard. So our numeric virtual keyboard is useful for typing in PIN numbers and IP addresses, and it features 10 keys in a row ordered numerically, and a user could use the control to type on that keyboard. So if, Bill, you wouldn’t mind playing that video. It’s brief, so you can probably play it a few times. So, yeah, you can see the 6DOF cursor targeting different characters of the keyboard, and the user types in a PIN number and completes the PIN number, and then that keyboard goes away, next slide, please. So here is the QWERTY virtual keyboard, also known as the full virtual keyboard. The QWERTY keyboard is arranged like the mechanical one. The letters are in a similar organization for functions like search or typing an URL in a browser. These are useful. In this example, you will see a video of a user pointing at the text field for the browser called Helio. The virtual keyboard will float up in front of the user, and they’ll be able to then point at the characters they want. I should also mention there’s LED feedback on the touch pad or the control and haptic feedback when the user hovers over a key. So, Bill, if you don’t mind playing this video, so the user is searching for the artist Björk,
and this is a good example because Björk has an umlaut over the O. So to get to the umlaut, you have to do what’s called a force hold on the O key to get the alternate character O with the umlaut. And so they successfully find that character and then pull up Björk, who’s one of my personal favorite artists, by the way. Okay. Next slide, Bill. So final Magic Leap example, I want to show you some examples of the hardware that’s used to convey different input and system functions. So I mentioned three pieces of hardware with LEDs. This video will show two of them operating, and just the thing to look out for is to notice the similarities in the patterns. They reference each other. In terms of accessibility, we leverage color as well as motion, so if you have limited vision or colorblindness, you’d be able to discern these different patterns without being aware of what color they are. So they’re all different visually. I should mention too in the previous example, the input is highly multimodal which is definitely, definitely has accessibility in mind. So, Bill, if you wouldn’t mind going and playing this video, so the first pattern is the control and light pack turning on and starting up. And so we see these kind of glowing sunrise patterns around the parameters of both devices as they’re starting up, similar color and pattern moving left to right. Now the two devices are going to pair. You see two chasers following each other on both devices, and they merge to indicate pairedness. I’m going to adjust the volume level. There are buttons on the Lightpack and a pattern on the Lightpack to show you the level visually. And there’s a mode called reality mode where you press this button on the Lightpack, and it turns off the display and the audio so you can focus on real-life situations like kitchen disasters. And here you can graze the button on the Lightpack and see the charge level, represented in color and volumetric fill. And now I’m shutting down the device by holding down the Lightpack power button. And you see a sunset pattern in the opposite direction from starting up. All right. Next slide, please. So I want to talk briefly about the work I’m doing at Vuforia. It’s very related to the Magic Leap work I was doing in the past. In fact, you’ll see at the bottom of this that Magic Leap is a Vuforia partner. So Vuforia Work Instructions, which is what I work on, allows you to author, edit, publish and scale intuitive step-by-step instructions for your organization’s most critical workflows.
Vuforia Work Instructions help you get started quicker with augmented reality. Our Work Instruction solutions enable intuitive content creation and maintenance,
streamlined editing and enhancement, easy publishing across all device formats, accelerated time to value and enterprise security control and scalability. So if you wouldn’t mind switching to the next slide, Bill, and before you start this video, I will just mention to you that the use case for our augmented reality is primarily manufacturing setting, and we have a number of apps that are used to capture video and spatial data on devices like HMDs, like the HoloLens. And then we have an app called Editor that’s used to play back the, sorry, to author content, and then finally, an app called Vantage used to consume the content, so consume the constructional data. And then there’s even an app called Insights, so, Bill, if you wouldn’t mind playing the video. Then I can talk a little bit about it. So before I instruct, so we’re looking at a motorcycle.
Someone is pointing a tablet running Vantage at the motorcycle. They are detecting the 3D model of the motorcycle and putting labels on it. And now, we’re going back and
consuming those instructions, and we’re following steps to adjust an instrument on the motorcycle, confirming pass/fail. We have automatically detected spatially with the iPad’s camera what parts are in the scene, and there’s finally before it Insights offering a report of the completed procedure. So I don’t have too much more time, but if you go to the next slides… Oops. Looks like we’re watching the video again. I can talk a little bit while we’re waiting for the technical difficulty to be worked out. I was going to show you a picture of the HoloLens Capture app, so that’s running at the beginning of our pipeline, and it currently runs on HoloLens and allows you to move around your space and capture spatial data and photos and images and set bookmarks, et cetera, all hands-free on device. So most of it is, the Capture app, is run using a head pose cursor, so movement of the head. Also, there are voice commands and a simple gesture implementation if you want to click a button. But you don’t actually need the gesture because you can also just dwell on a button with head pose and there’s a timer with each button and the button to be pressed. The second image is of the Vuforia View app, and the View app is an app to play back instructions in the factory, relying on Azure and Spatial markers and playing back content that was created on our Capture app right after it’s been edited in our Editor app. So that’s all I have to share today. Thank you, Bill.
365
00:20:40,400 –> 00:20:44,501
-Bill: Tim, I’m sorry for the little glitch there.
366
00:20:44,501 –> 00:20:45,701
-Tim: Oh, that’s okay.
367
00:20:45,701 –> 00:20:50,367
-Bill: I’m just flashing up those last images. I know we only have about 4 minutes.
369
00:20:50,367 –> 00:20:53,667
-Tim: No worries. Bill, I’ll let you describe them.
371
00:20:53,667 –> 00:23:12,534
-Bill: Sure. Yeah, so I’m just showing what Tim was just showing to describe. This was showing Vuforia Capture, so we’ve got someone wearing a HoloLens who is recording her actions in different steps. And you can see there’s, like, a step collection of this information which can be in different forms, voice, et cetera. And then we see a young gentleman wearing a HoloLens that’s viewing instructions by wearing a HoloLens and using gestures and voice to access the information, which, again, can be in multimodal form. So with that, I’m going to just say that if there are a few questions, we have about 4 minutes left, which is not a lot of time, but I want to thank Tim for the wonderful overview and such rich examples and, more importantly, for his excellent work in accessibility over the years. And so any questions you might have about accessibility really can involve multimodal inputs, equitable design considerations or some of the practices we alluded to. We’d be glad to entertain those.
And I’m going to go out of screen share mode so that I can actually see the chat, and then I’m also going to paste some links to our PEAT white paper, “Inclusive XR in the Workplace,” which Ashley Coffey mentioned in a previous breakout, but certainly we’d love for you to learn more about these topics if you’re interested. And if there are any questions, please paste them in the chat, and otherwise, we can have more time to dialogue post-conference. Looks like Alexa has, “What kind of advice would you give to designers and developers who want to build more inclusive training applications but don’t have the resources to get started?”. So, Tim, would you like to comment on that a little bit? Like, how do you get started when there’s not… We’ve had that experience, I think.
412
00:23:12,534 –> 00:24:31,567
-Tim: Well, you know, my first answer was find a resource within the company. In the case of Magic Leap, Bill was the person hired who was in charge of that, so it was
a really natural pairing. In fact, I think he was even the one who found me. But then when I found out Bill existed, yeah, it made it so much easier to get some of these things through and also come up to speed. In the absence of someone like Bill or even with someone like Bill in your organization, you can join organizations like XR Access.
And there’s also ALLie, that is, ALLie VR from Thomas Logan. I mean, these orgs have resources to help you get up to speed for sure, and there are different groups, like,
subgroups within those orgs focusing on different work streams. So if your app or the thing you’re working on is, like, hardware related, there is probably a group overseeing
hardware best practices for UX and accessibility.
434
00:24:31,567 –> 00:25:00,567
-Bill: Thanks, Tim. XRAccess.org is worth checking out. This is our community led effort which involves the XR Association and many other community members, including many members of the disability community and large platform companies and application and content developers. We’re going to run out of time, so I want to say thank you, Tim, for your thought leadership and your work, and then we’ll turn it back over to Stephanie to close us out.
445
00:25:00,567 –> 00:25:33,634
-Stephanie: That’s right. Thank you both for joining us. Wonderful to have you here. We always enjoy working with XR Access. So, everybody, please join us back at the main stage in just a little while, and I’m just here looking for my notes so I can get my timing right — at 4:30 for the final key note.
And in the meantime, please take a moment to swing back over to our exhibitors at the expo, and Bill will be there as he said, and he can answer any questions you have.
See you all at the closing session. Bye.
457
00:25:33,634 –> 00:25:35,234
-Tim: Bye.
-Bill: Bye, thanks.
458
00:25:35,234 –> 00:25:36,400
-Stephanie: Thanks.
The XRAlert provides a fast, free, and curated update about the rapidly evolving immersive tech industry brought you by the XR Association. Delivered every Tuesday, it covers everything you need to know about XR today, tomorrow, and on the horizon, and the people who make it happen.