Skip to content

RealityVirtually2019/xable

Repository files navigation

X-able: Closing the XR Accessibility Divide

NOTE to ALL: How the cotent in this READEME file is organized

  • INTRODUCTION:context on WHY... why this project, why now and why this team/why we care.
  • PROBLEM- inclusivity issues encountered in XR; compelling data on how many impacted
  • HACKATHON SCOPE- goals, constraints and theory of change for open source solution
  • APPROACH-based on Design Thinking, ADA 508 guidelines/W3C-WAI initiative for web/mobile
  • GitHUB DELIVERABLE- standard informaton for developers coming out of this project
  • FUTURE VISION FOR OPEN SOURCE LIBRARY- ideas/concepts that the team has brainstormed that are outside the project scope

INTRODUCTION (why this project? why now? why this team/why we care?)

This team is keenly aware that as XR infiltrates our personal, business, social, and educational lives, there's a swath of the population worldwide that are being left behind- just as they were when web/mobile solutions were first being introduced. Then ADA 508 compliance regulations started getting government and commercial entities to pay attention to the needs of the disabled when creating new web/mobile experiences. We believe that XR create an even bigger divide between those who can easily engage-and those who can't. Therefore we're taking this opportunity toat the RV Hackathon, to shine a spotlight on the challenges of accessibility, create starting point and invite developrs to collaborate in the future on creating and maintaining an open-source library that resolves accessibility issues in XR.

Team Background & Motivations

-- Scott is a creative technologist and interactive designer with nearly 2 decades worth of experience designing and building interactive experiences. He currently works for Pega Systems as an Innovation and Experience Manager. His previous work includes clients such as MINI Cooper, Fidelity, Puma, Converse, AthenaHealth, and Comcast.

-- Jordan leads immersive design at ByteCubed + CHIEF, teaches design as adjunct faculty at George Mason University, and currently serves as President of UXPA DC. Previously he worked in the Department of Defense and United States Senate helping government make digital experiences to be more accessible and inclusive, and is passionate about ensuring that nobody is left behind by emerging technology like XR

-- Mike is the Boston chapter president of the VR/AR Association and recently left his position as Director of Wayfair Next to create a new company for 3D content creation. On a personal note, Mike has a son withlow vision and color recgognition shallenges who generously tested some ideas with his Dad.

-- Frank O

-- Susan-is an educator/entrepreneur and former Director of Digital Strategy at Harvard University; previously managed a UX agency in global consulting firm for 10 years, advocated for and delivered web/mobile accessibiliy with commercial/govt.clients including US Dept. of Education for 7 years. On a personal note, her husband has MS with cognitive/motor issues. He has tried and struggled with XR experiences.

PROBLEM

There are 650 million people, 10% of the world's population, with some form of disability. Imagine if you were one of them, with a hearing or vision impairment or limited motor skills trying to engage in an XR experience? What if you were among the 8% of men who have some form of color blindness and could not distinguish colors in the XR space?

We're exploring ways to close the growing accessibility divide within XR by visualizing an open-source toolkit for developers that helps them integrate support for users with disabilities. Our theory of action is that by taking a cross-platform compliance approach to resolving accesibility issues, developers will be more inclined to include accessibility support when creating new XR experiences. Based on the four W3C/WAI principles (perceivable, operable, understandable,and robust) we believe that the ADA 508 compliance guidelines for web/mobile set a powerful precedent that should be extended to XR.

We're keenly aware that there are recurring accessibility issues that may be common across mulitple platorms. If we begin with the most common challenges, then a new XR accessibility baseline could be established. From there, the possibilities are endless. we've brainstormed about these options as well using the W3C/WAI categories for accessibility issues to guide discussions.Categories are as follows:PHYSICAL,SPEECH, VISUAL,AUDITORY, COGNITIVE, NEUROLOGICAL, AGE, TEMPORARY DISABILITY, SITUATIONAL.

Here are 2 examples of the common accessibility challenges and ways we might address:possible responses:

  1. Color-blind user can't distinguish red-green or blue-yellow spectrums. To address that, the icon colors go into a high contrast mode or pattern overlay helps to distinguish color differences
  2. Vision-impaired user may have difficulty reading labels or navigating XR space. To address that, a function could be enabled to allow user to enlarge elements when selected in 3D space

HACKATHON SCOPE-

We will explore and experiement ways to close the accessibility divide within XR by visualizing a open-source toolkit for developers that helps them integrate support users with disabilities. We envision a strategy that focuses on cross-platform compliance. With this in mind, we'll be selective when choosing 3-4 functions to simulate/protoype.

Our theory of change is that is we shine a spotlight on accessibility issues, demonstrate ways to think about/resolve accessibility issues collaboratively, then we can build momentum for open sources solutions for accessibility. Ultimately we envision users with disabilities wll have more and more opportunities to enter th XR realm of amazing experiences.

APPROACH-based on Design Thinking,

We're taking an accelerated approach to the Hackathon project, based on the Design Thinking methodology. We're also leveraging our team's ast experiences and building on the foundation pf accessibility support for web/mobile established by ADA 508 guidelines and the work of W3C/WAI for web/mobile.

We began by establishing projectgoals/priorities,roles/responsibilities and prefereed communication channels. WE've primarily used face-to face discussions, our X-able Slack channel and GitHUB to share ideas, give updates and share assets.The following is high level view of our daily activities and outputs

DAY 1 ACTIVITIES:

   -Pitch problem/solution
   
   -Recruit Team 
   
   -Define Scope, constraints & priorities
   
   -Establish Goals
   
   OUTPUTS: Define d approach using collectve work experiences and Design Thinking methods. Also agreed upon the following project            goals/aspirations:
         -Shine a spotlight on expanding digital divide that excludes people of all ages with disabilities from experiencing  XR and                pursuing social, entertainment, educational and career opportunities.
         -Explore/experiment on ways to close the accessibility gap by extending ADA 508 guidelines and principles to common functions              seen across XR platforms/devices. 
         -Prototype elements for an open-source toolkit  that helps developers  add accessible features/functionality into current/                future XR experiences.
         -Demonstrate the use of cross-platform compliant code, supporting for accessibility can transform XR experiences -across                  platforms and devices- for millions of users worldwide.

DAY 2 ACTIVITIES:

   -Brainstorm needs and opportunities
   
   -Research features/functionality gaps and impacts
   
   -Experiment across platforms & devices

   
   OUTPUTS: Testing with... zoom controls, object outlines; with color contrast patterns shading and strategic color palette; with air        motion controls t adapt to limited motor skills  

DAY 3 ACTIVITIES:

  -Define scenarios
  
  -Begin prototyping
  
  -Continue testing
  
  -Continue documenting 
  
  -Update devpost; draft video
  
  -Update GitHUB

  OUTPUTS:  defined sample user scenarios;Draft entry in GitHUB; preliminary prototypes/simulations, mulitple small test for low             vision, limited motion, spatial navidgation functions; draft video of one simulation; powerpoint capturing project journey; search         and integrate multiple 3D images

DAY 4 ACTIVITIES: -Final cycles of Iterating/testing

  -Design set of accessible icons 
  
  -Update video, docs, and share 
  
  -Focus on final deliverables

  OUTPUTS: final testing/completing simulations; design accessible icons; update GirtHUB README files; final uploads to GitHUB

GitHUB DELIVERABLE

Simulations to be uploaded to GitHUB.

OUTLINE FOR RESOURCE LIBRARY OF STANDARD INFORMATION FOR DEVELOPERS - a starting point for next steps...

ASSETS

Assets used: --HDRIs: https://hdrihaven.com/hdri/?c=low%20contrast&h=hall_of_finfish

--Microsoft HoloToolkit: https://github.com/Microsoft/MixedRealityToolkit-Unity/releases

--3D Models: http://digitallife3d.org/blacktip-shark

DOCUMENTATION

--Feature Areas

-- Platforms

--Required Software

--Getting started

--Accessiblity Examples

--Helpful Links

https://www.w3.org/WAI/

EXTERNAL

PACKAGES

PROJECT SETTINGS

CONTRIBUTING & CODING GUIDELINES

--Open Source Terms of Use and Guideines

--Coding Guidelines

--Documentation Guide(s)

--Feature/asset Contributions Process

W3C/WAI categories for accessibility issues:

--PHYSICAL --SPEECH --VISUAL --AUDITORY --COGNITIVE --NEUROLOGICAL --AGE --TEMPORARY DISABILITY --SITUATIONAL

HELP

FUTURE VISION FOR OPEN SOURCE LIBRARY

A small sampling of the many ideas/concepts our team has brainstormed that are outside the scope of this Hackathon project

https://trello.com/b/0n6Rkzah/functional-trends

https://trello.com/b/h609FPhv/brainstorm-1-w3c-categories

BEYOND THE XR ACCESSIBLE BASELINE...

-- Customize voice like avatar: Offer ability to personalize experience with ability to voiceover record own voice as verbal commands in X experience. --Provide ability for user to know/confirm" where am I? with a range of options- via visual, sound, GPS or vibration cue

--Include dual options for navigating with clear, consistent iconography color contrast, and voice over options.

--Compensate for face blindness-pulling in his/her contacts/images into experience? for fictional experience embed descriptions of characters/players

-- Allow user AND guide dog with sensor to participate as team in XR

--Compensate for hearing loss with visual representation of actions needed to perform function, emergency signals/ alerts

-- Give deaf person ability to sing with gestures/movements/color /vibrations

--For cognitive issues provide memory recall option to" repeat that again" for me- in text and/or visual representation

-- For temporary issues Menu of options/settings for on/off function that supports a variety of compensatory functions e.g. change brightness after a an eye operation - then resume normal brightness when healed

AN OPEN INVITATION to all developers to consider the sample scenarios that follow and contribute solutions

--PHYSICAL user scenarios...

--SPEECH user scenarios...

--VISUAL user scenarios...

--AUDITORY user scenarios...

--COGNITIVE user scenarios...

--NEUROLOGICAL user scenarios...

--AGE user scenarios...

--TEMPORARY DISABILITY user scenarios...

--SITUATIONAL- user scenarios...

FEEL FREE TO ADD YOUR EXAMPLES TO THE ABOVE LIST!

Assets Used: HDRIs: https://hdrihaven.com/hdri/?c=low%20contrast&h=hall_of_finfish

Microsoft HoloToolkit: https://github.com/Microsoft/MixedRealityToolkit-Unity/releases

3D Models: http://digitallife3d.org/blacktip-shark

About

Accessible UI/UX for XR

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors