Equal Entry https://equalentry.com/ Contributing to a more accessible world Wed, 04 Mar 2026 19:52:41 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://equalentry.com/wp-content/uploads/2019/05/favicon-150x150.png Equal Entry https://equalentry.com/ 32 32 215924826 Every Jira Ticket Is Your Accessibility Policy https://equalentry.com/every-jira-ticket-is-your-accessibility-policy/ https://equalentry.com/every-jira-ticket-is-your-accessibility-policy/#respond Wed, 04 Mar 2026 19:00:38 +0000 https://equalentry.com/?p=199178 Charlie Triplett explains how accessibility becomes sustainable when it is built into the everyday tools teams already use. Instead of relying on mindset shifts or expecting everyone to learn the Web Content Accessibility Guidelines (WCAG), Charlie shows how acceptance criteria, team agreements, and agile practices help designers, developers, and product owners understand what success looks like for real people. By embedding accessibility into user stories and Jira tickets, teams become more self-sufficient, reduce ambiguity, and create digital products that work for everyone.

The post Every Jira Ticket Is Your Accessibility Policy appeared first on Equal Entry.

]]>
This article is based on Charlie Triplett‘s talk at A11yNYC about the currency on which modern software gets built, which is the Jira ticket. If it’s in the Jira ticket, it gets done. If accessibility isn’t there, a remediation fire drill is.

Building accessibility into everyday product development

Charlie Triplett’s work focuses on building accessibility programs that organizations would miss if they disappeared. He approaches accessibility as a practical, team-centered discipline rather than a compliance exercise.

The first thing to understand is how modern software development works. Teams operate within agile frameworks, and their work is shaped by tools such as Jira, team agreements, and acceptance criteria. These tools already define how teams communicate, plan, and deliver. Instead of asking teams to adopt a new mindset, focus on using the tools they already rely on.

Accessibility becomes sustainable when it is embedded in these existing systems. People come to work to do their jobs, not to maintain a mindset. When accessibility is part of the definition of ready, the definition of done, and the acceptance criteria for each story, it becomes a natural part of the workflow.

It’s important to understand how teams function. When accessibility professionals avoid agile meetings or dismiss scrum as “too many meetings,” they miss the opportunity to influence the process. Teams cannot incorporate accessibility if accessibility experts do not understand how teams work.

Why accountability matters in accessibility work

While well-intentioned, “everybody is responsible for accessibility” often leads to no one being accountable. Teams are made up of people with specific roles, and each role has defined responsibilities. Product managers, designers, developers, and quality assurance (QA) specialists all contribute differently.

Expecting everyone to hold the same level of accessibility expertise is unrealistic. Instead, teams need clarity about who writes acceptance criteria, who fulfills them, and how success is measured. Product owners typically write acceptance criteria, while designers and developers fulfill them. This structure mirrors how teams already operate.

By aligning accessibility with existing responsibilities, teams can integrate it without confusion. This approach respects people’s roles and avoids overwhelming them with expectations outside their job scope.

Teams thrive when expectations are testable and unambiguous. Acceptance criteria provide that clarity. They help teams understand what success looks like for people using assistive technology or device settings such as enlarged text.

How acceptance criteria create clarity and collaboration

Acceptance criteria are short, testable statements that describe what must be true for a feature to be considered complete. They help teams understand the user’s needs and ensure that everyone is aligned on the expected outcome.

There are two common formats for this: user stories and Gherkin-style criteria. User stories describe what a person wants and why. Gherkin criteria describe what is true, what action occurs, and what outcome should follow. Both formats help teams communicate clearly.

“Now, you’re likely going to realize you’re missing some details and you’ll need to talk with your developers and designers before you can finish the stories. That Is A Good Thing! Identifying the missing parts and reducing scope before even building is exactly what Gherkin uncovers for you,” writes Nic Werner in Writing User Stories With Gherkin.

Accessibility criteria can be in user stories. For example, a shopping cart button may have visual states for default and hover, but the design may not include a focus state. When acceptance criteria require a visible focus indicator, the developer must ask the designer for guidance. This prompts collaboration without requiring an accessibility expert to intervene.

Criteria can also cover screen reader behavior, keyboard interaction, mobile screen readers, and device settings such as enlarged text. These criteria help teams think about real people and real use cases.

Over time, teams become more self-sufficient. They stop asking basic questions and begin asking more nuanced ones. This shift frees accessibility experts to focus on complex issues rather than repeating foundational guidance.

Practical steps organizations can take

Organizations can begin by integrating accessibility into the tools and processes they already use. Team agreements can include accessibility expectations in the definition of ready and the definition of done. This ensures that accessibility is considered before work begins and verified before work is completed.

Product owners can add accessibility acceptance criteria to user stories. Designers can annotate Figma files with accessibility information. Developers can implement keyboard, screen reader, and device setting behaviors as part of their normal workflow.

Teams can use resources such as the Accessible Rich Internet Applications (ARIA) Authoring Practices Guide (APG) or tools like Charlie’s AtomicA11y website. These resources provide practical, component-level guidance that can be copied directly into Jira tickets. These resources help teams avoid reinventing the wheel and ensure consistency across projects.

By focusing on clarity, communication, and shared tools, organizations can build accessibility programs that scale across teams and products.

A sustainable path forward

This approach shows that accessibility becomes sustainable when it is woven into everyday work. Acceptance criteria help teams communicate clearly, collaborate effectively, and understand what success looks like for real people. When accessibility is part of the workflow, teams become more confident and self-sufficient.

This approach respects people’s roles, reduces ambiguity, and builds programs that last. It shifts accessibility from a specialist-driven effort to a shared, practical practice grounded in the tools teams already use.

Video highlights

Watch the presentation

Resources

Bio

Charlie Triplett is an accessibility leader with over 20 years of experience in UX design and UI engineering, now focused on building global enterprise accessibility management programs.

He invented AtomicA11y.com, wrote TheBookOnAccessibility.com, and contributed the Design Systems chapter of Inclusive Design for Accessibility: A practical guide to digital accessibility, UX, and inclusive web and app design.

FAQ

What are the acceptance criteria in accessibility?
Acceptance criteria are testable statements that describe what must be true for a feature to be complete. They help teams understand expected behavior for people using assistive technology or device settings.

Who writes accessibility acceptance criteria?
Product owners typically write acceptance criteria. Designers and developers fulfill them as part of their normal workflow.

How do acceptance criteria help teams?
Acceptance criteria reduce ambiguity, prompt collaboration, and help teams understand what success looks like for real people. They also help teams become more self-sufficient over time.

What tools can teams use to find accessibility criteria?
Teams can use the Accessible Rich Internet Applications (ARIA) Authoring Practices Guide and AtomicA11y to find ready-made criteria for common components.

The post Every Jira Ticket Is Your Accessibility Policy appeared first on Equal Entry.

]]>
0 199178
XR Accessibility: What Meta Horizon Worlds Teaches Us About Inclusive Virtual Reality Design https://equalentry.com/xr-accessibility-inclusive-design/ https://equalentry.com/xr-accessibility-inclusive-design/#respond Tue, 17 Feb 2026 21:46:03 +0000 https://equalentry.com/?p=198682 In this article, based on his presentation, Thomas Logan explores the accessibility lessons shared in a detailed presentation on Meta Horizon Worlds and the Meta Quest 3. It highlights how virtual reality introduces new barriers and new opportunities for inclusion, from head-based navigation and hand-controller dexterity to caption placement, screen reader support, and seated-mode design. The recap focuses on what works, where gaps remain, and how developers and organizations can build more accessible XR experiences that respect autonomy, safety, and dignity.

The post XR Accessibility: What Meta Horizon Worlds Teaches Us About Inclusive Virtual Reality Design appeared first on Equal Entry.

]]>
This article is based on Thomas Logan‘s talk at A11yNYC on XR social accessibility in practice, based on lessons from Meta Horizon Worlds.

The evolving landscape of accessibility in virtual reality

Virtual reality has matured rapidly over the past decade, shifting from experimental novelty to a mainstream platform for socializing, learning, and collaboration. As more people enter these environments, accessibility becomes a foundational requirement rather than an optional enhancement.

Thomas traces this evolution through the lens of Meta Horizon Worlds and the Meta Quest 3, drawing on years of hands-on experience with eXtended reality (XR) accessibility. What emerges is a clear picture of both progress and persistent gaps.

His early work in virtual reality (VR) began during the COVID-19 pandemic, when virtual meetups became a lifeline for community connection. Those gatherings revealed the strengths of VR, such as global participation, shared presence, and immersive interaction. But they also exposed barriers that excluded many users.

The shift from flat screens to embodied interfaces introduced new physical demands, new safety concerns, and new assumptions about mobility and dexterity.

These insights shaped the presentation’s central question: How do we ensure that VR evolves to include everyone?

VR case study: Meta Horizon Worlds

Meta Horizon Worlds provides a useful case study because it spans multiple devices, including the headset, mobile, and web. Supporting such varied input methods forces developers to think beyond the default assumptions of VR.

It also highlights the tension between platform-level accessibility options and the custom work required from individual developers. The result is a landscape where some accessibility options are built in, others are optional, and many depend on the priorities of the people creating the experience.

Thomas emphasizes that accessibility in VR is not simply a technical challenge. It is a design philosophy grounded in autonomy, dignity, and safety. When accessibility is built into the platform, more people can participate without needing workarounds.

When it is left to developers, it often becomes a “nice to have” that never reaches implementation. Understanding this dynamic is essential for anyone building or evaluating XR experiences today.

How VR reshapes interaction and introduces new barriers

Unlike traditional digital interfaces, VR relies heavily on physical movement. The headset itself becomes an input device, requiring users to turn their heads to navigate menus or interact with objects. For people with limited neck mobility, this can be a significant barrier.

This seemingly simple design choice, treating head movement as a primary control, creates a new category of accessibility considerations that did not exist in mobile or web environments. Hand controllers introduce another layer of complexity. The Meta Quest controllers assume two functional hands with full dexterity, including thumb, index, and middle-finger movements. Many VR experiences rely on gestures that require grip strength or precise finger placement.

While the Quest supports alternative inputs such as Bluetooth keyboards, mice, and gamepads, these options only help if developers intentionally support them. In practice, many experiences still rely on two-handed controllers as the default, leaving users with limited mobility or limb differences without a viable way to participate.

Physical reach is another challenge. Early VR experiences encouraged users to walk around the room, bend down to pick up objects, or reach overhead. This created safety risks and excluded people who cannot perform those movements.

Over time, user feedback and real-world mishaps, such as people running into televisions or knocking over furniture, pushed developers toward seated-mode design. By 2026, most VR experiences include seated options, reflecting both accessibility needs and general user preference for safer, more predictable interactions.

Physical accessibility standards, such as the Americans with Disability Act (ADA) reach ranges, can be applied directly to virtual environments. Because VR worlds are built using real-world measurements, designers can use established guidelines to determine how high or low interactive elements should be placed.

This connection between physical and digital accessibility is unique to VR and offers a powerful framework for inclusive design. When developers follow these principles, they reduce the need for users to bend, stretch, or strain to interact with virtual objects.

Accessibility in Meta Horizon Worlds: Progress and limitations

Meta Horizon Worlds makes meaningful progress in terms of accessibility. One example is the “near mode” interface. This allows users to pull menus closer to their faces. It works like a built-in zoom feature and supports people with low vision.

The platform also includes haptic feedback, spatial audio, and voice input, offering multiple ways to interact with the environment. These multimodal options can significantly improve usability, especially for people who cannot rely on hand controllers alone.

Another important improvement is the ability to access accessibility settings during device setup. Users can triple-press the Meta button (Oculus button on some controllers) to open these options before the headset is fully configured. This addresses a long-standing gap in many technologies: Accessibility support that becomes available after setup is complete. Providing access from the start supports users who need accommodations immediately, not after navigating an inaccessible onboarding process.

Color correction and contrast adjustments are available in the operating system. While these help, they can’t fully compensate for inaccessible app design. If an experience relies solely on color to convey meaning, no system-level filter can fix that. This underscores the need for developers to follow inclusive design practices rather than relying on platform features to solve everything.

Screen reader support is one of the most significant additions to the Quest ecosystem. Meta introduced an experimental screen reader that uses flick gestures on the controllers to navigate menus. While promising, the feature is still early and has known issues. Some users have reported situations where turning on the screen reader disables both controllers and hand tracking, leaving them unable to turn it off. A more reliable solution would be to provide voice commands, such as “turn screen reader on” or “turn screen reader off.”

Captioning is another area where Horizon Worlds shows both progress and limitations. Automatic captions are available and include speaker labels, which are essential for understanding who is talking.

However, captions cannot be moved, resized, or replaced with human-generated captions. This is a significant limitation for events that rely on professional captioners, such as the Accessibility NYC Meetup. Without a way to integrate high-quality captions, users must rely on automatic speech recognition, which may not meet accuracy needs.

Representation, communication, and the social dimension of accessibility

Representation plays a crucial role in how people experience virtual spaces. Horizon Worlds includes cochlear implant options for avatars, which is a meaningful step toward inclusive representation. However, many other forms of representation, such as wheelchairs, white canes, guide dogs, limb differences, or mobility devices, are not yet supported. Avatars should reflect the diversity of real people, not just idealized bodies.

Communication tools also shape accessibility. The “Speak Live” feature allows users to select preset or custom phrases that are spoken aloud through a synthesized voice. This supports people who use augmentative and alternative communication (AAC). Users can add frequently used phrases, mirroring how AAC devices are personalized in real life. This is only available in the headset, not on mobile or web. Nonetheless, it’s a meaningful step toward supporting non-verbal communication in social VR environments.

Caption placement in social settings is important. Captions that appear too far away, too close, or directly over interactive elements can disrupt the experience. Meta’s documentation recommends placing captions about one meter from the user and allowing repositioning, but developers must implement this. Without a standardized captioning system, quality varies widely across experiences.

Social VR also raises questions about identity and autonomy. Some users prefer anonymity or experimentation with their avatar’s appearance, while others want their virtual representation to match their real-world identity. Social VR must support both. For example, a user who uses a wheelchair in real life may want their avatar to reflect that, while another user may prefer not to disclose their disability. Inclusive design means offering options without assumptions.

What can organizations and developers do now?

Organizations building VR experiences have an opportunity to shape the future of accessibility. One of the most effective strategies is designing for multiple devices. Because Horizon Worlds runs on headsets, mobile phones, and the web, developers must assume that users may not have controllers or full mobility. This naturally encourages more accessible patterns and reduces reliance on physical gestures.

Another key principle is building accessibility into defaults. When accessibility is automatic, it benefits everyone and reduces the burden on developers. Conversely, when accessibility requires custom work, it often becomes a low-priority task that never gets completed. Platform-level support is essential for consistent, reliable accessibility.

Testing with disabled users is also critical. Real-world testing reveals barriers that guidelines alone cannot predict. This is especially true in VR, where physical comfort, motion tolerance, and safety vary widely. Organizations should involve users with diverse disabilities early and often throughout the process. This ensures that accessibility is part of the design process rather than an afterthought.

Finally, developers should prioritize multimodal input and communication. Captions, voice input, haptics, and spatial audio should be treated as core options. These tools support a wide range of users and create more flexible, resilient experiences. As VR continues to evolve, the most successful platforms will be those that embrace accessibility as a fundamental design principle.

Looking ahead

Virtual reality is at a pivotal moment. Platforms like Meta Horizon Worlds show meaningful progress in accessibility, from multimodal input to seated-mode design and early screen reader support. At the same time, many options still depend on developers choosing to implement them, leading to inconsistent experiences.

The path forward is clear: Accessibility must be built into the platform, not bolted on. When VR environments respect autonomy, safety, and representation, they become places where more people can participate fully and confidently. The future of XR depends on making these environments immersive and inclusive.

Video highlights

Watch the presentation

Resources

Bio

Thomas Logan has spent over twenty years helping organizations design and implement technology solutions that are accessible to people with disabilities. Throughout his career, he has led projects for federal, state, and local government agencies, as well as private sector organizations ranging from startups to Fortune 500 companies.

He is the founder and owner of Equal Entry, whose mission is to contribute to a more accessible world. Equal Entry advances this mission by providing training, education, and accessibility audits across websites, desktop and mobile applications, games, and virtual reality. The company partners with organizations that build digital technologies to ensure accessibility is embedded from the start.

FAQ

What is XR accessibility?

XR accessibility refers to making virtual, augmented, and mixed reality experiences usable by people with disabilities. It includes visual, auditory, mobility, cognitive, and sensory considerations.

Does Meta Horizon Worlds have captions?

Yes. Horizon Worlds includes automatic captions with speaker labels, though they cannot currently be moved, resized, or replaced with human-generated captions.

Is there a screen reader for the Meta Quest?

Meta introduced an experimental screen reader that uses controller gestures to navigate menus. It is still early and has known usability issues.

Can VR be used while seated?

Yes. Most modern VR experiences, including Horizon Worlds, support seated-mode design to improve safety and accessibility.

How do people with limited mobility interact with VR objects?

Meta includes extended grab distance, allowing users to pull objects toward themselves without bending or reaching. Developers can also support voice input, haptics, and alternative controllers.

Can avatars represent disabilities in Horizon Worlds?

Only limited options exist today, such as cochlear implants. Mobility devices, limb differences, and assistive tools are not yet supported.

The post XR Accessibility: What Meta Horizon Worlds Teaches Us About Inclusive Virtual Reality Design appeared first on Equal Entry.

]]>
0 198682
Beyond Manual Audits: How Automation Strengthens Accessibility https://equalentry.com/accessibility-audits-automation/ https://equalentry.com/accessibility-audits-automation/#respond Wed, 14 Jan 2026 20:30:19 +0000 https://equalentry.com/?p=198349 Accessibility issues often appear quietly: After a plugin update, a new comment, or a small content change no one notices. Manual audits catch the big, complex problems, but automated monitoring tools help spot the everyday issues that slip through. In this example, a simple reader comment created an invisible empty link that would confuse anyone using assistive technology. Tools can catch these hidden problems quickly, while people and processes ensure they get fixed. Strong accessibility comes from combining all three: human judgment, consistent workflows, and continuous automated monitoring to keep websites usable, accessible, and trustworthy.

The post Beyond Manual Audits: How Automation Strengthens Accessibility appeared first on Equal Entry.

]]>
Accessibility breaks most often in the quiet moments: After a plugin update, after a new comment, after a well‑intentioned content edit that introduces something no one notices.  It’s essential to make accessibility part of ensuring a website’s quality. Hence, a site monitoring tool should be one of the tools in a company’s website toolbox.

Website teams rely on automated monitoring tools every day to track broken links, SEO performance, and content quality. Accessibility belongs in the same category as critical site‑quality checks. While manual audits remain the primary solution for identifying complex barriers, automated monitoring plays a key role in detecting issues that appear between those deeper reviews.

At Equal Entry, our standard consulting process recommends comprehensive manual audits twice a year. That level of rigor matters especially because websites are living systems. New content, content management system (CMS) upgrades,  plugin updates, and user‑generated comments can introduce accessibility issues at any time, even on sites with mature accessibility practices.

We hold ourselves to the same expectations we set for clients. Our six‑step accessibility auditing process is thorough. The remediation of issues identified in the audit means the site has addressed critical accessibility issues. However, organizations need a way to continuously check the site to account for the new content being added through dynamic site updates.

That’s where automation strengthens the process. To bridge the gap between manual audits, we use DubBot to monitor our site continuously. When paired with expert manual reviews, they help ensure accessibility remains a core part of overall site quality.

DubBot spots a problem

One day, DubBot surfaced an issue we wouldn’t have caught through manual reviews alone. A reader left a perfectly standard comment: “Thank you, you wrote a great article.” Sighted users saw only that sentence. But behind the scenes, the comment included an empty link.

DubBot flagged it because the link had no accessible name. A screen reader would announce it simply as “link,” offering no context and no purpose. That’s confusing for users and a clear accessibility failure.

Here’s what appeared on the page.

Thank you, you wrote a great article.

What the HTML showed.

Thank you, you wrote a great article.

<a href=“https://spam-website..com/” rel=“nofollow ugc”></a>

WCAG failures

This single comment caused the page to fail two WCAG 2.2 success criteria:

  • 2.4.4: Link purpose (in context)
  • 4.1.2: Name, role, value

WCAG 2.4.4 Link purpose (In context)

WCAG SC 2.4.4 requires that every link include descriptive text so users understand where it leads. An empty link is like a door with no sign. You can see the door but you have no idea what’s on the other side.

On a website, that uncertainty becomes a barrier. Users relying on assistive technologies need clear, descriptive link text to navigate confidently.

WCAG 4.1.2 Name, role, value

WCAG 4.1.2 requires that every user interface component contain a programmatically determinable:

  • Name: What the element is called (its label).
  • Role: What type of element it is (link, button, checkbox).
  • Value: Its current state (on/off, checked/unchecked).

A simple way to think about this is a remote control:

  • The button labeled “Play” is the name.
  • Starting the video is the role.
  • Whether the video is currently playing or paused is the value.

Interactive elements on a website, like links and buttons, need the same clarity. An empty link is like a mystery button with no label. You can press it, but you have no idea what it does. Without a name or purpose (role), assistive technologies can’t communicate it to users. This creates an inaccessible experience.

Why these accessibility guidelines matter

When a link has no accessible name, assistive technology users hear only “blank” or “link with no context.” Keyboard users may tab to something that appears invisible. And vague link text like “click here” doesn’t help anyone. No one knows where it leads.

It’s the digital equivalent of walking through a hallway and finding a door labeled “open me.” You might end up in a library, a gym, or the wrong meeting room entirely. Clear labels, such as “library,” “gym,” “meeting room,” give people the information they need before they take action. Links work the same way.

The role of automation in accessibility

Automated tools like DubBot don’t replace manual audits, but they play a critical role in continuous monitoring. Websites change constantly, and automation helps surface hidden issues the moment they appear, keeping sites more resilient between scheduled reviews. When used thoughtfully, automation becomes an essential part of a mature accessibility strategy.

Automated tools aren’t perfect, as you can learn from our comparison of automated testing tools for digital accessibility. While they can miss issues or flag false positives, they provide an additional layer of protection.

Accessibility isn’t about checking boxes. It’s about ensuring every user can navigate with confidence. Empty links, vague labels, and unnoticed errors erode that experience. By pairing expert manual audits with automated monitoring, organizations can stay ahead of problems and maintain a consistently usable site for everyone.

“Some of our most successful customers start with a manual accessibility audit, then use DubBot to continuously monitor their sites to help stay in compliance,” says Blaine Herman, Founder of DubBot.

Why people, processes, and technology all matter

Automation can surface problems, but detection alone doesn’t improve accessibility. Many monitoring tools stop at scanning and reporting, leaving teams with a growing list of issues and no clear path to resolution. A mature accessibility practice needs more than alerts. It needs a system that connects the right people to the right problems at the right time.

That’s where DubBot’s workflow design stands out. Instead of generating static reports, DubBot makes it possible to assign each issue to a specific person who can investigate, verify, and close it. That accountability loop turns automated findings into actionable work. It ensures that problems don’t linger in dashboards or spreadsheets. They move through a process until they’re resolved.

This combination of people, processes, and technology is what keeps accessibility sustainable. Automation catches the unexpected. Processes ensure issues are tracked and managed. People bring the judgment and expertise to fix what the tools uncover. Together, they create a system that doesn’t just identify accessibility barriers. It removes them.

The takeaway: Accessibility is ongoing. Every link needs a clear purpose. Every button needs a clear role. Every user deserves a clear path forward.

What sustained accessibility really requires

Accessibility breaks most often in the quiet moments: After a plugin update, after a new comment, after a well‑intentioned content edit that introduces something no one notices. That’s why mature teams don’t treat accessibility as a project milestone. They treat it as part of site quality, the same way they treat uptime, SEO health, and security patches. The work doesn’t stop because the web doesn’t stop changing.

Manual audits provide the depth: The human judgment, the context, the nuance that automation can’t replicate. But automation provides the vigilance. It surfaces the issues that appear between audits, the ones introduced by everyday activity, the ones no one would think to look for. When those two approaches work together, organizations aren’t just “checking for compliance.” They’re proactively protecting the user experience.

The real goal is stability. Every link should communicate its purpose. Every interactive element should expose its role. And every user should be able to move through a site without guessing what will happen next. That level of clarity doesn’t come from a single audit or a single tool. It comes from a system of processes, people, and technology. They’re designed to catch issues early, fix them quickly, and keep the site trustworthy over time.

Do you need an accessibility audit and a VPAT / ACR?

Is it time to update your VPAT / ACR and WCAG conformance statements? We can help. We do accessibility audits and VPAT reviews. If you’re not sure about these or want more info, contact us.

The post Beyond Manual Audits: How Automation Strengthens Accessibility appeared first on Equal Entry.

]]>
0 198349
From Reactive to Proactive: Building a Sustainable Accessibility Program https://equalentry.com/accessibility-program-sustainability/ https://equalentry.com/accessibility-program-sustainability/#respond Tue, 02 Dec 2025 20:14:31 +0000 https://equalentry.com/?p=197758 At an A11yNYC meetup, Lina Trifon, Senior Product Manager of Accessibility, shared practical strategies for building sustainable accessibility programs. Drawing from lived experience and product leadership, Lina outlined how to use the W3C Accessibility Maturity Model and a dual-track approach -- reactive and proactive -- to move organizations from awareness to integration. The conversation emphasized dignity, inclusion, and realistic steps for embedding accessibility into workflows.

The post From Reactive to Proactive: Building a Sustainable Accessibility Program appeared first on Equal Entry.

]]>
This article is based on Lina Trifon‘s talk at A11yNYC on how to build a sustainable accessibility program using dual-track strategies and the W3C maturity model.

Understanding the challenge

Accessibility programs often begin with good intentions but quickly run into barriers. Lina named the most common ones:

  • Overwhelming technical debt.
  • Limited resources.
  • Competing priorities.
  • Lack of buy-in.

These challenges are familiar to anyone who’s tried to push for accessibility in a product-driven environment.

She emphasized that buy-in isn’t just a leadership issue. Resistance can come from any level of an organization, especially when accessibility is seen as extra work rather than an essential quality. That’s why it’s important to start with clarity by understanding where your organization stands before deciding how to move forward.

To do this, Lina recommends using the W3C Accessibility Maturity Model. It’s a framework that helps teams assess their current state and set realistic goals. The model includes four levels:

  1. Inactive
  2. Launch
  3. Integrate
  4. Optimize

Using the W3C maturity model

Most organizations Lina works with are in the “launch” phase. That means they’ve started thinking about accessibility but haven’t yet embedded it into their workflows. Her goal is to help them reach the “integrate” phase, where accessibility becomes part of everyday practice.

The W3C model includes seven criteria, but Lina focuses first on two: the software development lifecycle and knowledge/skill building. These areas offer the most leverage for product teams and are often the easiest to influence early on.

She referenced a resource from the North Carolina Department of Public Instruction, which includes a self-assessment rubric and dynamic recommendations for each maturity level.

The dual-track strategy: reactive and proactive

To move from launch to integrate, Lina uses a dual-track strategy. The reactive track addresses legacy issues. The proactive track embeds accessibility into future work. Both are necessary for long-term success.

Reactive work often starts with an audit. Lina prefers external, manual audits. This is especially true in regulated industries like education and healthcare because they carry more weight with stakeholders. Manual testing catches more issues and reflects real user experience.

Once issues are identified, prioritization is key. Lina recommends considering frequency of use, severity of impact, and whether a feature is about to be rebuilt. This helps teams avoid wasting effort on soon-to-be-retired components.

Remediation and component libraries

When it’s time to fix issues, batching them by component or user flow is more efficient than tackling them one by one. Lina also recommends leveraging component libraries, which are collections of reusable UI elements like buttons and forms. Fixing a component once can improve accessibility across the entire product.

However, she cautions that component libraries aren’t magic. Teams still need to verify implementation and ensure updates are applied consistently. In organizations without a design system, Lina has created shared decision logs to document accessibility choices and guide future work.

Documentation is critical. It helps teams avoid repeating mistakes and ensures that accessibility decisions are visible across roles.

Shifting left with proactive strategies

Reactive work is necessary, but it’s not enough. Without proactive strategies, teams will keep generating new accessibility debt. That’s why Lina focuses on shifting left. This means embedding accessibility into the earliest stages of product development.

Training is the first step. Lina recommends both general and role-specific training and stresses that learning should be ongoing. One-time workshops alone don’t cut it. People need to practice accessibility to retain it.

Next, she suggests creating role-based guidelines and checklists. These reduce friction and make accessibility easier to implement. Checklists aren’t perfect, but they help teams build habits and clarify expectations.

Making accessibility part of the process

The final step is process integration. Accessibility should be required, not optional. Lina recommends adding checkpoints to design reviews, code reviews, and QA testing. She also encourages design annotations that clearly communicate accessibility requirements.

To sustain progress, accessibility must be part of performance reviews, KPIs, and job descriptions. Treat it like security. In other words, non-negotiable and essential to quality. This helps shift the mindset from “extra work” to “core responsibility.”

These strategies worked at Lina’s previous company. Accessibility became normalized, and the dedicated accessibility team was eventually disbanded. That’s because the work was fully embedded.

Building buy-in and community

Getting buy-in is often the hardest part. One of the most effective tips is showing short video clips of users with disabilities interacting with your product. Real feedback builds empathy and helps stakeholders understand the impact.

Legal risk can also motivate change, especially in regulated industries. But it’s not a sustainable motivator. Shame and fear may spark action, but long-term success comes from pride, ownership, and inclusion.

The A11yNYC event itself modeled best practices. It included live captioning, ASL interpretation, accessible seating, and gender-neutral restrooms. The community emphasized iteration, feedback, and shared learning.

Embedding accessibility for the long haul

Building a sustainable accessibility program takes more than audits and checklists. It requires cultural change, practical tools, and human-centered design. Lina Trifon’s approach offers a clear path forward: assess honestly, act strategically, and embed accessibility into every layer of the organization. The A11yNYC community continues to lead by example, showing that accessibility is possible and essential.

What organizations can do next

  • Use the W3C Accessibility Maturity Model to assess your current state.
  • Start small within your role to raise awareness and ask questions.
  • Prioritize manual testing and real user feedback.
  • Build accessibility into KPIs, job descriptions, and performance reviews.
  • Document decisions and share knowledge across teams.
  • Treat accessibility as a core quality metric, not a compliance checkbox.

Video highlights

Watch the presentation

Bio

Lina Trifon (she/they) is the Senior Product Manager of Accessibility at Ellevation Education. Lina is passionate about creating inclusive and equitable experiences for all users. In her role, Lina develops and executes an accessibility strategy that builds a proactive, inclusive culture across the organization.

Lina’s work ranges from conducting audits and prioritizing what needs to be fixed to training teams and building new workflows that make accessibility a natural part of how the research and development team operates.

Resources

Frequently asked questions about sustainable accessibility programs

What is the W3C Accessibility Maturity Model?

The W3C Accessibility Maturity Model is a framework that helps organizations assess and improve their accessibility practices. It defines four levels of maturity:

  • Inactive: No accessibility awareness or program in place.
  • Launch: Initial efforts exist but are disorganized.
  • Integrate: Accessibility is embedded into workflows and processes.
  • Optimize: Accessibility is fully normalized and part of the company culture.

Organizations can use this model to identify gaps, set realistic goals, and track progress across seven key criteria, including development lifecycle, procurement, and skill building.

What does a dual-track accessibility strategy mean?

A dual-track accessibility strategy combines two approaches:

  • Reactive track: Fixes existing accessibility issues, often uncovered through audits.
  • Proactive track: Embeds accessibility into future workflows to prevent new issues.

This strategy helps organizations address legacy barriers while building sustainable, inclusive practices. It’s especially useful for teams starting at the “launch” phase of the maturity model.

Why is manual accessibility testing important?

Manual testing is essential because automated tools only detect a small percentage of accessibility issues, typically around 20 to 30%. Manual testing involves real people using assistive technologies like screen readers or keyboard navigation to identify barriers that automated scans miss.

Manual audits are especially valuable in regulated industries like healthcare, education, and finance, where credibility and thoroughness matter. They also provide richer insights into user experience and usability.

How can organizations prioritize accessibility issues after an audit?

Lina recommends prioritizing based on three factors:

  • Frequency of use: Focus on high-traffic pages or features.
  • Severity of impact: Address blockers that prevent task completion.
  • Product roadmap: Avoid fixing features that are being retired or rebuilt.

This approach helps teams avoid overwhelm and focus on changes that deliver the most impact for users.

What are component libraries and why do they matter for accessibility?

Component libraries are collections of reusable UI elements–like buttons, forms, and icons–that developers use to build digital products. Fixing accessibility in a component library can improve multiple areas of a product at once.

However, updating the library isn’t enough. Teams must also verify that components are implemented correctly across the product. Lina recommends early remediation of shared components to maximize efficiency and reduce future debt.

How can accessibility be embedded into product development workflows?

Embedding accessibility means making it part of every stage of development. Lina suggests:

  • Adding accessibility checkpoints to design reviews, code reviews, and QA testing.
  • Using design annotations to communicate accessibility requirements.
  • Creating role-specific checklists and best practices.
  • Including accessibility in KPIs, job descriptions, and performance reviews.

This shift helps normalize accessibility and ensures it’s treated as a core quality metric.

What’s the best way to get stakeholder buy-in for accessibility?

One of the most effective strategies is showing real user feedback. Recording short clips of users with disabilities interacting with your product can build empathy and highlight barriers in a tangible way.

Legal risk is another motivator, especially in industries with compliance requirements. However, long-term buy-in comes from aligning accessibility with business goals, customer experience, and team pride.

Can accessibility programs ever stop doing reactive work?

Yes, but only when proactive practices are fully embedded. Lina shared that at a previous company, her team completed most reactive work and normalized accessibility across workflows. As a result, the dedicated accessibility team was disbanded. Not because the work ended, but because it became everyone’s responsibility.

This outcome is bittersweet. It reflects success, but also requires ongoing vigilance to ensure accessibility remains a priority.

The post From Reactive to Proactive: Building a Sustainable Accessibility Program appeared first on Equal Entry.

]]>
0 197758
Beyond the Page: How an Inclusive Library Model Is Redefining Accessible Literacy https://equalentry.com/assistive-technology-print-disabilities/ https://equalentry.com/assistive-technology-print-disabilities/#respond Tue, 04 Nov 2025 15:09:53 +0000 https://equalentry.com/?p=197405 Most people take reading for granted. But for those with print disabilities, access to books, news, and learning materials can be life-changing. The Andrew Heiskell Braille and Talking Book Library transforms that access into reality, offering free talking books, Braille materials, digital downloads, and personalized tech support for anyone who can't process standard print. Through tactile arts labs, assistive technology coaching, and inclusive programming, the library empowers patrons to read, create, and connect on their own terms. They prove that literacy is about freedom.

The post Beyond the Page: How an Inclusive Library Model Is Redefining Accessible Literacy appeared first on Equal Entry.

]]>
This article is based on Chancey Fleet and Shane Smith‘s “Literacy Beyond Print: Accessible Reading, Tech and Graphics at the New York Public Library” talk at A11yNYC.They explain the purpose of the Andrew Heiskell Braille and Talking Book Library, who it serves, and how tech and tactile graphics expand literacy beyond print for blind, low-vision, and print-disabled readers.

Most people never think twice about reading. You pick up a book, scroll a screen, scan a label, and move on. But for millions of Americans, those simple acts are blocked by barriers that have nothing to do with interest or intellect. They’re blocked by print.

The Andrew Heiskell Braille and Talking Book Library removes those barriers and reimagines what literacy looks like when access is the starting point, not the afterthought.

This isn’t a story about accommodations. It’s about infrastructure. From tactile maps and talking book players to long-term loan Braille e-readers and personalized tech coaching, Heiskell’s model centers dignity, autonomy, and joy. It’s a blueprint for inclusive design that works for every institution trying to serve with respect.

When print becomes a barrier

Reading is often treated as a universal skill as something everyone can do, given time, interest, or education. But that assumption erases a critical reality: Millions can’t access printed text at all because the format is inaccessible. This is known as a print disability.

A print disability isn’t a diagnosis. It’s a functional barrier. It means someone can’t read standard print due to blindness, low vision, mobility disabilities that make it hard to hold or turn pages, or cognitive and learning disabilities that affect how text is processed. These barriers show up in everyday moments: Trying to read a prescription label, follow a recipe, fill out a form, or enjoy a book. And they’re often invisible to those who don’t face them.

The consequences are far-reaching. When reading is inaccessible, so is education, employment, civic participation, and leisure. That’s why literacy must be understood as infrastructure and a public right. And that’s where the Andrew Heiskell Braille and Talking Book Library comes in.

In addition to offering books in alternative formats, Heiskell reimagines the entire reading experience. This includes everything from tactile design and digital access to personalized support and community programming. It’s a case study in what happens when access is the foundation.

A library engineered for access and equity

While The Andrew Heiskell Braille and Talking Book Library is a branch of the New York Public Library, it’s also a designated regional library for the National Library Service for the Blind and Print Disabled (NLS). NLS is a Library of Congress program.

That dual identity gives it a rare kind of reach. It serves local patrons while also supporting readers across all five boroughs and Long Island through its NLS designation.

Although the five boroughs are in NYC, not all are served by NYPL. Heiskell bridges that gap by functioning as both a local and regional provider. That’s what makes it unique. It’s not just a neighborhood library, it’s a regional hub for accessible literacy.

It provides free access to talking books, Braille materials, digital downloads, and assistive technology for anyone who can’t use standard print. That includes people who are blind, low vision, mobility disabled, or who have reading disabilities that affect how they process text. And it’s not limited to individuals. Schools, senior centers, service agencies, and other institutions can register to support their communities directly.

What makes Heiskell stand out is how it delivers. Materials are mailed postage-free, devices are loaned long-term, and digital access is barrier-free. Patrons can personalize their preferences, download books instantly, and receive tech coaching without gatekeeping. It’s a model built on dignity, not bureaucracy.

And it’s scalable. Every state has at least one NLS-designated library. But Heiskell’s approach, blending tactile creation, community programming, and personalized support, offers a blueprint for what inclusive public services looks like when they’re designed with users at the center. The library is an infrastructure for equity.

Tools that redefine how we read

At the Andrew Heiskell Library, access is tactile, audible, and personal. Patrons receive free talking book players that are designed for ease of use: with large tactile buttons, built-in speakers, and no internet required. Books arrive by mail on cartridges, and users can listen at their own pace, with no due dates and no penalties.

For Braille readers, the library offers long-term loan Braille e-readers based on a major shift made possible by a change in federal law. These devices allow patrons to download books directly from Braille and Audio Reading Download (BARD) and read them on refreshable Braille displays to provide patrons with autonomy. They choose what to read, when they read it, and how they read it.

Patrons can also set genre preferences and receive automatic shipments of new titles, removing friction from the reading experience. Whether it’s tactile, digital, or audio, the tools are built around the reader instead of the system.

Digital access without the digital divide

Patrons at the Heiskell Library can download thousands of titles through BARD, the Braille and Audio Reading Download service from the Library of Congress. It works on mobile devices and computers. Use can choose their own books, download them instantly, and read at their own pace. There are no due dates and no limits.

Staff walk patrons through the setup process, help them install the app, and make sure they’re ready to use it independently. For those who prefer desktop access, BARD Online offers the same collection through a web browser.

Heiskell also connects users to Bookshare and NFB Newsline. These platforms provide access to newspapers, magazines, and academic texts in formats compatible with screen readers and Braille displays. All services are free to eligible users and designed to meet a range of reading needs. Every tool is chosen for usability. Every step is supported.

Coaching, creation, and community

At Heiskell, support starts with conversation. Staff offer one-on-one tech coaching to help patrons navigate BARD, Bookshare, screen readers, and mobile apps. The goal is practical independence by getting readers to the point where they can choose, download, and enjoy content without barriers.

The Dimensions Lab expands that mission into tactile creation. Its purpose is to give people a place for tactile creation for accessibility, such as tactile graphics, raised line drawings, and 3D prints.

Patrons collaborate with staff to produce custom tactile graphics, maps, and educational tools. These aren’t off-the-shelf solutions. They’re built to meet specific needs, from classroom diagrams to museum layouts. Blind makers and designers also use the space to prototype their own projects, with support from staff.

Workshops and meetups bring people together across disability and design communities. Whether it’s building a tactile voting guide or exploring 3D printing, the focus is on shared problem-solving. The library is a place where people build tools, skills, and relationships that last.

What you can do

Talk to readers using audiobooks or large print, teachers, and groups about the library services for people with print disabilities. If you work in education, healthcare, tech, or public service, you already serve people with print disabilities. The question isn’t whether they’re in your audience. It’s whether your systems let them in.

Audit your materials. Are your forms, flyers, and digital content accessible to someone who can’t see, hold, or process standard print? If not, you’re creating barriers. These can be fixed with the right tools and mindset.

Refer eligible individuals and institutions to the Andrew Heiskell Library. Help them apply for free access to talking books, Braille materials, and tech coaching. Consider having a talking book player at your institution.

If they live outside NYC or Long Island, connect them to their state’s NLS regional library. NYPL’s virtual programs are open to anyone joining from anywhere virtually. Out of town visitors are welcome to stop by.

Support the model. Volunteer, donate, or advocate for tactile creation labs, long-term device loans, and friction-free digital access.

Video highlights

Watch the presentation

Bio

Chancey Fleet is the Assistive Technology Coordinator, and Shane Smith is the Managing Librarian, at NYPL’S Andrew Heiskell Braille and Talking Book branch.

Resources

Frequently asked questions

What is a print disability?

A print disability is any condition that prevents someone from reading standard printed text. This includes blindness, low vision, and mobility disabilities that make it hard to hold or turn pages, and some cognitive or learning disabilities.

Who qualifies for services at the Andrew Heiskell Braille and Talking Book Library?

Anyone who is a New York City or Long Island resident with a print disability, whether due to vision, physical mobility, or reading challenges, can apply. Institutions like schools, senior centers, and service agencies can also register to support their communities.

What kinds of materials are available?

The library offers talking books, printed Braille, digital Braille, and access to over 160,000 audio titles. Materials can be mailed or downloaded, and patrons can personalize their preferences for genre, format, and delivery.

How do I apply or refer someone?

Individuals and institutions can apply online at talkingbooks.nypl.org. Certification can be provided by a wide range of professionals, including doctors, educators, social workers, and librarians.

What is BARD and how does it work?

Braille and Audio Reading Download (BARD) is a free app and web platform that gives eligible patrons access to the library’s digital collection. Users can download books to mobile devices or specialized players, with no due dates or limits.

What is NLS?

NLS stands for the National Library Service for the Blind and Print Disabled, a division of the Library of Congress. It runs a free national library program that provides Braille and audio materials to people who can’t read standard print due to visual, physical, or reading disabilities.

Here’s what makes NLS valuable.

  • It partners with a network of regional libraries across all 50 states, including the Andrew Heiskell Library in NYC.
  • Materials are mailed postage-free under the United States Post Office (USPS) “Free Matter for the Blind or Handicapped” designation. (This is the official term.)
  • Materials are mailed postage-free under the federal designation “Free Matter for the Blind or Handicapped,” a term used by USPS, though many organizations now prefer more inclusive language like print disabilities.
  • Patrons can also download books instantly using the BARD mobile app or web platform.
  • All content is exempt from copyright restrictions, so users can keep books as long as they want.

It’s not just a library. It’s a national infrastructure for accessible literacy.

Can I volunteer or support the library?

Yes. Volunteers help record audiobooks, assist with tech coaching, and support events. Tech professionals, educators, and accessibility advocates are especially encouraged to get involved.

The post Beyond the Page: How an Inclusive Library Model Is Redefining Accessible Literacy appeared first on Equal Entry.

]]>
0 197405
Got Data, Now What? Storytelling Through Accessible Design https://equalentry.com/got-data-now-what-storytelling-through-accessible-design/ https://equalentry.com/got-data-now-what-storytelling-through-accessible-design/#respond Tue, 21 Oct 2025 18:19:34 +0000 https://equalentry.com/?p=197326 Based on Dr. Angela Young’s A11yNYC talk, this explores how inclusive design transforms raw data into stories everyone can understand and act on. They unpacked the hidden costs of inaccessible dashboards, explained how to design for access from the start, and showed how storytelling can make complex data clear and memorable. The takeaway: Accessible data design is more than compliance. It’s a strategy for clarity, equity, and trust. When organizations build accessibility into every chart, caption, and narrative, they turn information into insight and ensure that every voice can participate in decision-making.

The post Got Data, Now What? Storytelling Through Accessible Design appeared first on Equal Entry.

]]>
This article is based on Dr. Angela Young‘s talk at A11yNYC, which explored how accessibility transforms data from a static report into a shared story that drives understanding and action.

Organizations collect more data than ever before. Yet many teams struggle to turn those numbers into insights everyone can use. Angela tackled one of the most persistent challenges in digital communication: How to make data meaningful, inclusive, and equitable.

Their approach was practical and grounded in lived experience. They didn’t dwell on theory or compliance checklists. Instead, they focused on real-world friction. How inaccessible design decisions, often made with good intentions, quietly reinforce exclusion. The message was clear: If your data isn’t accessible, it isn’t complete.

The hidden cost of inaccessible dashboards

Data visualization can make or break understanding. Charts, dashboards, and infographics are supposed to simplify complexity. But when designed without accessibility, they obscure meaning and exclude people. Angela described this as a silent problem. When only a few can interpret the data, decision-making turns into an exclusionary exercise.

Too often, teams prioritize aesthetics without realizing that those choices can become barriers. Aesthetics include vibrant colors, sleek visuals, and interactive filters. Flattened images of charts, hover-only filters, and color-coded categories without labels might look polished, but they silently tell some users: This data isn’t for you.

Angela shared a vivid example. In one project, a data dashboard was created as a single static image. It seemed harmless until a colleague with low vision couldn’t access it. Suddenly, an entire perspective was missing from the conversation. The solution was simple education and redesign, but the takeaway was deeper. Accessibility isn’t a technical fix. It’s a structural responsibility.

When data isn’t accessible, then you’re leaving the decision-making to only a portion of the available audience. This isn’t equity.

They also highlighted cognitive overload as another common pitfall. Dashboards overflowing with charts and metrics can overwhelm users, especially those with cognitive or learning disabilities. Cluttered layouts, unlabeled filters, and poor navigation increase cognitive effort and fatigue.

Designing for access from the start

Angela urged teams to rethink how they approach accessibility. It shouldn’t be bolted on after launch. It should be integrated from the first sketch. “Accessibility isn’t perfection. It’s vigilance,” they said. “It’s the ongoing habit of checking assumptions.”

They shared several principles for designing accessible data experiences from the beginning:

  • Use sufficient color contrast. Follow WCAG standards, maintaining at least a 4.5:1 ratio for text and visuals.
  • Label everything clearly. Descriptive axis titles and legends are essential; avoid placeholders like “Series 1.”
  • Don’t rely on color alone. Reinforce meaning with patterns, textures, or direct labels.
  • Keep text legible. Sans-serif fonts in 12–14 points work well, and all caps should be avoided for readability.
  • Make interactions accessible. Filters and dashboards should be keyboard navigable and screen reader compatible.
  • Write meaningful alt text. Describe insights, not just shapes or colors.

In practice, this means designing for comprehension, not just presentation. Accessibility is design that anticipates difference rather than reacting to it. Each of these ensures that data is usable by more people and that no single group holds the keys to understanding. Accessibility transforms data from something presented to something shared.

Simplifying for meaning

Complexity doesn’t automatically create clarity. Angela urged teams to resist the temptation to display every data point. Overloaded dashboards filled with decimals, percentages, and long tables make it harder for users to find the story behind the numbers.

They recommended focusing on the trends that matter most. Summaries and key callouts help people understand the message quickly. A brief statement like “Revenue grew 17% last quarter” is often more effective than a page of raw figures. The goal is to guide readers toward insights, not overwhelm them with details.

Whitespace supports this clarity. Many designers treat empty space as wasted real estate, but it functions as a visual breathing room that helps users separate ideas and process information. Accessible dashboards aren’t about flash. They’re about comprehension.

Writing alt text that tells the story

Angela emphasized that alt text should communicate meaning, not just describe visuals. Too often, alt text for charts lists colors and shapes without explaining what they show. Instead, the description should summarize the takeaway of the visualization.

A good example would be writing, “Bar chart comparing monthly sales for Product A and Product B, showing a 20% increase for Product A in August.” This phrasing conveys the story within the data rather than its structure.

Treating alt text as part of the design process has another advantage: it forces teams to clarify what they want each chart to say. If a visualization can’t be described in one or two sentences, it may not be telling a coherent story. Alt text, then, becomes an accessibility tool and a storytelling discipline.

Building structure and layout for understanding

Accessibility is about more than color and text contrast. It’s about how information is structured. Angela pointed out that even well-designed charts can fail if the layout creates friction. Legends that sit far from their corresponding visuals, filters hidden behind hover states, or dashboards that require precise mouse actions can all block access.

Good structure makes it easier to follow a narrative. Related elements should be grouped together, spacing should be consistent, and unnecessary visuals should be removed. Testing designs with real users, rather than relying solely on automated tools, reveals where confusion or barriers still exist.

Responsive design also matters. Dashboards should function on mobile devices and low-bandwidth connections and work with screen readers. Accessibility, at its core, is adaptability. Creating experiences that work for everyone, not just the average user.

Turning data into stories

People remember stories far better than they remember raw numbers. Angela’s framework for accessible storytelling brings together three layers that make data more human:

  • Visual layer: Colors, labels, and structure that make information perceivable.
  • Language layer: Captions, summaries, and alt text that make it understandable.
  • Interaction layer: Controls, filters, and exports that make it operable.

When these layers work in harmony, data transforms from an abstract display into a shared narrative. Plain-language summaries, consistent phrasing across visuals, and clear headlines all help readers find meaning faster.

Effective data storytelling does more than make numbers emotional. It also makes them relatable. By organizing visuals in a way that mirrors how people read and process information, teams can help audiences understand complex insights without needing specialized training.

Seeing data as dialogue

Angela described data as a conversation rather than a monologue. A dashboard needs to be more than just displaying information to be consumed. It should invite interaction, reflection, and collaboration. When data is accessible, it encourages participation from everyone, not limited to those who are already comfortable interpreting charts.

This shift from presentation to dialogue democratizes knowledge. Instead of a small group of analysts or designers holding the keys to interpretation, accessibility opens the conversation to more voices. When more people can access and understand the data, the insights become richer and more representative.

Accessible storytelling also reduces friction. A clear narrative, consistent structure, and inclusive design allow users to engage with the material immediately. Instead of deciphering visuals, they can focus on problem-solving and decision-making.

Accessibility as a signal of trust

Beyond usability, accessibility signals credibility. When organizations consistently present information in formats that everyone can understand, they build trust. Users come to expect clarity and inclusion, and that expectation strengthens engagement.

Inaccessible dashboards, on the other hand, concentrate knowledge in the hands of a few “data gatekeepers.” Those who can interpret complex visuals end up making decisions for everyone else. Accessible design removes that imbalance. It makes transparency and participation part of the organization’s culture.

More than a compliance requirement, accessibility is a demonstration of respect for all users. It shows that every person’s ability to understand information matters equally.

Real-world friction and practical fixes

Angela’s examples illustrated that accessibility challenges are rarely about neglect. They’re about awareness. In one client project, performance charts relied entirely on red and green bars. For colorblind users, the entire message was lost. In another case, filters were available only through mouse hover actions, making them invisible to keyboard users.

These aren’t unusual mistakes. They’re common across industries. The key is how teams respond. Angela encouraged treating accessibility gaps as opportunities to learn rather than failures to be corrected. Each small improvement expands access and reduces exclusion.

They also acknowledged that accessibility work happens within constraints, such as tight schedules, competing priorities, and legacy systems. The goal isn’t perfection but progress. Every accessible choice, no matter how small, makes a difference.

From data to action

Angela framed accessible storytelling as a three-part narrative arc: set the stage, show the shift, and call to action.

  • Set the stage: Provide context for why the dataset matters.
  • Show the shift: Explain what changed or what patterns emerged.
  • Call to action: Identify what decisions or steps should follow.

A well-structured story presents findings and guides people toward understanding and action. For example, instead of listing engagement rates, tell the story of how accessibility improvements doubled engagement and what that means for future strategy. The data stays the same, but the story makes it memorable.

Accessibility as strategy

Angela outlined a strategy for how organizations can think, communicate, and lead more effectively while designing inclusive charts. Accessible data design fosters clarity, equity, and trust. It ensures that everyone has the same opportunity to understand and act on information.

Accessibility is empathy in process. It’s an intentional effort to include others in understanding. When data storytelling is inclusive, it reflects the diversity of the audience it serves.

In the end, accessibility isn’t a box to check or a compliance hurdle to clear. It’s the foundation for effective communication. If data is meant to drive action, then the story it tells must be one that everyone can read.

Video highlights

Watch the presentation

Bio

Dr. Angela Young (they/them) is a queer, nonbinary, multiply disabled accessibility strategist with a passion for transforming complex systems into inclusive experiences. As Lead, Enterprise Technical Accessibility Training and Awareness, Angela helps teams embed accessibility into everyday product decisions.

With a background in education, tech, and design, Angela combines practical training with systems thinking and lived experience. They specialize in demystifying accessibility for devs, designers, and data teams, reminding us that inclusion is not extra work but the real work.

Resources

  • Power BI: Microsoft’s flagship BI tool with built-in accessibility features, including keyboard navigation, screen reader support, and alt text for visuals.
  • Tableau: Powerful visualization tool; accessibility features are improving, but it requires disciplined design for compliance.
  • Excel: Often overlooked, but one of the most screen reader-friendly tools for building accessible charts and tables.
  • Flourish: Web-based visualization platform supporting alt text and responsive embeds for inclusive online charts.
  • Color Oracle: Free color blindness simulator for checking designs against multiple vision types.
  • Stark: Accessibility plugin for Figma, Sketch, and XD; checks color contrast and generates alt text guidance.
  • Storytelling with Data (Cole Nussbaumer Knaflic): Classic guide to clarity, simplicity, and storytelling with charts.
  • Data Feminism (Catherine D’Ignazio and Lauren Klein): Examines power, equity, and inclusion in how we use and present data.
  • Accessibility Guidelines for Data Visualization (WCAG 2.2): WCAG 2.2 documentation applied to data viz contexts.
  • Nightingale (Data Visualization Society): Online journal of the Data Visualization Society with many accessibility-oriented articles.
  • Tamara Munzner: Author of  Visualization Analysis and Design and thought leader in viz theory.
  • Andy Kirk: Visualization consultant, trainer, and author of Data Visualization: A Handbook for Data Driven Design
  • Nadieh Bremer: Award-winning data visualization designer sharing innovative approaches.
  • Stephanie Evergreen: Specialist in data reporting and visualization for clarity and accessibility.

The post Got Data, Now What? Storytelling Through Accessible Design appeared first on Equal Entry.

]]>
0 197326
Accessibility Audits: Because Everyone Deserves to Stay in Focus https://equalentry.com/accessibility-audits-focus-reflow/ https://equalentry.com/accessibility-audits-focus-reflow/#respond Thu, 09 Oct 2025 19:55:26 +0000 https://equalentry.com/?p=197169 This article explains why accessibility audits are essential for websites and digital content. It shows how even small design changes can create barriers for users. The Equal Entry team identifies these issues and fixes them. It emphasizes that accessibility is an ongoing responsibility, not a one-time task. Fixing problems improves usability, protects credibility, and shows respect for all users.

The post Accessibility Audits: Because Everyone Deserves to Stay in Focus appeared first on Equal Entry.

]]>
Accessibility isn’t a one-and-done checklist. It’s a living commitment. Even teams like ours that build accessibly from the start can introduce barriers when content or design evolves. That’s why regular accessibility audits matter. They catch issues that would frustrate users and erode credibility.

In our latest audit, we uncovered two issues that could easily go unnoticed by sighted mouse users. However, they create friction for many others. One blocked some content when zoomed in. The other turned keyboard navigation into guesswork. We didn’t just flag them. We fixed them as we always do.

Here’s where some companies go wrong. They pay for an accessibility audit, but they don’t fix issues. What’s the value in an accessibility audit if they don’t address the issues?

We work with our clients to help them fix their accessibility issues. Here, we share the process of what meaningful accessibility work looks like.

Reflow

During the audit, we discovered a problem that happens when users zoom in on the page. This is a reflow issue.

At the bottom of the page, we have a newsletter signup section. However, when we zoom in, Knomo, our mascot, overlaps with the text that says, “Get free accessibility tips and news delivered to your inbox.”

How much should someone be able to zoom a page? According to the Web Content Accessibility Guidelines (WCAG) Success Criterion 1.4.10 Reflow, those users should be able to zoom to a point where the width is the equivalent of 320 pixels.

The equivalent of 320 pixels means that if your browser width is 1280 pixels, you should be able to zoom by 400%.

The following video explains and demonstrates.

Here’s how we tested this:

  1. Set the zoom rate to 100%.
  2. Set the window width to 1280 using Window Resizer, a Chrome extension.
  3. Zoom the page by 400%.

We look for any loss of information. Knomo overlaps with the text, which makes it hard to read.

After applying the fix, the character image and the text no longer overlapped. Now, the user can read without any overlapping content.

How to fix the reflow accessibility issue

Every solution will be different based on the code and design. Before we fixed the issue, the vertical blue line (<div class=”line”></div>) connecting the numbers had a position: relative for the whole container (<div class=”our-process”>).

Before the reflow fix

Here’s the HTML.

<div class="our-process"> <!-- Line container -->
  <h2>Our accessibility auditing process</h2>
  <p>Our auditing process follows careful steps:</p>
  <button onclick="expandAllItems()" id="expand-all" aria-expanded="false" aria-controls="item-1 item-2 item-3 item-4 item-5 item-6 item-7">Expand All</button>
  <ol class="list">
      [...]
  </ol>
  <div class="line"></div> <!-- line element -->
</div> <!-- End of line container -->

Here’s the CSS.

.our-process {
  [...]
  position: relative; /* Used as reference when using position absolute on child elements */
}

.our-process .line {
  [...]
  position: absolute;
  top: 290px;
  bottom: 164px;
  left: 210px;
}

After the reflow fix

The position: relative should be moved to a new div that only has the ol and line element.

Here’s the revised HTML.

<div class="our-process">
  <h2>Our accessibility auditing process</h2>
  <p>Our auditing process follows careful steps:</p>
  <button onclick="expandAllItems()" id="expand-all" aria-expanded="false" aria-controls="item-1 item-2 item-3 item-4 item-5 item-6 item-7">Expand All</button>
  <div class="list-container"> <!-- New container -->
    <ol class="list">
      [...]
    </ol>
    <div class="line"></div> <!-- line element -->
  </div> <!-- End of new container -->
</div>

Here’s the CSS for the fix.

.our-process .list-container {
  position: relative; /* Setting a new reference on child elements with position absolute */
}

.our-process .list-container .line {
  [...]
  position: absolute;
  top: 100px;
  bottom: 110px;
  left: 60px;
}

Now you can zoom in without any content hiding behind other elements.

Focus Visible

For all audits, we test the accessibility of navigating a website using only the keyboard. Not everyone uses a mouse. Many people rely on the keyboard. They press the Tab key to navigate through interactive elements. Instead of a mouse click, they use Enter or Space to activate the element.

In this scenario, we navigated Equal Entry’s accessibility services using only the keyboard. When you scroll down, there’s a section describing our six-step auditing process. Each step is clickable, and there’s also an “Expand all” button to reveal all the steps.

For keyboard users, it’s crucial that the current focus is always visible. This lets them know where they are on the page.

Here’s the problem. As we Tab through the buttons, the focus is visible on the steps. But when you land on the “Expand all” button, the focus indicator disappears. The following video demonstrates the focus visible problem. Instead of being visible, it’s invisible.

You can still activate “Expand all” with the keyboard. However, you have no idea where the focus went. This fails Web Content Accessibility Guideline (WCAG) 2.4.7: Focus Visible. This criterion requires a clear focus indicator on a keyboard-operable interface.

How to fix the focus visible accessibility issue

We fixed the focus visible by adding a visible focus rectangle to the “Expand all” button. Now, it’s consistent with the other interactive elements.

Before the focus visible fix

Here’s the before where the CSS showed no indicator for focus.

#expand-all:focus {
  border: none;
  outline: none; /* Removes default browser visual indicator */
}

After the focus visible fix

Here’s the after where we added the visual indicator in the CSS.

#expand-all:focus {
  border: none;
  outline: none;
  box-shadow: var(--focus-border); /* Added custom focus visual indicator */
}

Try it out! Here’s the accessibility audits page. Press Tab until you get to the “Expand All” link. Then, press the spacebar or Enter key.

The importance of fixing accessibility issues found in auditing

Finding accessibility issues is only the first step. Fixing them is what protects users and your organization. When reflow breaks, users lose access to content. When focus indicators disappear, navigation turns into guesswork. These aren’t minor problems. They’re barriers that exclude people and hurt the user experience.

Unresolved issues can lead to legal exposure, brand image damage, and lost trust. But when you fix them, you show users that their experience matters. You reduce risk, improve usability, and build credibility with every resolved barrier

That’s why our audits don’t stop at identification. We work with clients to fix what’s broken and explain why it matters. Because accessibility is more than compliance. It’s respect and better user experiences for everyone.

Ready to strengthen your accessibility efforts and compliance?

Whether you’re updating your VPAT, reviewing your ACR, or tackling WCAG conformance, we’ve got you covered. We specialize in accessibility audits and VPAT reviews that go beyond checkboxes: Reducing risk, improving usability, and respecting your users.

Not sure where to start? Let’s chat.

The post Accessibility Audits: Because Everyone Deserves to Stay in Focus appeared first on Equal Entry.

]]>
0 197169
Making Content Accessible for People with Limited English Proficiency https://equalentry.com/accessible-content-limited-english-proficiency/ https://equalentry.com/accessible-content-limited-english-proficiency/#comments Wed, 10 Sep 2025 16:47:59 +0000 https://equalentry.com/?p=197036 Based on Irina Morozova’s A11yNYC talk, this explores how thoughtful content and inclusive design can make digital experiences more accessible for people with limited English proficiency. She emphasizes that unclear language and poor design can turn everyday tasks into stressful barriers, eroding independence and dignity. The article offers practical strategies such as using plain language, meaningful headings, avoiding idioms and abbreviations, and designing multimedia with clarity and calmness. Irina advocates for respectful, intuitive design that empowers users and reduces cognitive load, highlighting the importance of planning for accessibility from the start

The post Making Content Accessible for People with Limited English Proficiency appeared first on Equal Entry.

]]>
This article is based on Irina Morozova‘s talk at A11yNYC. She talked about how to make content accessible through thoughtful content and inclusive design for people who aren’t fluent in English. Irina is an accessibility experience designer. She has a linguistics and teaching background.

Irina opens with an ATM example to illustrate how routine tasks can become inaccessible when design and language don’t align with the user’s needs. When you use a familiar ATM in your native language, there’s no one waiting behind you or distractions. But remove just one element, like language familiarity or a clear interface, and the experience quickly shifts from simple to stressful.

For users with limited English proficiency, unclear instructions or poor design can erode independence in critical moments. The ATM turns into a barrier. Irina uses this to emphasize that inclusive design is essential. It preserves dignity and independence in everyday life.

Why language access matters

The main language in New York is English; hence, much of the online content is also English. A Statista graph shows that in February 2025, English was the dominant language, at almost half.

This means the other half of the people in New York do not speak English as a primary language. Therefore, almost half of the people interacting with a digital product don’t have English as their main language. If organizations don’t consider those not fluent in English, they’re potentially losing half of their customers and users.

Users with different language proficiencies vary in their skills. These skills vary by background, exposure, and context. Language speakers are not a homogeneous group. Irina uses “people with limited English proficiency” because it’s descriptive and not judgmental. It’s stating a fact without any assumptions.

Poorly designed content is a barrier for everyone. More so, for people with limited language proficiency. It can cause cognitive overload, stress, and anxiety. How can we make content more accessible and reduce stress for speakers who aren’t fluent in English?

Inclusive content principles

The key is to use plain language written for 7th to 9th grade level. Word choices and sentences are one part of it. How the text is presented also matters. This includes the use of meaningful headings, which improve scannability. They may only be interested in one part of the page. Clear headings allow them to choose the section to read. It minimizes cognitive overload.

For example, you’re planning to go to a café. You want to know if you can bring your laptop. You find the rules, but you don’t want to read the whole page. Headings can help you find the specific section.

Avoid abbreviations and acronyms before spelling them out. For example, APR may be a common abbreviation in the U.S. However, not everyone knows it means annual percentage rate. Before using the abbreviation, spell out the term to help users connect it with the abbreviation or acronym.

Also, avoid using idioms, metaphors, and phrasal verbs. Phrasal verbs are expressions that combine a verb and another element, such as an adverb or a preposition. Instead of “look up,” use “search.” Not everyone knows what “snail mail” means, so using “mail” works better. “Saving for a rainy day” is an idiom. Keep it simple and use “saving.”

Design and multimedia considerations

Structure, short sentences, plain language, and word choices can get a boost when you use consistent flows, icons, and clear task instructions. Users need enough time to process. Even with good headings, the content needs to be in manageable segments.

Inclusive design means planning for error prevention and easy recovery for when a mistake happens. This is why it’s crucial to plan for accessible content and presentation early in the design process.

Like the best practices for content, there are well known rules for audio, video, and other multimedia. These include providing subtitles, captions, and transcripts. Ensure there are pauses between sentences rather than between words in audio. Non-verbal communication also matters.

Be mindful of accent comprehension barriers. Irina gave the example of an elevator’s voice being in a Scottish accent. It made it harder for some people to understand, even if they spoke the same language. Anyone can have an accent, whether a person’s primary language is English or another language. Accents can potentially add to the cognitive load. Hence, it’s important to pay attention to accents in the media.

Another important recommendation is to avoid background audio, unneeded pop-ups, and autoplaying audio and video. These can stress the user. Focus on calm technology, which means technology needs to simplify complexities, not introduce new ones.

One key to seamless technology is to use the correct language tags. When a website has the wrong language tag, it causes the content to be pronounced incorrectly.

Empowerment through respectful design

Content and digital experiences should bring joy, dignity, and empowerment. Respectful and inclusive design consists of clear and intuitive content. This happens when you plan for accessible content and presentation early in design. Apply accessibility principles, which reduce cognitive load. Think about designing with less noise and more intention.

The resources Irina shared are in the resources section of this article. They Web Content Accessibility Guidelines (WCAG) that support inclusive design for limited English proficiency. She also shared resources that can help with designing intuitive, inclusive, and respectful content. As you work on designing experience, ask this question: “How can your content reduce stress for your users with limited English proficiency.”

Video Highlights

Watch the Presentation

Bio

Irina Morozova is a seasoned accessibility experience designer with a background in linguistics, teaching, and UX. Her work highlights the importance of clear communication, cognitive accessibility, and culturally sensitive design that supports users with varying levels of language proficiency.

Her passion for accessible design comes from her own experiences navigating multilingual environments. This personal perspective fuels her commitment to crafting content that not only informs but also empowers users from all backgrounds.

Resources

The post Making Content Accessible for People with Limited English Proficiency appeared first on Equal Entry.

]]>
1 197036
How a Blind Person Uses Social Media and AI to Drive Accessibility https://equalentry.com/accessibility-social-media-ai/ https://equalentry.com/accessibility-social-media-ai/#respond Tue, 12 Aug 2025 14:35:45 +0000 https://equalentry.com/?p=196439 James Warnken, who is legally blind, shares how his background in digital marketing led him to a career in accessibility. In this conversation, he talks about using social media to gather real feedback from people with disabilities, how AI can help bridge accessibility gaps, and why lived experience matters in designing inclusive digital spaces. He also explains the difference between real accessibility efforts and performative ones as well as how organizations can do better by listening to the community.

The post How a Blind Person Uses Social Media and AI to Drive Accessibility appeared first on Equal Entry.

]]>
https://equalentry.com/wp-content/uploads/2025/07/EPI36-James-Warnken.m4a

James Warnken traces his career path from college internships in SEO and marketing to becoming a passionate accessibility educator and consultant. A powerful turning point came when he was asked, “How do you make your websites accessible?” A question that prompted him to embrace his disability more publicly and dive into the world of digital accessibility.

In this episode, James explains how accessibility shouldn’t be siloed in organizations. He also discusses his innovative use of social media to crowdsource real-time insights from people with disabilities. These insights often contradict assumptions baked into standards and guidelines.

James highlights the importance of lived experience in shaping accessible solutions and shares how he’s used AI tools to self-accommodate, even customizing web experiences when existing overlays fall short.

Hello, everyone. This is Thomas Logan from Equal Entry here with Ken Nakata of Converge Accessibility. In this episode of Accessibility Insights, we’re talking with James Warnken, a legally blind accessibility expert, about his experiences working in the field.

Digital marketing and accessibility

Ken Nakata: I am excited to have James here too, because I’ve known him for years and he’s a fantastic guy who knows a lot about accessibility.

James, from what I know, you started in digital marketing and only later came to accessibility. How did that happen, and can you describe some of the opportunities that you see in that overlap?

James Warnken: Absolutely. I went to college for digital marketing. I have a bachelor’s degree.

But I started working in marketing. While I was in college, I did a series of internships, and had my own business on the side. So, I started in the world of search engine optimization, optimizing websites for the search engines. From there I tiptoed into the world of design and development, and that led to those internships, one of them being working on, at the time, it was a 21,000 product catalog on Amazon.

I did a short stint in cybersecurity in Washington, DC, and ended my college internship career working as an SEO intern, working primarily on automations. And drip campaigns and email funnels and things like that. But I did do a lot of side work, building small business websites during college, fixing payment gateways.

A lot of the technical, like “My DNS records are not working. Can you fix them?” And then I’d watch a YouTube video, learn what needed to be done, go in and fix it. For a lot of small businesses that’s how I learned, was trading experience for fixing technical problems with them knowing that I hadn’t done it before.

That’s where I started. Coming out of college, I joined a startup here in northeast Ohio. Over the last couple of years, that startup has grown pretty steadily. As of 2021, I would say, I was in a conversation with an organization that was looking for digital marketing help and they were looking for somebody with a disability specifically, they wanted to hire somebody with a visual impairment.

I figured that was a pretty good opportunity for me to explore. And in that conversation, I was asked the question: “How do you make the websites you work on accessible?” As a person with a disability who didn’t openly talk about it, I didn’t disclose it unless I had to. That question hit me like a ton of bricks and it dug the first shovel full out of the rabbit hole of what accessibility is in the digital world.

And Ken, you and I shortly thereafter met, after I met with Mike Kess and we had several conversations and that just opened this whole industry, this whole field, this whole conversation up to me. And I went through a personal growth pattern and transition of becoming comfortable talking about my disability.

And in doing so, I learned how to, in a way, leverage my lived experience over the last 18 years. In this conversation and combine that with the marketing background and the tech background and all of it just came together and it made sense. And, I was asked by Mike very early on in that process, “Do you want to identify the problems or be the solution?”

And for me, one of the solutions that I wanted to bring to this space was teaching, educating, getting people that work in technical fields like designers and developers and engineers and content creators and marketers and all of those people working accessibility into their day-to-day, just like I had to learn how to do when I was asked that question.

And doing so from the perspective of lived experience, doing the bit from the perspective of learning accessibility during its a whole separate career and then transitioning. Creating that space where people can ask their questions about disability. They can let their curiosity lead them into how they can bring accessibility into what they do every day when they show up to work.

Ken Nakata: This is a cool connection because just yesterday we were talking about the same issue Xian Horn. This idea of how disability is an empowering feature, not a setback. And so I imagine James, that, as I’m listening to you talk, describing this, I’m thinking that “Wow!” This is an area where your disability and your knowledge of accessibility make you a better digital marketer, and your experience learning digital marketing makes you better at accessibility.

James Warnken: Exactly. And now with everybody that I teach, whether it’s a person with a disability or not, I echo that, and that accessibility belongs in every conversation within the business.

Whether it’s marketing, whether it’s branding, it’s communication, it’s leadership and management. It’s HR, it’s PR, accessibility has a place in all of those. I don’t necessarily know if accessibility should be its own division within an organization or its own department, more so that it should be integrated naturally across the organization.

And who better to lead that conversation than somebody who lives it every day? And so when I teach and I get the opportunity to train other individuals with disabilities who want to become accessibility specialists or engineers or experts, the biggest thing that I emphasize to them is, you’re learning accessibility, but nobody can challenge you on your lived experience.

Nobody can tell you what’s right or what’s wrong for you, and you learning the perspectives and the experiences of those in this community alongside you is just going to make you that much more valuable in this conversation of driving that change of being that difference. Getting us to a point where every website that gets built has accessibility in it.

Every time a new logo gets drawn up, accessibility is considered, and so on. And so forth.

Real-time data from users with disabilities using social media

Ken Nakata: Exactly. So, as I’ve known you for a long time, I know that you’ve been able to get real-time data from users with disabilities using social media. Can you describe that for the audience and how do you go about crowdsourcing this data?

James Warnken: First, I want to say I am excited to be here. I feel like I’ve moved up to the cool kids’ table, excited to be here just sharing and having a conversation around all of our favorite topic accessibility and I guess jumping right into crowdsourcing the data.

I didn’t anticipate that as what it would turn into. I originally just started out sharing content on social media primarily TikTok originally, about accessibility, and one of the things that I had seen as a user who had been on TikTok for a while before then was there is a large community of blind, individually impaired individuals on social media, sharing their experiences, telling stories, good and bad.

Talking about what works for them, sharing assistive technology there’s just a huge community, so much so that there’s a #BlindTalk that has emerged around that community of people experiencing visual impairments, visual disabilities, all of those things. And so when I started making content, originally I wanted to just talk about accessibility and start to build my personal brand.

And as I got into it, I quickly in and started asking questions. And I believe one of the first questions I asked was, “As a person with a disability, what assistive technology do you use?” I was genuinely curious. I, as a person with low vision, I use magnification some days, screen readers the next, and from all the documentation I had been reading, learning about accessibility, a lot of it said “Blind people use screen readers, low vision people use magnification.”

And so from that video, I got a ton of comments, a ton of direct messages, comparatively to what I normally see as an engagement rate, and going through the comments, there were people, “I’m low vision and I use a screen reader.”, “I’m low vision and I use magnification.” And so again, marketing background, I said, “Wow, that’s helpful insight from actual people.”

Not a report that I read online that didn’t have a published date on it. And so I shared it back with Ken and Jeff and we dove down that rabbit hole of what other assumptions are being made about the way people with disabilities are engaging and interacting.

And so I’ve maintained that as part of like my brand as being able to, as a member of the community, ask those kinds of questions that might seem a little bit out of left field or a little bit more insensitive coming from a business person or a researcher or something like that, that’s not a part of the community.

And so it’s been cool to be that person bridging both sides of the conversation. Those who care about accessibility and want to do it right and want to make sure that they’re meeting the needs and the wants and the personal preferences of their users and the community of users that just want things to work.

Thomas Logan: James, that’s very interesting that you’ve been able to call this information, the wisdom of the crowds. I think as an accessibility professional, who usually uses the Web Content Accessibility Guidelines (WCAG) as my guide, I’m always feeling like I need to be learning more from real users of the technologies.

So from your work and your crowdsourcing, what’s an interesting insight that you came across from your conversations on social media with other people with disabilities?

James Warnken: I think the first one is the elephant in the room that you don’t have to be blind to use a screen reader. People with low vision. I know people over the years who have dyslexia that use screen readers or built-in text-to-speech generators or different things like that because it’s easier for them to listen than it is to read with dyslexia and other forms of disabilities.

Screen readers are more popular than people realize, even beyond just the blind and visually impaired community. But I think the other thing that I’ve started to pick up on and recognize throughout the different years of doing this and being on social media, consuming content more so than creating it, is just how open the disability community is to sharing those insights and those opinions, and how little there are people asking for them.

I know that there are screen reader surveys that go out every year and collect data on that. But I don’t see a whole lot of creators or a whole lot of organizations that are necessarily going out and asking the disability community. And I think part of that really just boils down to like the intimidation factor of “We don’t want to ask the wrong questions. We don’t want to seem insensitive. We don’t want to seem like we’re asking for this data to increase our profit margins.”

And I think there’s a right and a wrong way to approach that conversation with the disability community as a fully able-bodied person or an organization or a business, where the community is going to meet you with more than you could have ever expected.

Ken Nakata: Very cool.

Finding accessibility solutions

Thomas Logan: That’s interesting, James. And I agree. I don’t experience that in my world, people asking enough questions out to the community to learn. They just take in my world the WCAG standards as truth. So, for screen magnification, could you talk about the tools you use and what tips and techniques you use with the screen magnifier on your platforms?

James Warnken: The combination that I use daily is the built-in browser magnification. Anywhere from 175% to 300, depending on how teeny tiny the text is that I’m trying to read. But beyond that, I am a huge fan of dark mode. For some reason, the black text on a white background is harder to read than inverted with a dark background and light text for me.

And so with that, I encountered a ton of websites that have color contrast issues, right? Going back to WCAG, contrast minimum is a big one that affects me just as much as resize text and reflow do. Instead of not being able to read the content as the AI has emerged over the last couple of years, I’ve taken some of those.

And instead of trying to reach out to that business and get that color change, I’ve used AI to create some chrome extensions or bookmarks for myself, that actually go in and override the CSS of big brand websites, to make it so I can set the text color and the background color to whatever I need it to be as a user.

I, being somebody very, I would say not very technical, but fairly technical and knowing how to use technology to self-accommodate, I’ve done that on some websites that get millions of users, because they’re using gray text on a blue background or light gray on a white background that I can’t read at all.

I’ve designed a couple of solutions myself. I’m using dark mode, I’m using magnification. And when I have a day where I’m fatigued after a long day and I still need to read a long email or a document, those longer reading sessions are where I’ll flip NVDA or a screen reader on and switch over either mid session or changing between tasks and use the screen reader, so that I’m not continuing to constantly strain my eyes and potentially cause a headache or that migraine that nobody wants to deal with.

It just depends on the website. It depends on the day and how I’m feeling. Whether I’m using just magnification, magnification plus dark mode or high contrast mode or in some scenarios, a screen reader.

Thomas Logan: That’s awesome. I want to follow that up. Okay, so you’ve developed a custom solution for the modes that you can use to read, as you mentioned, like mainstream websites, do you have an opinion for sites that have an accessibility overlay? Do you benefit if they have an overlay or does it work better for you just to use what you already built yourself?

James Warnken: I know mine works.

The simple answer is that a lot of those overlays, when they get added from what I’ve seen, is there’s no testing that is being done to verify that it is working properly and functioning as planned. And so I’ve encountered websites where, with those widgets, the resize text option causes the text to overlap or go behind an image, or the color inverter or the color changers don’t change all of the text, only the certain ones that have proper CSS classes or ID attributes on them.

It’s not looking at the root element. It’s looking for a certain class or a certain ID. And if it doesn’t have it, it doesn’t work. I’ve seen this on big websites. I’ve seen it on small local websites. Those tools, they’re not as reliable as the ones that me and ChatGPT have built for James.

And so, I would use them if they were a little bit more reliable. But, even then that sort of leads into the whole conversation of, who is that overlay there to serve? And I personally, as somebody who’s been losing my vision for 19 years, I know my technology. I know how to magnify, I know how to turn on high contrast mode at the device level or within the browser settings.

A lot of those features I’ve been using for years. That one website that has it, that’s cool for somebody who was diagnosed yesterday or somebody who’s maybe a little bit older and doesn’t want to learn how to use assistive technology, but for me, who’s been doing it for years, I’m going to use what’s reliable.

AI and accessibility

Thomas Logan: James, you mentioned the role of AI and accessibility. What’s your personal experience with using AI and do you feel like AI is something that’s improving accessibility or is a stop gap, or how do you feel about AI and accessibility?

James Warnken: I think it’s absolutely opening up the opportunity to enhance accessibility. I think there’s still a lot of fear around AI. Whether that’s taking our jobs, it’s allowing students to cheat in the classroom, I think there’s still a lot of cautious fear around AI and what it’s capable of doing.

And until we experiment and until we try, we never know and we’ll never push those boundaries. And so I’ve actually worked a lot over the last couple of years with as many different AI models or tools as possible to try to figure out where does accessibility fit in this conversation. One of the common questions I always get is accessibility overlays and widgets.

The second one that shortly follows is accessibility and AI. My answer to the question is it’s a two perspective answer. The first perspective is users with disabilities, 96% of websites aren’t accessible. So that content is not readily available for a screen reader user or somebody who’s using a braille display or for any type of assistive technology.

AI can fill those gaps. I’ve seen and firsthand experience getting a document that wasn’t accessible, whether I downloaded it from a website, somebody emailed it to me. Taking that to a large language model like ChatGPT, Copilot or even Gemini now, they’re all getting significantly better every day, but taking inaccessible documents, inaccessible webpages, uploading them to the AI. The AI doesn’t need it to be accessible, to be able to extract the information and give it back to me in a plain text format.

And plain text is the best way to communicate. I can get image descriptions, I can get that bulleted list that was just visually styled to look like a bulleted list. I can strip away all the colors from it using AI to make it visually distinguishable for someone like myself. AI can open up so many opportunities for self-advocacy and self-accommodation, but I don’t want that to lead businesses or organizations or decision makers to say, “We’ll leave accessibility up to the individual, because there is still that meeting in the middle that has to happen. If I’m going to put in the time and the effort to learn how to use a screen reader, or I’m going to buy a $3,000 braille display.

Or I’m going to buy a $10,000 wheelchair, but there are still stairs in front of your building. If there’s still unlabeled buttons and images and links and things that don’t have proper names and rolls and labels, why did I spend $3,000? Why did I buy a $10,000 wheelchair? It doesn’t make any sense for me as a user to go through all of that effort.

And you assume that you don’t have to put in any effort on your end. And so AI on that front can come in and help educate, can help inform. AI can write code, you can ask it to create an accessible form in HTML and it will spit it out for you to, all you have to do is copy, paste, and style it with CSS, and then there you go.

It has all the titles, it has all the IDs, it has everything that it needs so that a screen reader and other assistive technology can properly interpret and deliver that to a user. It’s a double perspective answer, but AI can help businesses to be better and create accessible products and services.

And it can help individuals to fill the minor gaps and inconveniences that are just beyond the WCAG guidelines or whatever that team’s ability to create accessible content are.

Ken Nakata: James, sometimes you’ve talked about people watching online as a way to learn. What’s something you saw recently on TikTok or another platform that changed your thinking or inspired you to take action?

James Warnken: Absolutely. So I’m currently in the middle of a research project, and I think this is a perfect example of how to use it and use social media to engage with different communities of different backgrounds and cultures, and even abilities. So this research project is training an AI model on how to write long descriptions that are more effective and more practical for blind and visually impaired people.

That is a pretty specific research topic. And so within that research, they contracted with me because I’m a part of that community, and I have direct access to some of that community. But what I wasn’t prepared for was when I put it out on social media that I was conducting that research and that I was looking for people who cared about art and identified with a disability to answer a couple of questions for me in an interview-style sort of user testing. What I didn’t know was that when I put that out that I was going to get almost 400 responses. And when we talk about recent, when we talk about real-time data, when we talk about engaging with the community. The community even surprised me recently.

And I constantly am seeing videos of new emerging technologies or people sharing their experiences and ideas, not just complaining about them and that it was a bad experience, but giving actual feedback of this is what should be being done. This is what should have happened. And all of that is a learning opportunity that is completely free.

There’s no paywall as long as you have an email and a password. That content is, is readily accessible. I constantly see new apps that are coming out to do audio descriptions to read menus or do different things. I have a full page on my iPhone of just accessibility apps now, and I would’ve never known about any of them if it wasn’t for social media.

So I think it’s not letting your day-to-day newsfeed be the only content you consume. I was at a conference last winter, and they talked about searching for different types of disabilities in the search engines of those social media and just watching it. The more you watch, the more it’s going to work into your feed, and the more you diversify your feed, the more you’re going to be able to understand and relate. And in some cases, empathize with those negative or poor experiences to do something about them.

Thomas Logan: James, I want to finish off with asking you about the list of the companies that you’ve learned about from social media. And really this concept is real accessibility versus performative accessibility.

Like when you encounter people making different claims such as novel audio description solutions. Do you have a process for how you can determine if they’re legit or it’s actually just a performance and they’re trying to make a solution, but they’ve never really worked with people with disabilities.

What do you think about that?

James Warnken: As a member of the community? It’s pretty obvious to see what’s a marketing tactic; we could make so much money off of this. And versus we’re doing this to genuinely help a community that is traditionally overlooked or ignored or not even thought about in the first place.

It’s pretty obvious. Some of the big companies that are leading the way for accessibility, they don’t put it on social media. They don’t directly come out and talk about those features improving. I can’t remember the last time I saw from Apple talking about improvements to voiceover.

So the companies that are doing it, in most cases doing it silently and the community is there. Like, we appreciate that we know that it’s there for us to be able to share as close to the same experience as possible. And when it’s done for financial reasons or it’s done just to avoid the next lawsuit or whatever that might be.

It’s pretty obvious. And I think the best way people can see that is by looking at some of these big organizations and their accessibility statements. Some of them are like two sentences long. Some of them claim they’re fully AA compliant with the web content accessibility guidelines. And right there on their homepage, you can’t expand the dropdown menus with a keyboard. And so a lot of it just from true user experience is pretty, pretty obvious. But as soon as you start to peel back some of the layers, you can generally tell who’s doing it and for what reasons.

I’ve even seen VPATs that are fully supported down the middle column and they had “click here” buttons on their homepage. Those are so simple to remediate, to avoid, to fix, and somehow they’re making it into VPATs.

So it’s down to even the legal conversation of government contracting like it’s when you put it in front of somebody that lives it every day, the curtain falls pretty quickly.

Thomas Logan: James, how do people get in touch with you?

James Warnken: I would say the best way to get in touch with me is probably LinkedIn.

At this point, my LinkedIn is the one that I check the most often. But if you’re looking for just content or more information, jameswarnken.com is my personal website.

Thomas Logan: We’d love to hear from you. Let’s continue the conversation. Thank you so much, James, for our conversation, and we look forward to our listeners engaging us.

Any channel where you encounter this content. We’re looking for your feedback and we will have continued conversations. Thank you so much for your time, and we’ll see you in our next episode.

Do you need help reducing accessibility risk?

Our years of experience working with lawyers and being expert witnesses in lawsuits have given us a unique perspective in explaining and justifying our clients’ accessibility compliance. If you are concerned about legal issues related to accessibility of your digital product, website, or app, please contact us to discuss how we can help.

The post How a Blind Person Uses Social Media and AI to Drive Accessibility appeared first on Equal Entry.

]]>
0 196439
Digital Accessibility’s Gap: AI to Bridge Mobile and Web Barriers https://equalentry.com/digital-accessibility-ai/ https://equalentry.com/digital-accessibility-ai/#respond Wed, 30 Jul 2025 22:05:27 +0000 https://equalentry.com/?p=196505 Mobile accessibility remains challenging, with 60–80% of users encountering barriers across common apps. Jason Tan frames the challenge through four apocalyptic horsemen: lack of mobile-specific standards, weak automation tools, opaque UI structures, and restrictive platform APIs. He proposes a new automation layer to bridge gaps.

Michael Bervell adds a human-tech lens arguing AI can amplify accessibility professionals’ impact when used wisely. A study has found “cyborg” workflows outperform traditional ones known as centaur. While AI shows promise for automating up to 95% of WCAG remediation, human expertise remains irreplaceable. Be a cyborg, not a centaur.

The post Digital Accessibility’s Gap: AI to Bridge Mobile and Web Barriers appeared first on Equal Entry.

]]>
This article is based on Jason Tan and Michael Bervell, co-founders of TestParty, who talked about Digital Accessibility’s Gap: AI to Bridge Mobile and Web Barriers at A11yNYC.

Mobile accessibility Four Horsemen

Jason Tan reveals that around 20% of iOS users use larger text. He references a mobile accessibility survey published by the American Federation for the Blind. The survey looked at diverse application groups from crypto to banking to ordering. Overall, mobility accessibility remains a big barrier. For most apps, 60% to 80% of the respondents run into a barrier.

That leads to the four main issues of mobile accessibility, which are like four apocalyptic horsemen. The first one is around standards. Many may be familiar with WCAG, Web Content Accessibility Guidelines. So, where are accessibility guidelines for mobility?

To have a thriving ecosystem of developers and open source tools requires a unified standard. The W3C has a working draft on mobile accessibility that adds different controls that don’t exist on the web.

The second thing that makes this mobile accessibility hard is the lack of automation. There are some open source tools like Appium, but they tend to be hard to use. In fact, some companies choose to skip user interface (UI) tests because the frameworks are hard to maintain.

Without kind of the right open source automation tools, it’s hard to start systematically testing. On OpenAI, with Operator, you can tell it to do something. It uses Puppeteer and Playwright and takes screenshots and navigates. This can’t be done on mobile unless you’re crafty and trying to overengineer a lot of things.

The third issue is the opaque element structure. Deque’s aXe-core is one of the best open source frameworks that primarily relies on embedding itself within the HTML DOM. Yet, there is no such DOM in mobile. If you’ve tried cross-platform mobile, like React Native, you might have seen the React Native debugging tree. It is technically available for you to try to crawl, but that is nowhere near the accessibility provided by the HTML DOM.

DOMs are crucial for constructing the relationship between elements on a screen. Without this sort of representation, you must rely on simple visual analysis. The Document Object Model (DOM) shows the structure of how a page is overlaid.

For mobile, you can get a view hierarchy. That’s helpful for debugging, but it won’t help for things that don’t appear. This isn’t useful for automated testing.

Hence, the fourth issue is platform limitations. Apple and Google have restrictions on their APIs for what you can and can’t do. While they have a lot of prebuilt components, they only have a couple of fields available for accessibility. Some are useful while others are restrictive. That makes it hard to navigate how to make things accessible.

Automating mobility accessibility testing

TestParty has built an automation framework that strips away the ugly parts of Appium to create a new standard that is closer to Puppeteer. Developers can write a YAML-based test file, navigate through screens, and create hierarchies around them.

It lets you capture the element types and the various relationships inside a hierarchy. That hierarchy can be used for automated testing. The tool can do it while detached from the Apple ecosystem or the Google ecosystem, as it’s strictly within VSCode. This lets you take control of the running of a device and test things remotely.

Combining AI and humans

Michael tells the story of one of his favorite superheroes, who is half cybernetic robot and half human. The combination of the person’s technology and humanness makes them a superhero. Along these lines, people can use AI to turn themselves into something like an accessibility superhero. It’s a combination of assistive AI tools plus the human heart.

Where we are today with WCAG remediation automation

Where are we today? This is chapter 1, episode 1, season 1. This is the pilot. Where are we today? Today, 94% of the million most visited home pages are inaccessible per the WebAIM Million. There’s been a little progress, which is good.

Nonetheless, the web is also becoming more complex. This year, there were 11.8% more elements on these home pages. That means there are 11% more things to test than last year. The web will continue to grow more complex because people will get more creative and do it with more interesting new components.

The reality is that manual audits are slow. It takes months to complete a digital accessibility audit. Michael’s company went through SOC 2 compliance (system and organization controls), and it took eight weeks to complete, and then another three months to get the complete report of the audit.

Accessibility professionals have a lot on their plates, as almost 61% of accessibility employees do not have the resources they need to do their accessibility work effectively. They’re giving 110% to make the web 5% more accessible than last year. How can we do better? By using artificial technology.

What can AI do?

Artificial intelligence in this context is the Frontier large language models (LLM) like Gemini and ChatGPT trained on massive data sets. These large language models essentially work by taking huge amounts of data, adding weights to that data, in terms of how they interpret it, and making predictions. These predictions are getting good.

All these models can score well on any standardized test. You might think you don’t need a lawyer when an AI model can score high on the LSAT. Benchmarks show that these AI tools are getting better and better and better at general logic and general reasoning.

Michael was a lead research assistant on a Harvard Business School study that came out in September 2023. It delineates what types of learning AI are best at versus worst at.

In the experiment, the researchers worked with the Boston Consulting Group’s (BCG) global workforce. They had them do 18 tasks around creativity, analytics, and persuasion. Half had access to ChatGPT GPT-4. And the other half had no access.

What the study found was that those who had access to AI with no training were able to work about 12.5% faster. They did 12.5% more work, 25% faster, and 4% higher quality.

Some people were 10% better than their other AI-using counterparts. The researchers wanted to know what the difference was. There was a group who called themselves centaurs, the mythological half-horse, half-human creatures, who divided the work between themselves and AI.

It’s almost as if you were to use a calculator and ask, “What is 2+2?” It would tell you “4.” They would use 4 to do the rest of their work. That’s centaur work. The AI is an assistant to their thinking.

The other half of consultants who performed better were called cyborgs. They integrated AI into their workflow. They continually interacted with technology. On average, the cyborgs prompted AI 3.5 times more than a centaur.

A centaur might say, “Hey, I have to design this new marketing campaign for Nike. Give me 10 example names.

Whereas a cyborg would ask that question and then say, “I like two of the names. Give me more that are like that.” They kept refining it to another subset. They used as almost like a sounding board or a teammate.

There are two takeaways. The first is that the expert AI is going to be better than the average human. And the expert human will be better than the average AI. The Frontier AI model can turn an average individual into an almost-expert in any field. However, experts in specific fields beat AI.

The second takeaway is that how someone uses AI matters more than just using AI. In other words, be an AI cyborg, not a centaur.

One way to be more effective is to integrate AI into your everyday work. Good examples of this in action are Grammarly, Be My Eyes, GitHub Copilot, and Cephable.

Three practical WCAG AI remediations

Can you automate the WCAG? This applies to remediation, not testing. Michael asked AI how much of the WCAG AI can automatically remediate.

  • DeepSeek said 70%: 20% could be fully automated, plus 50% with a human in the loop.
  • Claude said 75%. 17% fully automated plus 60% with a human in the loop.
  • ChatGPT said 80% automation, 50% fully automated, plus 30% with a human in the loop.
  • Gemini said 95%. 56% fully automated, plus 39% a human in the loop.

What can’t AI remediate when it comes to Web Content Accessibility Guidelines? In summary, the experts in accessibility will always be valuable because AI can’t replace the expertise.

AI can’t replace what humans do at their best, which is to be empathetically human. However, AI can help you always operate as your best human. An accessibility superhero has the AI tools, the heart, and their knowledge. The key is to keep the human in the loop. This is the best way to moderate AI systems to ensure the outputs are accurate.

This is where the centaur versus the cyborg mentality is the most different. A cyborg is constantly involved in the loop of the AI interaction. When someone prompts 8 times in a row, they’re validating, re-prompting, and re-validating.

Human-in-the-loop AI technology is using spellcheck, autocomplete for forms, and fraud alerts. A fraud alert involves a human texting the human asking if they purchased something that seemed out of the ordinary.

Now apply this concept to an AI agent in the loop. Can we have an agent looking at the results and validating them? How do we use technology to validate technology to create better and more effective models within a specific niche like accessibility, security, design, and usability testing?

The question then becomes: How good is your validation agent? Perhaps, you can have a positive validation agent that says something looks right. Then, a negative validation agent says it looks wrong. They give feedback differently. Put them together and it turns into a cyborg AI system rather than a human-in-the-loop centaur system.

Video Highlights

Watch the Presentation

Bios

Jason Tan is the co-founder and CTO of TestParty, a startup automating digital accessibility testing across platforms. Jason studied Computer Science, Economics, and Latin at Princeton, and previously worked as an iOS engineer at Twitch, where he encountered accessibility challenges during a live lawsuit.

At TestParty, he brings a uniquely technical and humanistic lens to mobile and web accessibility. Jason is passionate about building developer-first tools that don’t just detect problems — but help fix them.

Michael Bervell is the CEO and co-founder of TestParty, an AI-powered digital accessibility platform that automates WCAG remediation across web and mobile. He previously consulted on accessibility for Google and the United Nations and was awarded an NSF SBIR grant for advancing automated compliance technologies.

A published author and Harvard graduate, Michael’s work sits at the intersection of AI, education, and inclusion. He’s passionate about building tools that make the internet equitable for everyone.

The post Digital Accessibility’s Gap: AI to Bridge Mobile and Web Barriers appeared first on Equal Entry.

]]>
0 196505