◈ Latency HUD — Shift+L to close
FPS       : --
Inference : -- ms
Confidence: --
Memory   : --
YOLOv8n · on-device · Ghana
Ally accessibility platform interface deployed at a museum, showing audio description system
● Live 2024

Ally — Envision Technologies

AI-powered accessibility platform for blind and low vision users

AccessibilityConversational AIWebVoice InterfaceReact

📊 Metrics

Users Globally
100,000+
User Satisfaction
100%
Deployment
Museums (SF)
Interaction Mode
Voice + Keyboard

🔗 Links

🏆 Recognition

  • Deployed in San Francisco museum pilots
  • 100% of pilot participants reported improved experience
  • Reduced dependency on human assistance

Overview

Ally is an AI-powered accessibility platform built by Envision Technologies for blind and low vision users. The platform enables natural, conversational interaction with digital environments — removing barriers that traditional interfaces were never designed to address.

At Envision, I worked as a Software Engineer on the Ally platform, contributing to systems that now serve over 100,000 users worldwide.

What I Built

My work at Envision focused on two primary areas:

Accessible Conversational AI Systems

Built the infrastructure for voice-driven, conversational interaction with Ally’s AI layer. This involved designing interaction flows that work without visual feedback — relying entirely on audio output, spatial cues, and keyboard navigation. Every decision had to account for users who can’t see the interface at all.

Web Platform Development

Developed the web-based version of Ally, supporting both voice and keyboard-first interaction. The implementation had to meet WCAG standards rigorously — not as a checklist, but as a product requirement. Screen reader compatibility, focus management, and semantic HTML were foundational, not afterthoughts.

Museum Deployments

Ally was piloted in San Francisco museums as an accessibility layer for exhibits and wayfinding. The system allowed blind and low vision visitors to engage with exhibit content through conversational AI — asking questions, receiving descriptions, and navigating independently.

Key outcomes from the pilot:

  • 100% of participants reported an improved museum experience
  • Significant reduction in reliance on human docent assistance
  • Validated real-world usability of AI-driven audio description at scale

Scale and Impact

The platform serves over 100,000 users globally across Envision’s product ecosystem. Working at this scale meant that accessibility was not an optional feature — it was the product. Every decision, from API response time to audio clarity, directly affected users who had no fallback to a visual interface.

What This Work Taught Me

Building Ally reinforced a core belief: accessibility infrastructure must be treated with the same engineering rigor as any performance-critical system. Latency matters. Reliability matters. Ambiguity in an audio prompt causes real confusion for real users.

This experience also sharpened my thinking on designing AI systems for users whose needs are specific and non-negotiable — a mindset I carry directly into Safe Step and future assistive projects.