OCAD University Open Research Repository

Empowering Agentic Non-Visual Web Navigation Through Tactile Controls and AI Support

Amin, Forootan (2025) Empowering Agentic Non-Visual Web Navigation Through Tactile Controls and AI Support. [MRP]

Item Type: MRP
Creators: Amin, Forootan
Abstract:

Blind and low-vision (BLV) users rely on screen readers to access digital content, yet these tools often impose strictly linear, text-based navigation that fails to communicate spatial layout, contextual changes, or emotional tone. This disconnect leads to cognitive overload, frustration, and reduced autonomy. In response, this research proposes a new model of screen reader interaction: a modular, tangible interface grounded in affordance-based design to enhance agentic non-visual navigation.
Through interviews and co-design sessions with BLV users, the study identifies six key experiential barriers: (1) loss of spatial orientation, (2) lack of state-change feedback, (3) absence of emotional/paralinguistic cues, (4) dependence on sequential logic, (5) inefficient input methods, and (6) mistrust of over-automated AI. These findings informed a series of design iterations, evolving from a conversational AI prototype to a tactile, multi-modal controller.
The final design features a rotary knob for sequential traversal, a rotor switch for hierarchical navigation, haptic and auditory feedback to signal changes and navigation boundaries, and a context-aware AI assistant. Mapped to NVDA screen reader commands and aligned with the POUR (Perceivable, Operable, Understandable, Robust) accessibility framework, each component reinforces spatial awareness, user agency, and reduced cognitive demand.

Date: 7 May 2025
Divisions: Faculty of Design
Date Deposited: 09 May 2025 16:18
Last Modified: 09 May 2025 16:18
URI: https://openresearch.ocadu.ca/id/eprint/4779

Actions (login required)

Edit View Edit View