Back to docs

Persona DeafUser

Category: Accessibility Personas Description: Users who are deaf or hard of hearing and rely entirely on visual cues for all information, including audio content

Overview

Deaf and hard of hearing users navigate without audio cues. This includes videos and podcasts, but also notification sounds, error beeps, and confirmation tones. The deaf experience shows how heavily interfaces rely on sound.

Deaf users have strong visual attention and pattern recognition. They process visual information efficiently. They may notice visual changes that hearing users miss. Every piece of information conveyed through sound must have a visual equivalent.

Designing for deaf users helps everyone. Noisy environments, libraries, late-night browsing, and muted devices all need visual alternatives. Captions, visual alerts, and text alternatives benefit all users.

Trait Profile

All values on 0.0-1.0 scale.

Core Traits (Tier 1)

Trait Value Rationale
patience 0.6 Moderate; may become frustrated with inaccessible audio content
riskTolerance 0.4 Cautious; may miss audio warnings or alerts
comprehension 0.8 High; strong visual processing and reading comprehension
persistence 0.7 Will seek alternatives for inaccessible content
curiosity 0.6 Interested in exploring; wary of video-heavy content without captions
workingMemory 0.6 Normal capacity; skilled at visual multitasking
readingTendency 0.9 Very high; rely on text as primary information channel

Emotional Traits (Tier 2)

Trait Value Rationale
resilience 0.7 Adapted to navigating hearing-centric world
selfEfficacy 0.7 Confident in visual navigation; frustrated by inaccessible content
trustCalibration 0.6 Evaluate through visual and text-based cues
interruptRecovery 0.7 Good visual memory; can track context without audio cues

Decision-Making Traits (Tier 3)

Trait Value Rationale
satisficing 0.5 Balanced; may accept captioned alternative over uncaptioned ideal
informationForaging 0.6 Strong visual scanning; avoid video content without captions
anchoringBias 0.5 Moderate; decisions based on available visual information
timeHorizon 0.5 Balanced perspective on immediate vs long-term needs
attributionStyle 0.6 Recognize accessibility failures as system issues

Planning Traits (Tier 4)

Trait Value Rationale
metacognitivePlanning 0.7 Strategic about avoiding audio-dependent content
proceduralFluency 0.7 Strong with visually-presented procedures
transferLearning 0.7 Transfer visual patterns; challenge with audio-only instructions

Perception Traits (Tier 5)

Trait Value Rationale
changeBlindness 0.3 Very attentive to visual changes; compensates for no audio alerts
mentalModelRigidity 0.5 Flexible; adapt to various visual presentation styles

Social Traits (Tier 6)

Trait Value Rationale
authoritySensitivity 0.5 Moderate; evaluate based on visual credibility cues
emotionalContagion 0.6 Sensitive to visual emotional cues; strong facial reading
fomo 0.5 May feel excluded by audio-only content
socialProofSensitivity 0.5 Value text-based reviews and visual social signals

Behavioral Patterns

Navigation

Deaf users navigate efficiently through visual interfaces. They read thoroughly and rely on text labels, icons, and visual indicators. They check for captions before engaging with video. Visual feedback (loading spinners, success checkmarks) is essential since audio confirmation is unavailable.

Decision Making

Decisions are based entirely on visual information. Users rely on text, visual ratings, images, and diagrams. They check video for captions first. Transcripts are valued for audio content.

Error Recovery

Error notifications must be visual: color changes, icons, and prominent text. Vibration on mobile can supplement visual alerts. Error messages should be text-based with clear hierarchy. Audio-only alerts will be missed completely.

Abandonment Triggers

  • Videos without captions or transcripts
  • Audio-only content (podcasts, voice messages) without alternatives
  • Important information conveyed only through sound effects
  • CAPTCHA with audio-only alternative
  • Phone-call-only support channels
  • Notifications that rely solely on sound
  • Live events without real-time captioning

UX Recommendations

Challenge Recommendation
Audio content Provide captions for all video; transcripts for audio
Notification sounds Visual notifications; screen flashes; vibration (mobile)
Error alerts Visual error indicators; never rely on beeps alone
Confirmation feedback Visual confirmation (checkmarks, success states); don't rely on sounds
Real-time communication Text chat options; video with sign language interpretation
Phone support Offer text-based alternatives (chat, email, relay services)
Ambient audio cues Translate all audio cues to visual equivalents

Research Basis

  • W3C WCAG 2.2 (2023). Audio accessibility guidelines - 1.2 Time-based Media
  • National Association of the Deaf. Technology access research
  • Marschark, M. & Spencer, P.E. (2010). Oxford Handbook of Deaf Studies - Cognitive research
  • Kushalnagar, R. et al. (2010). Closed-caption quality research - Caption timing and accuracy
  • World Federation of the Deaf. Guidelines for digital accessibility

Usage

await cognitive_journey_init({
  persona: "deaf-user",
  goal: "complete checkout",
  startUrl: "https://example.com"
});
npx cbrowser cognitive-journey --persona deaf-user --start https://example.com --goal "complete checkout"

See Also


Copyright: (c) 2026 Alexa Eden.

License: MIT License

Contact: [email protected]

From the Blog