I place an extremely high emphasis on an inclusive design methodology within all my work, making it a core characteristic of my practice as a designer and strategist. While my approach is “inclusion” one of the natural outcomes of this approach is “accessibility”. You can read more about my inclusive design methodology at this link. This methodology has permitted me to realize several practical design solutions that increase target markets, foster sustainability, and increase usability and relevance for everyone.
Universal Access Point
During the conceptualization of the Canadian Museum for Human Rights core exhibitions, a design challenge was how to ensure all content would be accessible to all visitors. Solutions for digital media seemed relatively obvious to me, but how to make physical content and experience inclusive required healthy consideration. For example, how would we make the artefact in case, the framed photo, or the text screened on a wall, accessible to those who couldn’t see them, or read them? The solution I developed was to use the mobile device. It could and would be leveraged for personalization, but we would consider personalization from an inclusion and accessibility perspective as well.
The Universal Access Point (UAP) is a system of 5 components.
- A tactile, cane-detectable floor strip
- A tactile, braille marker
- A low-energy Bluetooth beacon (iBeacons)
- A mobile app on a mobile device (iOS, Android)
- A Content Management System (and Digital Asset Management System)
With this solution, content is catalogued with a CMS. Content hierarchies are created and assigned a number. The iBeacons are assigned numbers. Those numbers are redundantly represented on the tactile markers. Using the mobile app people can have the content delivered dynamically to them and therefore have the static content read aloud via the text-to-speech functions of the device. Supplemental rich content (ie. ASL, Audio description) can also be assigned to the content structures and sent to the app. Visitors can self-guide using the cane-detection and tactile marker vocabulary, or simply use a “near-me” mode and be prompted by the Bluetooth signals.
Having worked on a number of accessible web projects over my career, developing the Universal Keypad concept was relatively straight forward in principle. The bulk of the work was landing on the specifications for my original concept, and this was done through partnerships with Ralph Appelbaum Associates, the Inclusive Design Research Centre (OCAD), Kubik Maltby, and Electrosonic, as well as my own teams at the CMHR including contract consultants Bruce Wyman, and Sina Bahram on the software semantics and standardization of my proposed solution.
My initial concept was to use the principles of Web semantics, JAWS (screen-reading software), and an abridged keyboard (instead of the typical QUERTY) to create a standard interface for touchscreen and gestured based interfaces. Thereby facilitating greater use by a greater variety of individuals with varying needs and preferences. I hired the IDRC at OCAD to prove the concept, and then worked with my other partners to produce prototypes, and eventually a final design and integrate it into the built environment.
The UKP-I and semantic software structure, using text-to-speech (as well as accessibly compliant media) ensures that any user can navigate a digital or tangible interface, zoom interfaces, adjust volume, and have wrist support. Moreover, it ensures that a consistent interface is applied to various styles of touchscreen interfaces, tangible interfaces, and gesture-based installations. The consistent application of the UKP-I provides very high usability (while avoiding the need to relearn how to use an interface at each new installation).
The UKP-A is an audio-only version. This keypad is used for instances where the media is non-navigable, such as for film presentation. It this instance it provides access to the described audio tracks, and also provides individual volume control.