Accessible Design

I have placed an important emphasis on inclusive design, making it a core characteristic of my practice as a designer. My inclusive design methodology has realized several practical design solutions. Some of those are detailed here.

Universal Access Points
Universal Keypad (A and I)

Universal Access Point
During the conceptualization of the Canadian Museum for Human Rights core exhibitions, a design challenge was how to ensure all content would be accessible to all visitors. Solutions for digital media seemed relatively obvious, but how to make physical content and experience inclusive required healthy consideration. The solution I developed was to use the mobile device. It could and would be leveraged for personalization, but we could consider personalization from an inclusion and accessibility perspective as well.

The Universal Access Point (UAP) is a system of 5 components.

  1. A tactile, cane-detectable floor strip
  2. A tactile, braille marker
  3. A low-energy Bluetooth beacon (iBeacons)
  4. A mobile app on a mobile device (iOS, Android)
  5. A Content Management System (and Digital Asset Management System)

With this solution, content is catalogued with a CMS. Content hierarchies are created, assigned a number. The iBeacons are assigned numbers. Those numbers are redundantly represented on the tactile markers. Using the mobile app people can have the content delivered dynamically to them via the mobile app and therefore have the static content described via text-to-speech. Supplemental rich content (ie. ASL, Audio description) can also be assigned to the content structures and sent to the app. Visitors can self-guide using the cane-detection and tactile marker vocabulary, or simply use a “near-me” mode and be prompted by the Bluetooth signals.


Universal Keypad

Having worked on a number of accessible web projects over my career, developing the Universal Keypad concept was relatively straight forward in principle. The bulk of the work was landing on the specifications for my original concept, and this was done through partnerships with Ralph Appelbaum Associates, the Inclusive Design Research Centre (OCAD), Kubik Maltby, and Electrosonic, as well as my own teams at the CMHR including contract consultants Bruce Wyman, and Sina Bahram on the software semantics, and standardization side of this solution.

The initial concept was to use the principles of Web semantics, JAWS (screen-reading software), and an abridged keyboard (instead of the typical QUERTY) to create a standard interface for touchscreen and gestured based interfaces. Thereby facilitating greater use by a greater variety of individuals with varying needs and preferences.

The UKP-I and semantic software structure, using text-to-speech (as well as accessibly compliant media) ensures that any user can navigate a digital or tangible interface, zoom interfaces, adjust volume, and have wrist support. Moreover, it ensures that a consistent interface is applied to various styles of touchscreen or gesture-based installations, providing very high usability (while avoiding the need to relearn how to use an interface).

The UKP-A is an audio-only version. This keypad is used for instances where the media is non-navigable, such as for film presentation. It this instance it provides access to the described audio tracks, and also provides individual volume control.