A digital interface adaptation company

Making inaccessible interfaces accessible for people with disabilities using artificial intelligence.

available on all mayor platforms on

November 18, 2023

  • Days
  • Hours
  • Minutes


While there have been noteworthy accessibility advances over the past two decades, the inaccessibility of consumer appliances and other products with digital interfaces continues to be a significant problem for many people with disabilities. Increasingly, consumer appliances like washing machines, microwaves, refrigerators, and dishwashers have flat, digital interfaces that are largely inaccessible for people with visual, motor, and cognitive disabilities. This problem is projected to increase in frequency and severity as companies increasingly move towards largely digital interfaces.

  • Box icon
  • Box icon
  • Box icon
Primary Screenshot Secondary Screenshot
Primary Screenshot Secondary Screenshot


MyUI.AI is a cloud-based system that leverages Artificial Intelligence to adapt the digital interfaces of consumer appliances, kiosks, and nearly any compatible IoT device, to the specific interaction needs of people with disabilities (PWD).
Collecting disability and interaction data via an accessible mobile application, MyUI learns the interaction needs and preferences of users and then uses artificial intelligence to ‘adapt’ inaccessible digital interfaces to specific user needs and preferences.
From a practical perspective, this means that an interface that as a default only supports touch or tactile interaction may now support gestures, eye gaze, affect, and/or voice commands. The user interface (UI) adaptation further extends to the visual elements of the UI and involves adaptations of interface colors, contrast, font size, spacing, button size, and layout.
MyUI enables people with disabilities to interact with customized as opposed to default and potentially inaccessible digital interfaces.

  • Box icon
  • Box icon
  • Box icon

Take a video tour of MyUI

MyUI learns your interaction preferences through a series of minigames and adapts any compatible digital interface using artificial intelligence.

Solution / Value Proposition

The MyUI ecosystem of technologies has three components:

  1. the user-facing mobile application,
  2. a machine learning model deployed to the cloud, and
  3. tooling to support adaptive human-machine interface (HMI) development.

The mobile application will collect demographic and interaction-related data like type of disability, age, and interaction preferences through a series of mobile application games. This data is then fed to a machine learning model deployed to the cloud, which processes the interaction and demographic data to make a prediction about specific user interface characteristics (e.g., font type, layout, contrast, etc.). and modalities necessary to support optimal interaction (e.g., eye gaze, affect, speech, etc.). The digital interface of consumer devices, designed by the manufacturer using our tooling, will then adapt to user needs based on data retrieved from the cloud-based model

MyUI Milestones


Semifinalist award in the US DOT’s Inclusive Design Challenge.


Accepted into the gBETA 2023 mobility accelerator in Indianapolis, Indiana


May-November: raised to expand our engineering team through crowdfunding.

Meet our amazing team

In depth expertise in accessibility.

Member Photo
16+ years in UX, Accessibility, and Software Engineering. ~6 million in grants from the US DOT, NHTSA, NSF, Google Research and others.

Julian Brinkley

Ph.D., PMP

Member Photo
7 Years in AI, data science, and software engineering at some of the world’s largest tech companies.

Kuntal Mait


Subscribe to our newsletter