By Amanda Florian, The Washington Post
Over decades, input devices in the video game industry have evolved from simple joysticks to sophisticated controllers that emit haptic feedback. But with Enabled Play, a new piece of assistive tech created by self-taught developer Alex Dunn, users are embracing a different kind of input: facial expressions.
While companies like Microsoft have sought to expand accessibility through adaptive controllers and accessories, Dunn’s new device takes those efforts even further, translating users’ head movements, facial expressions, real-time speech and other nontraditional input methods into mouse clicks, key strokes and thumbstick movements. The device has users raising eyebrows — quite literally.
“Enabled Play is a device that learns to work with you — not a device you have to learn to work with,” Dunn, who lives in Boston, said via Zoom.
Dunn, 26, created Enabled Play so that everyone — including his younger brother with a disability — can interface with technology in a natural and intuitive way. At the beginning of the pandemic, the only thing he and his New Hampshire-based brother could do together, while approximately 70 miles apart, was game.
“And that’s when I started to see firsthand some of the challenges that he had and the limitations that games had for people with really any type of disability,” he added.
At 17, Dunn dropped out of Worcester Polytechnic Institute to become a full-time software engineer. He began researching and developing Enabled Play two and a half years ago, which initially proved challenging, as most speech-recognition programs lagged in response time.
“I built some prototypes with voice commands, and then I started talking to people who were deaf and had a range of disabilities, and I found that voice commands didn’t cut it,” Dunn said.
That’s when he started thinking outside the box.
Having already built Suave Keys, a voice-powered program for gamers with disabilities, Dunn created Snap Keys — an extension that turns a user’s Snapchat lens into a controller when playing games like Call of Duty, “Fall Guys,” and “Dark Souls.” In 2020, he won two awards for his work at Snap Inc.’s Snap Kit Developer Challenge, a competition among third-party app creators to innovate Snapchat’s developer tool kit.
With Enabled Play, Dunn takes accessibility to the next level. With a wider variety of inputs, users can connect the assistive device — equipped with a robust CPU and 8 GB of RAM — to a computer, game console or other device to play games in whatever way works best for them.
Dunn also spent time making sure Enabled Play was accessible to people who are deaf, as well as people who want to use nonverbal audio input, like “ooh” or “aah,” to perform an action. Enabled Play’s vowel sound detection model is based on “The Vocal Joystick,” which engineers and linguistics experts at the University of Washington developed in 2006.
“Essentially, it looks to predict the word you are going to say based on what is in the profile, rather than trying to assume it could be any word in the dictionary,” Dunn said. “This helps cut through machine learning bias by learning more about how the individual speaks and applies it to their desired commands.”
Dunn’s AI-enabled controller takes into account a person’s natural tendencies. If a gamer wants to set up a jump command every time they open their mouth, Enabled Play would identify that person’s individual resting mouth position and set that as the baseline.
In January, Enabled Play officially launched in six countries — its user base extending from the U.S. to the U.K., Ghana and Austria. For Dunn, one of his primary goals was to fill a gap in accessibility and pricing compared to other assistive gaming devices.
“There are things like the Xbox Adaptive Controller. There are things like the HORI Flex [for Nintendo Switch]. There are things like Tobii, which does eye-tracking and stuff like that. But it still seemed like it wasn’t enough,” he said.
Compared to some devices that are only compatible with one gaming system or computer at a time, Dunn’s AI-enabled controller — priced at $249.99 — supports a combination of inputs and outputs. Speech therapists say that compared to augmentative and alternative communication (AAC) devices, which are medically essential for some with disabilities, Dunn’s device offers simplicity.
“This is just the start,” said Julia Franklin, a speech language pathologist at Community School of Davidson in Davidson, N.C. Franklin introduced students to Enabled Play this summer and feels it’s a better alternative to other AAC devices on the market that are often “expensive, bulky and limited” in usability. Many sophisticated AAC systems can range from $6,000 to $11,500 for high-tech devices, with low-end eye-trackers running in the thousands. A person may also download AAC apps on their mobile devices, which range from $49.99 to $299.99 for the app alone.
“For many people who have physical and cognitive differences, they often exhaust themselves to learn a complex AAC system that has limits,” she said. “The Enabled Play device allows individuals to leverage their strengths and movements that are already present.”
Internet users have applauded Dunn for his work, noting that asking for accessibility should not equate to asking for an “easy mode” — a misconception often cited by critics of making games more accessible.
“This is how you make gaming accessible,” one Reddit user wrote about Enabled Play. “Not by dumbing it down, but by creating mechanical solutions that allow users to have the same experience and accomplish the same feats as [people without disabilities].” Another user who said they regularly worked with young patients with cerebral palsy speculated that Enabled Play “would quite literally change their lives.”
Click here to read the full article on The Washington Post.