With the tablet computer on her lap, Grace Simon, 13, hoisted her hand and set it down on the screen. It didn’t matter where her fingers landed. Just like that, she made a selection in an early version of an app designed to help her converse about her favorite activities.
“That’s pretty awesome,” said her mother, Jennifer Simon, of Westphalia, Mich.
Grace has athetoid cerebral palsy. While she can read, understand spoken words and signal ‘yes’ or ‘no’, she can’t speak or use extensive sign language to communicate because of her limited muscle control. A software engineering class at the University of Michigan has spent the past semester building systems that might help the sixth-grader connect with others and act more independently at school and at home. On Dec. 11, all 14 teams will demonstrate their systems for instructors – and for Grace.
“They’re all feeling a responsibility that this needs to work well,” said teaching assistant Chris McMeeking. “This is different from a typical class. When a project fails for us, we understand; we’re developers. But if it fails for Grace, it’s like we let her down.”
The students made a variety of systems. Some are specific – attempting to give Grace tools to draw, color and practice math. Others could help her answer questions or convey more information about what she needs, wants or likes. And several groups focused on giving Grace more general language tools. If she could just enter information into a computer – whether words, letters or choices from a list – that could vastly expand her ability to communicate. One group made a decision tree that could let her narrow her choice from a broad category to a single word or concept. Another designed a modified keyboard.
Because a typical keyboard, mouse or touchscreen wouldn’t work for Grace, the students utilized joysticks, Microsoft Kinect motion sensors and Intel Gesture Cameras to build systems she could potentially control with her limb motion, or facial movements.
The team named ‘Vision with Grace’ uses a Kinect camera to identify objects in view and then lets Grace make selections by looking at that object for several seconds.
“When Grace and her family first visited our class, they expressed frustration that they were sometimes unable to quickly understand her needs,” said team member Stephen Lanham, a senior computer science major. “They said sometimes Grace would have to crawl across the room to show her parents that she was interested in some object such as a book for school. Our solution provides a clean and simple interface for Grace to choose items identified by the Kinect.”
Many of the teams that relied on a modified touchscreen built upon a project developed in the class several years ago. ASK Interfaces, created by teaching assistant McMeeking and his colleagues, scans through possible selections on a tablet computer and essentially turns the entire screen into a button.
One group that uses ASK is ‘My Favorite Things’, which impressed Grace and her mom in a mid-semester beta release, is a database of items that Grace enjoys, organized into categories.
“We want to help Grace express herself and tell others about her favorite activities. We hope this helps her answer questions and allows her to have more two-sided conversations,” said Catherine Fisher, a senior computer science major in the class.
Students say they learned more than how to code.
“Grace’s enthusiasm and willingness to participate with helping us refine our product has been a pleasure and reminded us of the value of creating highly configurable software,” said Eyad Makki, a senior studying computer science. “What seemed like a reasonable speed for us wasn’t for her and we had to tweak it.”
The instructor behind the course is David Chesney, a lecturer and research investigator in computer science and engineering who has made a point of putting social context into his senior- and freshman-level classes for more than a decade.
Last year, his class focused on developing games to help children with autism with large motor and social skills. One of those games, PATH, is in a clinical trial with patients with brachial plexus palsy – a congenital movement disorder in which one arm is weaker and shorter than the other. And McMeeking’s ASK Interfaces is expected to be available for public use in early 2014.
“The work has to be important,” Chesney said. “It would be easy enough to come up with contrived projects semester after semester, but this isn’t contrived. This is something we’re doing that really has the potential to help people. That makes it so much more meaningful.”
Microsoft and Intel both donated technologies to the class and the Mott Golf Classic has also provided funding.