top of page

Inside BABC member Microsoft's efforts to make AI systems more inclusive of people with disabilities

Saqib Shaikh says people who are blind, like himself, typically develop highly organized routines to keep track of their things — putting keys, wallets, canes and other essentials in the same places each time.


Saqib Shaikh, Microsoft principal software engineering lead and one of the founders of Seeing AI. Photo by John Brecher.


But sometimes life gets messy: A child needs help finding a lost stuffed animal, identical garbage bins get moved around on the curb or coats get jumbled together at a party.


Today, a person using Microsoft’s Seeing AI app can point a phone camera at a scene, such as a conference room table, and hear a description of what’s in the frame: laptops, water bottles, power cords, phones. But it would sometimes also be useful for the machine learning algorithms powering the app to recognize objects that are specific to that individual person, said Shaikh, a Microsoft engineer whose team invented Seeing AI.


Until recently, there hasn’t been enough relevant data to train machine learning algorithms to tackle this kind of personalized object recognition for people with vision disabilities. That’s why City, University of London, a Microsoft AI for Accessibility grantee, has launched the Object Recognition for Blind Image Training (ORBIT) research project to create a public dataset from scratch, using videos submitted by people who are blind or have low vision.


The data will be used to train and test new algorithms to recognize and locate important personal objects, which can range from cell phones to face coverings to kitchen tools.

“Without data, there is no machine learning,” said Simone Stumpf, senior lecturer at the Centre for Human-Computer Interaction Design at City, University of London, who leads ORBIT. “And there’s really been no dataset of a size that anyone could use to introduce a step change in this relatively new area of AI.”  ­


The lack of machine learning datasets that represent or include people with disabilities is a common roadblock for researchers or developers working with those communities to develop intelligent solutions that can assist with everyday tasks or create AI systems that are less likely to magnify prejudices that can skew decision making.


Read the full story and watch the video.



Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • Facebook Basic Square
  • Twitter Basic Square
  • Google+ Basic Square
bottom of page