Beginning somewhere around the release of the Apple iPhone X, mainstream media attention to Face Recognition has gone from nearly none whatsoever— to the press equivalent of a flood light being shined directly on the industry. With this new attention, concerns have been raised around privacy and safety— and we’ve been listening.
Last summer, our Diversity & Ethnicity Recognition App was released-- and over 10 million people uploaded their selfies, curious to see how the algorithms would classify their ethnicities.
In 'Diversity Gone Viral' we expressed our gratitude for the amazing response to the app, and acknowledged some of the feedback we received-- positive and negative. We also learned a few valuable lessons around the complexity of the taxonomy of ethnicity, which has given us a runway to better understanding the current concerns associated with face recognition software.
"While most users will get a spot-on result, we acknowledge that the ethnicity classifiers currently offered (Black, White, Asian, Hispanic, ‘Other’) fall short of representing the richly diverse and rapidly evolving tapestry of culture and race."
- Brian Brackeen, CEO, Kairos
Time for change
Recent news concerning Amazon’s Rekognition software has created conversations around the overall safety of using Face Recognition in law enforcement. At the center of this concern, is bias. Specifically, the often inaccurate results returned when the software is tasked with recognizing people of color.
"Imperfect algorithms, non-diverse training data, and poorly designed implementations increase the chance for bad outcomes. Surveillance uses, such as face recognition enabled body-cams, ask too much of today’s algorithms." https://t.co/xzG1b6BMw1 #privacy #AI #surveillance
— Kairos (@LoveKairos) May 29, 2018
We’ve written about this bias and have expressed our opposition to commercial Face Recognition in law enforcement— yet we have continued to entertain millions of people with an app which we know for at least some users, will return an inaccurate, and possibly offensive result.
In consideration of this, and of our overall commitment to responsibly representing our industry, we have decided to shutter our Diversity and Ethnicity App. While our app is meant exclusively for use as entertainment, we recognize that the line between the use of commercial Face Recognition for entertainment and its use in law enforcement is currently blurred. And until this line is more clearly defined— our concerns around sensitivity to bias and safety just won’t allow us to continue to contribute to the confusion.
Standing up for what is right
We’ll think hard about how we might be able to bring back the Diversity and Ethnicity app. If we do, it will signal the end of commercial Face Recognition in non-commercial applications— which for many citizens, could actually mean the difference between life, and death. We’d want to bring it back to help us understand each other, and to openly help us to design and share a diversity dataset, that will help all AI companies build products that we can all use and enjoy equally.
We said it back in March, and today it's never been more relevant... https://t.co/2RpuYCiiDO
— Kairos (@LoveKairos) June 15, 2018
Brian Brackeen
Brian is the CEO at Kairos, a Human Analytics platform that radically changes how companies understand people.