Source: aiuniverse.xyz
An artificial intelligence tool Google provides to developers won’t add gender labels to images anymore, saying a person’s gender can’t be determined just by how they look in a photo, Business Insider reports.
The company emailed developers today about the change to its widely used Cloud Vision API tool, which uses AI to analyze images and identify faces, landmarks, explicit content, and other recognizable features. Instead of using “man” or “woman” to identify images, Google will tag such images with labels like “person,” as part of its larger effort to avoid instilling AI algorithms with human bias.
In the email to developers announcing the change, Google cited its own AI guidelines, Business Insider reports. “Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias.”
AI image recognition has been a thorny issue for Google in the past. In 2015, a software engineer noted that Google Photos’ image recognition algorithms were categorizing his black friends as “gorillas.” Google promised to fix the issue, but a follow-up report by Wiredin 2018 found Google had blocked its AI from recognizing gorillas and had not done much else to address the problem at its core.
Google released its AI principles in 2018, in response to backlash from Google employees, who protested the company’s work on a Pentagon drone project. The company pledged not to develop AI-powered weaponry, and it also outlined a number of principles, such as the one referenced above, to address issues of bias, oversight, and other potential ethical issues in its future development of the technology.