An AI tool Google offers to developers won’t add gender labels to images any longer. The development is first reported by Business Insider mentioning that a person’s gender can’t be established just by how they look in an image. Notably, the Cloud Vision API of Google will tag images as ‘person’ to prevent any prejudice.
Yesterday, the company emailed its developers about the change to its extensively used Cloud Vision API tool. The aforementioned tool uses Artificial Intelligence to analyze images and determine landmarks, faces, explicit content, and other recognizable features. Google will tag the images with labels like “person,” instead of using “man” or “woman” to identify the respective images.
As per the aforementioned Business Insider report, Google cited its own Artificial Intelligence guidelines, in the email to developers mentioning the change.
“Given that a person’s gender cannot be inferred by appearance, we have decided to remove these labels in order to align with the Artificial Intelligence Principles at Google, specifically Principle #2: Avoid creating or reinforcing unfair bias.”
What is your point of view about the latest step taken by the search engine giant to prevent gender bias? Do you think AI has a tendency to be biased towards a particular gender in the coming time?
See also