• Call-in Numbers: 917-633-8191 / 201-880-5508

  • Now Playing

    Title

    Artist

    Google’s Keywords Planner, which suggests keywords to associate with adverts, has been found to have recommended pornographic phrases related to the terms “Black girls,” “Latina girls,” and “Asian girls”

    The same keywords appeared for the same ethnicities if “girls” was changed to “boys”.

    The keywords did not appear for “white boys” and “white girls”.

    Download the new Independent Premium app

    Sharing the full story, not just the headlines

    The Markup, which first reported on the keywords, suggests Google’s keywords linked minorities with sexual objectification.

    Google’s pornography filter highlighted 203 of 435 suggested terms as “adult ideas”.

    It is unclear what this definition refers to, but the Markup implies that Google was aware of the keywords sexual connotations.

    The results included Black women performing oral sex, as well as “Piper Perri Blacked”, referring to the adult actress and the porn production company.

    The Markup also implied that it made marketing services related to minorities more difficult.

    “The language that surfaced in the keyword planning tool is offensive and while we use filters to block these kinds of terms from appearing, it did not work as intended in this instance,” a Google spokesperson said in response to the report.

    “We’ve removed these terms from the tool and are looking into how we stop this from happening again.”

    “Within the tool, we filter out terms that are not consistent with our ad policies,” the spokesperson said. “And by default, we filter out suggestions for adult content. That filter obviously did not work as intended in this case and we’re working to update our systems so that those suggested keywords will no longer be shown.”

    Google emphasised that because a phrase was suggested as a keyword does not mean that adverts using that suggestion would be approved.

    The company did not provide an explanation for why searches for “white boys” and “white girls” did not return suggested results. The Independent has reached out to Google for more information.

    Google’s Keyword Planner is used by marketers to decide what keywords would be the most affective in Google’s search results, which reportedly generated over $134 billion (£105 billion) in revenue for the search giant in 2019.

    Google’s products can reflect biases and stereotypes in its own algorithms because of the way it incorporates data from the internet – something that has been visible in other algorithms.

    It was suggested that because of racial fetishisation of women of colour was the cause of these recommendations.

    This is not the first instance where Google’s algorithm has resulted in apparently racist results. In 2015, Google’s Maps software was flooded so that search results for a set of racist slurs redirected to the White House.

    In 2017, it was discovered that Google allowed advertisers to target content to users based on racial slurs, including the derogatory epithets such as “wetback” as well as the category “Nazi”.

    Instagram, owned by Facebook, has also recently said that it will be taking action to change its algorithm to avoid bias against people of colour.

    Read More


    Reader's opinions

    Leave a Reply