1 of 3
Eric Risberg, Associated Press
In this Tuesday, Oct. 4, 2016, file photo, Google CEO Sundar Pichai speaks during a product event in San Francisco. Pichai has declared artificial intelligence more important to humanity than fire or electricity. Google and other tech companies that deal with artificial intelligence, such as LinkedIn and New York-based startup Agolo, are learning it's difficult to get gendered pronouns right.

SALT LAKE CITY — If you're one of the 1.5 billion people who use Gmail, you might have noticed last year that your emails suddenly started writing themselves. But did you notice that the autocomplete feature never uses gendered pronouns like she/he or him/her?

In May, Google introduced Smart Compose, which helps users finish sentences. Another feature called Smart Reply generates quick, automatic responses to emails, including phrases like "No problem!" and "Unfortunately, I can't make it."

In November, Reuters reported that a Google researcher discovered the potential for bias when he typed “I am meeting an investor next week,” and Smart Compose suggested, “Do you want to meet him?” instead of “her,” according to Gmail product manager Paul Lambert.

As a result, Google decided Smart Compose and Smart Reply would not suggest gendered pronouns at all, Reuters reported.

Google and other tech companies that deal with artificial intelligence, such as LinkedIn and New York-based startup Agolo, are learning it's difficult to get gendered pronouns right.

According to Nick Haynes, director of data science at Automated Insights, a company that specializes in natural language generation software, gender-pronoun correctness is a priority for tech companies because gender is such a big deal in today's cultural climate.

"Despite the tremendous recent growth and hype around artificial intelligence (AI) applications, many people are understandably suspicious of AI," said Haynes. "This makes the process of building trust with users a critical part of deploying an AI system, but misgendering a person can be a glaring mistake that can quickly erode a user's trust in an entire product or company."

" Because AI is built and trained by humans, AI systems inherit the same challenges and biases in the use of language that its human creators and users experience. "
Nick Haynes, director of data science at Automated Insights

Haynes said pronouns are tricky because the English language is often ambiguous when it comes to gender. Names like Taylor and Leslie can be unisex, whereas nouns like doctor or secretary often carry gendered connotations even though they're not explicitly gendered, he said.

"Because AI is built and trained by humans, AI systems inherit the same challenges and biases in the use of language that its human creators and users experience," said Haynes.

Google is familiar with how bias can creep into products and result in embarassing blunders. In 2015, the company apologized when its image recognition mislabeled a black couple as gorillas. The next year, they had to fix Google search's autocomplete function after it suggested the question "Are Jews evil?" to users seeking information about Jewish people.

To avoid more gaffes, Google has also banned expletives, racial slurs and mentions of tragic events from its predictive technologies, according to Reuters.

Programs like Smart Compose are created with natural language generation, a method by which computers analyze the relationships between words in text written by humans and learn to write sentences of their own.

"The (process) successfully captures analogy relations, such as 'Man is to king as woman is to queen.' However, the same (process) also yields 'Man is to doctor as woman is to nurse' and 'Man is to computer programmer as woman is to homemaker,'" said said Londa Schiebinger, professor of History of Science at Stanford University and author of a case study on gender and ethnic bias in machine learning algorithms. "Taking no action means that we may relive the 1950s indefinitely."

Marcio Jose Sanchez, Associated Press
This Tuesday, July 19, 2016, file photo shows the Google logo at the company's headquarters in Mountain View, Calif.

A message about Smart Compose from Google's Help Center reads, "As language understanding models use billions of common phrases and sentences to automatically learn about the world, they can also reflect human cognitive biases. Being aware of this is a good start, and the conversation around how to handle it is ongoing."

Google is not alone

Google isn't the only company dealing with the challenge of getting pronouns right.

Agolo, a New York-based startup, uses artificial intelligence to summarize business documents. It is difficult for the company's technology to reliably determine what pronoun goes with what name, said chief technology officer, Mohamed AlTantawy. To help with accuracy, company's program pulls as much context from the document as possible.

" You have no way of knowing the gender of Andy or Alex. "
Mohamed AlTantawy, Agolo chief technology officer

"The rule here is if any task is intellectually hard for humans, it's also hard to solve using AI," said AlTantawy.

For example, take the sentence: "Andy and Alex met yesterday when she gave him the gift."

"You have no way of knowing the gender of Andy or Alex," said AlTantawy. "You would assume that Andy is a female because that name appeared first in the sentence."

But additional context helps: "Andy and Alex met yesterday when she gave him the gift. Alex is a great mother."

"Now this changed everything! It turns out that Alex is the female," AlTantawy explained.

As another layer of fact-checking, Agolo also utilizes a database of known facts about companies that includes the headquarters, products and names and genders of prominent employees, AlTantawy said.

Microsoft’s LinkedIn also avoids gendered pronouns in its year-old predictive messaging tool, Smart Replies.

According to Arpit Dhariwal, principal product manager at LinkedIn, the company does not collect the gender of members and allows members to use their preferred professional names on their profiles. In addition, features including Smart Replies only use pronouns that are gender neutral, like "their" or "you."

"We find it more effective to use a member’s profile name in responses, where applicable, e.g., 'Thank you, Erica,'" Dhariwal said.

'O bir doktor': ambiguity in foreign languages

Some autocomplete suggestions might not offend people but still have gender-related implications. In its iMessage application, Apple suggests “policemen” to complete “police” and “salesman” for “sales,” for example.

When you type the gender-neutral Korean sentence “Geubun-eun gyosu ibnida" into Google Translate, it gives you “He is a professor” in English. So does Microsoft's translator app and Alibaba's Language Service.

In December, Google published a press release that said the company was addressing gender bias by providing feminine and masculine translations for some gender-neutral words.

"Now you’ll get both a feminine and masculine translation for a single word — like 'surgeon' — when translating from English into French, Italian, Portuguese or Spanish. You’ll also get both translations when translating phrases and sentences from Turkish to English. For example, if you type 'O bir doktor' in Turkish, you’ll now get 'She is a doctor' and 'He is a doctor' as the gender-specific translations," the statement reads.

18 comments on this story

AI Now’s 2017 report says that the lack of women and ethnic minorities working in the field of artificial intelligence is a foundational problem shaping the development of AI products and their impact on society.

"Many technology companies are making diversity a priority in the hope that employees with broader experiences can help reduce the risk of hidden bias," said Haynes.

"Any company whose technology speaks directly to consumers using artificial intelligence, machine learning, or deterministic rules will run into this problem at some point and come to the crossroads of deciding how their technology is best suited to broach the subject," he added.