TechLair

  • Home
  • contact
  • About
  • Privacy Policy

Report: Female-sounding voice assistants fuel gender bias

Tuesday, May 21, 2019 by Piyush Suthar | Comments

Home News Tech Report: Female-sounding voice assistants fuel gender bias

AI-powered voice assistants from Google, Amazon, Apple, and others could be perpetuating harmful gender biases, according to a recent UN report. The report, titled “I’d blush if I could” — Siri’s response to provocative queries or flirtatious statements — says the female helpers are often depicted as “obliging and eager to please,” which reinforces the idea that women are “subservient.” Worse, it states, is the way in which they give “deflecting, lacklustre, or apologetic responses” to abuse or criticism. Because the speech of most voice assistants is female, it sends a signal that women are… docile helpers, available at the touch…

This story continues at The Next Web

Authored by Piyush Suthar
Pro Blogger


Follow me on Twitter, Facebook, Google+, YouTube.

Load comments
  • Newer Post
  • Home
  • Older Post
  • techlair
    Over 1,500+ Readers

    Get fresh content from TechLair

    brand222 facebook brand2 envelope-o

    BEST OF TechLair

    Dell Inspiron 15 5593 laptop review: A win for Intel, but maybe not for Dell
    Is installation of CCTV cameras by the Indian government a violation of right to privacy?
    India will build its own space station in the next decade
    Huawei laptops have reportedly been taken down from the Microsoft store


    Copyright © 2019 TechLair. All rights reserved.
    Privacy Policy • DMCA • Contact