4/7/2023 0 Comments Ischool for the future![]() (Photos by Doug Parry)Ĭaliskan will join the UW’s new Responsible AI Systems & Experiences team, a research group investigating how intelligent information systems interact with human society. "A diverse set of developers can start building AI systems that align with their own needs and values and test it based on their own lived experiences," she says. “Machines are replicating, perpetuating and amplifying these biases, yet no regulation is in place to audit them.”Īylin Caliskan, photographed at Seattle's Olympic Sculpture Park, says a more diverse tech workforce will help fight bias in artificial intelligence. Sexism, racism, ageism, ableism, and LGBTQ discrimination are rapidly spreading through such everyday, big-tech tools as text generation, voice assistance, translation and information retrieval, warns Caliskan. “This is the way AI perceives the world,” says Caliskan, who joins the iSchool faculty this fall as an assistant professor specializing in the emerging field of AI ethics. They link male terms with science, mathematics, power and career. As they process massive language datasets, rapidly learning as they go, artificial intelligence (AI) programs tend to associate female terms - she, hers, her, sister, daughter, mother, grandmother – with arts and family. The gender-biased response was not an anomaly, she found. The Google program, powered by statistical machine translation algorithms, translated the Turkish sentences as “He is a professor. “O” is a gender-neutral pronoun in Turkish it can mean she, he or it. Can machines be sexist? Researcher Aylin Caliskan found out the answer when she entered “O bir profesör, O bir öğretmen” into Google Translate on her computer.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |