By Vama Saini, September 22 2023—
Dr. Gideon Christian, an assistant professor at the Faculty of Law at the University of Calgary, has taken on the task of confronting racial bias in artificial intelligence (AI) facial recognition technology. His research has garnered attention and $50,000 in funding from the Office of the Privacy Commissioner of Canada.
The implications of this research are extensive, given the widespread use of facial recognition technology across various sectors, including law enforcement, healthcare, education and finance. Christian discussed his motivations, findings and proposed solutions.
“For AI to reach its full potential, it’s critical to examine AI’s adverse impact on people of colour,” said Christian.
Christian’s research delves into the racial bias in facial recognition technology when identifying individuals of colour, specifically Black women. He said that the root cause of this bias lies in the data used to train AI models.
“Technology is not inherently unbiased — it’s a product of the data it’s built upon, and biases can arise when training AI on data that doesn’t represent the intended environment or population,” explained Christian. “When technology is designed using data predominantly from a white population and deployed in indigenous environments, it results in biased outcomes.”
Christian shed light on six cases where Black individuals were wrongly arrested as a result of faulty AI identifications. One prominent case highlighted by Christian involved Porcha Woodruff, an eight-month-pregnant Black woman falsely implicated in a carjacking incident.
“Despite her obvious pregnancy and the victim’s description of the assailant not matching her situation, the arrest was solely guided by AI facial recognition,” said Christian. “This case exemplifies the inherent risks of relying solely on AI and the critical need for human oversight in such applications.”
Christian highlighted the inequitable application of facial recognition technology in Canada. Inaccurate matches within government databases compromised the claims of individuals with refugee status, with the government alleging that they had made their claims using false identities. Notably, all the cases involved Black women.
“Employing facial recognition technology on vulnerable refugee populations, despite its well-documented errors in recognizing minority groups, raises serious concerns about fairness and justice in our immigration system,” said Christian.
To confront this racial bias in AI development, Christian proposed a framework that would require gender and race-based analysis of commercial facial recognition tools before they are deployed for use in the public.
“Where such tools are used in decision-making in the public sector, such as immigration decisions, individuals impacted by such decisions should be informed of the technology’s use and given the opportunity to challenge the decision,” said Christian.
Christian stressed the need for diversity within the AI industry, not just in its workforce but also in designing the technology itself to incorporate different perspectives.
“Diversity in design matters. The current industry’s success caters to a homogenous image,” said Christian. “It needs diverse perspectives and representative data to ensure fair technology.”
Using data that accurately mirrors the population where the technology will be deployed is critical. Christian explained that proportional representation in training data can significantly reduce bias.
“Racism in policies isn’t determined by intention — it’s about the impact on individuals based on their race,” said Christian.
To advance his research goals, Christian has initiated a collaboration with the Alberta Civil Liberties Research Centre (ACLRC), an institution dedicated to promoting civil liberties and human rights in Alberta through research and education.
“This partnership aims to increase public understanding and awareness of the racial impacts of AI facial recognition technology through webinars, workshops and the publication of research papers,” said Christian.
Through this collaborative effort, backed by government funding, Christian strives to foster an inclusive and equitable technological landscape that upholds the rights and dignity of all.
To learn more about Christian’s research and efforts to combat racial bias in AI facial recognition technology, visit here.