The tech industry’s struggle with racism and sexism
Students call out the systemic issues in the tech space
“The distribution of preferences and abilities of men and women differ in part due to biological causes and that these differences may explain why we don’t see equal representation of women in tech and leadership,” stated former Google software engineer James Damore in his now-infamous manifesto.
Damore was fired in 2017 shortly after the release of Google's Ideological Echo Chamber: How bias clouds our thinking about diversity and inclusion, in which he criticized Google's diversity initiatives.
Despite the attention brought by Damore's manifesto, it was far from the first instance of prejudice within the tech sector. Google itself has faced numerous allegations of discriminatory practices, underscoring the pervasive nature of bias within the industry.
Alex, a second-year software engineering student at Concordia University, whose last name was omitted for his safety, shared insights into the impact of stereotypes on individuals within the field of study as a Chinese-Canadian.
“I've often felt limited by stereotypes. I've often felt trapped by the assumption that I must excel in the field,” he said.
“My classes are filled with people who look like me. Most of my peers come from similar backgrounds” said Alex. “There’s this idea that people like me are meant to be tech-savvy yet tech fields and representations in the media clash with my reality since I don’t look like the typical geek. It is hard to keep up with.”
Despite coming from a family with an engineering background, Alex emphasized the need to transcend stereotypes and advocate for diversity and inclusion initiatives in the tech field.
In fact, due to a historic lack of diversity and inclusion initiatives in the tech sector, decades-worth of technology still being used and produced has been encoded with racial bias, disproportionately harming people of colour.
For example, research on Stable Diffusion AI has shown that artificial intelligence (AI) associates lighter skin tones with higher-paying jobs and darker skin tones with lower-paying ones.
Additionally, researchers at ProPublica found that policing and prison algorithms systematically assigned Black Americans higher risk scores. White defendants were seen as a lower risk by this racially biased technology.
Lindsay Rodgers, an advisor at Concordia's Applied AI Institute, emphasized the pivotal role of interdisciplinary collaboration in addressing biases within the tech field. Recognizing the fast-paced nature of the tech industry, she highlighted the importance of incorporating diverse perspectives to ensure responsible AI development.
“Computer scientists are not trained to identify biases,” Rodgers said. “I'm not trained to generate a data set, it is not my skill set.” She advocates for proactive measures to embed ethical considerations into all stages of tech development.
Rodgers hopes diversity and inclusion will become integral components to tech innovation. Her working group, Affecting Machines, offers workshops on gender equity and bias to help demystify AI and promote its equitable application across diverse fields.
As generative AI becomes increasingly prevalent, estimates suggest it could dominate marketing content by 2032.
Amid these challenges, the Applied AI Institute and the Concordia Student Union’s (CSU) AI Launch Lab want to address bias and promote diversity in artificial intelligence development.
Timothy Pereira, co-founder of ConcordAI, stressed the importance of fostering a more inclusive tech environment.
"Although I don't want to sound pessimistic, the programs we offer are a safe space that unfortunately does not currently reflect the tech industry." Pereira said the significance of these programs as safe spaces for individuals to work and learn without fear of judgment.
However, Pereira hasn’t lost hope. "The optimal solution would be for all tech companies to reflect these diversity programs, but in the meantime, we are hoping that those who participate in said programs can take what they learn from us and apply it to their work."
Pereira underscored the necessity of bridging the gap between the program's ideals and the realities of the industry. “While these efforts mark a positive step towards greater inclusivity, they should have to prioritize the impact. It is really important for tech to reflect society, not the field," he said.
Personal biases, lack of diversity, as well as the tech industry’s tendency to prioritize profit over societal implications, perpetuate discrimination in algorithmic systems, according to Pereira.
Rodgers believes it is crucial to increase awareness, proactive measures, and regulations in the industry. “Although many think that these issues do not concern them, tech is more than code and data. Targeted applications or Netflix suggestions might seem small, but they concern us and affect us. Taking the time to learn and be involved is important for all of us.”
This article originally appeared in Volume 44, Issue 12, published March 19, 2024.