My colleague AI

Getting to know my newest coworker before it outsmarts me

Erin Seize queried DALL-E 2 for an image of “illustrated realistic line art. journalist inspired by iris apfel and betsy johnson meeting a robot touching fingers, we see entire figures and there are plants on the boarders with twisting vines and realistic variegated leaves unfurling. sunny day a few clouds in the sky. a tree far in the background between the journalist and the robots. have fun.” In response, this is what it sent back.

I'm a semester away from becoming a journalism school graduate and being pushed out of the proverbial nest. My main concern is: Am I prepared for what’s out there? Luckily, I've got a pretty eclectic background. I've worked in horticulture, visual effects and animation production. I’ll happily pass on boring routine tasks to my “colleague AI,” as the German Journalists’ Association refers to it. Automation sounds exciting to me.

The ability to harness AI, or artificial intelligence, tools is a valuable skill for me to stay nimble in a rapidly evolving technological landscape. AI-based transcription and translation softwares have been a vital part of freeing up time to focus on more meaningful aspects of storytelling like connecting with people and hearing their stories. 

Still, fewer jobs and more reliance on AI could limit my prospects. Canadian census data from 2010 to 2020 follows the news industry’s downward financial trend with less than half of people selecting journalism as their occupation. In the same time period, data volume worldwide surged 5,000 per cent according to Forbes.Yet, I have the daunting responsibility for fact-checking—at least some parts of—this data and I won’t have a team of researchers at my disposal.

AI has been around for a while. It includes things like autocomplete, voice assistants and those annoying alerts about pictures of years gone by. We’re currently in the age of generative artificial intelligence (genAI). 

Unlike previous task-specific AI, genAI possesses the capability of outputting unique things that are not explicitly programmed into them like text, images and audio—think ChatGPT, Bard, Midjourney and DALL-E. 

Fortunately, my human skills still hold value. Chatbots routinely hallucinate, or make things up. You’re bound to find yourself in hot water if you’re not one for fact-checking. Publishers can’t totally hand over the reins to genAI, yet. Some outlets have tried to do just that and failed. Experts agree that human oversight is a non-negotiable element of journalism at this point. 

The Montreal Gazette saw its newsroom staff dwindle by 85 per cent in the last 35 years. In 1988, there was a buzzing newsroom of 240 people. That number now stands at 35 people. Meanwhile, census data shows that the population of metropolitan Montreal has almost tripled within the same time period. More stuff is happening with a lot less people to cover it. 

The Associated Press recently launched an initiative to bridge the AI awareness gap for local news providers. Tools are trained to meet the specific needs of smaller newsrooms and the benefits of AI are already having real world results. Breaking weather alerts are being delivered more quickly, which can save lives. According to the AP, newsroom editors can receive up to thousands of emails weekly. AI powered technology can sort, classify and even enter noteworthy events into the calendar, which saves valuable time. 

There are ethical implications to consider. AI is learning from real world data, which can be biased. Take health data, as argued in this article in PLOS Digital Health. The omission of race and ethnicity data in Canadian national health databases makes it challenging to consider groups that exhibit distinct outcomes compared to the broader population. Disability is another area that is lacking in most health related datasets, which can lead to algorithms that exclude these individuals from data-driven discussions and policies. 

If AI systems are trained on skewed data, they can reinforce and entrench more bias. It’s essential for me to keep this in mind when sleuthing journalistically. 

I checked in with DALL-E again asking it to generate “a Canadian family.” The bot typically generates four images with each prompt. My initial guess was that one of the four images would have a visible minority group in it given that one in four people in Canada identify as being part of a “racialized group,” according to the 2021 census.

Erin Seize queried DALL-E 2 for an image of “a Canadian family” In response, this is what it sent back.

I was wrong. The families all appear white, without an image to the contrary. 

AI cannot replace a human’s unique lived experience. “Context and interpretation is everything in our industry,” according to journalists responding to a 2020 global survey of what news organizations are doing with AI. These are the types of qualities that machines struggle to replicate. 

The public is skeptical of AI delivering their news. Data collected in a poll commissioned by the Canadian Journalism Foundation revealed that 90 per cent wanted transparency from news organizations about the use of AI. 

Reports surfaced recently of allegations that Sports Illustrated created fake author profiles to accompany entirely AI generated stories, which were not labeled as such. This is not unique

AI will get smarter and fast. The next milestone is artificial general intelligence (AGI), or strong intelligence, a form of AI that aims to make machines as intelligent as humans. This includes self-awareness, problem solving, making plans and learning from experiences. The industry is pushing hard and the genAI market’s worth is projected to be $1.3 trillion by 2032 according to Bloomberg. 

The integration of AI tools into journalism is non-negotiable: adapt or die. Learning the skills to communicate what is needed to streamline efficiency for our work—whether in the newsroom or as a self-employed freelance journalist—is a must. Until the flow of data subsides, I see co-existing with my colleague AI as my journalistic duty.