Big brother

The dos and don'ts of AI monitoring. Plus the right to a healthy environment, AI washing, and dangerous product warnings.

Good morning.

In the midst of an ongoing shareholder dispute, Gildan Activewear is now up for sale. It’s like Succession — but with basic Canadian t-shirts in place of a global media empire. Pour one out for Gildan’s lawyers, who have presumably worked around the clock the past few weeks.

— Dylan Gibbs


6 min read

  • Privacy commissioner weighs in on AI monitoring

  • In memoriam: two highly respected former Chief Justices

  • The right to a healthy environment

  • How specific should a dangerous product warning be?


Filling the AI-regulation gap

Looking through a peep hole, we see a student with a concentrated look on his face, working on his laptop at home.

Re McMaster University, 2024 CanLII 17583 (ON IPC)

Canada’s lack of AI regulation is like a bat signal for privacy commissioners. While governments are busy hashing out policies and legislation, privacy commissioners are stepping up to recommend AI guardrails. Case in point: Ontario’s privacy commissioner recently investigated the use of AI-powered exam software by McMaster University.

What happened: McMaster started proctoring exams remotely during the pandemic. The University uses AI-powered software that monitors test-takers’ behaviour through their webcams. The software flags suspicious students as potential cheaters. And it saves a recording of each student’s exam so that human instructors can double-check the flagged incidents.

McMaster mostly complied with Ontario’s existing privacy laws. The commissioner said McMaster is allowed to use the monitoring software. But it needs to button a few things up. McMaster has to give students clearer notice about the personal information it collects during exams. And it has to stop letting its software provider use student recordings to “improve” the software.

But the Commissioner didn’t stop there. Recognizing that Canada’s AI laws and policies are still a work in progress, the commissioner recommended McMaster take extra steps to safeguard its use of invasive AI technology. Here are some of the “additional guardrails” the commissioner told McMaster to put in place:

  • Do an impact assessment on the risks of automated decision-making.

  • Consult with students, especially those most likely to suffer from biased decisions. (The commissioner cited research suggesting that exam monitoring software flags students with darker skin tones more often and generates false positives for students with disabilities.)

  • Don’t bring the “full weight” of an academic integrity allegation against a student without giving them an informal chance to challenge the software’s assessment.

  • Let students opt out of the invasive monitoring entirely by writing their exams in person.

You can read the commissioner’s full list of recommendations at the end of her report.

Big picture: Canada is working on broader AI regulation, but we’ve still got plenty of ground to cover before catching up to the EU. The House of Commons is studying the proposed Artificial Intelligence and Data Act, which would put tighter regulations on “high impact” systems like McMaster’s biometric monitoring software. On the provincial front, Ontario is still working on the Trustworthy AI Framework it started in 2021. And Alberta recently said it might need 18 months to draft AI legislation.

But the wait for a comprehensive scheme doesn’t mean it’s the Wild West. Expect privacy and human rights regulators to continue shedding light on appropriate uses of AI, with or without legislation directly on point.

In Memoriam

The Honourable Roy McMurtry, former Attorney General and Chief Justice of Ontario, passed away at age 91. He’s remembered for legislation that established a bilingual court system and reformed family law. As chief justice, he was on a panel that issued one of the first Canadian appellate decisions legalizing same-sex marriage.

The Honourable Claude Bisson, former Chief Justice of Québec, passed away just shy of his 93 birthday. He was appointed as an Officer of the Order of Canada in 1999:

Throughout his career as a judge, he was an effective counsel and an example to all the members of his profession. He is known for his tremendous humanity and his tireless efforts to ensure the orderly administration of justice. He oversaw the implementation of numerous reforms which greatly improved the way laws are applied. Very committed socially and generous with his time, he is a remarkable man in the eyes of his community.

Order of Canada Recipients: The Honourable Claude Bisson


🌳 A new report from the Law Commission of Ontario recommends major changes to Ontario’s Environmental Bill of Rights — the proposed changes would guarantee that everyone in Ontario has the right to a healthy environment. You can review all 58 of the report’s recommendations here.

🏛️ BC’s Ministry of the Attorney General released an update on its plan to create a combined regulator for the province’s legal professions. The Law Society of BC doesn’t think the proposed plan does enough to keep the regulator independent from government.

🤖 The US Securities and Exchange Commission fined two investment firms that were a bit too eager to ride the AI wave. Calling it “AI-washing”, the SEC said the firms lied about using AI to make themselves look more attractive.

  • One of the two firms was Toronto-based Delphia, which the SEC hit with a $225,000 fine for making statements like “[Delphia] put[s] collective data to work to make our artificial intelligence smarter so it can predict which companies and trends are about to make it big and invest in them before everyone else.”

  • Anticipating a trend, SEC Chairman Gary Gensler put out a video warning other companies about the risk of AI washing.

💰 KPMG is coming for Big Law. The latest game plan? Invest more money in AI.


Getting specific about the duty to warn

How specific does a dangerous product warning need to be?

According to the BC Court of Appeal, it’s not enough to warn passengers that a tour bus doesn’t have seatbelts. The manufacturer of a beltless bus needs to spell out the specific risk that rollover collisions (while rare) can eject passengers and seriously injure them. If the company operating the bus knows about the risk of a rollover (as most would), then the operator is also on the hook to give passengers that specific warning.

In this case, the manufacturer and the tour bus operator didn’t give passengers a proper warning. But it didn’t matter. The plaintiff knew about the risk of ejection and chose to ride the bus anyway.


Dylan Gibbs

That’s all for today. Govern yourself accordingly.

You can also find me on LinkedIn and X/Twitter @DylanJGibbs. If someone sent you this email, subscribe here.


Don’t keep us a secret. Get your friends to sign up and you’ll be rewarded. You can find your custom referral link in the email version of Hearsay.

Referral rewards include coffee, sticker pack, t-shirt, book, crewneck, and a $500 prepaid credit card