Yukon Supreme Court says lawyers must disclose use of AI

Yukon's Supreme Court says lawyers in the territory have to inform the court when they use artificial intelligence to produce court documents or conduct legal research.

Yukon's Supreme Court says lawyers in the territory have to inform the court when they use artificial intelligence to produce court documents or conduct legal research.

The court issued those instructions last month, citing "legitimate concerns about the reliability and accuracy of information generated from the use of artificial intelligence" and specifically referring to Chatbot, or ChatGPT.

Manitoba's Court of King's Bench released similar directions last month after two New York lawyers were sanctioned by a U.S. court when they used ChatGPT to produce a legal briefing. The document contained fake quotes and referred to non-existent case law.

Thomas Slade, an Ottawa lawyer with Supreme Advocacy, said he is surprised Canadian courts are taking a 'pre-emptive' approach to potential AI threats.

He added he isn't convinced the directive is necessary at this point. The incidents out of the States, I think, are somewhat isolated. I haven't heard of anything similar happening in Canada," he said. "Usually, practice directions come about because there's some problem that's recurring. In this situation, I don't think there's a recurring problem yet.

Though not especially concerned about the presence of AI in courtrooms, Slade said he is worried about how people will use learning systems to navigate legal matters in their day-to-day lives as these systems become more popular.

"The biggest threat is the risk of fake or misinformation," he said. "I think there might be people out there that don't have the resources to pay for lawyers or don't want to go to a lawyer for whatever reason, so then they turn to these tools, and they may not realize the information these tools are generating is not legally sound."