The Centre for Public Legal Education Alberta (CPLEA) undertook a trial of Artificial Intelligence (AI) tools to explore ways to improve both internal workflows and its public legal education and information resources.

The 2022–2023 Bridging the Gaps Legal Needs Assessment from the Alberta Law Foundation highlighted Artificial Intelligence (AI) as a promising way to enhance access to justice in Alberta. Specifically, the report noted generative AI as a potential tool for delivering public legal education and information (PLEI), including generating forms, making referrals, and translating documents. Generative AI is a type of artificial intelligence technology that can create new content—like text, images, videos or music – in a way that can look or sound like a human made them.
Although the private legal sector has long used AI — for legal research, document discovery, and client management — these tools have historically been out of reach for non-profits due to cost. However, the sudden rise in accessible, cost-effective AI tools is changing that landscape.
Trialing AI tools
In April 2024, CPLEA received special project funding from the Alberta Law Foundation to trial AI within our organization. In keeping with the spirit of people-centered justice, CPLEA took a bottom-up approach to explore how AI could:
- help us provide translated PLEI in more languages and formats
- support self-guided help for users (e.g., through interactive forms or chatbots)
- present legal content in multimedia formats to better meet diverse needs
To ensure our team was on the same page with foundational knowledge, we started with training from the Alberta Machine Intelligence Institute (Amii). We learned about AI best practices, which also helped us uncover further AI use opportunities in the public legal education space.
Following the insights we gained from Amii, our team then worked together to plan a well-scoped eight-month trial of AI tools to address four areas of our everyday work:
- Translation – to test whether AI could speed up and lower the cost of translating our materials
- Web and interactivity – to explore the possibility of adding chatbots to enhance web experiences
- Video and multimedia – to improve our video production workflows and explore new formats using AI
- Plain language editing and writing – to improve workflows for plain language editing and writing in all areas of our work, wherever possible
Key findings
Our team trialed various AI tools (or tools with AI capabilities) in translation, video and multimedia, and plain language editing/writing. Below is a summary of our key findings.
AI supports human expertise but cannot replace it.
In the areas that we tested, AI tools offer valuable support to translation, video and multimedia production, and plain language editing and writing. However, we found that all AI outputs still require human review to ensure accuracy and clarity.
AI translation tools show potential for short-form content but there are review hurdles.
Tools like DeepL performed well across multiple languages, especially with French, Spanish, and Turkish, when translating short-form content (which we set at a four-page limit). With a human review process, we believe such tools can help significantly reduce the time we need to produce multilingual PLEI material. However, longer form content (such as translating web content) remained an issue as we found that it was error-prone and labour intensive to verify. Moreover, we encountered practical hurdles with finding suitable language reviewers in Alberta.
Plain language editing and writing AI tools improved internal workflows … and more.
CPLEA does not currently use AI to generate content that goes to the public but AI tools like ChatGPT and Grammarly helped speed up our team’s editing and writing processes. With OpenAI tools like ChatGPT developing at an astonishing pace, we are starting to see more sophisticated applications in areas ranging from design, video and multimedia creation to web and coding support.
AI video production is still premature, but shows promise with voice, captions/transcripts, and editing.
Tools such as ElevenLabs and Adobe Premiere proved to be useful for improving audio clarity, creating quality voiceovers, and/or creating captions and transcripts, saving weeks (if not more) of manual recording or editing work.
Chatbot and interactive web tools are not viable for us – yet.
Our early chatbot experiment raised some accuracy and hallucination concerns. Our current technical infrastructure is also not designed to support development of an effective chatbot/interactive web tool. Going forward, we will be working to overcome these technical barriers.
Our guiding principles for responsible use
Based on our AI tool trial and internal discussions, CPLEA has developed a few initial guiding principles to support safe, effective, and ethical use of artificial intelligence across our work. These initial principles reflect the lessons learned from our real-world experiences during this trial period and they will help inform our own organization’s approach. With the ever-evolving nature of AI technology and best practices, we anticipate regularly revisiting these guiding principles.
Always have a human in the loop
AI tools can help us do our work. However, no tool replaces the need for human review, oversight, and final decision-making.
Use and disclosure
CPLEA’s current policy is to not use AI to generate PLEI content from scratch. When using generative AI tools for tasks such as translation, we will disclose what AI tool we used and how we used it. Due to the risk of hallucinated or outdated legal information, we will not use generative AI for legal research.
Always consider responsible AI
We must always consider responsible AI issues such as privacy, bias, security risks, trust and safety, and the role of AI tools (including their human impacts). We’ve identified several high-level questions to guide our thinking. Below are a few examples.
Privacy: How does this tool handle input and output data? Does any data become part of this tool’s own training model? Are we inputting personal or confidential information into this tool?
Bias: Is the input or output data of this tool showing bias? How might this tool perpetuate bias?
Security risks: What are the tool’s terms and conditions? What is the tool’s input and output data being used for? Who owns the tool’s outputs?
Trust and safety: Is this tool accurate? Have we fact-checked the tool’s input and output? Is the output going to be used for internal or public purposes?Does this tool’s output (or using this tool) undermine public trust or safety in us?
Role of AI tools and their human impacts: What’s the use case for this tool?How does using this tool affect how we work? Does it augment our work and can we use it effectively? Are we becoming dependent on it?How does using this tool affect other people more broadly? In other words, is it replacing a human?
Be strategic and selective when using AI
We will only use tools that ethically increase our team’s capacity to deliver on our mission of creating resources that help people understand their rights and obligations under the law. While we will continuously monitor promising developments for future AI exploration and implementation, we will be cautiously selective when choosing new tools to use.
Future directions
Building on the insights from our AI tool trial project, CPLEA has identified areas for further exploration and potential implementation. These ideas reflect both the opportunities AI presents for improving access to PLEI and the organizational capacity-building efforts needed to make them a reality.
One such area is translating our longer form material. Our project demonstrated that AI tools can help produce quality translations for short-form PLEI material into certain languages, especially when we pair them with human review for accuracy, consistency, and quality. Natural next steps would be for us to test these tools on slightly longer form content, such as our existing booklets on foundational legal topics (for example, Renting 101), and using a wider variety of languages.
Another area is to invest in our technical infrastructure to make it better compatible with AI tools. This longer-term project will better position us to consider implementing a chatbot, or virtual justice navigator, to curate information and referrals for users.
CPLEA is committed to continuing to explore and invest in AI opportunities, both internal and user-facing, that ultimately allow us to improve both access to justice and the quality of justice available to Albertans.
Looking for more information?
Looking for articles like this one to be delivered right to your inbox? SUBSCRIBE NOW!
DISCLAIMER The information in this article was correct at time of publishing. The law may have changed since then. The views expressed in this article are those of the author and do not necessarily reflect the views of LawNow or the Centre for Public Legal Education Alberta.