R00 OXD AI Justice In The Courts Assets Hero V01 ST

AI and access to the justice system: design challenges and considerations

Gordon Ross, VP and Partner at OXD, stresses that AI can’t improve public access to court information unless the underlying information is addressed first.
Share
XFacebookLinkedInEmailCopy Link

Is there a place for AI in digital services for the justice system?

Recent developments in artificial intelligence (AI), including the launch of ChatGPT by OpenAI1 in November 2022, have significantly raised public awareness to concepts like machine learning (ML) and large language models (LLM). This coverage and attention have compelled organizations across various sectors to consider the use of AI, its costs, benefits and risks, and develop policy and guidance for its responsible use.

The justice sector and legal system in Canada are not immune to these developments in AI. While some sectors appear to have been caught flat-footed in considering AI until recently, legal scholars and computer scientists have been discussing the topic for nearly 40 years at conferences like the International Conference on Artificial Intelligence and Law2, founded in 1987. The 2022 UNESCO-sponsored AI and the Rule of Law3 online course covers important topics relating to the use of AI in the courts. 

Locally, the BC Ministry of Attorney General investigated the use of AI in the sector through the AI Justice Challenge4 in 2018 and 2019, supported by InnovateBC. This initiative sought to investigate five key areas of AI use, including:

  • assisted form completion tools,
  • intelligent document reviewers,
  • chatbots,
  • audio transcription,
  • and a courts inquiry platform.

Within BC’s administrative justice sector, Darin Thompson5 from the Civil Resolution Tribunal considered how simple AI can be applied to online dispute resolution in his 2015 International Journal of Online Dispute Resolution article6

AI is more than just ChatGPT...

It’s important to note that AI is a broad category of different technologies and applications. 

As Dentons Canada’s Kirsten Thompson, Luca Lucarini, and Noah Walters7 wrote in their article titled “Know the Limitations and Liabilities of ChatGPT”8, it’s common to find four types of AI 9 referred to: 

OXD illustration of a chess piece representing Reactive Machines AI and Justice system

Reactive machines

Have no memory, react to the present—e.g., IBM Deep Blue 10 chess).

OXD illustration of a car representing Limited Memory Systems AI model

Limited memory systems

Have some memory, with past experiences informing decisions—e.g., self-driving cars and ChatGPT.

OXD illustration of a person with brain and heart representing theory of Mind and AI and Justice system

Theory of mind systems

Attempt to understand emotions and beliefs of humans—generally considered not yet achieved.

OXD illustration of a lightbulb representing self-awareness AI model

Self awareness

Are conscious and self-aware—also not considered achieved.

For instance, ChatGPT operates as a limited memory system, generating responses based on its data training set, which can be quite large (e.g. historical court decisions).

Judges, court administrators, and government technology policy makers are discussing how AI could impact citizens’ justice system experiences. Much of OXD's work in the justice sector over the past 20 years has been focused on improving access to justice 11, through applying legal design practices that increase the accessibility and usability of the justice system for citizens.

So, how might an ever-growing number of people with legal problems 12 use AI tools to follow and navigate court processes?

AI is the solution, but what’s the problem?

Before looking at how AI can assist the public with courts, one should consider what makes courts hard to use now. Our previous work with courts struggle to present information clearly, evidenced by how self-represented litigants 13 navigate court forms and rules. This highlights a historical issue 14 that courts need to better present information resources in a more understandable manner.

People without a legal background can find legal terminology difficult to understand. Literacy data shows how widespread these challenges are, with 48% of adult Canadians having literacy skills that fall below a high school level15. While it’s important to acknowledge that improvements by courts have made information more understandable through the ongoing plain language work of writers, lawyers, and judges, the journey of making courts accessible is far from being done16.

The best course of action for the design of any information resource is to start with evidence of user needs through design research. Research should be followed with sound information architecture and content strategy. This is true regardless of the use of AI to interpret, summarize, or re-package information for the end-user. 

Justice system information providers still need to focus on removing redundant, out-of-date, and less-timely information from their websites before considering AI. Re-writing information will assist with making people feel more confident in what to expect of the courts, know how the courts work, and navigate procedures in their legal situation. 

This focus and work on high quality information contributes to achieving the outcomes of helping those who navigate the system feel clear on what to expect and how to proceed. Clear and concise legal information and guidance sets those expectations and are a significant determinant of how participants feel about their justice experience.

Design challenges that lie ahead with AI and the justice system

For now, OXD continues to consider several questions regarding the proposed use of AI to assist the general public in navigating the justice system: 

  • Can we design information resources that provide insight into the choices or decisions faced by a member of the general public in terms of legal process? When do these choices occur and how can we effectively design information resources that illustrate the passage of time and transition of legal dispute stages, in a way that enables users to make informed decisions? 
  • What happens when  members of the public seek legal advice through AI interfaces? Will AI respond with legal advice? Many self-reps struggle with the difference and understanding of the concept of legal advice vs. procedural legal information17. Will AI interfaces improve or worsen this crucial distinction?
  • Can we depend on AI to be a reliable source of information and guidance18? How will the system be trained, and how will the responses be verified? 

As Content Design London states it in their article19, “An AI is only as good as the examples it is trained on, and as we all know, the world is full of misconceptions, half-truths, and lies. You cannot trust it to always get its facts right. You must always fact-check anything you intend to publish [especially if you’ve used AI to generate content]”. 

Book Isle In A Law Book Library for the concept of AI and Access to Justice

If people with legal issues are turning to AI-powered tools to comprehend court processes due to the difficulty of finding and understanding information—leading to confusion and frustration—the primary goal should be to make that information easier to find and comprehend. AI shouldn’t be viewed as a seductive short-cut to circumvent this hard work.

Simplifying complex court information with AI/ML risks over-simplification or omission of key points, causing errors, delays, higher costs, and poor justice outcomes. 

So, is there a place for AI in the justice system? Yes. But not yet.

Practicing sound user-centred design, such as conducting design research and usability testing information with representative users, can improve the sectors’ ability to produce helpful and authoritative information. Information, which one day, may form the basis of authoritative, verified, and accurate training data for an AI. 

Until that day comes, there’s still much work to be done to improve our current legal information for everyone, and in doing so we can reap the benefits of a more accessible justice system along the way. 


Footnotes

  1. Introducing ChatGPT by OpenAI
  2. 19th International Conference on Artificial Intelligence and Law - ICAIL 2023
  3. Artificial Intelligence and The Rule of Law - The National Judicial College
  4. Salman Azar. “Artificial Intelligence, the “AI Justice Challenge” and the Future of Law.” The Canadian Bar Association British Columbia Branch
  5. About Darin Thompson
  6. Thompson, Darin. “Creating New Pathways to Justice Using Simple Artificial Intelligence and Online Dispute Resolution.” HeinOnline
  7. About Dentons Data
  8. Lucarini, Luca; Thompson,Kirsten; Walters, Noah. “Know the limitations and liabilities of ChatGPT.” Lexpert
  9. “Understanding the different types of artificial intelligence.” IBM
  10. IBM Deep Blue
  11. Improving access to justice
  12. McDonald, Susan; Savage, Laura. “Experiences of serious problems or disputes in the Canadian provinces, 2021.” Statistics Canada
  13. National Self-Represented Litigants Project (NSRLP)
  14. Samsudeen Alabi.“Expanding Online Access to Procedural Resources for Self-Represented Litigants.” National Self-Represented Litigants Project (NSRLP)
  15. “Literacy at a Glance.” ABC Life Literacy Canada
  16. “Plain language – essential for real access to justice.” Provincial Court of British Columbia
  17. “Legal Information vs. Legal Advice: What is the difference?”Center for Public Legal Education Alberta (CPLEA)
  18. Jason Proctor. “Air Canada found liable for chatbot's bad advice on plane tickets.” CBC
  19. Neil Fazakerley. “Yet another blog post about ChatGPT.” Content Design London

Continue the conversation

Contact Gordon Ross to learn more about how to conduct user-centred design research in the justice sector.