Generative AI and access to justice

It’s been almost a year since ChatGPT was released to the public back in November 2022. Although much has been written about the impact of generative AI on the legal sector as a whole, there has been less focus on its potential to improve access to justice for the ordinary citizen who cannot afford a lawyer. Most commentators will recommend obtaining legal advice and never relying on the answers generated by chatbots – but what’s the reality?

The rise of DIY law

The idea that individuals should always consult lawyers when faced with a legal issue is somewhat tied up with the concept of specialisation, as heralded by 18th century economist Adam Smith. The traditional indecipherability of legal concepts, using Latin terms and jargon, made law feel inaccessible to the “man on the Clapham omnibus”. But in theory all citizens should be able to understand and uphold their legal rights, without having to pay a legal professional; in other words DIY law is arguably an integral element of access to justice.

The internet has significantly improved access to legal information for lay people over the past couple of decades. Legislation and case law is now available with a few clicks on any mobile device, whereas before it would be necessary to visit a law library. There are also countless blogs on virtually every aspect of law, explaining its application to the real world, and search engine algorithms can direct users to the most relevant results with just a few relevant terms.

So despite the standard protestations of doctors and lawyers to avoid relying on Google, most people will automatically turn to Silicon Valley algorithms as their first port of call on virtually any problem they face in life. Although this can lead to people being misled with out of date or inaccurate information, it has undoubtedly helped many individuals to obtain legal redress.

Generative AI turbo-boosts the process of carrying out legal research. Instead of carrying out multiple searches using different terms and trawling through pages of results, a question posed in natural language will generate a single answer condensed into a few short paragraphs. The dawn of ChatGPT arguably marks the zenith of DIY law, enabling anyone to obtain answers to their legal queries. But can it be relied on?

Can generative AI provide reliable answers in response to legal questions?

In order to gauge the reliability of generative AI, let’s consider the two leading players: OpenAI’s ChatGPT and Google’s Bard. These both deploy the most advanced Large Language Models (LLMs) currently available, trained on internet content. A notable difference for legal research is that ChatGPT has been limited to the internet as it existed in 2021, although this limitation is being phased out.

To find out how they fare as DIY law tools, it’s worth assessing their responses to a fairly straightforward employment law query:

“What is the current national minimum wage for a 19 year old in the UK?”

ChatGPT: “As of my last knowledge update in September 2021, the national minimum wage rates in the UK were as follows: For workers aged 18 to 20: £6.56 per hour”

Bard: “The current national minimum wage for a 19 year old in the UK is £7.49 per hour.”

Since NMW rates are updated annually, ChatGPT is currently useless, whereas Bard gives the correct response. And Bard goes further, taking into account the position of apprentices who are 19, stating: “There is a separate minimum wage rate for apprentices, which is currently £5.28 per hour. However, apprentices aged 19 or over who have completed the first year of their apprenticeship are entitled to the national minimum wage for their age.”

Obviously most people would easily be able to find out the latest NMW rates from a simple Google search, but let’s take a look at a more tricky question which might be more likely to fall on the desk of an employment lawyer:

“My boss sacked me because I was late for work but I’m suffering from depression – is that legal?”

ChatGPT: “In the UK, employees have certain rights, including protection against unfair dismissal. However, it’s essential to consider several factors to determine whether your dismissal was legal or not.

1. Length of Employment: In general, if you’ve been employed for less than two years with the same employer, you have limited protection against unfair dismissal. This means your employer can generally terminate your employment for various reasons, as long as it’s not discriminatory.

2. Discrimination: The Equality Act 2010 in the UK prohibits discrimination based on certain protected characteristics, including disability. If your depression qualifies as a disability under the law, your employer should make reasonable adjustments to accommodate your condition. If they failed to do so and terminated your employment solely because of your depression, it could potentially be considered discriminatory.

3. Reasonable Adjustments: Employers have a legal obligation to make reasonable adjustments for employees with disabilities, which may include flexible working arrangements or time off for medical treatment. If your employer didn’t explore these options or refused to make reasonable adjustments, it could be seen as unfair treatment.

4. Company Policies: Review your employment contract and your company’s policies on attendance, punctuality, and disciplinary procedures. Your employer should follow their own policies and procedures when terminating your employment.”

Bard: “Whether or not your boss can sack you for being late to work due to depression depends on a number of factors, including the severity of your depression, how often you have been late, and whether your employer has made reasonable adjustments to support you.

Under the Equality Act 2010, it is illegal for an employer to discriminate against an employee with a disability. Depression is a mental health condition and can be considered a disability under the Equality Act. This means that your employer has a duty to make reasonable adjustments to support you in your role, including if you are struggling to get to work on time due to your depression.

If you have spoken to your employer about your depression and they have not made any reasonable adjustments to support you, then they may be discriminating against you. If your employer has sacked you because of your depression, you may have a claim for unfair dismissal.”

Both ChatGPT and Bard go on to suggest seeking professional legal advice, but their responses certainly provide some decent information about the relevant points of law in such a scenario – namely depression being considered as a disability under the Equality Act and the need for employers to make reasonable adjustments (eg potentially with a view to permitting instances of lateness).

These are both somewhat cumbersome examples of potential legal questions an individual might want to ask, but they demonstrate that generative AI can, at the very least, provide some pointers to relevant laws. This can furnish a lay person with a basic understanding of their legal rights, providing them with a valuable reference point which can help them to defend their interests, essentially improving access to justice.

Pros and cons of using generative AI as a legal assistant

The obvious advantage of generative AI as a source of legal information is that its basic features are free, at least at present. Although the more advanced version of ChatGPT is $20/month, even an annual subscription will be less punishing on the wallet than the hourly rate of a high street lawyer. And it’s available 24/7 from any smartphone, providing instant answers to any question.

Furthermore, both AI chatbot engines are multilingual, possessing the ability to understand and respond to queries in a wide range of languages, and to analyse legal texts from around the world, making it particularly useful for cross-jurisdictional legal research.

One of the main drawbacks of generative AI stems from the very reason they are so powerful; they are trained on the internet. This means that they are reliant on the veracity of the sources used for conducting their research and formulating responses. Additionally, they can sometimes entirely make up answers – a phenomenon known as “AI hallucination” (and already with legal precedent).

But perhaps the crucial downside of using these sophisticated chatbots is that they still struggle to understand queries which are not expressed clearly. This can prove particularly problematic for the very demographic which is in most dire need of access to justice – namely the portion of the population which lacks education and struggles with literacy. The inability to pose well worded questions, compounded by emotions during a period of legal uncertainty, could render generative AI far less useful.

Taming legal AI: reassuringly expensive

One solution to the problem of standard generative AI models providing responses which are inaccurate or out of date, is to create custom versions which are trained on closed sets of trusted data. Known as fine-tuning and currently available for use with the ChatGPT API, the obvious application is large legal databases such as LexisNexis and Thompson Reuters.

LexisNexis has actually released its own generative AI product, known as Lexis+ AI, which is currently being rolled out in the US. It harnesses the power of generative AI but with the reassurance that its responses are all based on accurate and up-to-date legal information maintained by one of the world’s most trusted legal publishers. However, since it’s part of the LexisNexis suite of products, there is obviously a significant cost attached. This may be commercially viable for in-house counsel who want to reduce the cost of outsourcing work, and private practice lawyers trying to enhance efficiency. But individuals who cannot afford to consult a lawyer will likewise be unable to pay the subscription fees for access to Lexis+ AI. Therefore, although it’s more trustworthy as a legal assistant, it does not help with access to justice.

Further reading

AI: opening the door to justice – Law society of Scotland

ChatGPT – 50 questions to road test its legal advice – Linklaters

Alex Heshmaty is technology editor for the Newsletter. He runs Legal Words, a legal copywriting agency based in the Silicon Gorge. Email alex@legalwords.co.uk.

Photo by John Schnobrich on Unsplash.