SRA considers risks from ArtificiaI Intelligence

AI, SRA, Report, artificial intelligence

A new report published by the Solicitors regulation Authority (SRA) looks at the opportunities – and risks – that artificial intelligence represents to law firms.  The report, which is the latest in the SRA’s Risk Outlook series, looks at how artificial intelligence is impacting the legal services sector, outlines current and potential future developments, and highlights some of the issues that firms may need to think about in each area to help them assess if and how they might be affected.

The report acknowledges that artificial intelligence (AI) is not a new topic of risk. Its use and potential has been considered before, including in the 2018 Outlook on technology and legal services. However, the rapid pace of recent developments in AI – including systems such as ChatGPT, DALL-E – have made systems more available to firms and made the use of AI for everyday tasks more mainstream.

The report points out that whilst previously, dedicated AI for legal work was once only accessible by the largest firms, now this is no longer the case and an increasing range of commercial products are making it easier for smaller firms to benefit.

The report goes on to state that the use of AI is rising rapidly and that by the end of 2022:

  • three quarters of the largest solicitors’ firms were using AI, nearly twice the number from just three years ago
  • over 60% of large law firms were at least exploring the potential of the new generative systems, as were a third of small firms
  • 72% of financial services firms were using AI.

Opportunities that firms could consider include using AI to complete administrative tasks more efficiently, so as to free up staff capacity for more complex tasks. Automation can reduce costs – it could be used, for instance, to capture client information before a first consultation. Firms may already have access to software, which can help them harness AI and develop new ways of working.

However, as with any new technology there are going to be risks. These may include:

  • Accuracy and bias problems – these can cause AI to produce incorrect and possibly harmful results, either through hallucinations or amplification of existing bias in the data. These effects can have the added problem that people often put more trust in computers than in humans.
  • Client confidentiality – maintaining client confidentiality when using AI, not just protecting against exposure to third parties but also making sure sensitive information is secure both in their firm and when dealing with the system provider.
  • Accountability – solicitors need to remember that they are still accountable to clients for the services provided, whether or not external AI is used.

Paul Philip, SRA Chief Executive, said:

‘It is difficult to predict how quickly AI will change the legal sector, but increasingly we won’t be able to ignore its impacts.

‘So far it has mainly been larger firms using AI. However, with such technology becoming increasingly accessible, all firms can take advantage of its potential. There are opportunities to work more efficiently and effectively. This could ultimately help the public access legal services in different and more affordable ways.

‘Yet there are risks. Firms need to make sure they understand and mitigate against them – just as a solicitor should always appropriately supervise a more junior employee, they should be overseeing the use of AI. They must make sure AI is helping them deliver legal services to the high standards their clients expect.’

The use of AI in law firms was also covered in the SRA Risk Outlook paper of June 2022 on innovation in a competitive landscape and in the 2021 research report on innovation in legal services.

The Law Society has also published a report on generative AI which can be found at www.lawsociety.org.uk/topics/ai-and-lawtech/generative-ai-the-essentials 

Share on social media