AI and the law
Discover how AI has been used in the legal profession and it's future in this Chambers Legal Topic article.
AI in the legal profession
2023 will be remembered as the year we realised AI is now part of the essential fabric of our world, and not just a novel suite of apps or the latest life hack. All sectors have been forced to reckon with what AI means for business models and employment prospects, and the legal profession is no different. And whilst some lawyers may fear for their jobs in the medium-to-long term, law firms are not wasting any time in trying to harness new potentialities.
Earlier this year, Allen & Overy launched Harvey, based on GPT-4, a platform which, according to A&O, "uses natural language processing, machine learning and data analytics to automate and enhance various aspects of legal work, such as contract analysis, due diligence, litigation and regulatory compliance”. Harvey’s output, the firm hastens to add, is given “careful review by an A&O lawyer”.
Meanwhile, Simmons & Simmons has sent ripples through the matrix with its Rocketeer product, billed as a “the world’s first AI trademark lawyer”. Rocketeer, which is said to “predict the outcome of EU trademark conflicts to more than 92% accuracy”, offers the promise of data-driven insights to a field traditionally characterised by subjectivity and qualitative over quantitative analysis. Since its launch in 2021, Rocketeer has been licensed to many law firms as well as corporate clients.
Insights into the UK legal market
Register for the Chambers UK guide launch 2024 here.
AI and intellectual property
But if the rise of the robots feels inexorable, they aren’t having it all their own way. The American computer science researcher Dr Stephen Thaler has been on a years-long quest through courts in leading IP jurisdictions to secure legal recognition of his AI system, DABUS, as an inventor. So far, the European Union, the US Federal Circuit and courts in Germany and Australia have ruled that DABUS cannot lay legal claim to original works, be it copyright for artworks or else patents for a ‘neural frame’ and a ‘fractal container’ it devised. Decisions of these courts have all hinged on the reasoning that, since DABUS is not human, it cannot be considered an autonomous individual and therefore a named inventor under interpretations of existing statute.
The case is currently awaiting judgment by the UK Supreme Court, the UK Intellectual Property Office having ruled that an inventor owning patent rights must be a ‘natural person’, a decision upheld by the Patents Court and the Court of Appeal.
Questions of originality and authorship posed by the emergence of sophisticated AI models inevitably test the very foundations of intellectual property law, not least since the materials from which generative AI fashions its creations are usually the fruits of human research and imagination.
2023 has seen several class actions brought by American authors before US federal courts, all claiming that OpenAI’s ChatGPT infringes their copyright by training its models on entire bodies of work without permission or compensation. The UK Society of Authors has welcomed this claim, and since 'fair dealing’ UK copyright law exceptions are more specific than analogous US ‘fair use’ ones, it is likely we will soon see litigation in a similar vein on our shores.
Chambers coverage of AI focused work
Most of the growing number of AI-focused work highlights the Chambers UK Guide has seen this year have concerned the commercialisation of AI products used in life sciences and drug discovery, hi-tech research and IT, and financial services operations. It is exciting to see the germination of technologies that will save countless lives or else promise to realise the boldest aspirations of modernity.
But we are also expecting to see more instructions that grapple with AI’s more troubling implications for individual rights, from data protection to questions of systemic human biases and their reflection in artificial frameworks that can make decisions of grave consequence in a millisecond.
The rate of innovation so far outpaces legislators’ capacity to make sense of these implications, so we should expect much fuzzy logic and error-driven learning going forward. It will be lawyers as much as scientists on whom we depend for the imposition of intelligent control on artificial intelligence.