Artificial Intelligence

A new publication from the UK’s financial regulator signals to firms that they should take steps to manage risks in the use of AI.

By Stuart Davis, Fiona M. Maclean, Gabriel Lakeman, and Imaan Nazir

The UK’s Financial Conduct Authority (FCA) has published its latest board minutes highlighting its increasing focus on artificial intelligence (AI), in which it “raised the question of how one could ‘foresee harm’ (under the new Consumer Duty), and also give customers appropriate disclosure, in the context of the operation of AI”. This publication indicates that AI continues to be a key area of attention within the FCA. It also demonstrates that the FCA believes its existing powers and rules already impose substantive requirements on regulated firms considering deploying AI in their services.

The government has announced it will come up with a new code of practice to replace an earlier approach that faced opposition from the creative sectors.

By Deborah Kirk and Brett Shandler

Latham previously reported on the UK government’s proposal to introduce a new copyright and database exception that allows text and data mining (TDM) for any purpose, provided that the party employing TDM obtains lawful access to the material (June 2022 TDM Proposal). The UK government has now announced that it is abandoning this proposal, and intends to consult with AI firms and rightholders to produce a code of practice to support AI firms to access copyrighted work as an input to their models, whilst ensuring protections on generated output to support rightholders. It has foreshadowed that this code of practice, due by summer 2023, may be followed up with legislation if it is not adopted or agreement is not reached.

The directives aim to assist claimants in proving the causation of damages and product defectiveness in complex AI systems, creating legal certainty for providers.

By Deborah J. Kirk, Thomas Vogel, Grace E. Erskine, Ben Leigh, Alex Park, and Amy Smyth

On 28 September 2022, the European Commission issued two proposed directives to reform and clarify liability rules on artificial intelligence (AI):

  1. The Directive on Adapting Non-Contractual Civil Liability Rules to Artificial Intelligence (AI Liability Directive) introduces rules on evidence and causation to facilitate civil claims for damages in respect of harm to end users caused by AI systems.
  2. The Directive on Liability for Defective Products (Revised Product Liability Directive) seeks to repeal and replace the 1985 Product Liability Directive (Directive 85/374/EEC) with an updated framework to better reflect the digital economy. The Revised Product Liability Directive proposes to explicitly include AI products within the scope of its strict liability regime and to modify the burden of proof for establishing defectiveness of technically or scientifically complex products like AI systems.

The UK government and regulators have taken several steps to implement a 10-year strategy published last year outlining the government’s pro-innovation national approach to AI.

By Deborah J. Kirk, Laura Holden, Nara Yoo, and Amy Smyth

The UK Department of Digital, Culture, Media and Sport (DCMS) published its 10-year National AI Strategy for the regulation and promotion of artificial intelligence (AI) in the UK (Report). DCMS seeks to build “the most pro-innovation regulatory environment in the world” 

In Lexology’s Getting the Deal Through: Digital Health 2021 (UK) Latham & Watkins considers the key regulatory and transactional issues faced by market players and practitioners.

By Frances Stocks Allen, Oliver Mobasser, Sara Patel, Mihail Krepchev, and Samantha Peacock

The UK has an active digital health market comprising both the private and public sectors. Venture capital funding in the digital health sector has increased significantly in recent years, with the majority of investment appearing to come from private investment firms. However, public financing through IPOs is also on the rise. The COVID-19 pandemic has further heightened the positive and dynamic investment climate for digital health technologies in the UK. In particular, the pandemic has highlighted the need for resilience in healthcare systems, including through digital health solutions. As a result, the pandemic has significantly accelerated uptake of digital health solutions in the UK and related investment opportunities, as well as challenging structural barriers that had previously slowed investment in digital health innovations.

Digital health in the UK is currently governed by a patchwork of different legal regimes, rather than bespoke legislation, while various regulatory and enforcement bodies have jurisdiction over the digital health sector.