Welcome to the October edition of Technol-AG, Addleshaw Goddard's monthly technology update.


Following on from the White Paper on Artificial Intelligence, the European Commission has published a Proposal for an Artificial Intelligence Liability Directive ('AI Liability Directive'). The AI Liability Directive aims to improve the functioning of the internal market by providing uniform rules for non-contractual civil liability for damage caused from the involvement of AI systems.

The proposal covers two main measures:

1. New disclosure obligations regarding evidence and information about high-risk AI systems; and

2. A rebuttable presumption of causality regarding the link between non-compliance with a duty of care and damage caused by AI systems.

The rebuttable presumption of causality will only apply if three conditions are satisfied: (1) the fault of an AI system has been demonstrated; (2) it can reasonably be considered likely that the fault has influenced the output, or lack thereof, produced by the AI system; and (3) the claimant has demonstrated that the output produced or failure to produce an output gave rise to the damage.

If the proposal is adopted, there will be a significant impact on companies developing and using AI systems as it will be easier for claimants to bring claims for failures and non-compliance. The potential benefit for such companies is that there will be increased certainty regarding their potential liability.


The recent High Court decision in D'Aloia v Persons Unknown & Others [2022] EWHC 1723 (CH)  may signify the start of more mainstream acceptance of non-fungible tokens (NFTs) as a viable tool of commerce and law.

The victim of an alleged cryptocurrency fraud (who was induced to deposit cryptocurrency into 2 digital wallets, wrongly assumed to belong to a legitimate US-regulated business) was given permission by the court to serve legal proceedings on the defendants by transferring NFTs into the defendants' digital wallet addresses. NFTs are unique identifiers for digital assets, which are recorded on a digital ledger (the "Blockchain"). NFTs cannot be copied, and have unique metadata allowing them to be distinguished from each other (and verified via the Blockchain). Service by NFT would consist of transferring documents into the wallets of the defendants, who sat behind the offending website.

The court noted that service by NFT was 'likely to lead to greater prospect of [the defendant's] being put on notice of the making of this order, and the commencement of these proceedings.' The court did not consider however, that it would be appropriate to allow service by NFT without also requiring service by email.

The decision demonstrates the English courts' readiness to embrace modern technologies, particularly in assisting victims of alleged cryptocurrency fraud. This may act as a wake-up call in the digital asset space, and prompt crypto exchanges to invest more heavily in compliance and security infrastructure to seek to prevent such fraud occurring. It may also act as a deterrent for would-be perpetrators of crypto crime, who (arguably) can no longer hide behind such exchanges to evade being served with legal proceedings. It could also subsequently invite the use of blockchain technology elsewhere in dispute resolution, such as disclosure, e-signatures and document exchange.


A robot has addressed the House of Lords about the potential impact of AI on the creative industries.

In the committee meeting, there were technical issues when the robot needed rebooting, and some algorithms were less effective. However, with this use of technology in its infancy, who knows where this 'first' may lead.

There has been criticism of the use of AI to make art, due to machines being trained on datasets that collate content from online, without necessarily having the permission of the artists. It has been acknowledged that this could lead to intellectual property ownership issues in the future.

Whether AI (such as the robot) can make a valid contribution to a debate has also been challenged, due to the fact that humans train the technology that AI uses to produce speech and text, meaning that what an AI robot says is directly aligned to the information a human feeds it. Therefore, it could be suggested that an AI robot presenting to a House of Lords committee is merely a novel mouthpiece for various human opinions, rather than an independent voice.


Have you been experiencing challenges in negotiating cloud contracts? AG's Tech Group is delighted to introduce our AG Cloud Market Trends Report. This interactive report contains some of the key negotiation trends we have identified on the most hotly negotiated cloud contract topics based on our experiences when negotiating and managing issues relating to a significant number of cloud contracts. Request your copy of the report here.

Key contacts

Meet the team
Susan Garrett

Susan Garrett

Partner, Co-Head of Tech Group
Manchester, UK

View profile