11 October 2023
Share Print

ICO launches its first action against GenAI use

To The Point
(3 min read)

On 6 October the UK Information Commissioner's Office (ICO) announced that it had issued a preliminary enforcement notice against Snap, Inc and Snap Group Limited (Snap) over its potential failure to properly assess the privacy risks posed by Snap’s generative AI chatbot "My AI". This is the ICO's first action in relation to GenAI. While there are at present no AI-specific laws in the UK or EU, data protection law governs the processing of personal data using AI, and the ICO has issued guidance on how the UK GDPR applies to such processing. All organisations using or proposing to use AI to process personal data need to consider the relevant law and guidance, including the requirement to conduct a data protection impact assessment (DPIA). Read on to find out how this could impact your business.

The ICO’s investigation into Snap's "My AI" chatbot provisionally found that the risk assessment Snap conducted before it launched the chatbot did not adequately assess the data protection risks posed by the generative AI technology, particularly to children. The chatbot involves the use of innovative technology and the processing of personal data of 13- to 17-year-old children, so risk assessment is particularly important.

While we are waiting for the EU to finalise its draft AI Act and for the UK government to announce its response to the consultation on its white paper published in March this year, existing law, including the GDPR and UK GDPR, already governs the use of AI which involves processing personal data. In addition, the ICO has issued guidance on AI and data protection, which explains how the law applies to AI. Personal data processing using AI must comply with the data protection principles including lawfulness and transparency, as well as the rules on automated decision-making. The UK GDPR requires organisations to conduct a data protection impact assessment (DPIA) before carrying out high-risk processing, in particular processing using new technologies, such as GenAI. Using children's data is another factor that makes processing "high risk", triggering the requirement for a DPIA.

At this stage, the ICO has issued Snap with a preliminary notice setting out the steps which the Commissioner may require, but Snap has the right to make representations, which the ICO will consider before making a final decision. However, the ICO's action indicates that the Commissioner intends to take a proactive approach to enforcing data protection law as it applies to the use of AI. The ICO's decision to publish a press release shows that it wants to be seen to be taking action, which can result in negative publicity.  Accordingly, organisations using or proposing to use AI to process personal data, in particular data relating to children or other vulnerable people, should review the relevant law and guidance and consider how to implement it, seeking expert advice as necessary. Some key actions are:

  • Identifying a lawful basis for the processing
  • Making sure that you are transparent with individuals about what data you are processing for what purposes
  • If you are using AI to make automated decisions about individuals, complying with UK GDPR requirements, including providing meaningful information about the logic involved
  • Identifying whether you need to conduct a data protection impact assessment (DPIA)

To the Point 


Subscribe for legal insights, industry updates, events and webinars to your inbox

Sign up now