Snapchat, the popular social media app, has recently found itself under the scrutiny of the UK data watchdog, the Information Commissioner’s Office (ICO). The concern lies around its AI chatbot feature, My AI, and the alleged lack of thorough privacy risk assessment. This has raised concerns, especially considering that a significant portion of Snapchat’s users are children and young adults. The chatbot is part of Snapchat+, the platform’s subscription service, and was made available to all users in April.
Snapchat’s parent company, Snap, was issued a preliminary enforcement notice by the ICO. This follows the provisional findings of an investigation, which suggested that Snap failed to “adequately identify and assess the risks” to its millions of users in the UK, including those aged 13 to 17. The ICO has given Snap until October 27 to respond before a final decision on action is made.
If a final enforcement notice is served, Snapchat could face severe consequences. The company would have to stop processing data related to My AI for its UK customers until an “adequate risk assessment” is carried out. This could also lead to a hefty fine – up to 4% of Snap’s global turnover which equates to about £3.8bn or a maximum fine of £17.5m.
In response to these concerns, a Snap spokesperson stated that they were closely reviewing the ICO’s provisional decision. They highlighted that their product development approach includes a robust legal and privacy review process before public availability. Snap has pledged to work constructively with the ICO to ensure compliance with their risk assessment procedures.