UK Government targets minorities with social media ads despite Facebook ban
In recent years, government bodies and police forces have turned to social media as a vehicle for communication, utilizing hyper-targeted adverts to disseminate messages on a wide range of topics. The approach is not without controversy. Many of the ads are specifically targeted using data linked to protected characteristics like race, religious beliefs, and sexual orientation, sparking debates about their ethical implications.
The practice of ‘microtargeting’, as it’s known, has been exposed through an analysis of over 12,000 ads that ran on Facebook and Instagram between late 2020 and 2023. The data, originally provided to UK academics by Meta (formerly Facebook), has offered a glimpse into the state’s use of targeted advertising based on profiling by the world’s largest social media company. Despite Facebook’s 2021 announcement banning ad targeting based on race, religion, and sexual orientation, interest labels assigned by Facebook seem to be used as substitutes.
Targeted adverts can often be seen as beneficial in improving diversity or promoting public health and safety. For instance, they’ve been used to encourage Covid vaccine uptake and crime reporting. However, critics argue that these ads rely heavily on “old-fashioned” assumptions about social groups and use “ridiculous stereotypes”. As a result, they might exclude individuals who don’t fit into these cultural stereotypes hence missing out on important information.
Calls for regulation and transparency are growing louder in the face of these concerns. Advocates argue for an open register of digital campaigns by public sector bodies along with details of their targeting approaches. As we continue to navigate the digital age, it’s clear that the intersection of technology, advertising, and ethics is a space that warrants careful attention.