Meta to seek disclosure on political ads that use AI ahead of Canada elections

Sign up now: Get ST's newsletters delivered to your inbox

Meta scrapped its US fact-checking programs earlier in 2025.

Meta aims to curb misinformation ahead of the Canadian federal elections.

PHOTO: REUTERS

Follow topic:

Meta Platforms will ask advertisers to disclose the use of artificial intelligence (AI) or other digital techniques to create or alter a political or social issue ad, the Facebook-owner said on March 20, aiming to curb misinformation ahead of

the Canadian federal elections

.

The disclosure mandate will apply if an ad contains a photorealistic image, video or realistic-sounding audio that has been digitally created or altered to depict a real person as saying or doing something they did not actually say or do.

It also extends to ads that show a person who does not exist or a realistic-looking event that did not happen, alter footage of a real event or depict an event that allegedly occurred, but is not a true image, video or audio recording of the event.

In November 2024, Meta said it would

extend its ban on new political ads

after the US election, in response to rampant misinformation during the previous presidential election.

Meta also barred political campaigns and advertisers in other regulated industries from using its new generative AI advertising products in 2023.

However, Meta

scrapped its US fact-checking programmes

earlier in 2025 – in addition to curbs on discussions around contentious topics such as immigration and gender identity – succumbing to pressure from conservatives to implement the biggest overhaul of its approach to managing political content.

The Instagram-owner also claimed in December 2024 that generative AI had limited impact across its apps in 2024, failing to build a significant audience on Facebook and Instagram or use AI effectively.

Meta has also added a feature for people to disclose when they share AI-generated images, video or audio, so it can label it. REUTERS

See more on