Meta is making users who opted out of AI training opt out again, watchdog says

If you already opted out of Meta’s AI training back in 2024, chances are, you have also been notified to do so again.

Noyb, a Vienna-based privacy watchdog organization, has sent a cease-and-desist letter to Meta, arguing that this practice violates the GDPR. This may potentially lead to yet another class action lawsuit if this is not resolved in time.

Privacy watchdog Noyb sent a cease-and-desist letter to Meta Wednesday, threatening to pursue a potentially billion-dollar class action to block Meta’s AI training, which starts soon in the European Union.

In the letter, Noyb noted that Meta only recently notified EU users on its platforms that they had until May 27 to opt their public posts out of Meta’s AI training data sets. According to Noyb, Meta is also requiring users who already opted out of AI training in 2024 to opt out again or forever lose their opportunity to keep their data out of Meta’s models, as training data likely cannot be easily deleted. That’s a seeming violation of the General Data Protection Regulation (GDPR), Noyb alleged.

“Meta informed data subjects that, despite that fact that an objection to AI training under Article 21(2) GDPR was accepted in 2024, their personal data will be processed unless they object again—against its former promises, which further undermines any legitimate trust in Meta’s organizational ability to properly execute the necessary steps when data subjects exercise their rights,” Noyb’s letter said.

Curious to hear if any European Union residents here have received such notice again. I have no idea why this needs to be repeated in the first place.

2 Likes

Couldn’t you guess already? Scummy behavior.

My best guess is far too many people opted out before so they are reneging on their “promise” and forcing people to do it again thinking not as many may know about it or do it this time around.

1 Like