Africa tech NewsAI NewsCurrent NewsLocal tech NewsTechTech NewsWorld

Meta Agrees to Unmask Child Exploitation Accounts in South Africa Following Legal Pressure

0

In a landmark legal development, Meta Platforms will hand over identifying data of offenders behind child exploitation content shared on WhatsApp and Instagram.

Meta Platforms, the parent company of WhatsApp, Instagram, and Facebook, has formally agreed to disclose subscriber information linked to accounts responsible for sharing explicit content involving South African school children. The move follows an urgent court application brought by digital law specialists demanding accountability from the global tech firm.

The case, spearheaded by the Digital Law Company, culminated in a court-approved settlement that compels Meta to provide names, email addresses, and phone numbers associated with the perpetrators. The order also mandates the establishment of a dedicated child protection hotline, in partnership with the Digital Law Company, to fast-track future abuse reports for the next two years.

“This is a landmark moment,” the Digital Law Company said in a statement. “We believe this is the first time in South African legal history that a global tech giant has agreed, in writing and in court, to these kinds of terms.”

The urgent application was filed after Emma Sadleir, founder of the Digital Law Company and a well-known expert in social media law, was alerted on 11 July to accounts circulating “horrific” material. According to Sadleir, the content not only involved children but also encouraged further submissions from minors.

“This wasn’t about extortion, which is often the case in these incidents,” Sadleir said. “This was about humiliation and shame.”

After discovering that one of the WhatsApp channels involved had threatened to release a significant cache of illegal material, the firm moved swiftly, filing a court application on 14 July. Judge Mudunwazi Makamu responded with an urgent order instructing Meta to shut down the implicated WhatsApp, Instagram, and Facebook accounts and to provide identifying user information.

While Meta initially complied by removing some of the flagged accounts, the Digital Law Company claimed that new channels emerged almost immediately. The firm accused Meta of not taking decisive steps to prevent the same perpetrators from regaining access and of withholding crucial data needed to press charges.

During their investigation, the Digital Law Company uncovered over 1,000 posts, including images and videos, circulated across 30 different Meta accounts within a matter of days.

Despite Meta’s public statement that it has “zero tolerance for child sexual exploitation” and had reported the accounts to the U.S.-based National Center for Missing and Exploited Children (NCMEC), friction emerged between the tech company and South African legal authorities.

In a move that raised eyebrows, Facebook South Africa’s legal team sent a letter claiming that the wrong entity had been cited in the application and that future legal correspondence should be directed to Meta officials in the United States. The Digital Law Company responded, calling the letter “puzzling,” and reaffirmed that their application correctly named the responsible parties.

When Meta missed the court-mandated deadline to release the requested data by noon on 16 July, the firm filed a contempt of court application, seeking a writ that would see Meta’s Southern Africa Head of Policy imprisoned for 30 days or until compliance.

Following the settlement order issued by Judge Makamu, the Digital Law Company emphasized the broader implications of the case, saying it “reinforces that tech companies operating in South Africa must comply with local laws, court orders, and standards of dignity and child protection.”

The newly established hotline and Meta’s written legal commitment could represent a turning point in how global platforms handle abuse cases in South Africa.

“Tech companies cannot operate above the law,” Sadleir said. “This case sets a powerful precedent, and we hope it signals a shift in how digital platforms engage with urgent threats to child safety.”

WhatsApp for Windows 11 Takes a Step Back: Native App Replaced by Chromium-Based WebView

Previous article

iOS 26 Beta Adds Controversial Nudity Detection to FaceTime

Next article

You may also like

Comments

Leave a reply

Your email address will not be published. Required fields are marked *