Indonesia gives Meta ‘stern warning’ about harmful content on Facebook, Instagram, and WhatsApp: Not even 30% of flagged content…

Indonesia gives meta 39stern warning39 about harmful content on facebook instagram and whatsapp not.jpeg


Indonesia gives Meta 'stern warning' about harmful content on Facebook, Instagram, and WhatsApp: Not even 30% of flagged content…
Image used for representational purpose only

Indonesia’s Ministry of Communications has issued a strong warning to Meta Platforms. According to a Reuters report, the warning concerns how Meta Platforms handles harmful content across its platforms, including Facebook, Instagram, and WhatsApp. There have been concerns about online gambling and misinformation on the platforms. Indonesia’s Ministry of Communications stated that Meta Platforms had taken action on only 28.47% of items reported on these issues.The warning followed an unscheduled visit to Meta’s operational office in Jakarta by Indonesia’s communications and digital affairs minister Meutya Hafid, the Reuters report noted. The ministry said Meta’s compliance with the country’s regulations covering disinformation, online gambling, defamation, and hate speech remained limited.“Disinformation, defamation, and hate content threaten lives in Indonesia, yet Meta has allowed them to persist,” Meutya said. The Reuters report also noted that the ministry urged the company to strengthen its content moderation systems and accelerate the removal of illegal and harmful material. Last year, Indonesia urged Meta and other social media platforms to strengthen their content moderation.

Age restrictions and child-safety measures for social media services to be implemented in Indonesia soon

Government Regulation Number 17 of 2025, or PP TUNAS, is a regulation that is set to impose age restrictions and child safety measures on social media services operating in Indonesia. This regulation has strengthened the Personal Data Protection Law.President Prabowo Subianto signed PP TUNAS on March 28, 2025, and it has been effective since April 1, 2025. There is a one-year transition period for digital services.Some of the requirements include: Electronic System Providers must implement age verification measures, restrict access to services for users under a certain age, and implement measures to protect children on their services. Providers must also filter out content that could be harmful to minors, establish easy reporting processes, and ensure clear, prompt reporting.Minister of Communication and Digital Affairs Meutya Hafid said the government expects platforms to be ready before enforcement begins. “We feel that we have made it clear enough that this will start in March. So hopefully they (the platforms) will also support it, because we must understand and acknowledge that this regulation is to protect children in this country, in the digital realm, and it will be effective with the support and willingness of our friends at the platforms to also comply and follow the regulation,” she said.The regulation also prohibits platforms from profiling children’s data for commercial purposes and requires them to prioritise child protection over commercial interests.Platforms considered high risk must either block users under 16 or introduce parental supervision systems. Access for minors will no longer be fully independent.Authorities said strict sanctions could be imposed on platforms that fail to comply once full enforcement begins in March 2026.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *