Social media platforms to remove harmful content, add safeguards for young under S'pore's Internet rules, Latest Tech News - The New Paper
Tech

Social media platforms to remove harmful content, add safeguards for young under S'pore's Internet rules

Social media platforms like Facebook, TikTok and Twitter will soon be legally required to implement community standards and content moderation processes to minimise users' risk of exposure to harmful online content, under Singapore's new set of Internet rules.

They will also need to ensure additional safeguards for users under 18, including tools to help the users or their parents minimise their exposure to inappropriate content and unwanted interactions.

Minister for Communications and Information Josephine Teo announced some details of the proposed new rules in a Facebook post on Monday (June 20).

"There is a growing global movement pushing to enhance online safety, recognising harms come along with the good when people engage on social media," she said.

"Many countries have enacted or are in the process of enacting laws to protect users against online harms."

Mrs Teo said Singapore's preferred approach in strengthening its online regulatory approach is to do so in a consultative and collaborative manner.

"This means learning from other countries' experiences, engaging tech companies on the latest tech developments and innovations, and understanding our people's needs.

"These will allow us to develop requirements that are technologically feasible, can be effectively enforced and that are fit for our purpose."

The Ministry of Communications and Information (MCI) on Monday said it has been conducting consultations with the tech industry since earlier this month, and public consultations will begin next month.

The new Code of Practice for Online Safety and the Content Code for Social Media Services are aimed at codifying these standards in law and giving the authorities powers to take action against platforms that fail to meet the requirements.

The codes are expected to be added to the Broadcasting Act following the consultations.

The Infocomm Media Development Authority (IMDA) will be empowered to direct social media services to disable access to harmful online content for Singapore users.

And platforms will also be required to produce annual accountability reports to be published on the IMDA website.

These reports will need to include metrics to show the effectiveness of their systems and processes.

Asked what other consequences errant platforms could face, the ministry said it is too early to give details as the specifics are still being developed in collaboration with the tech industry.

The codes were first mentioned during the debate on its budget in March.

Mrs Teo told Parliament the codes will focus on three areas: child safety, user reporting and platform accountability.

She also said MCI is working with the Ministry of Home Affairs to provide Singaporeans with more protection from illegal activities carried out online.

This includes strengthening Singapore's laws to deal with illegal online content such as terrorist materials, child pornography, scams and content that incites violence.

COMMUNITY ISSUESinternetsocial media