Social media platforms such as Facebook, TikTok, and Twitter implement community standards and content moderation processes to minimize the risk of users being exposed to harmful online content under Singapore’s new set of Internet rules. It is legally required to do so.
You also need to ensure that users under the age of 18 have additional safeguards, such as tools to minimize exposure to inappropriate content and unwanted interactions.
Information and Communications Minister Josephine Teo announced details of the proposed new rule in a Facebook post on Monday (June 20th).
“When people get involved in social media, there is a growing global movement to recognize that harm comes with good and to make online more secure,” she said.
“Many countries have enacted or are in the process of enacting laws to protect users from online harm.”
Mrs. Theo said Singapore’s preferred approach to strengthening its online regulatory approach is to do so in a consultative and collaborative manner.
“This means learning from the experience of other countries, engaging technology companies on the latest technology developments and innovations, and understanding the needs of our people.
“These enable us to develop requirements that are technically feasible, effective and suitable for our purposes.”
The Ministry of Communications and Information (MCI) announced on Monday that it has been in talks with the technology industry since the beginning of this month and that public talks will begin next month.
The new code of conduct for online safety and the content code for social media services aims to code these standards and empower authorities to take action against platforms that do not meet the requirements.
After discussions, the code will be added to the Broadcasting Act.
The Infocomm Media Development Authority (IMDA) is empowered to instruct social media services to disable access to harmful online content for Singapore users.
The platform also needs to produce an annual accountability report published on the IMDA website.
These reports should include metrics to show the effectiveness of the system and process.
Asked about other consequences that the wrong platform may face, the ministry said it was too early to explain the details, as the details are still being developed in collaboration with the tech industry.
The code was first mentioned in the March budget debate.
Mrs. Theo told Congress that the code will focus on three areas: child safety, user reporting, and platform accountability.
She also said that MCI is working with the Interior Ministry to strengthen protection against online illegal activity for Singaporeans.
This includes strengthening Singapore’s law to deal with illegal online content such as terrorist material, child pornography, fraud, and content that incites violence.