Telegram, one of the instant messaging platforms, is under significant pressure to enhance the security and integrity of its content in 2024.
This pressure has intensified following the arrest of its founder, Pavel Durov, in France, where he faces legal action related to the alleged dissemination of harmful content via the platform.
In response to these challenges, Telegram announced in September that it has implemented substantial measures to address these issues.
By the end of the year, the company claims to have successfully removed over 15.4 million groups and channels identified as being involved in the spread of harmful content, including fraud, hate speech, and terrorist activities.
Telegram attributes this achievement to the utilization of advanced artificial intelligence (AI) technology that supports its moderation processes.
As reported by Techcrunch.com on Tuesday, December 17, 2024, this announcement also marks the launch of a new moderation page on the Telegram platform.
This page aims to enhance transparency and improve communication with the public regarding the company's efforts to combat harmful content.
In a post on Durov's official Telegram channel, the company noted a significant increase in moderation enforcement since Durov's arrest in August 2024. Meanwhile, the legal case against Durov in France is still ongoing.
Currently, he has been released on a €5 million (approximately IDR 82.5 billion) bail, although his future remains uncertain due to the ongoing legal proceedings.