| 英文摘要 |
Content moderation refers to detecting and identifying user content that is illegal or does not comply with the platform’s terms and conditions and taking specific measures against the content, such as demotion, access restriction, removal, termination or suspension of user accounts, etc. The EU Digital Services Act for the content review mechanisms mainly requires a platform to have corresponding obligations, including that all restrictions must be clearly regulated and explained, and that content review must be carried out fairly, and that the principles of diligence, objectivity, proportionality and protection of fundamental rights must be implemented. In addition, a platform is required to have transparency reporting obligations and provide users with internal and external remedies. The EU’s regulatory framework emphasizes information transparency and sets up appropriate procedures to protect user rights. However, the relevant legislative model should still pay attention to avoiding circumventing the substantive fairness and legality of content review while emphasizing information disclosure to enhance transparency. In addition, it is also necessary to build up guidelines for content moderation to protect the user’s rights. |