英文摘要 |
With the increasingly urgent need for algorithm governance, the right to explanation of automated decision-making has been put forward, which has become a ground for empowering users and relevant individuals with respect to their autonomy, the exercise of their technological due process rights, and the avoidance of externalization and shifting of costs and harm caused by algorithm operations. The General Data Protection Regulation(GDPR)designed the right to explanation of automated decision making in a limited and weakened manner, but it created a combined and reinforced version for algorithm explanations by recognizing various rights of both the data subject and the data protection impact assessment mechanism. Nevertheless, the GDPR scheme has several limitations, such as the relevant provisions being made in an incomplete structure, leaving ambiguity and inappropriate limiting of the application scope. When establishing a localized version of the right to an explanation of algorithms, the position and function for algorithm governance should be properly clarified. The major elements and contents of the right should be fully identified, and in accordance with the degree of social embeddedness and specific application fields. An accurate and scenario-based scheme for explanatory measurement should also be introduced, followed by an integrated approach to governance built on collaboration between internal and external actors. |