| 英文摘要 |
Big data investigation is an in-depth combination of big data technology and investigation activities. It is the inevitable result of the informatization evolution of human society. The wide application of big data technology and the continuous expansion of the functional scope of big data investigation not only leads to new changes in the form of investigative activities in the composition of the subject, action logic, coercive method, etc., but also makes the procedural control system of investigative activities exposed structural defects that cannot be ignored. Firstly, the intervention of third-party subjects has changed the pattern of investigative power, leading to a weakening of the ability of traditional institutional power centers to control investigative activities. Secondly, the dominance of algorithms has revolutionized the logic of investigative action, with the procedural nodes of filing, initiating and concluding investigations becoming blurred, resulting in the loss of a grip on external control of investigative activities. Thirdly, the separation of fields makes the coercive methods of investigation seldom leave traces in physical space, and do not even have an immediate impact on the lives of citizens, thus making it very difficult to control big data investigation activities through ex post facto remedies. In the face of these challenges, the traditional control approach with the dominant philosophy of writ reviewism and rights protectionism is caught in an institutional quagmire at the judicial and legislative levels, and the opinion of legal retentionism is difficult to subject to the examination of logical self-consistency. The key to solving the problem of procedural control of big data investigation lies in functional retention, preventing the dominant elements of big data technology from being used at will in order to avoid the confusion of judicial logic with administrative and commercial logic. To this end, behavioral regulationism should be taken as the core philosophy of the macro approach, based on the three important links of data collection, utilization and verification, and clarifying the technical rules of conduct in big data investigation. Specifically, on the basis of conceptual integration of data collection behavior, data collection control rules should be established around rights protection and external source control, data utilization restriction rules should be established around hierarchical control and technical boundaries, and data content verification rules should be established around authenticity and reliability assurance, so as to achieve the synchronous upgrade of big data investigation in the procedural control system, procedural subject, procedural flow and rights relief. In terms of procedural subjects, the introduction of third-party subjects such as network platforms into the control framework of investigative power is not to make them the sole representatives of the interests of the state, individuals or themselves, but to make them maintain technical neutrality in individual cases as much as possible, while giving them certain litigation roles, such as witnesses, appraisers or expert assistants, investigative assistants, etc., so that they can assume the responsibility of being data controllers or providers. In terms of procedural flow, it is necessary to take measures such as extending control nodes, optimizing control means, and enhancing knowledge collaboration among control subjects to dynamically control big data investigative activities. In terms of rights relief, based on the requirements of strict responsibility and technical behavior compliance under the concept of preventive justice, the sanction logic of infringement in big data investigative activities should be adjusted through the state compensation system and the rule of exclusion of illegal evidence. |