Algorithmic decision-making in automated administration has changed the traditional administrative model,and brought challenges and potential risks to the protection of civil rights and the administrative rule of law.As the important change in automated administrative operations,the algorithms forms the algorithmic power through unique technical advantages,which has the nature of "quasi-public power".Although algorithmic decision-making in automated administration can improve administrative efficiency,promote public welfare,and enhance governance efficiency,it will cause new problems such as algorithm black box and algorithm hegemony.Therefore,it is urgent to regulate algorithmic decision by law.In order to reach the normal operation of algorithmic decision-making in automated administration,it is necessary to ensure the procedural justice of algorithmic decision-making in automated administration.Algorithmic decision-making in automated administration should follow the basic principles of administrative procedure,such as administrative openness,administrative justice,and public participation.In practice,algorithmic decision-making in automated administration often brings harm to the due process and leads to violation of civil rights due to its own inexplicability,technical limitations,and the dependence of administrative subjects on algorithmic decision-making.For the algorithmic justice of digital social governance,the procedure regulation of algorithmic decision-making in automated administration can be learned from the experience abroad.Based on all above,we can make the following prospect for the regulation of algorithmic decision-making in China.At the conceptual level,the idea of power restriction,procedural justice and cooperative governance should be adhered to.At the institutional level,we can strictly limit the scope of application of algorithmic decisionmaking,perfect the reason-giving system and the hearing system,and introduce professional evaluation agency to achieve the visible algorithmic justice. |