| With the development and application of algorithm technology,algorithmic discrimination attracts more and more people’s attention,and the legal regulation of algorithmic discrimination needs to be implemented urgently."Data feeding" algorithmic discrimination includes all the autonomous algorithms of discrimination caused by data feeding.Among the multiple types of algorithmic discrimination,it is more feasible and effective to regulate.Through sorting out the current legal regulation path of discrimination,it is found that the legal regulation of algorithmic discrimination in foreign countries is centered on data and algorithms.China has made legal regulations on discrimination in the Constitution,Network Security Law,E-commerce Law and other laws.However,there are some regulatory loopholes in these legal regulatory paths for "data-feeding"algorithm-based discrimination,such as the "over-protection" and formal express consent of anti-automation decision-making power in the EU,the limitations of judicial review of anti-discrimination in the US,and the insufficiency of regulations in China’s laws both before and after the fact.These include lack of in-depth processing of data used for training algorithms,lack of regulation on data feeding after algorithms are put into use,and lack of systematic accountability system for the results of discrimination against "data-feeding"algorithms.Therefore,it is necessary to reconstruct the legal regulation path of"data-feeding" algorithmic discrimination,which can be divided into two stages to regulate"data-feeding" algorithmic discrimination according to whether the result of discrimination is generated.In the advance stage,the system of data cleaning,data transparency and traceability and data impact assessment is proposed to solve the problem that the data processing of the training algorithm is not in-depth enough.The discrimination data recognition function is set for the algorithm to deal with the problem of discrimination caused by the data feeding of the algorithm put into use,and the algorithm is reviewed.Restrict the user’s right to use the algorithm to avoid feeding the algorithm with discriminatory data.To occur in later stage,"data" type feeding algorithm discrimination law regulation of the result,because of the complexity of the algorithm system,regulating method and technical correction can not completely eliminate the discrimination algorithm,so you need to clear the data feed "type algorithm discrimination harmful consequences and take on the responsibility of the distribution,algorithm interpretation is how discrimination happens to damage the significance of distribution,The transparency of the algorithm is increased,and the construction of the algorithm accountability system provides a relief approach for the public harmed by the results of algorithm discrimination.According to the no-fault principle,the subject of responsibility for the results of "data-feeding" algorithm discrimination is determined to be the data trainer or the algorithm user,so as to bear the corresponding responsibility. |