Font Size: a A A

Machine Learning Diagnostics Based On The Manhattan Distance That Can Incorporate A Variety Of Information

Posted on:2021-04-02Degree:MasterType:Thesis
Country:ChinaCandidate:S H ZhuFull Text:PDF
GTID:2415330611490842Subject:Applied Psychology
Abstract/Summary:PDF Full Text Request
With the advent of the era of big data,researchers can not only get information about students' answers through tests,but also get past information about students.If the past information is used effectively and combined with the answer information of the cognitive diagnostic test,then a more accurate diagnosis result will be obtained.However,the existing cognitive diagnosis methods cannot simultaneously deal with students' response information and past information.Therefore,this paper proposes a machine learning diagnosis method based on the Manhattan distance that can integrate a variety of information.This article is mainly divided into four Research: Research 1,theoretically analyzes the advantages of Manhattan distance and the feasibility of using Bayes' theorem to introduce information,and constructs two diagnostic methods of M-PNN and MB-PNN;Research 2,discusses M-PNN and MDD,EDD,KNN,PNN's accuracy rate under the conditions;Study 3,discusses the accuracy of the MB-PNN,M-PNN,and PNN under different conditions when information is introduced;Study 4: Empirical data In the comparison,the effectiveness of PNN,M-PNN,and MB-PNN is compared.The specific research results are as follows:(1)In theory,first,Manhattan distance is better than Euclidean distance in cognitive diagnosis;second,Bayes' theorem can be used to introduce information for processing.(2)Under different conditions,the accuracy rate of the M-PNN diagnostic method that changed the European distance to the Manhattan distance is better than the three non-parametric methods of KNN,PNN,and EDD,which are comparable to MDD.Improve the validity of Euclidean distance to Manhattan distance.(3)When information is introduced,the MB-PNN diagnostic method containing Bayes' theorem is used to process the information,and the diagnosis results obtained are better than those of M-PNN and PNN.This proves that Bayes 'theorem can be used to introduce information,and at the same time proves the effectiveness of adding Bayes' theorem in MB-PNN.(4)In empirical analysis,comparing the classification coverage,attribute mastering mode distribution,and attribute pass rate of different methods,it is found that MB-PNN is better than M-PNN,and M-PNN is better than PNN,which proves that M-PNN,MB-PNN is also effective in empirical research.At the same time,it proves that modifying distance and increasing Bayes' theorem can also play a full role in empirical research.
Keywords/Search Tags:multiple information, Manhattan distance, Bayes' theorem, machine learning algorithm
PDF Full Text Request
Related items