In the second half of the twentieth century,some computational models of intelligence test problems were proposed.By the early 21st century,artificial intelligence IQ tests helped to establish a new bridge between psychometrics,cognitive science and artificial intelligence.Therefore,it has aroused the attention of researchers,and the research on artificial intelligence IQ test has achieved certain results,but the digital sequence prediction problem in IQ test is caused by the diversity of digital sequence patterns and various possible mathematical knowledge involved.The results are not ideal,so new digital sequence prediction algorithms need to be explored in an effort to make them comparable to Turing machines.In view of the above problems,this paper proposes a deep learning method based on the digital sequence problem in IQ testing.This article first collected a large data set from multiple data sources,including more than 1600 IQ number sequences from more than ten books and more than ten websites,including linear,power,Fibonacci,extra and others.Types of.Then,a new method using location-based forgetting re-encoding(LFR)is proposed to solve the problem of interpreting power reduction beyond the input layer size.Further,a deep neural network is trained to predict the next digit of the digital sequence,in which methods such as a normal DNN network,an RNN,and an extended LSTM network are tried.The experimental results show that our method is superior to other related methods in the literature,especially the average performance is better than the average IQ test results of multiple groups of graduate students. |