| With the development of information technology and the improvement of informatization,various communication equipment types and functions have been continuously improved,which has also caused various interferences in the environment.Multiple interferences interweave and affect each other in multiple domains,which leads to a decrease in communication quality.Since the communication environment is always affected by interference,research on the anti-interference technology of communication has always been a research hotspot.At present,artificial intelligence is constantly rising,and at the same time,learning-based methods are making remarkable achievements in the fields of image and natural language processing.This has led researchers to consider combining artificial intelligence with communication anti-jamming to make anti-jamming methods more intelligent and real-time.The new generation of intelligent anti-interference communication technology is worthy of in-depth study.In this paper,we summarize and analyze the domestic and international research status and interference models of traditional communication anti-jamming technology and intelligent anti-jamming technology.On this basis,this paper analyzes the feasibility of convolutional neural networks for communication interference model learning and interference suppression.Different from the traditional spread spectrum anti-interference perspective,this paper designs a communication anti-interference receiving algorithm based on deep learning from the perspective of interference learning and calculation-based anti-interference methods.This article explains the main work from the following aspects:(1)Regarding the problem of excessive bandwidth occupation in traditional direct-spread anti-jamming communications,this paper proposes a convolutional neural network-based anti-jamming algorithm for the receiver.First,this method preprocesses the received signal with interference to obtain a low-pass sampling signal.Then,the signal undergoes format processing and conversion to obtain an input signal that the neural network can train.In this paper,the conventional convolutional neural network is structurally improved for processing one-dimensional communication data.The learning network continuously iterates through feature extraction and learning,and finally can realize interference estimation.The obtained interference estimate is eliminated from the received signal to achieve the purpose of interference suppression.The algorithm does not perform spread spectrum processing on the signal to suppress the interference power,but sacrifices the computing resources of the learning algorithm,which can significantly reduce the occupied bandwidth under the same performance.(2)Aiming at the problem that the anti-interference algorithm cannot adapt different types of interference and signal power at the same time,this paper improves on the basis of the above-mentioned learning-based anti-interference algorithm,and performs generalized performance analysis with fixed signal-to-noise ratio and interference power.In addition,this paper adds an interference classification algorithm at the front of the algorithm.The algorithm perceives the signals in the communication environment,recognizes the current type of interference,selects the corresponding learning algorithm module,and finally realizes the estimation and suppression of multiple interferences.Different interferences perform corresponding learning algorithm training to construct the interference database and continuously enhance the algorithm's interference adaptability.In summary,the simulation results show that the learning-based anti-interference receiving algorithm proposed in this paper is different from the traditional spread-spectrum method in exchange for bandwidth for anti-interference gain under the condition of ensuring the system bit error performance,and the computing power is used to improve the anti-interference performance of the receiving end.At the same time,the improved learning-based anti-interference algorithm performs well under the interference environment constructed in this paper,and the algorithm generalization performance is better. |