Font Size: a A A

Research On Key Technologies Of Signal Processing In Distributed Relay System

Posted on:2018-05-02Degree:MasterType:Thesis
Country:ChinaCandidate:Y QiangFull Text:PDF
GTID:2348330515958251Subject:Information and Communication Engineering
Abstract/Summary:PDF Full Text Request
3GPP LTE-A proposed relay technology to expand the network coverage,which provides the signal to places that shadow fading is serious or signal is difficult to reach,such as indoor and business-intensive areas.The relay technology can expand the network coverage area,improve the system capacity,enhance the system anti-fading performance and enhance the robustness of the system,also can reduce costs,layout construction with a more energy-efficient way in the high-speed development of communication.There are multi types of relays,in which distributed digital relay uses a near-end unit connected with multiple remote units networking.Fiber or cable are used in the connection between the near-end unit and the remote unit.Compared with the traditional analog relay,distributed digital relay is widely studied and applied because of its low long-distance transmission attenuation,stable performance,flexible networking mode,perfect monitoring function,easily signal processing,versatility and compatibility advantages.However,the use of distributed digital relay will cause the noise rise of the base station,and thereby reducing the sensitivity of the base station.This paper studies how to solve the problem through digital signal processing.Firstly,based on the principle of distributed digital relay system,the uplink noise is analyzed in the system of one near-end unit connected with a number of remote units.It is concluded that the noise increment in the distributed digital relay system is mainly determined by the noise figure of remote unit,the uplink gain,and the quantity of the remote unit.Then based on 3GPP case 1 path loss model in the LTE system,simulation and calculation of the uplink is conducted in traditional analog relay scenario and the distributed digital relay scenario.Results show that the link gain effect of distributed digital relay system is better compared with the traditional analog relay system.But with the number of remote unit increased,the system noise will also be significantly increased,which affects the system performance.To solve this problem,based on GSM system and LTE system,this paper presents corresponding suppression method of the uplink noise respectively.The basic principle is that firstly obtaining the digital baseband signal of each user frequency band by filtering the uplink signal,and then the filtered digital baseband signal is subjected to power detection.Comparing the detected power value with noise threshold in accordance with certain judgment rules,if the signal power is small enough,it will be judged as noise interference and deleted to eliminate the interference.Secondly,a systematical simulation of the noise suppression scheme of distributed digital relay in GSM system and LTE system is conducted in this paper.The uplink signal transmission method in GSM system and LTE system is introduced in the first place,and the uplink signal of GSM and LTE system is generated based on MATLAB simulation tool.Then,the digital filter bank,power detection and threshold setting in the bottom noise suppression module are described and designed in detail,and the performances are analyzed.Lastly,the above design is applied to GSM and LTE distributed digital relay system respectively.By simulating the bit error rate of the system under different SNR,the simulation results show that the uplink noise of the system gets good suppression by using the bottom noise suppression scheme,and the system bit error rate has a very good improvement effect.Finally,this paper presents an optimization and performance analysis for the noise suppression module in LTE system.The noise threshold setting is analyzed and optimized in the first place.By setting two noise thresholds,noise-on threshold and noise-off threshold,and proposing an adaptive noise threshold method,the noise threshold value can change with the value of real-time noise power,so that the noise decision can be more accurate and the truncation error and misjudgment error can be reduced.Through the bit error rate simulation,interference noise can be more accurately suppressed and the system bit error rate has been further improved.Then,the filter banks in the bottom noise suppression module are optimized.In this paper,the prototype filter in the DFT filter bank is redesigned to reduce the overlap between the filters and make the filtering effect better.In the end,the performance of the noise suppression module is evaluated and analyzed.The influence of the delay of each module on the system is analyzed,and the way of weakening the bad influence is proposed.The computational complexity of the filter bank is analyzed,and a method,which is using more adjacent resource blocks as a filtering unit,is put forward to reduce the the amount of computation.
Keywords/Search Tags:Distributed Digital Relay, Noise Suppression, Filter Bank, Power Detection, Noise Threshold
PDF Full Text Request
Related items