| The fusion of multi-source precipitation observation information can reduce the precipitation measurement error,improve the precipitation retrieval accuracy,and obtain more reliable and accurate precipitation data sets.Both ground-based and spaceborne radar precipitation estimation data have their own measurement error structure characteristics and their representative space-time scales.Accurate characterization of precipitation error structure is the key to obtain the optimal fusion results of multi-source observation precipitation.Stratiform precipitation and convective precipitation have different precipitation microphysical processes in the precipitation development process,and their corresponding precipitation error structures are quite different.Therefore,it is very important to study the error structure characteristics of ground-based and spaceborne precipitation radars based on different precipitation types,establish a priori model and likelihood function model based on this feature,and develop the optimal fusion model and algorithm of ground-based and spaceborne radar precipitation data with different space-time scales and different uncertainties to obtain high-precision precipitation estimation.Firstly,based on the C-band dual-polarization radar data of Nanjing University of Information Science and Technology and the observation data of raindrop disdrometer in Nanjing,a large number of typical stratiform and convective precipitation processes are filtered out,and the raindrop size distribution(DSD)parameters at the corresponding time at three raindrop disdrometer stations were counted,and the log10(NW)-D0 precipitation type classification line applicable to Nanjing area is fitted.Then,the classification line is applied to the ground-based radar DSD parameters retrieved based on the variational method,so as to realize the precipitation type classification of ground-based radar data.In addition,the typical processes are selected to verify the classification effect,compared with the precipitation classification products of spaceborne dual-frequency precipitation radar(DPR),and the classification results were also applied to radar quantitative precipitation estimation.Based on the classification of precipitation types for ground-based radar data,considering the influence of precipitation types and precipitation intensity on the error structure of radar precipitation estimation,the prior error model of ground-based radar precipitation estimation relative to the rain gauge,the likelihood error model of spaceborne radar relative to ground-based radar and their distribution parameters are constructed and quantified,and then the hierarchical Bayesian structure is used to fuse the precipitation information of ground-based radar and spaceborne radar.Finally,the matching cases of spaceborne and ground-based radar are selected for precipitation data fusion,and the performance of the fusion result is analyzed by comparing with the rain gauge.The results show that the fitting classification lines of the three raindrop disdrometer stations in Nanjing are consistent and can be well applied to ground-based radar precipitation classification.The typical stratiform(convective)processes used for verification can be well separated on both sides of the classification line.Taking the DPR precipitation classification products as a reference,the classification effect of the classification line in Nanjing has the highest recognition rate for stratiform and convective precipitation in general,84.56%and72.64%respectively.The hierarchical Bayesian fusion results of ground-based and spaceborne radar based on precipitation type classification can greatly improve the original spaceborne radar precipitation estimation,with the relative bias from the rain gauge reduced by at least 20%and the correlation increased by at least 30%.Considering the influence of precipitation type and intensity on precipitation error structure,the hierarchical Bayesian fusion results can improve the root mean square error by 25%,the standard mean absolute error by 11%and the relative bias by 14%compared with the conventional Bayesian fusion results. |