| Most conventional seismic processing techniques have imbedded within them a dependency on least squares techniques. However, it has been shown in both the statistical and geophysical literature that least squares estimation may be yield unreliable results if the contaminating noise is not from the Gaussian probability density. Within the statistical demesne, a theoretical framework has been developed which addresses the problem of non-Gaussian noise by attempting to find estimators which have high efficiency in the Gaussian case, but remain stable when the noise is otherwise distributed. Robust statistical methods can easily be integrated into most seismic data processing schemes by replacing the least squares criterion by one which is less sensitive to aberrant data values. Three specific areas in which this may be done are in filtering, stacking and deconvolution. Filtering by medians and alpha-trimmed mean filters provides protection against spikes and glitches in the seismic trace. Replacing the standard stack with a robust estimate of location such as the alpha-trimmed mean will produce stacks which are insensitive to bad traces. Additionally, by using a robust stack as an initial signal estimate weighted stacks may be computed by using weights which are a function of the distance of each trace from the reference trace. Finally, the process that geophysicists refer to as deconvolution is known to the statisticians as multiple regression. By exploiting this connection, a technology for computing robust deconvolutions can be developed from the theory of robust regression. One method for rapidly solving these non-linear deconvolution problems has been referred to in the statistical literature as iteratively reweighted least squares. |