Font Size: a A A

The effect of interferometric noise due to crosstalk on the bit error rate of an optical network

Posted on:2002-03-28Degree:Ph.DType:Dissertation
University:University of California, San DiegoCandidate:Kuan, Gary MFull Text:PDF
GTID:1468390011998337Subject:Engineering
Abstract/Summary:
Crosstalk in optical networks produces interferometric intensity noise, which increases the error rate and decreases system performance. While studying the effects of multi-path crosstalk on a two-path optical network, it is observed that interferometric intensity noise due to coherent crosstalk may increase the error rate by up to twelve orders of magnitude. Furthermore, incoherent crosstalk is also found to interfere with the data signal, producing error rates up to ten orders of magnitude greater than expected for non-interfering signals. A more detailed analysis of optical interference reveals that incoherent interference exists and is present in optical communications networks even when interference fringes are otherwise unobservable. The manifestation of optical interference is found to be dependent on the RF bandwidth of the detector relative to the linewidth of the RF intensity noise, represented by the cross-correlation of the two interfering optical signals. Increasing the RF bandwidth of the detection system effectively increases its temporal resolution. If the temporal resolution is great enough, intensity fluctuations due to the relative stochastic phase fluctuations of the two signals is observed. This phenomenon is evident through both theoretical and experimental investigation.
Keywords/Search Tags:Optical, Error rate, Noise, Crosstalk, Interferometric, Due
Related items