Abstract:
Based on the time-correlation approach, a mathematical model is developed for analysis of the influence of anisotropy defects in an optical path on the correlation characteristics of low-coherence radiation. Several interference schemes of practical importance for optical coherence tomography (OCT) are analysed. The experimental data are presented which demonstrate the influence of anisotropy defects in an optical path on an output interference signal and the quality of two-dimensional images (tomograms) of biological objects obtained by the OCT method.