In modern autonomous systems, measurement repeatability and precision are crucial for robust decision-making algorithms. Stereovision, which is widely used in safety applications, provides information about an object’s shape, orientation, and 3D localisation. The camera’s lens distortion is a common source of systematic measurement errors, which can be estimated and then eliminated or at least reduced using a suitable correction/calibration method. In this study, a set of cameras equipped with Basler lenses (C125-0618-5M F1.8 f6mm) and Sony IMX477R matrices are calibrated using a state-of-the-art Zhang–Duda–Frese method. The resulting distortion coefficients are used to correct the images. The calibrations are evaluated with the aid of two novel methods for lens distortion measurement. The first one is based on linear regression with images of a vertical and horizontal line pattern. Based on the evaluation tests, outlying cameras are eliminated from the test set by applying the 2 (Formula presented.) criterion. For the remaining cameras, the MSE was reduced up to 75.4 times, to 1.8 px−6.9 px. The second method is designed to evaluate the impact of lens distortion on stereovision applied to bird tracking around wind farms. A bird’s flight trajectory is synthetically generated to estimate changes in disparity and distance before and after calibration. The method shows that at the margins of the image, lens distortion might introduce errors into the object’s distance measurement of +17%−+20% for cameras with the same distortion and from −41% up to (Formula presented.) for camera pairs with different lens distortions. These results highlight the importance of having well-calibrated cameras in systems that require precision, such as stereovision bird tracking in bird–turbine collision risk assessment systems.