Camera Geometry

This section introduces camera calibration related classes and primitives available in the dv-processing library.

Calibration files

Multiple calibration file formats were introduced in different DV software components. Single / stereo camera calibration files were introduced in DV camera calibration module, which produces these files after a successful calibration. A new yaml calibration file format is introduced in dv-processing, which can store calibration information not only for a single / stereo camera setups, but multi-camera setups with any number of cameras as well as calibration for IMU, and it’s noise parameters.

All of these calibration files can be opened and created by a provided dv::camera::CalibrationSet. The CalibrationSet class contains three main types of calibration parameters:

  • dv::camera::calibrations::CameraCalibration - Single camera intrinsic calibration: pinhole camera and distortion model parameters. It also contains extrinsic transformation matrix which describes physical displacement of a camera w.r.t. one of the cameras. Usually first camera denoted “C0” is selected as a reference which will have an identity matrix for its transformation and all the other cameras will contain transformation matrices which describes the relationship to the first camera.

  • dv::camera::calibrations::StereoCalibration - Stereo camera extrinsic parameters, this class contains essential and fundamental matrices between a select pair of cameras.

  • dv::camera::calibrations::IMUCalibration - IMU extrinsic calibration and IMU noise parameters. This class contains extrinsic parameters - transformation between camera plane and timestamp offset (time calibration). It contains IMU measurement noise parameters: gyroscope and accelerometer measurement biases, noise densities, and random walk (white noise) parameters.

The dv::camera::CalibrationSet contains sets of these three types of calibration parameters, which is sufficient for storing parameters of any generic visual-inertial multi-camera rig calibration. To see the full list of exact available parameters, please see the list of public members of each of the calibration classes.

Writing a calibration file

The following code sample shows how to generate a calibration file given hardcoded calibration parameters. The generated file is going to be used as input for the next section Reading calibration file.

 1#include <dv-processing/camera/calibration_set.hpp>
 2
 3int main() {
 4    // Initialize a calibration set
 5    dv::camera::CalibrationSet calibration;
 6
 7    // Add a camera calibration with hardcoded calibration parameters, the exact values are just for illustration.
 8    calibration.addCameraCalibration(dv::camera::calibrations::CameraCalibration("DVXplorer_DXA000312", "left", true,
 9        cv::Size(640, 480), cv::Point2f(320, 240), cv::Point2f(640, 640), {}, dv::camera::DistortionModel::None,
10        std::vector<float>({1.f, 0.f, 0.f, 0.f, 0.f, 1.f, 0.f, 0.f, 0.f, 0.f, 1.f, 0.f, 0.f, 0.f, 0.f, 1.f}),
11        dv::camera::calibrations::CameraCalibration::Metadata()));
12
13    // Add an IMU calibration as well, the exact values are just for illustration of use here.
14    calibration.addImuCalibration(dv::camera::calibrations::IMUCalibration("DVXplorer_DXA000312", 100.f, 98.1f,
15        cv::Point3f(0.f, 0.f, 0.f), cv::Point3f(0.f, 0.f, 0.f), 1.f, 1.f, 1.f, 1.f, 1.f, 1.f, 3500,
16        std::vector<float>({1.f, 0.f, 0.f, 0.f, 0.f, 1.f, 0.f, 0.f, 0.f, 0.f, 1.f, 0.f, 0.f, 0.f, 0.f, 1.f}),
17        dv::camera::calibrations::IMUCalibration::Metadata()));
18
19    // Just write the generated calibration set into a file.
20    calibration.writeToFile("calibration.json");
21
22    return 0;
23}

Reading calibration file

The following code sample shows the use of the dv::camera::CalibrationSet to load an existing calibration file.

Note

This sample requires an existing “calibration.json” file existing in the directory the binary is executed in. The file can be obtained by running previous sample “Writing a calibration file”.

 1#include <dv-processing/camera/calibration_set.hpp>
 2
 3int main() {
 4    // Initialize a calibration set
 5    const auto calibrationSet = dv::camera::CalibrationSet::LoadFromFile("calibration.json");
 6
 7    // Iterate through available camera calibrations. The designation here is an internal camera abbreviation
 8    // used to refer to a specific sensor in the camera rig.
 9    for (const auto &[designation, calibration] : calibrationSet.getCameraCalibrations()) {
10        // Print the designation and the camera name of current calibration
11        std::cout << "[" << designation << "] Found calibration for camera with name [" << calibration.name << "]"
12                  << std::endl;
13
14        // Print the intrinsic calibration parameters for this camera: focal length, principal point, distortion model
15        // and parameters of the distortion model
16        std::cout << "\t Focal length: " << calibration.focalLength << std::endl;
17        std::cout << "\t Principal point: " << calibration.principalPoint << std::endl;
18        std::cout << "\t Distortion model: " << calibration.getDistortionModelString() << std::endl;
19        std::cout << "\t Distortion parameters: "
20                  << fmt::format("[{}]", fmt::join(calibration.distortion.begin(), calibration.distortion.end(), ", "))
21                  << std::endl;
22    }
23
24    // Iterate through available IMU calibrations in the file
25    for (const auto &[designation, calibration] : calibrationSet.getImuCalibrations()) {
26        // Print the designation and the camera name of current calibration
27        std::cout << "[" << designation << "] Found IMU calibration for camera with name [" << calibration.name << "]"
28                  << std::endl;
29
30        // Print some available information: accelerometer and gyroscope measurement limits, calibrated time offset and
31        // biases
32        std::cout << "\t Maximum acceleration: " << calibration.accMax << " [m/s^2]" << std::endl;
33        std::cout << "\t Maximum angular velocity: " << calibration.omegaMax << " [rad/s]" << std::endl;
34        std::cout << "\t Time offset: " << calibration.timeOffsetMicros << " [μs]" << std::endl;
35        std::cout << "\t Accelerometer bias: " << calibration.accOffsetAvg << " [m/s^2]" << std::endl;
36        std::cout << "\t Gyroscope bias: " << calibration.omegaOffsetAvg << " [rad/s]" << std::endl;
37
38        // Print noise density values for the IMU sensor
39        std::cout << "\t Accelerometer noise density: " << calibration.accNoiseDensity << " [m/s^2/sqrt(Hz)]"
40                  << std::endl;
41        std::cout << "\t Gyroscope noise density: " << calibration.omegaNoiseDensity << " [rad/s/sqrt(Hz)]"
42                  << std::endl;
43    }
44
45    return 0;
46}

Monocular camera geometry

While dv::camera::CalibrationSet class is designed for reading / writing the calibration parameters of multi-camera rig and dv::camera::calibrations::CameraCalibration stores intrinsic calibration parameters of a single pinhole camera model, these classes do not provide any operations that perform projections and (un)distortion of point coordinates. dv::camera::CameraGeometry class provides an efficient implementation of the mathematical operations for:

  • Forward / backward projection,

  • Distortion model distortion and undistortion.

Note

The operations are intended for sparse point operations, since event-camera produces parse pixel data. For operations on full image frame prefer the use of highly optimized operations available in OpenCV.

The following code sample shows the use of dv::camera::CameraGeometry to back-project pixel coordinates onto 3D space, apply a 3D transformation on the points using the kinematics library and project the points back to pixel coordinates.

 1#include <dv-processing/camera/camera_geometry.hpp>
 2#include <dv-processing/core/event.hpp>
 3#include <dv-processing/data/generate.hpp>
 4#include <dv-processing/kinematics/transformation.hpp>
 5#include <dv-processing/visualization/colors.hpp>
 6
 7#include <opencv2/highgui.hpp>
 8
 9int main() {
10    // Use VGA resolution for this sample
11    const cv::Size resolution(640, 480);
12
13    // Initialize an ideal pinhole camera model parameters with no distortion
14    dv::camera::CameraGeometry geometry(640.f, 640.f, 320.f, 240.f, resolution);
15
16    dv::EventStore positiveEvents;
17
18    // Generate a sample set of events and filter out only positive events for this sample
19    dv::polarityFilter(dv::data::generate::dvLogoAsEvents(0, resolution), positiveEvents, true);
20
21    // Back project the events into a set of 3D points
22    const auto points = geometry.backProjectSequence<std::vector<dv::Point3f>>(positiveEvents);
23
24    // Apply some 3D transformation
25    dv::kinematics::Transformationf shift(
26        0, Eigen::Vector3f(0.5f, 0.3f, 0.1f), Eigen::Quaternionf(0.24f, -0.31f, -0.89f, 0.18f));
27
28    // Apply the transformation above to each of the back-projected points
29    std::vector<dv::Point3f> shiftedPoints;
30    for (const auto &point : points) {
31        shiftedPoints.push_back(shift.transformPoint<dv::Point3f>(point));
32    }
33
34    // Forward project the points with transformation
35    const auto rotatedPixels = geometry.projectSequence<std::vector<cv::Point2f>>(shiftedPoints);
36
37    // Choose a color for visualization and store it in a variable that can be efficiently assigned to a pixel intensity
38    const auto blue = dv::visualization::colors::iniBlue;
39    const cv::Vec3b color(static_cast<uint8_t>(blue[0]), static_cast<uint8_t>(blue[1]), static_cast<uint8_t>(blue[2]));
40
41    // Draw input events on an image for input preview
42    cv::Mat input(resolution, CV_8UC3, dv::visualization::colors::white);
43    for (const auto &event : positiveEvents) {
44        input.at<cv::Vec3b>(event.y(), event.x()) = color;
45    }
46
47    // Draw output pixels on another image
48    cv::Mat output(resolution, CV_8UC3, dv::visualization::colors::white);
49    for (const auto &pixel : rotatedPixels) {
50        output.at<cv::Vec3b>(pixel) = color;
51    }
52
53    // Concatenate both image for a single image preview
54    cv::Mat preview;
55    cv::hconcat(input, output, preview);
56
57    // Create preview window and show the image
58    cv::namedWindow("Preview", cv::WINDOW_NORMAL);
59    cv::imshow("Preview", preview);
60    cv::waitKey();
61
62    return 0;
63}
_images/reprojected_transform.png

Expected result of running the sample code for camera geometry transformation.

Stereo camera geometry

The dv::camera::StereoGeometry class is intended for efficient rectification of sparse point coordinates and 3D depth estimation. The dv::camera::StereoGeometry is built on top of monocular dv::io::CameraGeometry projections and adds sparse stereo rectification that aligns coordinates for stereo disparity matching. The class also provides convenience methods to estimate 3D depth with known disparity.

For sample use of the stereo geometry class, please refer to full disparity estimation samples available here.