Reading camera data

The dv-processing library provides a convenient way of reading camera data from live connected cameras and persistent files.

Camera name

Camera name is used to identify a unique camera produced by iniVation. Camera name consists of a camera model and a serial number, concatenated by an underscore (“_”) character. The library refers to camera name in multiple methods, this value can be consistently used across the library.

Some examples of a camera name:

  • DVXplorer: DVXplorer_DXA00093, DVXplorer_DXM00123

  • DAVIS: DAVIS346_00000499

Note

This definition is valid for USB cameras, the camera name is also reported in network streaming sources. In that case, camera name can be manually set by the developer, so naming convention for the models might not be entirely followed.

From a camera

The easiest approach to access data from a live camera is to use the dv::io::CameraCapture class. This section provides in-depth explanation on the usage and code samples.

Discover connected cameras

The camera name can be inspected using a command-line utility dv-list-devices that is available in the packages of dv-processing. Sample output for the utility:

$ dv-list-devices
Device discovery: found 2 devices.
Detected device [DAVIS346_00000499]
Detected device [DVXplorer_DXA00093]

Device discovery is also possible with the use of library methods. Following is a sample on how to detect connected devices using discovery method:

 1#include <dv-processing/io/discovery.hpp>
 2
 3#include <iostream>
 4
 5int main() {
 6    // Call the discovery method
 7    const std::vector<std::string> cameras = dv::io::discoverDevices();
 8
 9    std::cout << "Device discovery: found " << cameras.size() << " devices." << std::endl;
10
11    // Loop through detected camera names and print them
12    for (const auto &cameraName : cameras) {
13        std::cout << "Detected device [" << cameraName << "]" << std::endl;
14    }
15
16    return 0;
17}

Opening a camera

The dv::io::CameraCapture class follows RAII pattern for resource management. Creating an instance of the class will open the camera connected on USB and starts reading the data immediately, the resources are released when the object instance is destroyed.

The constructor of this class accepts two arguments: camera name [string] and camera type [enum] that are used to specify which camera needs to be opened. The default argument values are designed to not constrain the camera specification and effectively opens first detected camera in the system.

#include <dv-processing/io/camera_capture.hpp>

// Open first detected camera in the system
dv::io::CameraCapture capture;

It’s also possible to open a specific camera on the system, by providing a camera name:

// Open the specified camera
dv::io::CameraCapture capture("DVXplorer_DXA000000");

Camera type argument can be used to open a camera of given type. If both parameters are provided, the camera will need to match both field requirements to be opened by the dv::io::CameraCapture class:

// Open any DAVIS camera (camera name not specified)
dv::io::CameraCapture capture("", dv::io::CameraCapture::CameraType::DAVIS);

Checking camera capabilities

The dv::io::CameraCapture class abstracts all cameras manufactured by iniVation, since some camera provide different data types, the capture class provides methods to test what data the camera can provide:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3int main() {
 4    // Open any camera
 5    dv::io::CameraCapture capture;
 6
 7    // Print the camera name
 8    std::cout << "Opened [" << capture.getCameraName() << "] camera, it provides:" << std::endl;
 9
10    // Check whether event stream is available
11    if (capture.isEventStreamAvailable()) {
12        // Get the event stream resolution, the output is a std::optional, so the value() method is
13        // used to get the actual resolution value
14        const cv::Size resolution = capture.getEventResolution().value();
15
16        // Print the event stream capability with resolution value
17        std::cout << "* Events at " << resolution << " resolution" << std::endl;
18    }
19
20    // Check whether frame stream is available
21    if (capture.isFrameStreamAvailable()) {
22        // Get the frame stream resolution
23        const cv::Size resolution = capture.getFrameResolution().value();
24
25        // Print the frame stream capability with resolution value
26        std::cout << "* Frames at " << resolution << " resolution" << std::endl;
27    }
28
29    // Check whether the IMU stream is available
30    if (capture.isImuStreamAvailable()) {
31        // Print the imu data stream capability
32        std::cout << "* IMU measurements" << std::endl;
33    }
34
35    // Check whether the trigger stream is available
36    if (capture.isTriggerStreamAvailable()) {
37        // Print the trigger stream capability
38        std::cout << "* Triggers" << std::endl;
39    }
40
41    return 0;
42}

Configuring camera options

Some advanced properties of our cameras can be configured by a number of functions. They are listed here for reference, please check their detailed API documentation for more details.

DVXplorer camera advanced control functions:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3int main() {
 4    // Open a DVXplorer camera
 5    dv::io::CameraCapture capture("", dv::io::CameraCapture::CameraType::DVS);
 6
 7    // Configure event sensitivity to default. Other sensitivities available: VeryLow, Low, High, VeryHigh
 8    capture.setDVSBiasSensitivity(dv::io::CameraCapture::BiasSensitivity::Default);
 9
10    // Configure event-frame readouts per second (here variable 5000 FPS, the default value)
11    // See detailed API documentation for other available values
12    capture.setDVXplorerEFPS(dv::io::CameraCapture::DVXeFPS::EFPS_VARIABLE_5000);
13
14    // Enable global hold setting (already the default)
15    capture.setDVSGlobalHold(true);
16    // Disable global reset setting (already the default)
17    capture.setDVXplorerGlobalReset(false);
18
19    return 0;
20}

Read more about DVXplorer biases in our documentation page and specific details about eFPS implementation in dv-processing source code.

Note

On DVXplorer, setting global hold to false can help for certain applications containing repeating patterns observation, such as flickering LEDs.

DAVIS camera advanced control functions:

  • General options:

     1#include <dv-processing/io/camera_capture.hpp>
     2
     3int main() {
     4    // Open a Davis camera
     5    dv::io::CameraCapture capture("", dv::io::CameraCapture::CameraType::DAVIS);
     6
     7    // Setting camera readout to events and frames (default). Other modes available: EventsOnly, FramesOnly
     8    capture.setDavisReadoutMode(dv::io::CameraCapture::DavisReadoutMode::EventsAndFrames);
     9    // Configure frame output mode to color (default), only on COLOR cameras. Other mode available: Grayscale
    10    capture.setDavisColorMode(dv::io::CameraCapture::DavisColorMode::Color);
    11
    12    return 0;
    13}
    
  • Frame options:

     1#include <dv-processing/io/camera_capture.hpp>
     2
     3#include <chrono>
     4
     5int main() {
     6    using namespace std::chrono_literals;
     7
     8    // Open a Davis camera
     9    dv::io::CameraCapture capture("", dv::io::CameraCapture::CameraType::DAVIS);
    10
    11    // Enable frame auto-exposure (default behavior)
    12    capture.enableDavisAutoExposure();
    13    // Disable auto-exposure, set frame exposure (here 10ms)
    14    capture.setDavisExposureDuration(dv::Duration(10ms));
    15    // Read current frame exposure duration value
    16    std::optional<dv::Duration> duration = capture.getDavisExposureDuration();
    17    // Set frame interval duration (here 33ms for ~30FPS)
    18    capture.setDavisFrameInterval(dv::Duration(33ms));
    19    // Read current frame interval duration value
    20    std::optional<dv::Duration> interval = capture.getDavisFrameInterval();
    21
    22    return 0;
    23}
    
  • Event options (biases):

    Warning

    Before using biases, make sure that you absolutely need to change them and that you understand them by reading about biases on our documentation page.

     1#include <dv-processing/io/camera_capture.hpp>
     2
     3int main() {
     4    // Open a Davis camera
     5    dv::io::CameraCapture capture("", dv::io::CameraCapture::CameraType::DAVIS);
     6
     7    /// Access biases raw value
     8    // Photoreceptor bias
     9    uint16_t defaultPrBpInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_PRBP);
    10    // Source follower bias
    11    uint16_t defaultPrSfBpInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_PRSFBP);
    12    // Differential bias
    13    uint16_t defaultDiffBnInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_DIFFBN);
    14    // On threshold bias
    15    uint16_t defaultOnBnInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_ONBN);
    16    // Off threshold bias
    17    uint16_t defaultOffBnInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_OFFBN);
    18    // Refractory period bias
    19    uint16_t defaultRefrBpInt = capture.deviceConfigGet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_REFRBP);
    20
    21    /// Change biases values
    22    // Convert bias integer to values
    23    caer_bias_coarsefine coarseFinePrBp   = caerBiasCoarseFineParse(defaultPrBpInt);
    24    caer_bias_coarsefine coarseFinePrSfBp = caerBiasCoarseFineParse(defaultPrSfBpInt);
    25    caer_bias_coarsefine coarseFineDiffBn = caerBiasCoarseFineParse(defaultDiffBnInt);
    26    caer_bias_coarsefine coarseFineOnBn   = caerBiasCoarseFineParse(defaultOnBnInt);
    27    caer_bias_coarsefine coarseFineOffBn  = caerBiasCoarseFineParse(defaultOffBnInt);
    28    caer_bias_coarsefine coarseFineRefrBp = caerBiasCoarseFineParse(defaultRefrBpInt);
    29    // For example here, add 1 on the log-scale coarse value, i.e. multiply bias value by 10 (approximately)
    30    coarseFinePrBp.coarseValue   += 1;
    31    coarseFinePrSfBp.coarseValue += 1;
    32    coarseFineDiffBn.coarseValue += 1;
    33    coarseFineOnBn.coarseValue   += 1;
    34    coarseFineOffBn.coarseValue  += 1;
    35    coarseFineRefrBp.coarseValue += 1;
    36    // Convert back
    37    const uint16_t newPrBp   = caerBiasCoarseFineGenerate(coarseFinePrBp);
    38    const uint16_t newPrSfBp = caerBiasCoarseFineGenerate(coarseFinePrSfBp);
    39    const uint16_t newDiffBn = caerBiasCoarseFineGenerate(coarseFineDiffBn);
    40    const uint16_t newOnBn   = caerBiasCoarseFineGenerate(coarseFineOnBn);
    41    const uint16_t newOffBn  = caerBiasCoarseFineGenerate(coarseFineOffBn);
    42    const uint16_t newRefrBp = caerBiasCoarseFineGenerate(coarseFineRefrBp);
    43
    44    /// Set biases raw value
    45    // Setting photoreceptor bias
    46    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_PRBP, newPrBp);
    47    // Setting source follower bias
    48    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_PRSFBP, newPrSfBp);
    49    // Setting differential bias
    50    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_DIFFBN, newDiffBn);
    51    // Setting on threshold bias
    52    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_ONBN, newOnBn);
    53    // Setting off threshold bias
    54    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_OFFBN, newOffBn);
    55    // Setting refractory period bias
    56    capture.deviceConfigSet(DAVIS_CONFIG_BIAS, DAVIS346_CONFIG_BIAS_REFRBP, newRefrBp);
    57
    58    return 0;
    59}
    

Read events from a live camera

Incoming data from a camera can be read sequentially using the dv::io::CameraCapture::getNextEventBatch(). Following is a minimal sample on how to read events sequentially from a camera:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3#include <chrono>
 4
 5int main() {
 6    using namespace std::chrono_literals;
 7
 8    // Open any camera
 9    dv::io::CameraCapture capture;
10
11    // Run the loop while camera is still connected
12    while (capture.isRunning()) {
13        // Read batch of events, check whether received data is correct.
14        // The method does not wait for data arrive, it returns immediately with
15        // the latest available data or if no data is available, returns a `std::nullopt`.
16        if (const auto events = capture.getNextEventBatch(); events.has_value()) {
17            // Print received packet information
18            std::cout << *events << std::endl;
19        }
20        else {
21            // No data has arrived yet, short sleep to reduce CPU load.
22            std::this_thread::sleep_for(1ms);
23        }
24    }
25
26    return 0;
27}

Read frames from a live camera

Incoming frames from a camera can be read sequentially frame-by-frame using the dv::io::CameraCapture::getNextFrame(). Following is a minimal sample on how to read frames sequentially from a camera:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3#include <opencv2/highgui.hpp>
 4
 5int main() {
 6    // Open any camera
 7    dv::io::CameraCapture capture;
 8
 9    // Initiate a preview window
10    cv::namedWindow("Preview", cv::WINDOW_NORMAL);
11
12    // Run the loop while camera is still connected
13    while (capture.isRunning()) {
14        // Read a frame, check whether it is correct.
15        // The method does not wait for frame arrive, it returns immediately with
16        // the latest available frame or if no data is available, returns a `std::nullopt`.
17        if (const auto frame = capture.getNextFrame(); frame.has_value()) {
18            std::cout << *frame << std::endl;
19
20            // Show a preview of the image
21            cv::imshow("Preview", frame->image);
22        }
23        cv::waitKey(2);
24    }
25
26    return 0;
27}

Read IMU data from a live camera

Incoming imu data from a camera can be read sequentially using the dv::io::CameraCapture::getNextImuBatch(). Following is a minimal sample on how to read imu data sequentially from a camera:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3#include <chrono>
 4
 5int main() {
 6    using namespace std::chrono_literals;
 7
 8    // Open any camera
 9    dv::io::CameraCapture capture;
10
11    // Run the loop while camera is still connected
12    while (capture.isRunning()) {
13        // Read IMU measurement batch, check whether it is correct.
14        // The method does not wait for data to arrive, it returns immediately with
15        // the latest available imu data or if no data is available, returns a `std::nullopt`.
16        if (const auto imuBatch = capture.getNextImuBatch(); imuBatch.has_value() && !imuBatch->empty()) {
17            std::cout << "Received " << imuBatch->size() << " IMU measurements" << std::endl;
18        }
19        else {
20            // No data has arrived yet, short sleep to reduce CPU load.
21            std::this_thread::sleep_for(1ms);
22        }
23    }
24
25    return 0;
26}

Read triggers from a live camera

Note

To understand what triggers are and where they come from, read more about them on our documentation page.

Incoming trigger data from a camera can be read sequentially using the dv::io::CameraCapture::getNextTriggerBatch(). Following is a minimal sample on how to read trigger data sequentially from a camera:

 1#include <dv-processing/io/camera_capture.hpp>
 2
 3#include <chrono>
 4
 5int main() {
 6    using namespace std::chrono_literals;
 7
 8    // Open any camera
 9    dv::io::CameraCapture capture;
10
11    // Depending on the incoming signal, enable the detection of the desired type of pattern, here we enable everything.
12    // Note: In the following variables, replace 'DVX' with 'DAVIS_CONFIG' in case the device used is a DAVIS.
13    // Enable rising edge detection
14    capture.deviceConfigSet(DVX_EXTINPUT, DVX_EXTINPUT_DETECT_RISING_EDGES, true);
15    // Enable falling edge detection
16    capture.deviceConfigSet(DVX_EXTINPUT, DVX_EXTINPUT_DETECT_FALLING_EDGES, true);
17    // Enable pulse detection
18    capture.deviceConfigSet(DVX_EXTINPUT, DVX_EXTINPUT_DETECT_PULSES, true);
19    // Enable detector
20    capture.deviceConfigSet(DVX_EXTINPUT, DVX_EXTINPUT_RUN_DETECTOR, true);
21
22    // Run the loop while camera is still connected
23    while (capture.isRunning()) {
24        // Read trigger batch, check whether it is correct.
25        // The method does not wait for data to arrive, it returns immediately with
26        // the latest available data or if no data is available, returns a `std::nullopt`.
27        if (const auto triggers = capture.getNextTriggerBatch(); triggers.has_value() && !triggers->empty()) {
28            std::cout << "Received " << triggers->size() << " Triggers" << std::endl;
29        }
30        else {
31            // No data has arrived yet, short sleep to reduce CPU load.
32            std::this_thread::sleep_for(1ms);
33        }
34    }
35
36    return 0;
37}

Sample application - reading data from a live camera

An application reading multiple types of data from a live camera can be found among the code samples in the source code repository of the dv-processing library:

From a file

Data from iniVation cameras are usually recorded using the AEDAT4 file format. The dv-processing library provide tools for reading such files. This section contains explanations and samples on how data can be read from AEDAT4 files. More detailed information on the AEDAT4 file format can be found here.

Inspecting AEDAT4 files

AEDAT4 file format supports recording of different data streams into single file, multiple cameras are also supported. The library provides a command-line utility for inspection of AEDAT4 files dv-filestat, it provides information on available streams recorded in it.

The utility provides information on the size, timestamp information, duration. More information about the utility can be found here.

Opening a file

AEDAT4 files can be opened and read using a dv::io::MonoCameraRecording class. This class assumes that the recording was performed using a single camera. Following is a minimal sample code on opening a recording and printing information about it.

A file can be opened by providing its path in the filesystem:

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Open a file
 5    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 6
 7    // Get and print the camera name that data from recorded from
 8    std::cout << "Opened an AEDAT4 file which contains data from [" << reader.getCameraName() << "] camera"
 9              << std::endl;
10
11    return 0;
12}

Checking available streams

Data recordings might contain various data streams. The dv::io::MonoCameraRecording provides easy-to-use methods to inspect what data streams are available. Following sample code shows how to check for existence of various data streams:

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Store the file path
 5    const std::string pathToFile = "path/to/file.aedat4";
 6
 7    // Open a file
 8    dv::io::MonoCameraRecording reader(pathToFile);
 9
10    // Print file path and camera name
11    std::cout << "Available streams in [" << pathToFile << "]:" << std::endl;
12
13    // Check if event stream is available
14    if (reader.isEventStreamAvailable()) {
15        // Check the resolution of event stream. Since the getEventResolution() method returns
16        // a std::optional, we use *operator to get the value. The method returns std::nullopt
17        // only in case the stream is unavailable, which is already checked.
18        const cv::Size resolution = *reader.getEventResolution();
19
20        // Print that the stream is present and its resolution
21        std::cout << "  * Event stream with resolution " << resolution << std::endl;
22    }
23
24    // Check if frame stream is available
25    if (reader.isFrameStreamAvailable()) {
26        // Check the resolution of frame stream. Since the getFrameResolution() method returns
27        // a std::optional, we use *operator to get the value. The method returns std::nullopt
28        // only in case the stream is unavailable, which is already checked.
29        const cv::Size resolution = *reader.getFrameResolution();
30
31        // Print that the stream is available and its resolution
32        std::cout << "  * Frame stream with resolution " << resolution << std::endl;
33    }
34
35    // Check if IMU stream is available
36    if (reader.isImuStreamAvailable()) {
37        // Print that the IMU stream is available
38        std::cout << "  * IMU stream" << std::endl;
39    }
40
41    // Check if trigger stream is available
42    if (reader.isTriggerStreamAvailable()) {
43        // Print that the trigger stream is available
44        std::cout << "  * Trigger stream " << std::endl;
45    }
46
47    return 0;
48}

The dv-processing library also supports recording of other types, type agnostic methods are available as templated methods in C++, while python only contains a limited set of named methods (since templating is unavailable in python). Following sample show the use of generic method for checking the availability of certain streams with a name and a type:

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Open a file
 5    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 6
 7    // Store some stream name
 8    const std::string streamName = "poses";
 9
10    // Check if such stream name is available and validate the data type of this stream
11    if (reader.isStreamAvailable("streamName") && reader.isStreamOfDataType<dv::Pose>("streamName")) {
12        std::cout << "The file contains a stream named [" << streamName << "] and of data type [dv::Pose]" << std::endl;
13    }
14
15    return 0;
16}

Read events from a file

Following sample reads events in batches while the stream has available data to read. While reading from a file, the dv::io::MonoCameraRecording::getNextEventBatch() will return data until the end of stream is reached, the dv::io::MonoCameraRecording::isRunning() method will return a false boolean when the end is reached.

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Open a file
 5    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 6
 7    // Run the loop while data is available
 8    while (reader.isRunning()) {
 9        // Read batch of events, check whether received data is correct.
10        if (const auto events = reader.getNextEventBatch(); events.has_value()) {
11            // Print received event packet information
12            std::cout << *events << std::endl;
13        }
14    }
15
16    return 0;
17}

Read frames from a file

Following sample reads frames in batches while the stream has available data to read. While reading from a file, the dv::io::MonoCameraRecording::getNextFrame() will return a frame until the end of stream is reached, the dv::io::MonoCameraRecording::isRunning() method will return a false boolean when the end is reached.

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3#include <opencv2/highgui.hpp>
 4
 5int main() {
 6    // Open a file
 7    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 8
 9    // Initiate a preview window
10    cv::namedWindow("Preview", cv::WINDOW_NORMAL);
11
12    // Variable to store the previous frame timestamp for correct playback
13    std::optional<int64_t> lastTimestamp = std::nullopt;
14
15    // Run the loop while data is available
16    while (reader.isRunning()) {
17        // Read a frame, check whether it is correct.
18        // The method does not wait for frame arrive, it returns immediately with
19        // latest available frame or if no data is available, returns a `std::nullopt`.
20        if (const auto frame = reader.getNextFrame(); frame.has_value()) {
21            // Print information about received frame
22            std::cout << *frame << std::endl;
23
24            // Show a preview of the image
25            cv::imshow("Preview", frame->image);
26
27            // Calculate the delay between last and current frame, divide by 1000 to convert microseconds
28            // to milliseconds
29            const int delay = lastTimestamp.has_value() ? (frame->timestamp - *lastTimestamp) / 1000 : 2;
30
31            // Perform the sleep
32            cv::waitKey(delay);
33
34            // Store timestamp for the next frame
35            lastTimestamp = frame->timestamp;
36        }
37    }
38
39    return 0;
40}

Read IMU data from a file

Following sample reads imu data in batches while the stream has available data to read. While reading from a file, the dv::io::MonoCameraRecording::getNextImuBatch() will return an IMU measurement batch until the end of stream is reached, the dv::io::MonoCameraRecording::isRunning() method will return a false boolean when the end is reached.

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Open a file
 5    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 6
 7    // Run the loop while data is available
 8    while (reader.isRunning()) {
 9        // Read IMU measurement batch, check whether it is correct.
10        if (const auto imuBatch = reader.getNextImuBatch(); imuBatch.has_value() && !imuBatch->empty()) {
11            // Print IMU batch information
12            std::cout << "Received " << imuBatch->size() << " IMU measurements" << std::endl;
13        }
14    }
15
16    return 0;
17}

Read triggers from a file

Following sample reads triggers in batches while the stream has available data to read. While reading from a file, the dv::io::MonoCameraRecording::getNextTriggerBatch() will return an IMU measurement batch until the end of stream is reached, the dv::io::MonoCameraRecording::isRunning() method will return a false boolean when the end is reached.

 1#include <dv-processing/io/mono_camera_recording.hpp>
 2
 3int main() {
 4    // Open a file
 5    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 6
 7    // Run the loop while data is available
 8    while (reader.isRunning()) {
 9        // Read trigger batch, check whether it is correct.
10        if (const auto triggers = reader.getNextTriggerBatch(); triggers.has_value() && !triggers->empty()) {
11            // Print the trigger batch information
12            std::cout << "Received " << triggers->size() << " triggers" << std::endl;
13        }
14    }
15
16    return 0;
17}

[Advanced] Reading custom data types

The previous samples show how to use named functions to read different data types. C++ API provides templated methods to read any type of data. Below is sample that shows how to read data using the generic templated API:

Note

Since templated methods are only available in C++, the generic writing methods are only available in the C++ API.

 1#include <dv-processing/data/timed_keypoint_base.hpp>
 2#include <dv-processing/io/mono_camera_recording.hpp>
 3
 4int main() {
 5    // Open a file
 6    dv::io::MonoCameraRecording reader("path/to/file.aedat4");
 7
 8    // Define and contain a stream name in a variable
 9    const std::string stream = "keypoints";
10
11    // Check whether a timed-keypoint stream is available
12    if (!reader.isStreamAvailable(stream) || !reader.isStreamOfDataType<dv::TimedKeyPointPacket>(stream)) {
13        throw dv::exceptions::RuntimeError("Stream named 'keypoints' not found");
14    }
15
16    // Run the loop while data is available
17    while (reader.isRunning()) {
18        // Read timed keypoint batch, check whether it is correct.
19        if (const auto keypoints = reader.getNextStreamPacket<dv::TimedKeyPointPacket>(stream); keypoints.has_value()) {
20            // Print the number of keypoints read
21            std::cout << "Read " << keypoints->elements.size() << " timed keypoints" << std::endl;
22        }
23    }
24
25    return 0;
26}

Sample application - reading data from a recorded AEDAT4 file

An application reading multiple types of data from an AEDAT4 file can be found in the source code repository of the dv-processing library: