Reading camera data
The dv-processing library provides a convenient way of reading camera data from live connected cameras and persistent files.
Camera name
Camera name is used to identify a unique camera produced by iniVation. Camera name consists of a camera model and a serial number, concatenated by an underscore (“_”) character. The library refers to camera name in multiple methods, this value can be consistently used across the library.
Some examples of camera names:
DVXplorer:
DVXplorer_DXA00093
,DVXplorer_DXM00123
DAVIS:
DAVIS346_00000499
Note
This definition is valid for USB cameras, the camera name is also reported in network streaming sources. In that case, camera name can be manually set by the developer, so naming convention for the models might not be entirely followed.
From a camera
The easiest approach to access data from a live camera is to use the dv::io::camera::open()
functions. This
section provides in-depth explanations on their usage and code samples.
Discover connected cameras
The command-line utility dv-list-devices is available in the packages of dv-processing and can show information on the connected cameras. Sample output for the utility:
$ dv-list-devices
Device discovery: found 2 devices.
Detected device [DAVIS 00000499]
Detected device [DVXPLORER DXA00093]
Note
This utility reports type and serial number, not the full camera name described above.
The getCameraName()
function provides the full name once a device has been opened.
Device discovery is also possible with the use of library methods. Following is a sample on how to detect connected devices using discovery method:
1#include <dv-processing/io/camera/discovery.hpp>
2
3#include <iostream>
4
5int main() {
6 // Call the discovery method
7 const auto cameras = dv::io::camera::discover();
8
9 std::cout << "Device discovery: found " << cameras.size() << " devices." << std::endl;
10
11 // Loop through detected camera names and print them
12 for (const auto &camera : cameras) {
13 std::cout << "Detected device [" << camera.cameraModel << " " << camera.serialNumber << "]" << std::endl;
14 }
15
16 return 0;
17}
1import dv_processing as dv
2
3cameras = dv.io.camera.discover()
4
5print(f"Device discovery: found {len(cameras)} devices.")
6for camera_name in cameras:
7 print(f"Detected device [{camera_name}]")
Opening a camera
The dv::io::camera::DVS128
, dv::io::camera::DAVIS
, dv::io::camera::DVXplorer
and
dv::io::camera::DVXplorerM
classes follow the
RAII pattern for resource management. Creating
an instance of the class will open the camera connected on USB and starts reading data immediately; the camera is
shut-down when the object instance is destroyed. These classes allow full interaction with the cameras, exposing all
available functionality.
It is also possible to open cameras in a more generic way, using the dv::io::camera::open()
functions. For cameras
with multi-camera synchronization support, you can also use the dv::io::camera::openSync()
functions. This allows
access to a set of features common to all cameras. For access to all features, you will have to up-cast the generic
camera to the class representing its full type.
// Open the first camera found
auto capture = dv::io::camera::open();
import dv_processing as dv
# Open the first camera found
capture = dv.io.camera.open()
It is also possible to open a specific camera on the system by providing the serial number:
// Open the specified camera
auto capture = dv::io::camera::open("DXA00093");
import dv_processing as dv
# Open the specified camera
capture = dv.io.camera.open("DXA00093")
To access all features, it’s easiest to open the camera using its full class:
// Open any DAVIS camera (camera name not specified)
auto capture = dv::io::camera::DAVIS();
import dv_processing as dv
# Open any DAVIS camera (camera name not specified)
capture = dv.io.camera.DAVIS()
Checking camera capabilities
Since some cameras provide different data types and capabilities, the base class
dv::io::camera::CameraInputBase
provides methods to test what data the camera can provide:
1#include <dv-processing/io/camera/discovery.hpp>
2
3int main() {
4 // Open any camera
5 auto capture = dv::io::camera::open();
6
7 // Print the camera name
8 std::cout << "Opened [" << capture->getCameraName() << "] camera, it provides:" << std::endl;
9
10 // Check whether event stream is available
11 if (capture->isEventStreamAvailable()) {
12 // Get the event stream resolution, the output is a std::optional, so the value() method is
13 // used to get the actual resolution value
14 const cv::Size resolution = capture->getEventResolution().value();
15
16 // Print the event stream capability with resolution value
17 std::cout << "* Events at " << resolution << " resolution" << std::endl;
18 }
19
20 // Check whether frame stream is available
21 if (capture->isFrameStreamAvailable()) {
22 // Get the frame stream resolution
23 const cv::Size resolution = capture->getFrameResolution().value();
24
25 // Print the frame stream capability with resolution value
26 std::cout << "* Frames at " << resolution << " resolution" << std::endl;
27 }
28
29 // Check whether the IMU stream is available
30 if (capture->isImuStreamAvailable()) {
31 // Print the imu data stream capability
32 std::cout << "* IMU measurements" << std::endl;
33 }
34
35 // Check whether the trigger stream is available
36 if (capture->isTriggerStreamAvailable()) {
37 // Print the trigger stream capability
38 std::cout << "* Triggers" << std::endl;
39 }
40
41 return 0;
42}
1import dv_processing as dv
2
3# Open any camera
4capture = dv.io.camera.open()
5
6# Print the camera name
7print(f"Opened [{capture.getCameraName()}] camera, it provides:")
8
9# Check whether event stream is available
10if capture.isEventStreamAvailable():
11 # Get the event stream resolution
12 resolution = capture.getEventResolution()
13
14 # Print the event stream capability with resolution value
15 print(f"* Events at ({resolution[0]}x{resolution[1]}) resolution")
16
17# Check whether frame stream is available
18if capture.isFrameStreamAvailable():
19 # Get the frame stream resolution
20 resolution = capture.getFrameResolution()
21
22 # Print the frame stream capability with resolution value
23 print(f"* Frames at ({resolution[0]}x{resolution[1]}) resolution")
24
25# Check whether the IMU stream is available
26if capture.isImuStreamAvailable():
27 # Print the imu data stream capability
28 print("* IMU measurements")
29
30# Check whether the trigger stream is available
31if capture.isTriggerStreamAvailable():
32 # Print the trigger stream capability
33 print("* Triggers")
Configuring camera options
Some advanced properties of our cameras can be configured by a number of functions. Some are listed here for reference, please check the detailed API documentation for more details.
DVXplorer camera advanced control options
Sensitivity and eFPS
1#include <dv-processing/io/camera/dvxplorer.hpp>
2
3int main() {
4 // Open a DVXplorer camera
5 dv::io::camera::DVXplorer capture{};
6
7 // Configure event sensitivity. Range 0-17, default 9.
8 capture.setContrastThresholdOn(9);
9 capture.setContrastThresholdOff(9);
10
11 // Configure event-frame readouts per second (here variable 5000 FPS, the default value)
12 // See detailed API documentation for other available values
13 capture.setReadoutFPS(dv::io::camera::DVXplorer::ReadoutFPS::VARIABLE_5000);
14
15 // Enable global hold setting (already the default)
16 capture.setGlobalHold(true);
17 // Disable global reset setting (already the default)
18 capture.setGlobalReset(false);
19
20 return 0;
21}
1import dv_processing as dv
2
3# Open a DVXplorer camera
4capture = dv.io.camera.DVXplorer()
5
6# Configure event sensitivity. Range 0-17, default 9.
7capture.setContrastThresholdOn(9)
8capture.setContrastThresholdOff(9)
9
10# Configure event-frame readouts per second (here variable 5000 FPS, the default value)
11# See detailed API documentation for other available values
12capture.setReadoutFPS(dv.io.camera.DVXplorer.ReadoutFPS.VARIABLE_5000)
13
14# Enable global hold setting (already the default)
15capture.setGlobalHold(True)
16# Disable global reset setting (already the default)
17capture.setGlobalReset(False)
Read more about DVXplorer biases and details about the ReadoutFPS
implementation in our
documentation page.
Note
On DVXplorer, setting global hold to false can help for certain applications containing repeating patterns observation, such as flickering LEDs.
Crop events to a region of interest
1#include <dv-processing/io/camera/dvxplorer.hpp>
2
3int main() {
4 // Open a DVXplorer camera
5 dv::io::camera::DVXplorer capture{};
6
7 // Set ROI area, automatically takes effect
8 capture.setCropArea({0, 0, 200, 200});
9
10 return 0;
11}
1import dv_processing as dv
2
3# Open a DVXplorer camera
4capture = dv.io.camera.DVXplorer()
5
6# Set ROI area, automatically takes effect
7capture.setCropArea((0, 0, 200, 200))
DAVIS camera advanced control options
General options
1#include <dv-processing/io/camera/davis.hpp>
2
3int main() {
4 // Open a Davis camera
5 dv::io::camera::DAVIS capture{};
6
7 // Setting camera readout to events and frames (default).
8 capture.setEventsRunning(true);
9 capture.setFramesRunning(true);
10
11 // Configure frame output mode to color (default), only on COLOR cameras. Other mode available: GRAYSCALE
12 capture.setColorMode(dv::io::camera::parser::DAVIS::ColorMode::DEFAULT);
13
14 return 0;
15}
1import dv_processing as dv
2
3# Open a Davis camera
4capture = dv.io.camera.DAVIS()
5
6# Setting camera readout to events and frames (default).
7capture.setEventsRunning(True)
8capture.setFramesRunning(True)
9
10# Configure frame output mode to color (default), only on COLOR cameras. Other mode available: GRAYSCALE
11capture.setColorMode(dv.io.camera.DAVIS.ColorMode.DEFAULT)
Frame options
1#include <dv-processing/io/camera/davis.hpp>
2
3#include <chrono>
4
5int main() {
6 using namespace std::chrono_literals;
7
8 // Open a Davis camera
9 dv::io::camera::DAVIS capture{};
10
11 // Enable frame auto-exposure (default behavior)
12 capture.setAutoExposure(true);
13
14 // Disable auto-exposure, set frame exposure (here 10ms)
15 capture.setAutoExposure(false);
16 capture.setExposureDuration(10ms);
17
18 // Read current frame exposure duration value
19 std::chrono::microseconds duration = capture.getExposureDuration();
20
21 // Set frame interval duration (here 33ms for ~30FPS)
22 capture.setFrameInterval(33ms);
23
24 // Read current frame interval duration value
25 std::chrono::microseconds interval = capture.getFrameInterval();
26
27 return 0;
28}
1from datetime import timedelta
2
3import dv_processing as dv
4
5# Open a Davis camera
6capture = dv.io.camera.DAVIS()
7
8# Enable frame auto-exposure (default behavior)
9capture.setAutoExposure(True)
10# Disable auto-exposure, set frame exposure (here 10ms)
11capture.setAutoExposure(False)
12capture.setExposureDuration(timedelta(milliseconds=10))
13# Read current frame exposure duration value
14duration = capture.getExposureDuration()
15# Set frame interval duration (here 33ms for ~30FPS)
16capture.setFrameInterval(timedelta(milliseconds=33))
17# Read current frame interval duration value
18interval = capture.getFrameInterval()
Event options (biases)
Warning
Before using biases, make sure that you absolutely need to change them and that you understand them by reading about biases on our documentation page.
1#include <dv-processing/io/camera/davis.hpp>
2
3int main() {
4 // Open a Davis346 camera
5 dv::io::camera::DAVIS capture{};
6
7 /// Access biases raw value
8 // Photoreceptor bias
9 auto biasPhotoreceptor = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Photoreceptor);
10 // Source follower bias
11 auto biasPhotoreceptorSourceFollower
12 = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::PhotoreceptorSourceFollower);
13 // Differential bias
14 auto biasDiff = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Diff);
15 // On threshold bias
16 auto biasOn = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::On);
17 // Off threshold bias
18 auto biasOff = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Off);
19 // Refractory period bias
20 auto biasRefractory = capture.getDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Refractory);
21
22 // Add 1 on the log-scale coarse value, i.e. multiply bias value by 8 (approximately)
23 biasPhotoreceptor.first += 1;
24 biasPhotoreceptorSourceFollower.first += 1;
25 biasDiff.first += 1;
26 biasOn.first += 1;
27 biasOff.first += 1;
28 biasRefractory.first += 1;
29
30 /// Set biases raw value
31 // Setting photoreceptor bias
32 capture.setDavis346BiasCoarseFine(
33 dv::io::camera::DAVIS::Davis346BiasCF::Photoreceptor, biasPhotoreceptor.first, biasPhotoreceptor.second);
34 // Setting source follower bias
35 capture.setDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::PhotoreceptorSourceFollower,
36 biasPhotoreceptorSourceFollower.first, biasPhotoreceptorSourceFollower.second);
37 // Setting differential bias
38 capture.setDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Diff, biasDiff.first, biasDiff.second);
39 // Setting on threshold bias
40 capture.setDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::On, biasOn.first, biasOn.second);
41 // Setting off threshold bias
42 capture.setDavis346BiasCoarseFine(dv::io::camera::DAVIS::Davis346BiasCF::Off, biasOff.first, biasOff.second);
43 // Setting refractory period bias
44 capture.setDavis346BiasCoarseFine(
45 dv::io::camera::DAVIS::Davis346BiasCF::Refractory, biasRefractory.first, biasRefractory.second);
46
47 return 0;
48}
1import dv_processing as dv
2
3# Open a Davis camera
4capture = dv.io.camera.DAVIS()
5
6# - Access biases raw value
7# Photoreceptor bias
8defaultPrBpInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Photoreceptor)
9# Source follower bias
10defaultPrSfBpInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.PhotoreceptorSourceFollower)
11# Differential bias
12defaultDiffBnInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Diff)
13# On threshold bias
14defaultOnBnInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.On)
15# Off threshold bias
16defaultOffBnInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Off)
17# Refractory period bias
18defaultRefrBpInt = capture.getDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Refractory)
19
20# Change biases values by 1 milliAmpere (1 mio picoAmpere).
21defaultPrBpInt += 1000000
22defaultPrSfBpInt += 1000000
23defaultDiffBnInt += 1000000
24defaultOnBnInt += 1000000
25defaultOffBnInt += 1000000
26defaultRefrBpInt += 1000000
27
28# Set biases raw value
29# Setting photoreceptor bias
30capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Photoreceptor, defaultPrBpInt)
31# Setting source follower bias
32capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.PhotoreceptorSourceFollower, defaultPrSfBpInt)
33# Setting differential bias
34capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Diff, defaultDiffBnInt)
35# Setting on threshold bias
36capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.On, defaultOnBnInt)
37# Setting off threshold bias
38capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Off, defaultOffBnInt)
39# Setting refractory period bias
40capture.setDavis346BiasCurrent(dv.io.camera.DAVIS.Davis346BiasCF.Refractory, defaultRefrBpInt)
Crop events or frames to a region of interest
1#include <dv-processing/io/camera/davis.hpp>
2
3int main() {
4 // Open a Davis camera
5 dv::io::camera::DAVIS capture{};
6
7 // Set ROI for events
8 capture.setCropAreaEvents({0, 0, 200, 200});
9
10 // Set ROI for frames
11 capture.setCropAreaFrames({0, 0, 300, 150});
12
13 return 0;
14}
1import dv_processing as dv
2
3# Open a Davis camera
4capture = dv.io.camera.DAVIS()
5
6# Set ROI for events
7capture.setCropAreaEvents((0, 0, 200, 200))
8
9# Set ROI for frames
10capture.setCropAreaFrames((0, 0, 300, 150))
Read events from a live camera
Incoming data from a camera can be read sequentially using the dv::io::CameraInputBase::getNextEventBatch()
.
Following is a minimal sample on how to read events sequentially from a camera:
1#include <dv-processing/io/camera/discovery.hpp>
2
3#include <chrono>
4
5int main() {
6 using namespace std::chrono_literals;
7
8 // Open any camera
9 auto capture = dv::io::camera::open();
10
11 // Run the loop while camera is still connected
12 while (capture->isRunning()) {
13 // Read batch of events, check whether received data is correct.
14 // The method does not wait for data arrive, it returns immediately with
15 // the latest available data or if no data is available, returns a `std::nullopt`.
16 if (const auto events = capture->getNextEventBatch(); events.has_value()) {
17 // Print received packet information
18 std::cout << *events << std::endl;
19 }
20 else {
21 // No data has arrived yet, short sleep to reduce CPU load.
22 std::this_thread::sleep_for(1ms);
23 }
24 }
25
26 return 0;
27}
1import time
2
3import dv_processing as dv
4
5# Open any camera
6capture = dv.io.camera.open()
7
8# Run the loop while camera is still connected
9while capture.isRunning():
10 # Read batch of events
11 events = capture.getNextEventBatch()
12
13 # The method does not wait for data arrive, it returns immediately with
14 # latest available data or if no data is available, returns a `None`
15 if events is not None:
16 # Print received packet time range
17 print(f"Received events within time range [{events.getLowestTime()}; {events.getHighestTime()}]")
18 else:
19 # No data has arrived yet, short sleep to reduce CPU load
20 time.sleep(0.001)
Read frames from a live camera
Incoming frames from a camera can be read sequentially frame-by-frame using the
dv::io::CameraInputBase::getNextFrame()
. Following is a minimal sample on how to read frames sequentially
from a camera:
1#include <dv-processing/io/camera/discovery.hpp>
2
3#include <opencv2/highgui.hpp>
4
5int main() {
6 // Open any camera
7 auto capture = dv::io::camera::open();
8
9 // Initiate a preview window
10 cv::namedWindow("Preview", cv::WINDOW_NORMAL);
11
12 // Run the loop while camera is still connected
13 while (capture->isRunning()) {
14 // Read a frame, check whether it is correct.
15 // The method does not wait for frame arrive, it returns immediately with
16 // the latest available frame or if no data is available, returns a `std::nullopt`.
17 if (const auto frame = capture->getNextFrame(); frame.has_value()) {
18 std::cout << *frame << std::endl;
19
20 // Show a preview of the image
21 cv::imshow("Preview", frame->image);
22 }
23 cv::waitKey(2);
24 }
25
26 return 0;
27}
1import cv2 as cv
2import dv_processing as dv
3
4# Open any camera
5capture = dv.io.camera.open()
6
7# Initiate a preview window
8cv.namedWindow("Preview", cv.WINDOW_NORMAL)
9
10# Run the loop while camera is still connected
11while capture.isRunning():
12 # Read a frame from the camera
13 frame = capture.getNextFrame()
14
15 # The method does not wait for frame arrive, it returns immediately with
16 # latest available frame or if no data is available, returns a `None`
17 if frame is not None:
18 # Print received packet time range
19 print(f"Received a frame at time [{frame.timestamp}]")
20
21 # Show a preview of the image
22 cv.imshow("Preview", frame.image)
23 cv.waitKey(2)
Read IMU data from a live camera
Incoming imu data from a camera can be read sequentially using the {cpp:func}
dv::io::CameraInputBase::getNextImuBatch()
. Following is a minimal sample on how to read imu data sequentially from a
camera:
1#include <dv-processing/io/camera/discovery.hpp>
2
3#include <chrono>
4
5int main() {
6 using namespace std::chrono_literals;
7
8 // Open any camera
9 auto capture = dv::io::camera::open();
10
11 // Run the loop while camera is still connected
12 while (capture->isRunning()) {
13 // Read IMU measurement batch, check whether it is correct.
14 // The method does not wait for data to arrive, it returns immediately with
15 // the latest available imu data or if no data is available, returns a `std::nullopt`.
16 if (const auto imuBatch = capture->getNextImuBatch(); imuBatch.has_value() && !imuBatch->empty()) {
17 std::cout << "Received " << imuBatch->size() << " IMU measurements" << std::endl;
18 }
19 else {
20 // No data has arrived yet, short sleep to reduce CPU load.
21 std::this_thread::sleep_for(1ms);
22 }
23 }
24
25 return 0;
26}
1import time
2
3import dv_processing as dv
4
5# Open any camera
6capture = dv.io.camera.open()
7
8# Run the loop while camera is still connected
9while capture.isRunning():
10 # Read a batch of IMU data from the camera
11 imu_batch = capture.getNextImuBatch()
12
13 # The method does not wait for data to arrive, it returns immediately with
14 # latest available data or if no data is available, returns a `None`
15 if imu_batch is not None and len(imu_batch) > 0:
16 # Print the time range of imu data
17 print(f"Received imu data within time range [{imu_batch[0].timestamp}; {imu_batch[-1].timestamp}]")
18 else:
19 time.sleep(0.001)
Read triggers from a live camera
Note
To understand what triggers are and where they come from, read more about them on our documentation page.
Incoming trigger data from a camera can be read sequentially using the
dv::io::CameraInputBase::getNextTriggerBatch()
. Following is a minimal sample on how to read trigger data
sequentially from a camera:
1#include <dv-processing/io/camera/davis.hpp>
2
3#include <chrono>
4
5int main() {
6 using namespace std::chrono_literals;
7
8 // Open a DAVIS camera (DVXplorer also has triggers)
9 dv::io::camera::DAVIS capture{};
10
11 // Depending on the incoming signal, enable the detection of the desired type of pattern, here we enable everything.
12 // Enable rising edge detection
13 capture.setDetectorRisingEdges(true);
14 // Enable falling edge detection
15 capture.setDetectorFallingEdges(true);
16 // Enable detector
17 capture.setDetectorRunning(true);
18
19 // Run the loop while camera is still connected
20 while (capture.isRunning()) {
21 // Read trigger batch, check whether it is correct.
22 // The method does not wait for data to arrive, it returns immediately with
23 // the latest available data or if no data is available, returns a `std::nullopt`.
24 if (const auto triggers = capture.getNextTriggerBatch(); triggers.has_value() && !triggers->empty()) {
25 std::cout << "Received " << triggers->size() << " Triggers" << std::endl;
26 }
27 else {
28 // No data has arrived yet, short sleep to reduce CPU load.
29 std::this_thread::sleep_for(1ms);
30 }
31 }
32
33 return 0;
34}
1import time
2
3import dv_processing as dv
4
5# Open any camera
6capture = dv.io.camera.open()
7
8# Depending on the incoming signal, enable the detection of the desired type of pattern, here we enable everything.
9# Enable rising edge detection
10capture.setDetectorRisingEdges(True)
11# Enable falling edge detection
12capture.setDetectorFallingEdges(True)
13# Enable detector
14capture.setDetectorRunning(True)
15
16# Run the loop while camera is still connected
17while capture.isRunning():
18 # Read a batch of triggers from the camera
19 triggers = capture.getNextTriggerBatch()
20
21 # The method does not wait for data arrive, it returns immediately with
22 # latest available data or if no data is available, returns a `None`
23 if triggers is not None and len(triggers) > 0:
24 # Print the time range of trigger data
25 print(f"Received trigger data within time range [{triggers[0].timestamp}; {triggers[-1].timestamp}]")
26 else:
27 time.sleep(0.001)
Sample application - reading data from a live camera
An application reading multiple types of data from a live camera can be found among the code samples in the source code repository of the dv-processing library:
From a file
Data from iniVation cameras are usually recorded using the AEDAT4 file format. The dv-processing library provide tools for reading such files. This section contains explanations and samples on how data can be read from AEDAT4 files. More detailed information on the AEDAT4 file format can be found here.
Inspecting AEDAT4 files
AEDAT4 file format supports recording of different data streams into single file, multiple cameras are also supported. The library provides a command-line utility for inspection of AEDAT4 files dv-filestat, it provides information on available streams recorded in it.
The utility provides information on the size, timestamp information, duration. More information about the utility can be found here.
Opening a file
AEDAT4 files can be opened and read using the dv::io::MonoCameraRecording
class. This class assumes that
the recording was performed using a single camera. Following is a minimal sample code on opening a recording and
printing information about it.
A file can be opened by providing its path in the filesystem:
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Open a file
5 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
6
7 // Get and print the camera name that data from recorded from
8 std::cout << "Opened an AEDAT4 file which contains data from [" << reader.getCameraName() << "] camera"
9 << std::endl;
10
11 return 0;
12}
1import dv_processing as dv
2
3# Open a file
4reader = dv.io.MonoCameraRecording("path/to/file.aedat4")
5
6# Get and print the camera name that data from recorded from
7print(f"Opened an AEDAT4 file which contains data from [{reader.getCameraName()}] camera")
Checking available streams
Data recordings might contain various data streams. The dv::io::MonoCameraRecording
provides easy-to-use
methods to inspect what data streams are available. Following sample code shows how to check for existence of various
data streams:
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Store the file path
5 const std::string pathToFile = "path/to/file.aedat4";
6
7 // Open a file
8 dv::io::MonoCameraRecording reader(pathToFile);
9
10 // Print file path and camera name
11 std::cout << "Available streams in [" << pathToFile << "]:" << std::endl;
12
13 // Check if event stream is available
14 if (reader.isEventStreamAvailable()) {
15 // Check the resolution of event stream. Since the getEventResolution() method returns
16 // a std::optional, we use *operator to get the value. The method returns std::nullopt
17 // only in case the stream is unavailable, which is already checked.
18 const cv::Size resolution = *reader.getEventResolution();
19
20 // Print that the stream is present and its resolution
21 std::cout << " * Event stream with resolution " << resolution << std::endl;
22 }
23
24 // Check if frame stream is available
25 if (reader.isFrameStreamAvailable()) {
26 // Check the resolution of frame stream. Since the getFrameResolution() method returns
27 // a std::optional, we use *operator to get the value. The method returns std::nullopt
28 // only in case the stream is unavailable, which is already checked.
29 const cv::Size resolution = *reader.getFrameResolution();
30
31 // Print that the stream is available and its resolution
32 std::cout << " * Frame stream with resolution " << resolution << std::endl;
33 }
34
35 // Check if IMU stream is available
36 if (reader.isImuStreamAvailable()) {
37 // Print that the IMU stream is available
38 std::cout << " * IMU stream" << std::endl;
39 }
40
41 // Check if trigger stream is available
42 if (reader.isTriggerStreamAvailable()) {
43 // Print that the trigger stream is available
44 std::cout << " * Trigger stream " << std::endl;
45 }
46
47 return 0;
48}
1import dv_processing as dv
2
3# Store the file path
4path_to_file = "path/to/file.aedat4"
5
6# Open a file
7reader = dv.io.MonoCameraRecording(path_to_file)
8
9# Print file path and camera name
10print(f"Checking available streams in [{path_to_file}] for camera name [{reader.getCameraName()}]:")
11
12# Check if event stream is available
13if reader.isEventStreamAvailable():
14 # Check the resolution of event stream
15 resolution = reader.getEventResolution()
16
17 # Print that the stream is present and its resolution
18 print(f" * Event stream with resolution [{resolution[0]}x{resolution[1]}]")
19
20# Check if frame stream is available
21if reader.isFrameStreamAvailable():
22 # Check the resolution of frame stream
23 resolution = reader.getFrameResolution()
24
25 # Print that the stream is available and its resolution
26 print(f" * Frame stream with resolution [{resolution[0]}x{resolution[1]}]")
27
28# Check if IMU stream is available
29if reader.isImuStreamAvailable():
30 # Print that the IMU stream is available
31 print(" * IMU stream")
32
33# Check if trigger stream is available
34if reader.isTriggerStreamAvailable():
35 # Print that the trigger stream is available
36 print(" * Trigger stream")
The dv-processing library also supports recording of other types, type agnostic methods are available as templated methods in C++, while Python only contains a limited set of named methods (since templating is unavailable in Python). The following sample shows the use of generic methods for checking the availability of certain streams with a name and a type:
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Open a file
5 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
6
7 // Store some stream name
8 const std::string streamName = "poses";
9
10 // Check if such stream name is available and validate the data type of this stream
11 if (reader.isStreamAvailable("streamName") && reader.isStreamOfDataType<dv::Pose>("streamName")) {
12 std::cout << "The file contains a stream named [" << streamName << "] and of data type [dv::Pose]" << std::endl;
13 }
14
15 return 0;
16}
Read events from a file
Following sample reads events in batches while the stream has available data to read. While reading from a file, the
dv::io::MonoCameraRecording::getNextEventBatch()
will return data until the end of stream is reached, the
dv::io::MonoCameraRecording::isRunning()
method will return a false boolean when the end-of-file is reached
for any stream. You can also use the dv::io::MonoCameraRecording::isRunning(streamName)()
method to check for
specific streams.
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Open a file
5 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
6
7 // Run the loop while data is available
8 while (reader.isRunning()) {
9 // Read batch of events, check whether received data is correct.
10 if (const auto events = reader.getNextEventBatch(); events.has_value()) {
11 // Print received event packet information
12 std::cout << *events << std::endl;
13 }
14 }
15
16 return 0;
17}
1import dv_processing as dv
2
3# Open any camera
4reader = dv.io.MonoCameraRecording("path/to/file.aedat4")
5
6# Run the loop while camera is still connected
7while reader.isRunning():
8 # Read batch of events
9 events = reader.getNextEventBatch()
10 if events is not None:
11 # Print received packet time range
12 print(f"{events}")
Read frames from a file
Following sample reads frames in batches while the stream has available data to read. While reading from a file, the
dv::io::MonoCameraRecording::getNextFrame()
will return a frame until the end of stream is reached, the
dv::io::MonoCameraRecording::isRunning()
method will return a false boolean when the end-of-file is reached
for any stream. You can also use the dv::io::MonoCameraRecording::isRunning(streamName)()
method to check for
specific streams.
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3#include <opencv2/highgui.hpp>
4
5int main() {
6 // Open a file
7 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
8
9 // Initiate a preview window
10 cv::namedWindow("Preview", cv::WINDOW_NORMAL);
11
12 // Variable to store the previous frame timestamp for correct playback
13 std::optional<int64_t> lastTimestamp = std::nullopt;
14
15 // Run the loop while data is available
16 while (reader.isRunning()) {
17 // Read a frame, check whether it is correct.
18 // The method does not wait for frame arrive, it returns immediately with
19 // latest available frame or if no data is available, returns a `std::nullopt`.
20 if (const auto frame = reader.getNextFrame(); frame.has_value()) {
21 // Print information about received frame
22 std::cout << *frame << std::endl;
23
24 // Show a preview of the image
25 cv::imshow("Preview", frame->image);
26
27 // Calculate the delay between last and current frame, divide by 1000 to convert microseconds
28 // to milliseconds
29 const int delay = lastTimestamp.has_value() ? (frame->timestamp - *lastTimestamp) / 1000 : 2;
30
31 // Perform the sleep
32 cv::waitKey(delay);
33
34 // Store timestamp for the next frame
35 lastTimestamp = frame->timestamp;
36 }
37 }
38
39 return 0;
40}
1import dv_processing as dv
2import cv2 as cv
3
4# Open a file
5reader = dv.io.MonoCameraRecording("path/to/file.aedat4")
6
7# Initiate a preview window
8cv.namedWindow("Preview", cv.WINDOW_NORMAL)
9
10# Variable to store the previous frame timestamp for correct playback
11lastTimestamp = None
12
13# Run the loop while camera is still connected
14while reader.isRunning():
15 # Read a frame from the camera
16 frame = reader.getNextFrame()
17
18 if frame is not None:
19 # Print the timestamp of the received frame
20 print(f"Received a frame at time [{frame.timestamp}]")
21
22 # Show a preview of the image
23 cv.imshow("Preview", frame.image)
24
25 # Calculate the delay between last and current frame, divide by 1000 to convert microseconds
26 # to milliseconds
27 delay = (2 if lastTimestamp is None else (frame.timestamp - lastTimestamp) / 1000)
28
29 # Perform the sleep
30 cv.waitKey(delay)
31
32 # Store timestamp for the next frame
33 lastTimestamp = frame.timestamp
Read IMU data from a file
Following sample reads imu data in batches while the stream has available data to read. While reading from a file, the
dv::io::MonoCameraRecording::getNextImuBatch()
will return an IMU measurement batch until the end of stream
is reached, the dv::io::MonoCameraRecording::isRunning()
method will return a false boolean when the
end-of-file is reached for any stream. You can also use the
dv::io::MonoCameraRecording::isRunning(streamName)()
method to check for specific streams.
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Open a file
5 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
6
7 // Run the loop while data is available
8 while (reader.isRunning()) {
9 // Read IMU measurement batch, check whether it is correct.
10 if (const auto imuBatch = reader.getNextImuBatch(); imuBatch.has_value() && !imuBatch->empty()) {
11 // Print IMU batch information
12 std::cout << "Received " << imuBatch->size() << " IMU measurements" << std::endl;
13 }
14 }
15
16 return 0;
17}
1import dv_processing as dv
2
3# Open a file
4reader = dv.io.MonoCameraRecording("path/to/file.aedat4")
5
6# Run the loop while stream contains data
7while reader.isRunning():
8 # Read a batch of IMU data from the camera
9 imu_batch = reader.getNextImuBatch()
10 if imu_batch is not None and len(imu_batch) > 0:
11 # Print the info of the imu data
12 print(f"Received {len(imu_batch)} IMU measurements")
Read triggers from a file
Following sample reads triggers in batches while the stream has available data to read. While reading from a file, the
dv::io::MonoCameraRecording::getNextTriggerBatch()
will return a batch of triggers until the end of stream
is reached, the dv::io::MonoCameraRecording::isRunning()
method will return a false boolean when the
end-of-file is reached for any stream. You can also use the
dv::io::MonoCameraRecording::isRunning(streamName)()
method to check for specific streams.
1#include <dv-processing/io/mono_camera_recording.hpp>
2
3int main() {
4 // Open a file
5 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
6
7 // Run the loop while data is available
8 while (reader.isRunning()) {
9 // Read trigger batch, check whether it is correct.
10 if (const auto triggers = reader.getNextTriggerBatch(); triggers.has_value() && !triggers->empty()) {
11 // Print the trigger batch information
12 std::cout << "Received " << triggers->size() << " triggers" << std::endl;
13 }
14 }
15
16 return 0;
17}
1import dv_processing as dv
2
3# Open a file
4reader = dv.io.MonoCameraRecording("path/to/file.aedat4")
5
6# Run the loop while camera is still connected
7while reader.isRunning():
8 # Read a a batch of triggers from the camera
9 triggers = reader.getNextTriggerBatch()
10
11 # Check whether batch is valid and contains data
12 if triggers is not None and len(triggers) > 0:
13 # Print the trigger batch information
14 print(f"Received {len(triggers)} triggers")
[Advanced] Reading custom data types
The previous samples show how to use named functions to read different data types. The C++ API provides templated methods to read any type of data. Below is a sample that shows how to read data using the generic templated API:
Note
Since templated methods are only available in C++, the generic writing methods are only available in the C++ API.
1#include <dv-processing/data/timed_keypoint_base.hpp>
2#include <dv-processing/io/mono_camera_recording.hpp>
3
4int main() {
5 // Open a file
6 dv::io::MonoCameraRecording reader("path/to/file.aedat4");
7
8 // Define and contain a stream name in a variable
9 const std::string stream = "keypoints";
10
11 // Check whether a timed-keypoint stream is available
12 if (!reader.isStreamAvailable(stream) || !reader.isStreamOfDataType<dv::TimedKeyPointPacket>(stream)) {
13 throw dv::exceptions::RuntimeError("Stream named 'keypoints' not found");
14 }
15
16 // Run the loop while data is available
17 while (reader.isRunning()) {
18 // Read timed keypoint batch, check whether it is correct.
19 if (const auto keypoints = reader.getNextStreamPacket<dv::TimedKeyPointPacket>(stream); keypoints.has_value()) {
20 // Print the number of keypoints read
21 std::cout << "Read " << keypoints->elements.size() << " timed keypoints" << std::endl;
22 }
23 }
24
25 return 0;
26}
Sample application - reading data from a recorded AEDAT4 file
An application reading multiple types of data from an AEDAT4 file can be found in the source code repository of the dv-processing library: