Hi,
We’re running four ZED X cameras on a Jetson AGX Orin for multi-camera SLAM and 3D reconstruction. For downstream photometric processing (exposure compensation, radiometric
calibration, and cross-camera color consistency) we need per-frame exposure, gain, and white balance metadata — none of which are currently accessible through the SDK.
What’s broken
getCameraSettings() consistently returns -1 or a sentinel value for all of the following, regardless of whether auto-control is enabled or disabled:
- EXPOSURE_TIME → always -1
- ANALOG_GAIN → always -1
- EXPOSURE (relative 0-100) → always -1
- GAIN (relative 0-100) → always -1
- WHITEBALANCE_TEMPERATURE → always returns 2800 (the minimum) in auto-WB mode, not the actual computed temperature
Only AUTO_EXPOSURE_TIME_RANGE returns a value ([28, 33333] µs) — the permitted range, not the current value in use.
We also tried SVO pre-recording as a workaround: EXPOSURE_TIME and ANALOG_GAIN do get populated during SVO playback, but WHITEBALANCE_TEMPERATURE still returns 2800 during SVO
playback, so WB is unrecoverable through any current API path.
What we need (in priority order)
- Fix getCameraSettings(EXPOSURE_TIME) and getCameraSettings(ANALOG_GAIN) to return the actual values in both auto and manual modes
- Fix getCameraSettings(WHITEBALANCE_TEMPERATURE) to return the actual auto-computed temperature during auto-WB mode (not 2800)
- Add per-frame metadata alongside grab results — e.g. a getFrameMetadata() method — so all three values are available atomically with each frame
- If none of the above, expose the underlying Argus ICaptureMetadata or the V4L2 file descriptor so we can read getSensorExposureTime(), getSensorAnalogGain(), and the ISP WB gains
directly
Why this should be straightforward
The ZED SDK already uses Argus internally. ICaptureMetadata provides all three values:
- getSensorExposureTime() — per-frame exposure in nanoseconds
- getSensorAnalogGain() — per-frame analog gain
- ISP white balance gains (as RGB channel multipliers, or the equivalent color temperature)
This data already exists in the pipeline — it just isn’t surfaced to developers.
Use case impact
Without this, we can’t do per-camera exposure compensation (frames from cameras with different exposures can’t be merged without ghosting) or shared white balance locking (each
camera converges to a different color temperature independently, causing visible color casts across the rig). Both are blockers for production-quality multi-camera capture.