How to superimpose sensor data (CO₂, NO₂, O₃, etc.) onto ZEDfu 3D map

Hello,

I am currently working with a ZED 2i camera and using ZEDfu to generate 3D environment reconstructions. I export the map as an OBJ file together with the MTL and PNG texture. When I view the file in ZEDfu, the details and colors look correct. However, when I load the OBJ into Blender, many details appear missing or simplified compared to what I see in ZEDfu.

My research goal is to superimpose environmental sensor data (for example CO₂, NO₂, O₃, humidity, temperature) onto the 3D map generated by the ZED 2i. I would like to display this data as color-coded overlays or heatmaps inside the reconstructed environment.

I would like to ask:

  1. Is it possible to directly overlay sensor data onto the 3D map inside ZEDfu or through another Stereolabs-supported workflow?

  2. If not, what is the recommended approach to achieve this visualization? Should I use the ZED SDK, Unity, or another pipeline instead of Blender?

  3. Why does the OBJ file exported from ZEDfu lose details when imported into Blender? Is there a recommended export format for preserving both geometry and textures?

Any advice or examples from the community on how to combine external sensor data with ZEDfu 3D reconstructions would be greatly appreciated.

Thank you very much,
Hong Wei Soon

Hi @Justinmars

This function is not available.

Unfortunately, I do not currently have a reply to your question. The ZED SDK does not provide functions to achieve this result.

You should ask to Blender support, providing the OBJ file to analyze it.
Have you tried with the other available exporting formats?

Hi @Myzhar

Thank you for your help, I will check with the Blender team. Basically, my goal is to use the Zed 2i camera to create the 3d map, then superimpose the environmental data into the 3d map in real time so that we know what happens in the surroundings. since you mentioned this function is not available in zedfu, i was thinking maybe unity could be a good option. i checked the zed tutorials and saw there are examples for object detection and vr/ar, but could you provide more detail or guidance on how we can bring the zedfu map into unity and overlay external sensor data (like co₂, no₂, o₃, humidity, etc.)?

Also, is there any recommended export format from ZedFu for Unity (obj, fbx, gltf, ply), and is it possible to get the camera or mapping trajectory to help align sensor readings with the map?

Any advice or reference project from the team would be really helpful, thanks again!