Can I Stream a Textured Mesh Using ZED SDK?

Hi,

I hope all is well.

We are currently using the ZED Carrier Board (ZED Box Mini) with JetPack 6.2 on a Jetson Orin NX (16 GB), ZED SDK 5.0.2 and the ZED Box Mini v1.3.0 driver. We are using the enableStreaming() feature (Local Video Streaming - Stereolabs) to offload expensive computations to a laptop.

Our goal is to use the spatial mapping feature to create a “RT” photogrammetry-like representation (we would like to see a surface with colour mapping being updated every certain time as the detection system moves). So our idea is to request and retrieve the mesh async and then use an independent thread to apply a filter and a texture to later send the vertices, triangles, UVs and texture to a visualization software that will render this in chunks incrementally.

Is this architecture viable with the ZED SDK’s spatial mapping pipeline? Would you recommend us to use RTAB-Map instead of this approach? Any guidance or experience with similar setups would be greatly appreciated.

Thank you.

Hi @zonit

Yes, the described architecture is good.

We have not tested the latest RTABmap features, so we cannot say what’s better with not enough information. I recommend testing both architectures and comparing them.
A report here would be useful also for other users.

Hi @Myzhar,

Thank you for the prompt response.

However, I am running into some issues with the implementation of a continuous mapping system. Is there a way to retrieve only the new data that is added to the mesh, rather than the full mesh, when performing an asynchronous retrieval? Additionally, applyTexture() applies the texture to the entire mesh, which becomes expensive quite quickly. Is there a way to apply the texture only to the newly mapped areas, instead of retexturing the full mesh each time?

I am also considering disabling and re-enabling spatial mapping once the mesh reaches a certain size, but this approach feels hacky and would likely result in incoherent mesh interfaces.

I mentioned RTAB-Map because I am unsure whether the ZED SDK alone is sufficient to incrementally obtain textured maps. Perhaps this is a feature that may be supported in the future, but is not currently available.

Thank you.

Yes, are you using ZEDfu or the Spatial Mapping API?
I recommend you explore the code of the Spatial Mapping sample, which explains how to retrieve only the updated chunks.

Please add more information concerning this.

Hi @Myzhar,

Thank you again. I am using the Spatial Mapping API to generate textured meshes for incremental streaming. Geometry updates work well using has_been_updated(), but streaming the texture is where I am encountering issues.

From what I understand, applyTexture() generates a single global atlas, and each chunk’s UVs reference that atlas. I can use the updated chunks and their UVs to extract the relevant geometry for streaming. However, as the map grows, applying the texture becomes increasingly expensive. This is why I considered disabling and re-enabling spatial mapping after a certain point to reduce the processing load. I have not found a way to apply texturing only to new or updated data, and I wonder if I might be missing something.

Does this logic sound correct to you?

Hi @Myzhar,

I came across this thread: Zed Explorer vs Zedfu. A member of Stereolabs stated at the time that “ZEDfu is just a fancy UI over our API,” and that the same results should be reproducible using the Spatial Mapping sample. However, ZEDfu appears to update and display textures incrementally without incurring the full cost of regenerating the entire texture atlas each time.

When using the API, applyTexture() seems to apply globally, and we have not found any mechanism to limit this to updated chunks. Could you clarify how ZEDfu achieves this, and whether the incremental texturing functionality is available?

Thank you.