Running a Pytorch Model in Python, Then Rendering the Results in Unity

Hello again,

Thanks so much for working on this cool platform.

Given that the Unity OpenCV custom object detection example is somewhat deprecated, I was wondering if it is possible to use Python-Zed examples to run custom object detection with a Pytorch model, then stream/render the results on the Unity end. Ultimately, I want to display the CV results in AR passthrough mode.

If you can provide any assistance with this, that’d be great! The Unity Custom Object detection example does not seem to have any Pytorch samples, unless I just missed it…

Hi @blueming,

There are currently no ways to do that out-of-the-box with our sample currently.
If you want to implement it on your side, we will be glad to answer the questions you have, but we won’t be actively working on it in the short term, sorry for the inconvenience.

That being said, our custom OD sample does feel like it could use an update, I’ll log it for internal discussion, thanks for the report!
(And thanks for the kind words! :smiley: )


I’ve been working on solving this problem. So far, I can stream video frames from Python to Unity using ZED’s native streaming solution. Then, based on a Pytorch model, I can perform instance segmentation (i.e., prediction = inference_detector(model, img). Finally, I open a threaded socket connection in Python and send json packets containing information such as label, bbox, accuracy, and more to Unity.

Is there a way to render these results on the Unity end? Of course, I can implement my own solution (perhaps using PolyCollider2D). However, I am worried that this is quite slow. I am wondering if the ZED-Unity SDK has some tools that can help me render instance segmentation results from Python.

Here’s a sample json that the Unity application receives:
[{“class_name”: “Tennis Ball”,
“score”: 0.9066461324691772,
“bbox”: [663.1264038085938, 334.9393615722656, 769.7103271484375, 423.10589599609375],
“mask_id”: “mask_0.png”,
“depth”: 0.4628346264362335}]

Thanks so much! :))