Is it possible to use the Fusion Sender to stream video to Unreal via Live Link?


I want to display the video output of each of my cameras. I am using Live Link to stream the Fusion sender data into my Unreal scene.

I was thinking about using the Fusion Module to create a C++ class and custom blueprint to output the camera material, but I read somewhere that the Fusion Module is not supported in unreal.

“For now, the Unreal Engine 5 and Unity plugins for ZED do not handle the Fusion module of the SDK. The Live Link implementations are the way to bring the Fusion into these engines .”

from ZED Live Link integrations - Stereolabs

I wonder if there is a way to output the camera feeds into a material or UI element.

Any help or direction?



Hi Alex,

The Live Link implementation does not use the video stream from the ZED in UE5, it’s processed in the sender. Right now, you can’t access the video streams from the end application, the Live Link UE5 project receives skeletons only.