SVO Recording and Playback for Prototyping in Touch Designer

Hey,
to test the ux i want to setup an easy environment for prototyping in touch designer.
this includes:

  • playback of “normal” video with objects and overlays
  • synced playback of object detection data

i have recorded svo files of the scenes i want to test.
if i play the svo videos and apply the object detection and generate visible overlays of the detection in the video the svo video varys in playback speed.

assumptions:
1- so i guess the svo-file-playback speed is the capture rate of the images saved in the SVO file. is that true?

2 - setting the flag svo_realtime_mode to true causes using images’ timestamps. but documentation says “Enabling this parameter will bring the SDK closer(?!) to a real simulation when playing back a file by using the images’ timestamps.” so closer is not really linear/constant playback speed right? what is closer?

what is important to me is, that the playback, either linear(constant frame interval) or warped(capture rate), runs along the data in touch designer.

another option i thought would be to save svo video timestamps in every OSC message. then have a playback logic in touchdesigner which runs along the OSC messages at Touchdesigner´s samplerate and jumps from frame to frame.
i would prefer to get “linear” data into touch designer though.

very curious about your answers ! :slight_smile:

Hey there @D300

1- so i guess the svo-file-playback speed is the capture rate of the images saved in the SVO file. is that true?

Not really, it should be the rate at which you call the grab method, if real time mode is not enabled.

2 - setting the flag svo_realtime_mode to true causes using images’ timestamps. but documentation says “Enabling this parameter will bring the SDK closer (?!) to a real simulation when playing back a file by using the images’ timestamps.” so closer is not really linear/constant playback speed right? what is closer?

Basically if when calling grab, the wall-clock timestamp is greater or equal to the timestamp of the next frame added to the wall-clock timestamp when starting the playback, the frame will immediately be returned.
If several frames are available (because the previous grab took too much time for example), the intermediary frames will be dropped.
If the timestamp is not reached yet, the grab will wait for the timestamp to be reached, with very short sleeps.

It should be very close to linear, though, pretty much as close as a live feed can get.