ZED360 completely stopped working

Hello! Well, after a lot of work calibrating multiple cameras in one area, and after everything was working fine (I even found the right way to calibrate multiple cameras), when I took the PC to the final place where I would install the interaction, I had a serious problem and the ZED360 software stopped working completely.

In this new place I took 2 computers and I thought it would only be a problem with the first computer, but when I tried the same thing in the second computer I got the same problem. At first the software was unable to detect 2 cameras (I have 8 zed 2 cameras and the same thing happened with all of them.) and when it did, it could never show them as it should, i.e. facing each other. The other thing that happened is that the cameras could not see beyond 4 to 5 mts. After that distance they could not detect anything.

Finally I did not find a solution but I was able to modify a calibration file that I had generated in my development space, obviously using different measurements, which in the end was not an exact solution but at least I was able to get out of the problem.

Could it be that the camera sensors do not work properly when they have LED screens in front of them (there are several meters of modular LED screens)? The installation was in a corridor and the cameras were at each end. I should also mention that the screens formed the corridor as if they were walls.

The final product in my opinion ended up being a fiasco because this software stopped working.

Any possible solution?

I think if there was some way in python, c++, etc, with which cul could perform a correct calibration regardless of the location it would be an ideal solution. Don’t you have any libraries or modules that can be used to develop something by yourself to calibrate ?

It might also be interesting if we can understand how to generate or modify a calibration file to fit any space.

Maybe if one could give the location of the cameras (X,Y,Z position), distance between them, etc, there could be a way to generate a calibration file.

These are just ideas but the truth is it is necessary to be able to achieve this without depending on a software that can fail without finding a way to make it work again.

Thank you.

My config:
OS: Windows 10
CPU: Intel(R) Core™ i7-5820K CPU @ 3.30GHz
GPU: NVIDIA GeForce RTX 2080
Zed SDK Version: v.4.0.4
Camera: ZED2

Hi,

Sorry for your troubles. We are aware that our first version of ZED360 is not reliable enough and are working on it.
When you say that you made eveythink work fine once, I assume you had created a calibration file for your cameras with ZED 360? Can you show it to me ?
The configuration file is documented here : Fusion | Stereolabs
As you say, it contains world positions and translations of cameras. You can build it by yourself. However, it’s a lot easier using ZED360. The constraint that might prevent a good calibration is the presence of multiple people, that should be avoided at all cost during the calibration.

Once you have your calibration file (And if I understood correctly you have it), you don’t have to use ZED 360 anymore. The tool is mainly here for the calibration. You can use your calibration file with our samples https://github.com/stereolabs/zed-sdk/tree/master/body%20tracking/multi-camera
… Or with your own program, of course.

Then, about the screens : the NEURAL depth in particular will probably behave weirdly in front of a TV. However, ULTRA should be fine, it will just see a good old wall.

Hello, thank you for replying.

I have left the calibration file that worked for me in my development space (physical location).
Clarify that this space where I developed the application has completely different measures to the final space where the project was installed. The final space is much larger.

For this reason that I have mentioned, it is because I needed to recalibrate and I needed to use the ZED360 app. Also because of the above is that I needed to understand if there was any way to edit the file adjusting some values, the question was , which values ? Finally I tried and tried until something I could adjust the final file and it worked, although not as it should 100% but something worked. This is why I need to find a way to fix this for future developments.

test.json (1.1 KB)

This part of the file

        "world": {
            "rotation": [ // orientation of the camera in radians
                0,
                0,
                0
            ],
            "translation": [ // position of the camera in meters
                0,
                0,
                0
            ]
        }

Contains the position and translations of the camera. You can totally write them by yourself, but it requires a high accuracy.

Hello,

Yeah sure, I understand this section but, we need more information about how i can take the real values. Example: Distance between cameras, height of each camera from floor, additional data from each camera(rotation, inclination, etc) and additional to all previous how can I set the initial rotation so that each camera faces into the space instead of back to back? This problem has been the big problem. Maybe a solution can be that I can add some data manually.

Greetings !

These value define a reference frame where all the camera are positioned and oriented. For example if you put one camera at 0,0,0, it will be in the center. Then, it’s in meters and radians. It is explained here : Unauthorized Access

Hello again and thank for the information.

I have downloaded everything again for version 4.0.5 (zed-livelink and Zed SDK) and I am getting another of the things that happened to me at the end of the previous project and was that the ZED360 app, although at the moment I detected the camera(s) and let me use the calibration, when I saved the file it did not create it but what is more strange is that it did not even give me an error.

This is really disappointing because now I am in a new project and I can not generate the calibration file.

At least if it won’t let me create the file it could show me the code in a window from which I can copy and paste it to a file but I don’t have any solution.

Thanks.

ZED 360 really needs some more error handling. In the meantime you can verify

  • Were you trying to calibrate more than one camera ? only one will fail
  • Di the camera only see one person during the whole process ?

Im testing using two cameras.

Supposedly I can use up to 4 cameras but the most I have tried is with 2. The most rare and serious error is that although it allows me to get to the end of the calibration, at the moment of asking me to record to a file, this process does not complete. It simply does not create the file.

Days ago (before delivering the previous project) the ZED360 app worked correctly. After many tests I managed to find a way to calibrate without problems, until one day X it stopped working. It didn’t calibrate properly and it didn’t save/create the calibration file either.

Today I have tried several versions of the SDK (from 4.0.2 to 4.0.5) and on different computers (even on 2 computers where we never use the ZED360 app) it doesn’t work anymore.

As I was saying this now becomes serious as I don’t have how to calibrate new spaces.

Thank you.

Hi, we’ll fix these bugs and improve 360 quite soon. If you’re interested, I can send you a EA installer when I’ll have one.

However, we did not erased SDK versions, so if you had one version that worked, it should still work. Make sure that cameras are overlapping and that only one person goes through the space for a minute.

Excelent, i stay waiting for this update… thanks !!!

Hi, I understand all you can say me but, now, how i tell you previously, now i cant generate the calibration file why the app ZED360 dont want to write to file. This is the more stranger.

I hope you can help me with this problem.

Thanks

Hi, As I explained above, ZED360 does not let me create the calibration file. Although it gives me the option to save/create the calibration file, in the end it is not created.

Do you have any way to see the information that will be saved in the file to at least create the file manually?

I seriously need to fix this problem as I have a project coming up in a few days and still can’t get this information through the ZED360 app. I need the file to be able to move forward with this development.

Is there another way to create this file?

Thank you.

Hi,

An example of the file is here with all the documentation you need : Fusion | Stereolabs
You’ll just need to set the positions and rotations by yourself. If ZED360 did not write the file, It’s highly probable that it could not find these positions and orientations, it’s not just about writing the file.

I did not ask, are the camera overlapping correctly ? I mean, do they see the person at the same time ? Could you describe your room setup a little ?

Well clearly, this is not a problem with cameras and even less so with camera overlap. As I have mentioned in previous messages, I have had the ZED360 tool work before but at a certain point it stopped working. What’s more, this tool does not work on any computer, even, on computers where this tool has never been used before and now I’ll add more, I have tried several versions of the ZED SDK (from 4.0.2 to 4.0.5) and with all these tests nothing works. Also mention, I have testing using eight cameras. With all of them i arrive to the same bad result.

With respect to the example of the calibration file, How can I get a very accurate result by thinking of calibrating at least 2 cameras and all manually ? I think that calculating all the values is not so simple. How do I calculate the “World Rotation” or “World Translation”(this one looks easier) ? But all this calibrated with at least 2 cameras ?

When I try to generate the calibration file it doesn’t even work using only 1 camera. How is it possible that all of a sudden the ZED360 tool does not work and this is the same in all the compurtors ? Is this a problem of the cameras ? Why do the cameras work with the other tools (Zed Explorer, etc).

Is it a problem with the firmware version (Camera: 1532, IMU: 778)?

It’s most probably a problem with ZED 360, as I said it lacks robustness right now. It requires all senders and receivers to match the SDK versions, have a good FPS, and does not print the right error messages.

About manually building the file, it’s the camera position into the world. It’s like you’d set the coordinates of the cameras in the world. A world position to zero will be in the the center. a world position to (1,0,0) will be one meter away. Same thing for rotations.

My existence doubt occurs in the calibration of 2 or more cameras. I understand about the position in the world of 1 camera but when there are 2, don’t they need to calibrate each other to synchronize the same lens and its movements in the real working area?

Now regarding the use of the ZED360 application, it is working again. I have managed to try to calibrate 2 cameras but at times it happens to me that although the cameras are looking at each other face to face, one of them is looking backwards and this makes the skeleton of the same person appear 2 times as normal but in different places. Given the above, it is not possible to calibrate the location of the person. I leave an image of what happens:

Another thing I have noticed is that the fine movements of the hands and arms with FUSION are not very accurate. Moreover, they are a bit far from accurate. So, in an idea of making a virtual touch wall (particles with niagara) projected on a physical wall, the feeling of touching the physical wall and generating interactivity is unrealistic due to the issue of the precision of the arms/hands since the tracking is not accurate.

I just wait that all this info have a good welcome and you can use this feedback to improve the ZED360 application as it is quite complex otherwise (even manual) to try to use multiple cameras to achieve better interactive results.

Thank you.

More Info.

Something is happening and I don’t know if it is the new ZED360(SDK 4.0.5) or something else but when I walk on the X axis (the floor in the real world) the Manny moves on the Z axis (up and down) and also does not follow the movement of the person, i.e. it is always in T position.
image

I have a question, do you have any visual suggestion(example image) of how the overlap between 2 cameras should be to run the calibration ? I think this may be important.

Thank you.

Thank you for reporting all this. The hand tracking will be improved in the future with the (re)introduciton of BODY_70, but it’s not perfect right now.
About your calibration, Even so you have a calibration file in the end it seems wrong, since the cameras are not on the right positions and orientations. From our latest tests, the framerate of senders and receivers is really important, make sure you reach the maximum you can.