You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am using a Visionary T-Mini camera and after running the continuous streaming samples in both C++ and Python, I've noticed that the point cloud they save is not the same as the one generated by SOPAS, yielding different X, Y and Z coordinates for the same point, with a few centimeters of difference in some cases. If I measure the real distances from the center of the sensor, SOPAS seems to be pretty accurate. Do I need to do any extra steps using the API to get the same point cloud as SOPAS? I am planning to use the camera in a robotics application where getting accurate measurements is crucial.