orbbec / pyorbbecsdk

OrbbecSDK python binding
https://orbbec.github.io/pyorbbecsdk/
Apache License 2.0
84 stars 21 forks source link

Frame Timestamp Mismatch in Bag Playback (Color Frame Offset by One Frame) #71

Open CBadam opened 1 month ago

CBadam commented 1 month ago

Description: I am using the pyorbbecsdk with a Femto Mega camera to record and playback data. The issue arises when recording data with recorder.py (activating both depth and color sensors) into a bag file, and then playing it back using the playback.py example. I observe that the color frame and depth frame in each frameset do not share the same timestamp during playback.

It appears that the color frame corresponds to the depth frame of the next frameset, leading to mismatches between depth and color frames. This becomes a significant problem when trying to save colored point clouds, as the color and depth data from different frames do not align correctly, resulting in inaccurate visualizations.

Below is an example of how I am testing playback with the playback.py:

from pyorbbecsdk import *

def playback_state_callback(state):
    if state == OBMediaState.OB_MEDIA_BEGIN:
        print("Bag player stopped")
    elif state == OBMediaState.OB_MEDIA_END:
        print("Bag player playing")
    elif state == OBMediaState.OB_MEDIA_PAUSED:
        print("Bag player paused")

ctx = Context()
ctx.set_logger_level(OBLogLevel.NONE)
pipeline = Pipeline("test4.bag")
playback = pipeline.get_playback()
playback.set_playback_state_callback(playback_state_callback)
device_info = playback.get_device_info()
camera_param = pipeline.get_camera_param()
pipeline.start()

index = 0
while True:
    frames = pipeline.wait_for_frames(1000)
    if frames is None:
        break

    depth_frame = frames.get_depth_frame()
    color_frame = frames.get_color_frame()

    if depth_frame is None or color_frame is None:
        continue

    print(f"index= {index}: color_timestamp={color_frame.get_timestamp()}")
    print(f"index= {index}: depth_timestamp={depth_frame.get_timestamp()}")

    index += 1

pipeline.stop()

Output Example: index= 0: color_timestamp=2193546 index= 0: depth_timestamp=2193345 index= 1: color_timestamp=2193747 index= 1: depth_timestamp=2193546

As you can see, there is always a one-frame offset between the color and depth timestamps, with the color frame lagging behind the corresponding depth frame.

Here is the code for how I recorded the data:

import cv2
import numpy as np
from pyorbbecsdk import *

ESC_KEY = 27

def main():
    ctx = Context()
    ip="192.168.1.10"
    port=8090
    device = ctx.create_net_device(ip, port)
    pipeline = Pipeline(device)
    config = Config()

    profile_list = pipeline.get_stream_profile_list(OBSensorType.DEPTH_SENSOR)
    profile = profile_list.get_default_video_stream_profile()
    config.enable_stream(profile)

    profile_list = pipeline.get_stream_profile_list(OBSensorType.COLOR_SENSOR)
    profile = profile_list.get_default_video_stream_profile()
    config.enable_stream(profile)

    pipeline.enable_frame_sync()
    pipeline.start(config)
    pipeline.start_recording("./test4.bag")

    index = 0
    while index < 20:
        frames = pipeline.wait_for_frames(1000)
        if frames is None:
            continue

        depth_frame = frames.get_depth_frame()
        color_frame = frames.get_color_frame()

        if depth_frame is None or color_frame is None:
            continue

        print(f"index= {index}: depth_timestamp={depth_frame.get_timestamp()}")
        print(f"index= {index}: color_timestamp={color_frame.get_timestamp()}")

        index += 1

    pipeline.stop_recording()
    pipeline.stop()

if __name__ == "__main__":
    main()

Additional Notes:

When I activate the infrared (IR) sensor alongside depth and color, the IR frame inherits the one-frame delay, and the depth and color frames align with each other. The issue seems to arise only during playback, as the timestamps match correctly during recording. I would greatly appreciate any insights or solutions to ensure that depth and color frames are synchronized during playback, allowing for proper point cloud generation.

Thank you!

CBadam commented 1 month ago

Actually, I did more tests, and I understand that the last stream I enable in the recorder is the one that won't have the same timestamps as the other streams. if i do this in the recorder:

config.enable_stream(ir_profile) config.enable_stream(depth_profile) config.enable_stream(color_profile)

I get this in the playback: index= 0: color_timestamp=1726842583477 index= 0: depth_timestamp=1726842583410 index= 0: ir_timestamp=1726842583410 index= 1: color_timestamp=1726842583543 index= 1: depth_timestamp=1726842583477 index= 1: ir_timestamp=1726842583477

zhonghong322 commented 1 month ago

I will reproduce and test the issue based on your feedback next week.

zhonghong322 commented 1 month ago

@CBadam This is a bug in the Orbbec SDK C++ library. Frame synchronization is not supported during playback. We will fix this issue in a future C++ version. After the fix is applied, you will be able to enable frame synchronization in your playback code to resolve the issue.

pipeline.enable_frame_sync() //add this code pipeline.start()

zhonghong322 commented 1 month ago

After the bug is fixed, I will notify you in this issue.

CBadam commented 1 month ago

@zhonghong322 Thank you for your response. I will be waiting for the bug fix. In the meantime, is there another way to generate colored point cloud? (A function where I can give it the depth frame and the color frame). Because I need to generate point clouds, but since I am reading from a bag, there is the mismatch between color and depth timestamps. So the frames.get_color_point_cloud(camera_param) generates a bad colored point cloud.