Closed nchu-cha closed 3 years ago
Hi @nchu-cha Thanks very much for your questions.
1 and 2. The files that you mention are part of the MATLAB wrapper's operating files and are not meant to be run as example programs. The scripts in the wrapper's file list that are examples that can be tried have the word example in their name.
https://github.com/IntelRealSense/librealsense/tree/master/wrappers/matlab
These include:
advanced_mode_example.m capture_example.m depth_example.m depth_view_example.m pointcloud_example.m rosbag_example.m
If you wish to perform alignment in the MATLAB wrapper, the script created by a RealSense user in https://github.com/IntelRealSense/librealsense/issues/3735 provides an example of doing so.
3 and 4. https://github.com/IntelRealSense/librealsense/issues/3735 also demonstrates defining custom stream configurations with cfg instructions and obtaining intrinsics when alignment is being used.
thanks for your apply, I had tried the measure in #3735, but I got some questions.
When I key cfg = realsense.config();
, it couldn't connect with L515 it will show config with no properties
. Then I change to used context
and it can connect the L515 camera. Can anyone tell me what's the different between context
and config
?
After confirming the connection, I tried to used cfg.enable_stream(realsense.stream.depth,1280,720,realsense.format.z16);
in the #3735, but it shows Dot indexing is not supported for variables of this type.
, is that mean L515 camera can't use matlab to change resolution?
Because I need to change my depth resolution to 1920_1080 and RGB resolution to 1920_1080, I tried depth_img = permute(reshape(depth_data',[3,depth_color.get_width(),depth_color.get_height()]),[3 2 1]);
, let depth_color.get_width(1920)
, depth_color.get_height(1080)
, it shows too many input argument
, I think I need to convert the image resolution first. (When I used the Viewer the depth resolution only up to 1024_768. Can I use MATLAB set the depth resolution to 1920_1080?)
I found that there are many kind of "Constructor" and "Destructor" in the realsense matlab file, how can I know which function I need to use?
thank you
2 and 3. The L515 does not support 1280x720 depth (the reference script was designed for a RealSense 400 Series camera). L515 supports a maximum 1024x768 resolution for depth. RGB supports 1920x1080 though.
thanks for your apply, I finally can change my resolution for the depth image successfully.
I need the depth information, when using the Viewer it will have a RAW file, how can I get the RAW file form using MATALB?
I want try rosbag_example
, but it had an error :
Is that mean I need to add some parameter on it?
thank you
A .raw file would not work with the rosbag_example program. It is expecting a bag format file, which is like a video recording of camera data instead of a static image.
In the Viewer you can create a bag file by enabling streams in the RealSense Viewer and then left-clicking on the circular Record option beneath the camera's name in the Viewer's options side-panel. The Viewer will then record into the bag all streams that were enabled at the time that the Record option was activated.
Clicking on the Record option again should stop the recording and you will then have a bag file that you can load into programs such as rosbag_example.
You can also obtain pre-made sample bag files from the link below.
https://github.com/IntelRealSense/librealsense/blob/master/doc/sample-data.md
Can I used MATALB code to save a bag file? Because I want to do all the stuff on the MATLAB
https://github.com/IntelRealSense/librealsense/issues/6057 discusses creating a script in the MATLAB wrapper to record a bag file.
I used the method in #6057 like the below to have a bag file
filepath = fullfile('D:\matlab_code\bag_file','testrecord.bag');
cfg.enable_record_to_file(filepath);
Then I want to read the information in the bag file, but it can't open the file, can anyone help me to solve the problem?
bag = rosbag("D:\matlab_code\bag_file\testrecord.bag")
--> I key the command like this, and get the error
this is my MATLAB code:
clc; clear all; close all;
filepath = fullfile('D:\matlab_code\bag_file','testrecord.bag'); % set resolution cfg = realsense.config(); cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30); % cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30); cfg.enable_record_to_file(filepath);
% Make Pipeline object to manage streaming pipe = realsense.pipeline(); % Make Colorizer object to prettify depth output colorizer = realsense.colorizer(); % Start streaming on an arbitrary camera with default settings profile = pipe.start(cfg); % Get streaming device's name dev = profile.get_device(); name = dev.get_info(realsense.camera_info.name); % Get frames. We discard the first couple to allow % the camera time to settle for i = 1:5 fs = pipe.wait_for_frames(); end % Stop streaming pipe.stop(); % Select depth frame depth = fs.get_depth_frame(); % Colorize depth frame color = colorizer.colorize(depth); % Get actual data and convert into a format imshow can use % (Color data arrives as [R, G, B, R, G, B, ...] vector) data = color.get_data(); img = permute(reshape(data',[3,color.get_width(),color.get_height()]),[3 2 1]); % Display image imshow(img); title(sprintf("Colorized depth frame from %s", name)); bag = rosbag("D:\matlab_code\bag_file\testrecord.bag");
Reading a bag file in the MATLAB wrapper involves a different process to recording one with enable_record_to_file. It is done with cfg.enable_device_from_file, as demonstrated in the rosbag_example.m script.
https://github.com/IntelRealSense/librealsense/blob/master/wrappers/matlab/rosbag_example.m#L6
After stopping the pipeline with pipe.stop(), you should be able to change to a different cfg configuration than the cfg.enable_record_to_file one that the script started with (i.e set the cfg.enable_device_from_file instruction instead).
Is that mean if I need a bag file I need to used cfg.enable_record_to_file()
to get a bag file, and after the pipe.stop()
I can use cfg.enable_device_fron_file()
to open my bag file?
I add the cfg.enable_device_from()
instruction after pipe.stop()
, but my MATALB crash.
I found that if I used bag = rosbag(filename);
to open the bag file it can open the file that already have, it can't read the information immediately. (cfg.enable_record_to_file and rosbag can't be the same file.)
I had looked the information in the bag file, but I didn't saw the distance in the information, where can I get the Depth distance?
Yes, use cfg.enable_record_to_file() to create the bag and cfg.enable_device_from_file() to play it back.
To obtain a complete, readable bag, the pipeline needs to be stopped so that the bag is indexed upon closure of the recording process.
The SDK's C++ reference about record and playback recommends updating the device after closing recording to signify that a new pipeline is being used where playback will take place instead of record.
pipe->start(); // Resume streaming with default configuration
device = pipe->get_active_profile().get_device();
In regard to getting the distance, https://github.com/IntelRealSense/librealsense/issues/6612#issuecomment-645229665 provides references for doing this in the MATLAB wrapper using depth.get_distance
I had looked the #6057, but I still don't know how to solve the problem? Can you help me?
this is my code:
filepath = fullfile('D:\matlab_code\bag_file','20120919_test3.bag'); % set resolution cfg = realsense.config(); cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30); % cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30); cfg.enable_record_to_file(filepath);
% Make Pipeline object to manage streaming pipe = realsense.pipeline(); % Make Colorizer object to prettify depth output colorizer = realsense.colorizer(); % Start streaming on an arbitrary camera with default settings profile = pipe.start(cfg); % Get streaming device's name dev = profile.get_device(); name = dev.get_info(realsense.camera_info.name); % Get frames. We discard the first couple to allow % the camera time to settle for i = 1:5 fs = pipe.wait_for_frames(); end
% Stop streaming
pipe.stop();
cfg.enable_device_from_file("D:\matlab_code\bag_file\20120919_test3.bag");
% Select depth frame depth = fs.get_depth_frame(); % Colorize depth frame color = colorizer.colorize(depth); % Get actual data and convert into a format imshow can use % (Color data arrives as [R, G, B, R, G, B, ...] vector) data = color.get_data(); img = permute(reshape(data',[3,color.get_width(),color.get_height()]),[3 2 1]); % Display image imshow(img); title(sprintf("Colorized depth frame from %s", name));
pipe->stop(); // Stop the pipeline that holds the file and the recorder
pipe = std::make_shared<rs2::pipeline>(); //Reset the shared pointer with a new pipeline
My MATLAB wrapper programming knowledge is limited though, so I do not know what the wrapper equivalent of pipe = std::make_shared<rs2::pipeline>();
would be, unfortunately.
There was a recent discussion that talked about doing simultaneous streaming and recording using multithreading. I pointed the RealSense user in that case to the C++ multithreading script at https://github.com/IntelRealSense/librealsense/issues/6865
Multithreading is not a simple subject though and I do not have knowledge about how it could be used in a MATLAB wrapper script.
The information I currently needed are : 1. RGB image, 2. Depth image, 3. Depth distance (in Viewer is a RAW file), I need these three information when I capture the photo from theL515 camera.
I need A bag file is because the RGB and DEPTH image can't align together, and the methods I found all said that the parameters in the bag file are required, so I tried to get a bag file and read the parameters in it.
Now I can capture the depth image and get the bag file, so I start to capture the RGB image want to see the RGB image and the depth image. Is there an example for capture the RGB image?
All stream types that have been defined with cfg instructions should be captured in the bag file if they are enabled at the time that recording begins.
I researched your situation further and found a MATLAB script contributed by a RealSense user in https://github.com/IntelRealSense/librealsense/issues/6763#issuecomment-657217234 that aligns depth and color in real-time and then generates a point cloud. Adapting this script may avoid the need to record a bag file if you only needed a bag for the purpose of performing alignment.
I can capture the RGB image and DEPTH image at the same time. Now I need whole DEPTH image's distance information, so I tried the get_distance(this, x, y)
instruction to have the distance information, but it didn't work, and had an error:
after get the depth frame I try to get the depth distance, is that a correct instruction if I want to get the distance information?
this is my code:
clc; clear all; close all;
tic
filepath = fullfile('D:\matlab_code\bag_file','20120919_test3.bag');
cfg = realsense.config(); % set resolution for RGB & DEPTH cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30); cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30); cfg.enable_record_to_file(filepath); % 之後要改檔名 % Make Pipeline object to manage streaming pipe = realsense.pipeline(); % Make Colorizer object to prettify depth output colorizer = realsense.colorizer(); % Start streaming on an arbitrary camera with default settings profile = pipe.start(cfg); % Get streaming device's name dev = profile.get_device(); name = dev.get_info(realsense.camera_info.name); % Get frames. We discard the first couple to allow % the camera time to settle for i = 1:5 fs = pipe.wait_for_frames(); end % Stop streaming pipe.stop();
% Select color & depth frame RGB_frame = fs.get_color_frame(); depth_frame = fs.get_depth_frame();
for y = 1:1024
for x = 1:768
depth_dis = depth_frame.get_distance(x,y); % ******
end
end
% Colorize depth frame depth_color = colorizer.colorize(depth_frame); % Get actual data and convert into a format imshow can use % (Color data arrives as [R, G, B, R, G, B, ...] vector) % depth_frame(深度影像上色資訊) depth_data = depth_color.get_data(); depth_img = permute(reshape(depth_data',[3,depth_color.get_width(),depth_color.get_height()]),[3 2 1]); % color_frame RGB_data = RGB_frame.get_data(); RGB_img = permute(reshape(RGB_data',[3,RGB_frame.get_width(),RGB_frame.get_height()]),[3 2 1]);
% Display image figure;imshow(depth_img);title(sprintf("Colorized depth frame from %s", name)); figure;imshow(RGB_img);title(sprintf("Colorized RGB frame from %s", name));
toc
What happens if you remove the % part of the line?
depth_dis = depth_frame.get_distance(x,y);
I lost one error that shows out of range for argument "y"
I'm thinking that the reason why have the error is because the get_distance
function is int64(x), int64(y)
the unit is diffrrent?
If you cannot make get_distance work, the real-world distance in meters can alternatively be achieved by multiplying the 16-bit uint16_t raw depth pixel value by the depth unit scale of the RealSense camera model being used (which is 0.001 meters by default on 400 Series cameras and 0.000250 on L515).
If the SDK's Align processing block (align_to) is used then the SDK automatically adjusts the aligned image if depth and RGB are different resolutions.
Sorry for I can't clearly describe my problem.
Can MATLAB directly generate the RAW file?
I had found the reason why it can't work. The function in get_distane its x, y unit are int64
, the distance it only can get is from
x : 1 to 1023, y : 1to767. That means I lost one column and one row for my distance, how can I solve this problem. (my depth image resolution is 1024_768. )
in the #6763 (comment) it has to create a point cloud player , is L515 camera can create too? Because L515 camera is LIDAR can LIDAR create a point cloud player?
thank you
https://github.com/IntelRealSense/librealsense/issues/1485 discusses in detail how to write depth data in binary format, including how to approach the problem of doing so in the RealSense MATLAB wrapper. For example, one RealSense user in that discussion in https://github.com/IntelRealSense/librealsense/issues/1485#issuecomment-380487211 wrote a script for saving the data as a .txt file that MATLAB could read.
If you want to go even more directly than the MATLAB wrapper, the function imwrite in MATLAB itself offers the possibility of saving data in binary raw format, though it may not be the same as the .raw files that the RealSense SDK generates.
https://uk.mathworks.com/help/matlab/ref/imwrite.html
In regard to the script with a point cloud player, aside from having to change the stream resolutions to be compatible with L515 I cannot see a reason why the script would not work on an L515.
Sincerely thank you for answering me patiently. I successfully can align the RGB and DETPH. Below is my program, if anyone needs it can refer to it.
Using get distance to have the whole frame distance information:
(My depth resolution is 1920_1080. When using get_distance
, it records the depth distance from zero, so if I want to that the distance fill in the dis_matrix I need to that the for loop start from zero, and that the coordinates plus one.)
% get the distance
dis_matrix = zeros(1080,1920);
for y = 0:(depthHeight-1)
for x = 0:(depthWidth-1)
depth_dis = depth_frame.get_distance(x,y);
dis_matrix(y+1,x+1) = depth_dis;
end
end
align RGB and DEPTH frame
(Adding % *****
after the program is the program that needs to be added if alignment is required)
cfg = realsense.config();
% set resolution for RGB & DEPTH
cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30);
cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30);
pipe = realsense.pipeline();
colorizer = realsense.colorizer();
pcl_obj = realsense.pointcloud();
profile = pipe.start(cfg);
alignto = realsense.stream.color; % *****
alignedFs = realsense.align(alignto); % *****
% Get streaming device's name
dev = profile.get_device();
name = dev.get_info(realsense.camera_info.name);
% Get frames. We discard the first couple to allow
% the camera time to settle
for i = 1:5
fs = pipe.wait_for_frames();
end
% Stop streaming
pipe.stop();
% Select color & depth frame
RGB_frame = fs.get_color_frame();
% depth_frame = fs.get_color_frame(); % original depth frame
aligned_frames = alignedFs.process(fs); % align image *****
depth_frame = aligned_frames.get_depth_frame(); % after align Depth image *****
% get depth image parameters
depthSensor = dev.first('depth_sensor'); % *****
depthScale = depthSensor.get_depth_scale(); % *****
depthWidth = depth_frame.get_width(); % *****
depthHeight = depth_frame.get_height(); % *****
depth_color = colorizer.colorize(depth_frame);
depth_data = depth_color.get_data();
depth_img = permute(reshape(depth_data',[3,depth_color.get_width(),depth_color.get_height()]),[3 2 1]);
figure;imshow(depth_img);title(sprintf("Colorized depth frame from %s", name));
Thanks so much for sharing your MATLAB wrapper script with the RealSense community, @nchu-cha :)
Can I have a view before I capture the frame? Because I want to make sure that the image which I capture is I want. I'm thinking that can I us the capture_example revise to the capture function?
thank you
You could have the program wait for a keypress before initiating capture, like in the C++ script at https://github.com/IntelRealSense/librealsense/issues/8007#issuecomment-745518890
In the #8007 (comment) can it had a screen to see the instance frame? I need to build dataset for a market, in the dataset I hope that when I'm capturing the frame there will not had people in the frame, so I think if I had a viewer to see the instance frame will be more convenient.
I'm looking the code in capture_example, and I don't understand the meaning of the function description inside. Is there a more detailed explanation?
tmProcess(app,~,~)
, is the app mean the GUI in the picture?thank you
RsPipeLine would be the definition for the pipeline, and RsColorizer would refer to the use of a colorizer component to color-shade the depth data based on the depth values.
I do not know what 'r' represents though on the gyro and accel displays in the capture-example.m program though. It is probably equivalent to 'n' in the gyro and accel readout in the Viewer.
I would assume that draw_motion_data refers to rendering the gyro and accel data on-screen, yes, especially as the draw instructions in the code are labelled as AccelAX / AccelTextAX and GyroAX / GyroTextAX.
I'm not sure but it looks like it may be code that decides what to do if a particular camera stream type is not detected as being supported (for example, if the color stream or the accel and gyro streams are not accessible).
Again, not sure. I would speculate that after checking whether a particular stream type is supported, it updates the screen with drawnow and then releases frames with delete(RsFrameSet) to free up memory capacity.
Is the function StartUpFunc(app)
in the capture_example
is that mean when I click the Start button than it will do all the instruction in the function?
I want to create a capture button, so I imitate the stop button:
function onCaptureButton(app,~)
if strcmp(app.tmr.Running, 'on')
stop(app.tmr);
app.RsPipeLine.stop();
end
end
Than I create a Capture function like the below, but when I click the capture button nothing happen, it didn't shows the frame. I don't know where the problem is, can anyone help me?
function CaptureUpFunc(app)
%--time--%
now_time = datestr(now,30); %
time_str = [now_time(5:8),'_',now_time(10:15)]; %
filepath = fullfile('D:\matlab_code\dataset\',time_str); %
app.cfg = realsense.config();
app.cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30);
app.cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30);
app.pipe = realsense.pipeline();
app.colorizer = realsense.colorizer();
app.profile = app.pipe.start(app.cfg);
app.dev = app.profile.get_device();
name = app.dev.get_info(realsense.camera_info.name);
for i = 1:5
app.fs = app.pipe.wait_for_frames();
end
app.pipe.stop();
app.RGB_frame = app.fs.get_color_frame();
app.depth_frame = app.fs.get_depth_frame(); % *****
app.depth_color = app.colorizer.colorize(app.depth_frame);
app.depth_data = app.depth_color.get_data();
app.depth_img = permute(reshape(app.depth_data',[3,app.depth_color.get_width(),app.depth_color.get_height()]),[3 2 1]);
figure;imshow(app.depth_img);title(sprintf("Colorized depth frame from %s", name));
imwrite(app.depth_img,[filepath,'_DEPTH_img.png']);
end
thank you
A function is a list of instructions that is jumped to when the function name is called from elsewhere in the script and then the instructions are carried out in the order that they are listed in the function.
For example, if you created a Start button and set that start button to jump to a function named Start() that contained the start-up code, then that code would be run when the start button is clicked on.
Have you defined the new Capture button in the properties (Access = public) list?
https://github.com/IntelRealSense/librealsense/blob/master/wrappers/matlab/capture_example.m#L4
Have you also defined a list of settings for CaptureButton like the ones for the Start and Stop buttons in the link below?
yes, I had defined the capture button and also defined a list of setting for capture button.
function onCaptureButton(app,~)
if strcmp(app.tmr.Running, 'on')
stop(app.tmr);
app.RsPipeLine.stop();
end
end
Is the function mean when I click the capture button than the tmr.runnung == 'on' then the tmr will stop and the pipeline will also stop too?
That is what your onCaptureButton() function looks like to me. If the button is pressed then the onCaptureButton is jumped to and the stop code currently within that function should cause the pipeline to be stopped. But because of the If logic statement that the stop code is within, the stop will only activate if the If condition's terms are met. In other words, the stop would only activate if the state app.tmr.Running is currently On. If it is Off then the If condition is not met and the stop code nested inside it would not be able to be triggered. So nothing would happen when the onCaptureButton button is pressed.
I change the onCaptureButton() function to the below: I think if I want to capture the frame the pipeline need to keep going.
function onCaptureButton(app,~)
if strcmp(app.tmr.Running, 'off')
app.RsPipeLine.start();
start(app.tmr);
end
end
And change my CaptureUpFunc(app) function too, but it can't save the frame that I want.
function CaptureUpFunc(app)
%--time--%
now_time = datestr(now,30); %
time_str = [now_time(5:8),'_',now_time(10:15)]; %
filepath = fullfile('D:\matlab_code\dataset\',time_str); %
cfg = realsense.config(); %
cfg.enable_stream(realsense.stream.color,1920,1080,realsense.format.rgb8,30); %
cfg.enable_stream(realsense.stream.depth,1024,768,realsense.format.z16,30); %
app.RsPipeLine.start(cfg); %
app.RsPipeLine = realsense.pipeline();
app.RsColorizer = realsense.colorizer();
RsFrameSet = app.RsPipeLine.wait_for_frames();
if app.SupportDepth ~=0
df = RsFrameSet.get_depth_frame();
dc = app.RsColorizer.colorize(df);
dd = dc.get_data();
d = permute(reshape(dd',[3,dc.get_width(),dc.get_height()]),[3 2 1]);
app.hD = imshow(d,'Parent',app.DepthAX);
imwrite(d,[filepath,'_DEPTH_img.png']); %
else
d = zeros(480,640,3,'uint8'); d(:)=240;
app.hD = imshow(d,'Parent',app.DepthAX);
hold(app.DepthAX,'on');
text(app.DepthAX,140,220,'The <Depth> property is not supported','Color','k','FontSize',16);
hold(app.DepthAX,'off');
end
end
I had found that if I write the imwrite()
in the StartUpFunc(app) function it will save the frame before I press the start button, but I want to save the frame after I press the button. Do you know how to solve the problem?
Perhaps you could define a bool condition such as 'pause' that can be true or false, and set it to false by default in your script's start function. Then you could enclose wait_for_frames within an 'If pause = false' condition, so that new frames can only arrive if pause = false.
When the capture button is pressed, the onCaptureButton() function's code could set pause to 'true'. As soon as it is set to true, wait_for_frames should no longer be able to deliver new frames. The capture could then be saved and pause set to 'false' again immediately afterwards so that wait_for_frames can resume streaming frames.
Is the @CaptureUpFunc
definitely outside of the other functions (for example, not accidentally nested within @StartUpFunc
). You could make sure of this by finding the first line of a function and then pasting the code for @CaptureUpFunc
directly above that line.
I changed another way to solve the problem of the preview screen, I used Realsense Viewer to make sure the frame is I want, then I turn off the Viewer and run the MATALB code of the capture function. But I found that if I open the RGB camera first, to will shows no frame receive but if I turn on the depth sensor first then turn on the RGB camera it will be normal. do you know why the L515 camera will act like that? Because before I can turn on the RGB camera first, and it still can get the frame.
I have heard of this behaviour once before, with Depth first and RGB second working but not the other way around. I do not have an explanation for it though.
Thank you for your patient reply, I will close the issue
Thanks very much @nchu-cha for the update and for your patience too!
Before opening a new issue, we wanted to provide you with some useful suggestions (Click "Preview" above for a better view):
All users are welcomed to report bugs, ask questions, suggest or request enhancements and generally feel free to open new issue, even if they haven't followed any of the suggestions above :)
Issue Description
hello,
I'm trying to understand all the MATALB function. I had tried
align,m
andcamera_info.m
, them I get some errorIn the
align.m
, comes an error that showsnot enough input arguments
in line 6narginchk(1,1);
. Should I revise the(1,1)
range?function this = align(align_to)
narginchk(1, 1);
validateattributes(align_to, {'realsense.stream', 'numeric'}, {'scalar', 'nonnegative', 'real', 'integer', '<=', int64(realsense.stream.count)});
out = realsense.librealsense_mex('rs2::align', 'new', int64(align_to));
this = this@realsense.filter(out);
end
In
canera_info.m
, I get an error aboutCannot call the constructor of 'realsense.camera_info' outside of its enumeration block.
I want to let my depth resolution up to 1024768, and let my RGB resolution up to 19201080, hoe can I change the resolution from using MATLAB function?
I have seen other people get the intrinsic and extrinsic parameters before aligning the images, can anyone teach me how to get these parameters?
Does anyone know hoe to solve it?
Thank you