Closed RohanKurane closed 7 years ago
Has anyone had a chance to look at this ?
@RohanKurane Are you zeroing the struct? Give a look at this sample: https://github.com/intel-aero/sample-apps/tree/master/capturev4l2 or at here: https://github.com/zehortigoza/aero-optical-flow it will not work for you as your kernel don't have the additional changes to handle exposure https://github.com/intel-aero/meta-intel-aero-base/pull/7 but you can just remove it out.
Note: I am running on the 1.3.1 kernel. The latest 1.4 kernel did not work well with my application. I got some errors for pthread.
@RohanKurane if you have info about the pthread problem with v1.4, please report in a new issue. Tks.
@zehortigoza - Zeroing out the struct did not make a difference. Still get the same error.
@anselmolsm - I will re-do it with 1.4 and post my results.
Any suggestions ? I would like to get this to work and be able to take pics.
@RohanKurane Have you tried to run the 2 projects that I sent to you? I tested those 2 and they are working
@zehortigoza - I really just need to understand why the ioctl VIDIOC_S_FMT is failing. I noticed the project does a CLEAR of the fmt structure. I do the same using memset(&fmt, 0, sizeof(fmt)). I believe if that works, I should be able to get further. I have actually noticed it works one out of 10 times. To answer your question, no I have not run your 2 projects.
@zehortigoza - The one time the ioctl is successful, the ioctl to turn streaming on fails with err = 22. ioctl(ctx->fd, VIDIOC_STREAMON, &type)
enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
Any idea why ?
@RohanKurane I'm not a expert in V4L2 but if the device is not setup right it will fail when starting the streaming. Please run the samples first to confirm that everything else is working then compare your code with the one working.
@zehortigoza - I will. Thanks
fyi.. found this in regards to the VIDIOC_STREAMON ioctl - Specifically the capture hardware is disabled and no input buffers are filled (if there are any empty buffers in the incoming queue) until VIDIOC_STREAMON has been called. Accordingly the output hardware is disabled, no video signal is produced until VIDIOC_STREAMON has been called. The ioctl will succeed only when at least one output buffer is in the incoming queue.
@zehortigoza - I compiled the capturev4l2 on my ubuntu and copied the executable over to the aero board and ran the command
root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video0 Preview mode: 640x480, yuv420. m_width 640, m_height 480, m_sizeimage 462848, m_padded_width:640, bytesperline 640 Saving file: Image-video2-640x480-0.yuv420 Saving file: Image-video2-640x480-1.yuv420 Saving file: Image-video2-640x480-2.yuv420 Saving file: Image-video2-640x480-3.yuv420 Saving file: Image-video2-640x480-4.yuv420 Saving file: Image-video2-640x480-5.yuv420 Saving file: Image-video2-640x480-6.yuv420 Saving file: Image-video2-640x480-7.yuv420 Saving file: Image-video2-640x480-8.yuv420 Saving file: Image-video2-640x480-9.yuv420 Preview: Time=336.619000ms FPS=29.707176
Now why is VIDIOC_S_FMT and STREAMON ioctls failing from my application ?
Any advise ? Would pasting my code here help you ?
Thanks Rohan
Why are you running it on /dev/video0? It should be /dev/video2
If you know that it is working, now you can compare both codes, check for non-initialized structs...
int er;
const int WIDTH = 640;
const int HEIGHT = 480;
int SIZE = WIDTH * HEIGHT * 3/2;
const int BUFS_COUNT = 4;
struct v4l2_streamparm parm;
struct v4l2_requestbuffers req;
struct v4l2_format fmt;
struct v4l2_format fmt1;
enum v4l2_buf_type type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
struct v4l2_capability cap;
struct v4l2_cropcap cropcap;
struct v4l2_crop crop;
int channel = 1;
unsigned int page_size, buffer_size;
memset(&fmt, 0 , sizeof(fmt));
memset(&fmt1, 0 , sizeof(fmt1));
Context *ctx = (Context *)calloc(sizeof(Context), 1);
if ( !ctx )
{
printf("Cannot create context for device %s", BOTTOM_DEVICE);
}
else
{
printf("Successfully created context for device %s\n", BOTTOM_DEVICE);
}
//assert(ctx);
ctx->fd = cryptomove_open(BOTTOM_DEVICE, O_RDWR | O_NONBLOCK, 0);
if (ctx->fd < 0)
{
printf("Cannot open device %s", BOTTOM_DEVICE);
goto end;
}
else
{
printf("Successfully opened device %s\n", BOTTOM_DEVICE);
if (ioctl(ctx->fd, VIDIOC_S_INPUT, &channel) < 0)
{
printf("Cannot set input channel for device %s", BOTTOM_DEVICE);
goto end;
}
memset(&parm, 0, sizeof(parm));
parm.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
//parm.parm.capture.capturemode = 0x8000; // Preview mode
parm.parm.capture.capturemode = 0x4000; // video mode
if (ioctl(ctx->fd, VIDIOC_S_PARM, &parm) < 0)
{
printf("Cannot set mode for device %s", BOTTOM_DEVICE);
goto end;
}
er = ioctl(ctx->fd, VIDIOC_QUERYCAP, &cap);
if ( er < 0 )
{
printf("Cannot get query caps for device %s: %d", BOTTOM_DEVICE, er);
}
else
{
if (!(cap.capabilities & V4L2_CAP_VIDEO_CAPTURE)) {
printf("Not a capture device for device %s: %d", BOTTOM_DEVICE, er);
}
if (!(cap.capabilities & V4L2_CAP_STREAMING)) {
printf("Not a streaming device for device %s: %d", BOTTOM_DEVICE, er);
}
}
cropcap.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
if (0 == ioctl(VIDIOC_CROPCAP, &cropcap)) {
crop.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
crop.c = cropcap.defrect; /* reset to default */
if (-1 == ioctl(VIDIOC_S_CROP, &crop)) {
}
}
fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
fmt.fmt.pix.width = WIDTH;
fmt.fmt.pix.height = HEIGHT;
fmt1.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
er = ioctl(ctx->fd, VIDIOC_G_FMT, &fmt1);
if ( er < 0 )
{
printf("Cannot get video format for device %s: %d", BOTTOM_DEVICE, er);
}
else
{
printf("Get video format for device %s:%d\n", BOTTOM_DEVICE, fmt1.fmt.pix.pixelformat);
}
fmt.fmt.pix.pixelformat = fmt1.fmt.pix.pixelformat;
//fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUV420;
fmt.fmt.pix.field = V4L2_FIELD_INTERLACED;
if (ioctl(ctx->fd, VIDIOC_S_FMT, &fmt) < 0)
{
int errsv = errno;
printf("Cannot set video format for device %s: %d\n", BOTTOM_DEVICE, errsv);
goto end1;
}
page_size = getpagesize(); //usually 4K
buffer_size = (fmt.fmt.pix.sizeimage + page_size - 1) & ~(page_size - 1);
printf("page_size and buffer_size for device %s:%d:%d\n", BOTTOM_DEVICE, page_size, buffer_size);
memset (&req, 0, sizeof (req));
req.count = BUFS_COUNT;
req.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
req.memory = V4L2_MEMORY_USERPTR;
if (ioctl(ctx->fd, VIDIOC_REQBUFS, &req) < 0)
{
printf("Cannot set buffers for device %s", BOTTOM_DEVICE);
goto end1;
}
if (req.count != BUFS_COUNT)
{
printf("Incorrect buffers for device %s", BOTTOM_DEVICE);
goto end1;
}
for (int i = 0; i < BUFS_COUNT; ++i)
{
struct v4l2_buffer buf;
ctx->bufs_len[i] = buffer_size;
ctx->bufs[i] = memalign(page_size, buffer_size);
if (!ctx->bufs[i])
{
printf("Cannot allocate mem buffers for device %s", BOTTOM_DEVICE);
goto end1;
}
else
{
printf("allocated mem buffers for device %s with %d:%d:%p\n", BOTTOM_DEVICE, page_size, buffer_size, ctx->bufs[i]);
}
memset (&buf, 0, sizeof (buf));
buf.type = V4L2_BUF_TYPE_VIDEO_CAPTURE;
buf.memory = V4L2_MEMORY_USERPTR;
buf.index = i;
buf.m.userptr = (unsigned long)ctx->bufs[i];
buf.length = ctx->bufs_len[i];
if (ioctl(ctx->fd, VIDIOC_QBUF, &buf) < 0)
{
int errsv = errno;
printf("Error for device %s: %d\n", BOTTOM_DEVICE, errsv);
goto end1;
}
}
if (signal(SIGINT, sig_handler) == SIG_ERR)
{
printf(" Cannot catch SIGINTd\n");
}
if (ioctl(ctx->fd, VIDIOC_STREAMON, &type) < 0)
{
int errsv = errno;
printf("Cannot set streaming on for device %s: %d\n", BOTTOM_DEVICE, errsv);
goto end1;
}
printf(" Bottom facing camera has been successfully started\n");
@zehortigoza : just pasted the code above. I can try video2.
@zehortigoza
Today it does not work -
root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video2 Preview mode: 640x480, yuv420. VIDIOC_S_FMT error 22, Invalid argument
or
root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video0 Preview mode: 640x480, yuv420. VIDIOC_S_FMT error 22, Invalid argument
This is the same error in my app when I try to use the camera.
I have the R200 realsense camera connected as well. Not the ov8588.
Do you have this working consistently ?
The OV7251 works everytime in here with the sample, look at your dmesg output check if the OV7251 is detected.
@zehortigoza
it is. Here is the dmesg | grep ov7251 output -
root@intel-aero:~# dmesg | grep ov7251 [ 3.543782] ov7251 i2c-INT35AA:00: gmin: initializing atomisp module subdev data.PMIC ID 1 [ 3.572142] ov7251 i2c-INT35AA:00: camera pdata: port: 0 lanes: 1 order: 00000002 [ 3.574884] ov7251 i2c-INT35AA:00: sensor_revision = 0x2 [ 3.574890] ov7251 i2c-INT35AA:00: detect ov7251 success [ 3.575010] input: ov7251 as /devices/virtual/input/input4 [ 3.575154] ov7251 i2c-INT35AA:00: register atomisp i2c module type 1 [ 3.585290] ov7251 i2c-INT35AA:01: gmin: initializing atomisp module subdev data.PMIC ID 1 [ 3.614141] ov7251 i2c-INT35AA:01: camera pdata: port: 1 lanes: 1 order: 00000002 [ 3.615250] ov7251 i2c-INT35AA:01: read from offset 0x300a error -121 [ 3.622942] ov7251 i2c-INT35AA:01: sensor_id_high = 0xffff [ 3.630118] ov7251 i2c-INT35AA:01: ov7251_detect err s_config. [ 3.637231] ov7251 i2c-INT35AA:01: sensor power-gating failed
Strange...now it works again. For both video0 and video2..See below. I notice that this is using MODE=PREVIEW. When I tried using MODE=VIDEO I got the VIDIOC_S_FMT error. See below.
root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video0 Preview mode: 640x480, yuv420. m_width 640, m_height 480, m_sizeimage 462848, m_padded_width:640, bytesperline 640 Saving file: Image-video2-640x480-0.yuv420 Saving file: Image-video2-640x480-1.yuv420 Saving file: Image-video2-640x480-2.yuv420 Saving file: Image-video2-640x480-3.yuv420 Saving file: Image-video2-640x480-4.yuv420 Saving file: Image-video2-640x480-5.yuv420 Saving file: Image-video2-640x480-6.yuv420 Saving file: Image-video2-640x480-7.yuv420 Saving file: Image-video2-640x480-8.yuv420 Saving file: Image-video2-640x480-9.yuv420 Preview: Time=347.092000ms FPS=28.810805
root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video2 Preview mode: 640x480, yuv420. m_width 640, m_height 480, m_sizeimage 462848, m_padded_width:640, bytesperline 640 Saving file: Image-video2-640x480-0.yuv420 Saving file: Image-video2-640x480-1.yuv420 Saving file: Image-video2-640x480-2.yuv420 Saving file: Image-video2-640x480-3.yuv420 Saving file: Image-video2-640x480-4.yuv420 Saving file: Image-video2-640x480-5.yuv420 Saving file: Image-video2-640x480-6.yuv420 Saving file: Image-video2-640x480-7.yuv420 Saving file: Image-video2-640x480-8.yuv420 Saving file: Image-video2-640x480-9.yuv420 Preview: Time=337.922000ms FPS=29.592628
For MODE=VIDEO root@intel-aero:~# C=10 INPUT=0 MODE=VIDEO ./capture_examples --userp -d /dev/video2 Video mode: 640x480, yuv420. Setting fps to 30 VIDIOC_S_FMT error 22, Invalid argument
root@intel-aero:~# C=10 INPUT=1 MODE=VIDEO ./capture_examples --userp -d /dev/video0 Video mode: 640x480, yuv420. Setting fps to 30 VIDIOC_S_FMT error 22, Invalid argument
Please note that I am running 1.3.1
root@intel-aero:~# aero-get-version.py BIOS_VERSION = Aero-01.00.12_Prod OS_VERSION = Poky Aero (Intel Aero Linux Distro) 1.3.1 (krogoth)" AIRMAP_VERSION = 1.8 FPGA_VERSION = 0xa1
I had some issues with pthreads on 1.4 and my app. I have not had a chance to dig into it.
The error within my app (code pasted above) still exists. I get error 22 for VIDIOC_S_FMT and if that passes, I fail on the VIDIOC_STREAMON ioctl.
@zehortigoza
It works now from my app. I had an error. I was setting the input channel to 1 instead of 0. Looks like INPUT=0 is the ov7251 camera and INPUT=1 is the ov8858 ? Can you please add some documentation for the INPUT field in capture.cpp ? That is the only parm that is missing an explanation :) I can close this bug once you add that. Thanks for your help with this. I will open a different bug (if needed) for the 1.4 kernel and pthreads issue I ran into.
@RohanKurane There is something odd... ov7251 should be /dev/video2 id=1.
@zehortigoza
Are you sure ? I have it as /dev/video2 but channel = 0. This is what I see in capture.cpp file
static int cam_input = 0; / Camera ID selection /
@RohanKurane There is just 2 mipi connections, one is single lane other double lane. Unless the ov8588 is disconnected, ov7251 should be the one in id=1.
@zehortigoza I believe I have the ov7251 correctly connected to the 1 lane MIPI interface for VGA Camera (far right of the compute board). What does the following line in capture.cpp refer to ? static int cam_input = 0; / Camera ID selection /
As a couple side questions -
@RohanKurane cam_input will be set with the INPUT parameter value.
About the formats I don't what ones it support, you can read the driver(atomisp) or keep testing. About the mode, this value is undocumented for us too I only know that 0x8000 make it works for capture and you can do a live stream with it.
@zehortigoza I suspect cam_input is the device_id you are referring to. Correct ? root@intel-aero:~# C=10 INPUT=0 MODE=PREVIEW ./capture_examples --userp -d /dev/video2 Preview mode: 640x480, yuv420.
About the mode, this value is undocumented for us too I only know that 0x8000 make it works for capture and you can do a live stream with it. --> So I can basically capture 1000 frames using 0x8000 and stream them one after another to show live streaming I suppose.
INPUT=0 and cam_input are the same and will be set to VIDIOC_S_INPUT. Like I said in all other Aero board the ov7251 have id = 1, maybe in your board the ov8858 is disconnected causing this id change.
It will keep capturing the frames for you until you do a VIDIOC_STREAMOFF.
INPUT=0 and cam_input are the same and will be set to VIDIOC_S_INPUT. Like I said in all other Aero board the ov7251 have id = 1, maybe in your board the ov8858 is disconnected causing this id change. --> Yes. ov8858 is NOT connected. Thats probably why I see id=0.
It will keep capturing the frames for you until you do a VIDIOC_STREAMOFF. --> Thanks.
Did you want to update the INPUT parm in capture.cpp usage function and close this out ?
Something I ran across - if I stream for about a minute or so, I notice that at some point the 'select' system call fails to return any read file descriptors. r = select(fd + 1, &fds, NULL, NULL, &tv)
Has anyone seen this ? What is the most frames you have experimented with live streaming using the ov7251 ?
@RohanKurane I run in the same problem as you if there is more than just OV7251 using poll/select. I moved the OV7251 to his own thread and I'm also using the timeout of poll to reconfigure the OV7251 when it get stuck. https://github.com/zehortigoza/aero-optical-flow/blob/master/src/mainloop.cpp#L88
Sometimes it runs for about 2minutes without problems, sometimes it runs for 30seconds...
Let me know if you find another workarround.
I read somewhere that the VGA camera might be more suitable for still image captures. Are we over using it trying to live streaming ? I will check your link.
@zehortigoza I looked at your code. It looks kinda heavy by restarting the camera when you run into this situation of no file descriptors. Do we know when this could happen ? Select info : On success, select() returns the number of file descriptors contained in the three returned descriptor sets (that is, the total number of bits that are set in readfds, writefds, exceptfds) which may be zero if the timeout expires before anything interesting happens.
The timeout may be the issue here ? I reset it to 10 for every frame I try to read because select() updates it.
Also @zehortigoza what are you using for streaming ? Are you using gstreamer and QGC ? I am having trouble getting the frames streamed. Here is my pipeline I create. QGC is running on my ubuntu laptop at 192.168.8.2
snprintf(str_pipeline, sizeof( str_pipeline ), "appsrc name=mysource ! videoconvert ! " "video/x-raw,width=640,height=480,format=NV12 ! rtpjpegpay ! jpegenc ! videocrop top=0 left=0 right=40 bottom=10 ! " "udpsink host=%s port=5600", "192.168.8.2");
pipeline = gst_parse_launch( str_pipeline, &error );
appsrc = gst_bin_get_by_name( GST_BIN( pipeline ), "mysource" );
app_caps = gst_caps_new_simple( "video/x-raw", "format", G_TYPE_STRING, "I420",
"width", G_TYPE_INT, WIDTH, "height", G_TYPE_INT, HEIGHT, NULL );
gst_app_src_set_caps( GST_APP_SRC( appsrc ), app_caps );
gstbuffer = gst_buffer_new_wrapped_full( ( GstMemoryFlags )0, frame_ptr, gst_size, 0, gst_size, NULL, NULL ); // frame_ptr and gst_size are the buffer and sizes respectively. g_signal_emit_by_name( appsrc, "push-buffer", gstbuffer, &ret );
@RohanKurane there is a bug in the atomisp driver or firmware that is causing the poll/select to hang when there is more than just OV7251 on the filedescriptor list, doing my workaround to detect timeout and reconfigure the OV7251 make is run like it should.
I'm not streaming, I was just getting the frames to do optical flow. Otavio was working in the streaming with gstreaming and QGC. The OV7251 was even integrated but when we did some tests before the release of Intel Aero 1.4v it was broken and no one had time to look at that yet. https://github.com/01org/camera-streaming-daemon
@zehortigoza Thanks
@otaviobp Can you tell me what problem you ran into during streaming using gstreamer and QGC ? My problem seems to be the frames dont travel over to my ubuntu machine where QGC is installed. My pipeline is as pasted above (pasting here again) snprintf(str_pipeline, sizeof( str_pipeline ), "appsrc name=mysource ! videoconvert ! " "video/x-raw,width=640,height=480,format=NV12 ! rtpjpegpay ! jpegenc ! videocrop top=0 left=0 right=40 bottom=10 ! " "udpsink host=%s port=5600", "192.168.8.2");
pipeline = gst_parse_launch( str_pipeline, &error );
appsrc = gst_bin_get_by_name( GST_BIN( pipeline ), "mysource" );
app_caps = gst_caps_new_simple( "video/x-raw", "format", G_TYPE_STRING, "I420", "width", G_TYPE_INT, WIDTH, "height", G_TYPE_INT, HEIGHT, NULL );
gst_app_src_set_caps( GST_APP_SRC( appsrc ), app_caps );
gstbuffer = gst_buffer_new_wrapped_full( ( GstMemoryFlags )0, frame_ptr, gst_size, 0, gst_size, NULL, NULL ); // frame_ptr and gst_size are the buffer and sizes respectively. g_signal_emit_by_name( appsrc, "push-buffer", gstbuffer, &ret );
Note: I am running on 1.3.1 and not 1.4
192.168.8.2 is ip of ubuntu machine where QGC is installed
Also, using a similar pipeline for the realsense R200 camera works fine. I can stream for R200.
@zehortigoza @otaviobp
If there is a bug open for the ov7251 camera explaining the streaming problem, can you point me to it so I can follow it ? Also I can make a note in my code to wait for the fix ?
@RohanKurane The fix would be update the atomisp driver https://github.com/intel-aero/meta-intel-aero/issues/64
@zehortigoza This seems to address the issue with the "select" system call as we talked yesterday. Is this also going to address the gstreamer and QGC issue ? Or is there a separate bug for it ? Also, I am running 1.3.1 - do you anticipate to backport the fix to 1.3.1 as well ?
@RohanKurane It should fix the select issue. An there is not backports planned, when fixed it would go to the next Aero BSP release.
@zehortigoza Okay. Hopefully I do not run into the same issue as I did for the 1.4v for the next release. Is there an anticipated date for the next release ?
Do you know if the gstreamer/QGC issue is been looked at ?
@RohanKurane regarding the pthread issue you faced in v1.4, do you have any piece of code that we could use to verify that? v1.5 is expected for this week.
@RohanKurane Otávio was able to do streaming using the OV7251 in his tests so I don't think that there is a bug but the code right now in https://github.com/01org/camera-streaming-daemon is not working, maybe you should open a issue about that in camera-streaming-daemon.
@anselmolsm I will re-install 1.4v on my aero board and try and reproduce again.
@zehortigoza I will open the bug now.
Ok, so I'm closing this issue.
I am trying to program the VGA camera ov7251 within my application that I have ported to the Intel Aero Board. For that I am setting the format on the camera using the following lines of code: fmt.type = V4L2_BUF_TYPE_VIDEO_CAPTURE; fmt.fmt.pix.width = WIDTH; fmt.fmt.pix.height = HEIGHT; fmt.fmt.pix.pixelformat = V4L2_PIX_FMT_YUV420; fmt.fmt.pix.field = V4L2_FIELD_INTERLACED;
if (ioctl(ctx->fd, VIDIOC_S_FMT, &fmt) < 0) { int errsv = errno; printf("Cannot set video format for device %s: %d\n", BOTTOM_DEVICE, errsv);
}
The ioctl returns an error = 22.
I am following the code in https://github.com/otaviobp/camera-streaming-daemon/commit/bde69cb597d5f166c8717392ab8f5b0735b68089#diff-c23b02d2981b8df278455a66747be996
Please advise
Note: I am running on the 1.3.1 kernel. The latest 1.4 kernel did not work well with my application. I got some errors for pthread.