eclipse-threadx / usbx

Eclipse ThreadX - USBX is a high-performance USB host, device, and on-the-go (OTG) embedded stack, that is fully integrated with Eclipse ThreadX RTOS
https://github.com/eclipse-threadx/rtos-docs/blob/main/rtos-docs/usbx/index.md
MIT License
157 stars 91 forks source link

UVC #4

Open edbek opened 3 years ago

edbek commented 3 years ago

Is it possible with the help of this stack to implement interaction with Webcamera through UVC ? If so, how?

xiaocq2001 commented 3 years ago

Unfortunately there is no online document for now. The basic UVC work flow is as follow:

  1. Initialize USBX with UVC

    /*========================================================================*/
    /*= Initialize USBX with UVC supported.  */
    /*========================================================================*/
    
    /* Initialize USBX system. */
    status = ux_system_initialize(memory_pointer, USBX_MEMORY_SIZE, usbx_cache_safe_memory, USBX_CACHE_SAFE_MEMORY_SIZE);
    if (status != UX_SUCCESS)
        error_handler();
    
    /* Initialize USBX Host Stack.  */
    status =  ux_host_stack_initialize(NULL);
    if (status != UX_SUCCESS)
        error_handler();
    
    /* Register video class.  */
    status =  ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
    if (status != UX_SUCCESS)
        error_handler();
    
    /* Register EHCI HCD.  */
    status = ux_host_stack_hcd_register(_ux_system_host_hcd_ehci_name, _ux_hcd_ehci_initialize, EHCI_BASE, 0x0);
    if (status != UX_SUCCESS)
        error_handler();
  2. Wait a UVC device connection

    /*========================================================================*/
    /*= Wait until UVC device is connected.  */
    /*========================================================================*/
    
    /* Find the main video container.  */
    status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
    if (status != UX_SUCCESS)
        error_handler();
    
    /* We get the first instance of the video device.  */
    while (1)
    {
        status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
        if (status == UX_SUCCESS)
            break;
    
        tx_thread_sleep(10);
    }
    
    /* We still need to wait for the video status to be live */
    while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
    {
        tx_thread_sleep(10);
    }
    video = inst;
  3. Setup parameters

    /* Set video parameters to MJPEG, W x H resolution, .. fps. */
    status = ux_host_class_video_frame_parameters_set(video,
                                                      UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG,
                                                      CAMERA_RESOLUTION_WIDTH,
                                                      CAMERA_RESOLUTION_HEIGHT,
                                                      TEN_FRAMES_PER_SECOND);
    if (status != UX_SUCCESS)
        error_handler();
  4. Start streaming

    
    /*========================================================================*/
    /*= Start UVC streaming.  */
    /*========================================================================*/
    
    /* Start video transfer. */
    status = ux_host_class_video_start(video);
    if (status != UX_SUCCESS)
        error_handler();

if HIGH_BANDWIDTH_EHCI / Driver HCD must support adding requests list. /

/* Build buffer list.  */
for (i = 0; i < VIDEO_BUFFER_NB; i ++)
    video_buffers[i] = video_buffer[i];

/* Issue transfer request list to start streaming.  */
status = ux_host_class_video_transfer_buffers_add(video, video_buffers, VIDEO_BUFFER_NB);
if (status != UX_SUCCESS)
    error_handler();

elif NORMAL_BANDWIDTH_OHCI / Driver adds request one by one. /

/* Allocate space for video buffer. */
for(buffer_index = 0; buffer_index < VIDEO_BUFFER_NB; buffer_index++)
{
    /* Add buffer to the video device for video streaming data. */
    status = ux_host_class_video_transfer_buffer_add(video,
                                                     video_buffer[buffer_index]);
    if (status != UX_SUCCESS)
        error_handler();
}

endif

5. Handle frame data and reuse frame buffers, assuming there is transfer done callback putting semaphore:
/* Set transfer callback (do before start transfer). */
ux_host_class_video_transfer_callback_set(video,
                                          video_transfer_done);
/* Wait transfer done and re-use buffers.  */
buffer_index = 0;
while (1)
{

    /* Suspend here until a transfer callback is called. */
    status = tx_semaphore_get(&data_received_semaphore, TX_WAIT_FOREVER);
    if (status != UX_SUCCESS)
        error_handler();

    /* Received data. The callback function needs to obtain the actual
       number of bytes received, so the application routine can read the
       correct amount of data from the buffer. */

    /* Application can now consume video data while the video device stores
       the data into the other buffer. */

    /* Add the buffer back for video transfer. */
    status = ux_host_class_video_transfer_buffer_add(video,
                                                     video_buffer[buffer_index]);
    if (status != UX_SUCCESS)
        error_handler();

    /* Increment the buffer_index, and wrap to zero if it exceeds the
       maximum number of buffers. */
    buffer_index = (buffer_index + 1);
    if(buffer_index >= VIDEO_BUFFER_NB)
        buffer_index = 0;
}
edbek commented 3 years ago

Thank you for the detailed answer !

yuxin-azrtos commented 3 years ago

Close this issue. Feel free to reopen if you have questions.

bSquaredOlea commented 2 years ago

Are there anymore complete examples for this yet? I used the above code to get to the point where I can set parameters, but my device doesn't stream.

xiaocq2001 commented 2 years ago

@bSquaredOlea There is no complete example project yet. For the stream, it actually depends on your hardware and host controller driver, since the ISO transfer is quite different to bulk and interrupt transfer. If your HCD is not ready for ISO transfer there is no stream.

xiaocq2001 commented 2 years ago

You can try the steps to build a video example that enumerates and start streaming on a USB 2.0 high speed webcam (tested with "Microsoft LifeCam Studio(TM)"):

Get MIMXRT1060 Examples

Modifications in sample_usbx_host_mass_storage.c

UX_HOST_CLASS_VIDEO *video;

pragma location="NonCacheable"

UCHAR video_buffer[UX_HOST_CLASS_VIDEO_TRANSFER_REQUEST_COUNT][3072];

/ This semaphore is used for the callback function to signal application thread that video data is received and can be processed. / TX_SEMAPHORE data_received_semaphore;


* Add instance check function (before `demo_thread_entry`)
```c
static UINT  demo_class_video_check()
{

UINT status;
UX_HOST_CLASS               *host_class;
UX_HOST_CLASS_VIDEO         *inst;

    /* Find the main video container.  */
    status = ux_host_stack_class_get(_ux_system_host_class_video_name, &host_class);
    if (status != UX_SUCCESS)
        while(1); /* Error Halt  */

    /* We get the first instance of the video device.  */
    while (1)
    {
        status = ux_host_stack_class_instance_get(host_class, 0, (void **) &inst);
        if (status == UX_SUCCESS)
            break;

        tx_thread_sleep(10);
    }

    /* We still need to wait for the video status to be live */
    while (inst -> ux_host_class_video_state != UX_HOST_CLASS_INSTANCE_LIVE)
    {
        tx_thread_sleep(10);
    }

    video = inst;
    return(UX_SUCCESS);
}

UINT status;

status = tx_semaphore_put(&data_received_semaphore);
if (status != UX_SUCCESS)
    while(1); /* Error Halt.  */

}


* Add class registration (in `demo_thread_entry`)
```c
    /* Register video class.  */
    status =  ux_host_stack_class_register(_ux_system_host_class_video_name, _ux_host_class_video_entry);
    if (status != UX_SUCCESS)
        return;
yuxin-azrtos commented 2 years ago

@bSquaredOlea : does the sample code help?

bSquaredOlea commented 2 years ago

Thanks for your responses. I was able to get to a point where I started a transaction, but never got any video data from the device (I could get meta data from the device though). However, we are moving in a different direction now.

Thanks, Ben

On Tue, Feb 8, 2022 at 12:13 AM yuxin-azrtos @.***> wrote:

@bSquaredOlea https://github.com/bSquaredOlea : does the sample code help?

— Reply to this email directly, view it on GitHub https://github.com/azure-rtos/usbx/issues/4#issuecomment-1032248987, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWZZBYBAHZJQA6BUBLRXRLLU2CYAZANCNFSM4U6EMFVA . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

You are receiving this because you were mentioned.Message ID: @.***>

xiaocq2001 commented 2 years ago

@bSquaredOlea Thanks for sharing the progress. For the transaction, please note UVC transaction is based on isochronous endpoint, which works different against the control requests and bulk transfer, so only HCD with isochronous transfer support can get video data from the device (which has been done in EHCI HCD for 1060). If you are working on some other chip, your HCD still needs modification on isochronous transfer to make things right.

xianghui-renesas commented 1 year ago

Hi, we can collect stream video through ethernet port to the PC. Is there a recommended PC side application that can collect from the Ethernet port and display the video? thanks!

xiaocq2001 commented 1 year ago

No for the raw video stream.

xianghui-renesas commented 1 year ago

Hi @xiaocq2001 , thanks for the comment. What about mpeg format? we can output mpeg format to PC. Can some of the webcam application be used to display the video collected through USBX? eg. webcamiod. If not, what are some of the limiting factors? thanks!

xiaocq2001 commented 1 year ago

@xianghui-renesas, I'm not sure directly forwarded USB Video stream can be recognized by webcamiod, maybe you can try. I think the video stream must be rearrange/packaged by some web streaming protocol to allow PC application to play.

xianghui-renesas commented 1 year ago

Hi @xiaocq2001, thanks! I had a quick try with webcamiod and found they are primarily looking for the USB Video streaming device and is unaware of the host video packet format. I have a specific question on the definition of TEN_FRAMES_PER_SECOND. How does it convert with fps? Appreciate any comment you can provide. Thanks! status = ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, CAMERA_RESOLUTION_WIDTH, CAMERA_RESOLUTION_HEIGHT, TEN_FRAMES_PER_SECOND);

xiaocq2001 commented 1 year ago

Yes. That's the framerate.

xianghui-renesas commented 1 year ago

Hi @xiaocq2001, thanks! how is an input of 333333 converted 30fps, what is the unit of this argument in the API? thanks! / Set video parameters to MJPEG, 640x480 resolution, 30fps. / status = ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_MJPEG, 176, 144, 333333); I can see 10000000/30=333333. It seems the unit of this argument is 10th of a microsecond. Could you explain?

xiaocq2001 commented 1 year ago

Please refer to [Universal Serial Bus Device Class Definition for Video Devices: Frame Based Payload], where you can see frame intervals are in 100ns unit.

xianghui-renesas commented 1 year ago

Thanks @xiaocq2001 for the reference information. We are trying to stream the video to PC through UDP port. The PC end application we are trying to use is VLC: https://docs.videolan.me/vlc-user/3.0/en/advanced/streaming/stream_over_udp.html Do you have any experience using host video class and VLC? One of the video format we identified using VLC is UX_HOST_CLASS_VIDEO_VS_FORMAT_H264, how do we set up the bandwidth? Also a general question is how are we defining the color channel format? We do not seem to see there is information on the stack for this.

xiaocq2001 commented 1 year ago

Unfortunately, there is no H264 format demo for now, maybe you can trace some existing H264 camera for reference also there is H264 format spec on usb.org (Universal Serial Bus Device Class Definition for Video Devices: H.264 Payload).

General for USB bandwidth: done through changing different alternate settings.

xianghui-renesas commented 1 year ago

Thanks @xiaocq2001 , could you explain how the color channel encoding is defined in the USBX host video stack? If we collect image using uncompressed format, what is the format of data in the video buffer collected? For example, with this configuration, how are the color coding and buffer data format defined? ux_host_class_video_frame_parameters_set(video, UX_HOST_CLASS_VIDEO_VS_FORMAT_UNCOMPRESSED, 160, 120, 333333); The max payload for this setting is 384 (which is identified from (ux_host_class_video_max_payload_get), can you help to explain the format of this data so we can repackage to send to the PC program?

xiaocq2001 commented 1 year ago

Checking uncompressed format spec in https://usb.org/sites/default/files/USB_Video_Class_1_5.zip.

The supported pixel coding is as follow: image

The format is reported by GUID: image

The each payload is composed by a header and actual data, the header is like: image

You can refer to the spec for more details.

BluesharkPD commented 1 year ago

Hi @xiaocq2001 , is there any application of uvc host running on stm32h7 target ??

xianghui-renesas commented 1 year ago

Hi @xiaocq2001 , does the usbx video host stack support still image collection? Do you have an example code if it is supported.

xiaocq2001 commented 1 year ago

@xianghui-renesas , still image collection is not supported currently.

xianghui-renesas commented 1 year ago

Hi @xiaocq2001, I tried to piece together the packets collected in the MCU to display them using a feature in our e2studio IDE and found the packets are out of order in the packet buffers. I have 96 packet buffers. If the MCU is not providing the buffer fast enough for the frame rate, will the MCU start to skip packet? Do you have experience with this? thanks!

xianghui-renesas commented 1 year ago

Hi @xiaocq2001, your example so far uses stream-based protocol, does the stack support frame-based protocol, do you have an example for the frame-based implementation? I think it may be easier to look at the raw image from the MCU buffer with frame-based implementation.

xiaocq2001 commented 1 year ago

@xianghui-renesas , do you mean to see how a video frame is detected in USB packets? For motion jpeg, if you check the spec about payload header for each USB packet, there is EOF bit to indicate a frame end. image

MaheshAvula-Alifsemi commented 12 months ago

Hi @xiaocq2001 ,@yuxin-azrtos, I am working on USB host Isochrnous support, I have a one question that single transaction per microframe, I am able to see the video streaming and it's working fine but, multitransactions per miocroframe is not working, and i am no seeing any valid frames,

  1. I am using demo app as above one, I would like to know that, do i need to change application..? or could you please suggest how i can approach further on this.

I really appriciate your help on this.

Thanks Mahesh

xiaocq2001 commented 12 months ago

@Mahesh-dev-avula I think the application is fine for multitransactions per microframe. Maybe you can check if multiple transactions per microframe is supported by your host controller, or the host controller driver needs modification to support it.

MaheshAvula-Alifsemi commented 12 months ago

@xiaocq2001 , Thank you for responding on my query, Yes my host controller will support multi transactions per microframe,same hardware working on linux, my driver code also implemented based on linux reference(xhci driver). the difference i found that linux and azure rtos, in the linux they are preparing multiple bufferes at time and sending command to the hardware,but in RTOS from the application we are requesting only one buffere at time and it's working for single transaction per microframe, I would thinking that may be we need to prepare multiple bufferes at time for multi transactions. please correct me if I am wrong.

Thanks Mahesh

xiaocq2001 commented 12 months ago

For isochronous request, it's supposed to support request list input, that is, the requests are linked by its ux_transfer_request_next_transfer_request, this means linked multiple requests can be accepted which includes multiple buffers.

In current EHCI implementation, single buffer is used for multiple transactions, max 3072 (3 * 1024) bytes can be transferred in the request buffer.

MaheshAvula-Alifsemi commented 12 months ago

@xiaocq2001 , Thank you for the responding my query, Actually my hardware doesn't support EHCI , I have implemented xhci driver and trying on it. there are few scenorios which i have tested:

  1. When UVC device sends payload length as less than or equals 1024 bytes, in this scenorio, I am requesting payload buffer as 3072 bytes from my application and it's working.
  2. When we increase resolution as 1280*720 with 30fps from my application and at this time UVC device commited payload length as more than 1024 bytes and it's three transactions per microframe, in this scenorio, first 1024 bytes only my host controller receiving and then it's throwing DATA BUFFER ERROR and RING OVERRUN ERROR.

Any suggestion on this to resolve the issue.

Thanks Mahesh

xiaocq2001 commented 12 months ago

I'm not checking xHCI spec in detail, but I see following may relates to the high bandwidth multiple transactions support:

  1. Mult, Max Packet Size and Max Burst Size in 6.2.3 Endpoint Context.
  2. TRB Transfer Length in 6.4.1.3 Isoch TRB.

From description TRB Transfer Length can be 3072. From description Max Burst Size shall be set to the number of additional transactions and Mult shall be 0 for high-speed.

MaheshAvula-Alifsemi commented 11 months ago

@xiaocq2001 , Thank you so much for your inputs and it's working fine after programming Endpoint Context array.

I really appriciating your help. thanks

Mahesh

chrisrhayden84 commented 5 months ago

Hi @xiaocq2001 is there any update on when/if still image collection & H264 payload will be supported?

xiaocq2001 commented 4 months ago

For still image you can extract YUV frame from isochronous video stream. For H264 payload, you can handle that in application.