Open cumtchenchang opened 2 months ago
Hi, thank you for trying the sample. Wondering how this delay is measured? Which scenario are you setting up (eg. is the Source reader sending the samples to file or displaying on screen etc), it would help to understand your scenario more.
OK thank you for your response. I connect one USB camera for windows, get image timestamp not stable. but I connect one USB camera for windows for linux (ubuntu) get image timestamp stable
@daoyuanli9 int ReadCameraData(UINT32 width, UINT32 height) { IMFAttributes* attributes = NULL; IMFActivate* devices = NULL; IMFActivate deviceUsing = NULL; IMFMediaType mediaType = NULL; IMFAttributes attr = nullptr; IMFMediaSource deviceSource; IMFSourceReader reader; UINT32 count = 0;
auto hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED); if (FAILED(hr)) { goto failed_stop; } hr = MFCreateAttributes(&attributes, 1); if (FAILED(hr)) { goto failed_stop; } hr = attributes->SetGUID(MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE, MF_DEVSOURCE_ATTRIBUTE_SOURCE_TYPE_VIDCAP_GUID); if (FAILED(hr)) { goto failed_stop; } hr = MFEnumDeviceSources(attributes, &devices, &count); if (FAILED(hr)) { goto failed_stop; } if (count < 1) { hr = -1; goto failed_stop; } deviceUsing = devices[0];
hr = MFCreateDeviceSource(deviceUsing, &deviceSource); if (FAILED(hr)) { goto failed_stop; } hr = MFCreateAttributes(&attr, 1); { if (FAILED(hr)) { attr = nullptr; } else { hr = attr->SetUINT32(MF_SOURCE_READER_ENABLE_ADVANCED_VIDEO_PROCESSING, 1); } } hr = MFCreateSourceReaderFromMediaSource(deviceSource, attr, &reader); if (FAILED(hr)) { goto failed_stop; } hr = reader->GetNativeMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, &mediaType); if (FAILED(hr)) { } hr = MFSetAttributeSize(mediaType, MF_MT_FRAME_SIZE, width, height); hr = mediaType->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_YUY2);
PROPVARIANT var; hr = mediaType->GetItem(MF_MT_FRAME_RATE_RANGE_MAX, &var); if (!FAILED(hr)) { hr = mediaType->SetItem(MF_MT_FRAME_RATE, var); } hr = reader->SetCurrentMediaType((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, NULL, mediaType); if (FAILED(hr)) { goto failed_stop; } while (1) { DWORD index = 0; DWORD flag = 0; IMFSample* sample; LONGLONG timestamp;
hr = reader->ReadSample((DWORD)MF_SOURCE_READER_FIRST_VIDEO_STREAM, 0, &index, &flag, ×tamp, &sample); if (sample) { IMFMediaBuffer buffer = NULL; sample->ConvertToContiguousBuffer(&buffer); if (buffer) { BYTE channelBuf; DWORD channelLen; buffer->Lock(&channelBuf, NULL, &channelLen);
buffer->Release(); } sample->Release(); } }
failed_stop: if (attr) attr->Release(); for (UINT32 i = 0; i < count; i++) { devices[i]->Release(); } mediaType->Release(); return hr; }
Have you tried to follow the camera examples here? https://learn.microsoft.com/en-us/samples/microsoft/windows-universal-samples/camerastarterkit/
From your comment, it seems like the timestamp is from when the frames are captured.
@daoyuanli9 timestamp have no problem. arrive time is not reasonable. What I'm trying to say is that the latency of the fetch is inconsistent, Windows has this problem and while linux does not.
Wondering if you have tried adding MF_LOW_LATENCY attribute to the source reader? https://learn.microsoft.com/en-us/windows/win32/medfound/mf-low-latency
yes, we have tried, still terrible
We develop on a window system, connect a USB camera, and there is a 10-25ms image data delay when acquiring camera images. We analyzed that it is caused by a certain delay when Windows Media Foundation Source Reader directly reads UVC, and it changes periodically. Please give me a solution. Thank you![11](https://github.com/microsoft/media-foundation/assets/24315713/f8f4600e-ed03-4d6a-9207-b9f873d6cbea)