webcamoid / akvirtualcamera

akvirtualcamera, virtual camera for Mac and Windows
GNU General Public License v3.0
393 stars 51 forks source link

[question] Virtual devices not found in /dev/videoX #32

Closed kirk86 closed 1 year ago

kirk86 commented 2 years ago

Summary

I can't find any devices under /dev/video after successfully creating devices with AkVCamManager

Current Behavior

Devices are created successfully, also they show in zoom but can't see them in vlc.

Expected Behavior

After creating the devices I was expecting to see /dev/video0, /dev/video1 etc.

How it should have work? After creating devices expecting to see them under /dev/video or something like /dev/media

Steps to Reproduce

  1. OSX 11.5.1
  2. install AkVCamManager from the binary .pkg file on the latest stable release
  3. create a config.ini that creates 2 devices (output, capture)
  4. check on /dev/video* for devices but nothing
  5. AkVCamManager devices
    +--------------------+--------------------------------+
    | Device             | Description                    |
    +--------------------+--------------------------------+
    | AkVCamVideoDevice0 | Virtual Camera (output device) |
    | AkVCamVideoDevice1 | Virtual Camera                 |
    +--------------------+--------------------------------+

Your Environment

Am I missing something?

pinging @hipersayanX for visibility, thanks!

hipersayanX commented 2 years ago

After creating the devices I was expecting to see /dev/video0, /dev/video1 etc.

How it should have work? After creating devices expecting to see them under /dev/video or something like /dev/media

You are confusing GNU/Linux with Mac, /dev/video devices are exclusive of V4L2 which is not implemented in Mac, Mac uses AVFoundation for camera capture which is a completely different API, you will never see any /dev/video or anything like that there.

kirk86 commented 2 years ago

Thanks for the clarification!

I'm trying to use a program which reads form the virtual camera and does some processing and then pushes that onto the real device.

The issue is that the program was written with V4L2 in mind and in its config file I have the following lines:

virtual_video_device: "/dev/video0"
real_video_device: "/dev/video1"

Is there any possibility to replace /dev/video0 with AkVCamManager after creating devices? If so how can I achieve that?

Thanks!

hipersayanX commented 2 years ago

I'm trying to use a program which reads form the virtual camera and does some processing and then pushes that onto the real device.

Hmmm.... maybe you mean in the opposite way?

I'm trying to use a program which reads form the real device and does some processing and then pushes that onto the virtual camera.

Because the first sentence makes no technical sense.

Is there any possibility to replace /dev/video0 with AkVCamManager after creating devices? If so how can I achieve that?

I don't know what program are you talking about, but if you want to push frames from your program to the virtual camera you can check here the code examples.

kirk86 commented 2 years ago

Hmmm.... maybe you mean in the opposite way?

Yes, you are correct, sorry about the wrong statement.

if you want to push frames from your program to the virtual camera you can check here the code examples

Thanks for this I wasn't aware of it.

kirk86 commented 2 years ago

@hipersayanX I have a question if you don't mind.? After the example code I'm able to write to the virtual camera using akvcammanager but for some reason the content is not displayed properly.

For instance, I have the following image which I'm reading from disk, the image is an array[width=700, height=561, channels=3] I'm resizing it to 1024x720 to match the setting of the virtual camera

image

But when I write this image frame to the virtual camera I get the following output Any ideas what I'm doing wrong here?

image

hipersayanX commented 2 years ago

If the scaling is correct, it cold be maybe that you are sending the frames in the wrong format. Each color component is a 24 bits long: 8 bits Red, 8 bits Green, 8 bits Blue, in that order, no padding bits. Each frame is read from top-left corner to bottom right corner, lines must not be padded, and the width must be multiple of 32. For example, if you want to send a RGB24 1024x720 frame, the line size should be 1024 * 24 / 8 == 3072, the frame size should be 3072 * 720 == 2211840. If everything is correct you should be able to see the frame in the other side, else the code you are using for scaling is wrong.

kirk86 commented 2 years ago

Thanks, I tried all possible formats and only RGB24 seems stable, but the colours are distorted.

For example, if you want to send a RGB24 1024x720 frame, the line size should be 1024 * 24 / 8 == 3072, the frame size should be 3072 * 720 == 2211840.

I have no idea what that means. I'm reading an image using opencv which returns back a numpy arrray of size 1024x720x3 and type float32, then I transform that into a type of uint8, and send it as bytes to the virtual camera. Am I doing sth wrong?

Real image: image

This is what I get written on the virtual camera image

hipersayanX commented 2 years ago

That's because OpenCV is ordering the components as BGR instead of RGB, just swap red and blue components.

kirk86 commented 2 years ago

Thanks for pointing that out, any ideas as to why the other formats produce artifacts?

hipersayanX commented 2 years ago

any ideas as to why the other formats produce artifacts?

Because you are sending the frames in the wrong format, only RGB24 is supported for now.

kirk86 commented 2 years ago

only RGB24 is supported for now

Most probably this is the case since I've corrected the format after your suggestion.