automatic-ripping-machine / automatic-ripping-machine

Automatic Ripping Machine (ARM) Scripts
https://b3n.org/automatic-ripping-machine
MIT License
2.63k stars 288 forks source link

Failed to eject and unable to insert a new disc #572

Open lawhazl opened 2 years ago

lawhazl commented 2 years ago

Describe the bug A clear and concise description of what the bug is.

To Reproduce Steps to reproduce the behavior:

  1. start up ARM container
  2. insert any disc and ripping automatically starts
  3. disc is ejected (slot loading drive)
  4. unable to insert a new disc until ARM container is restarted.

Log shows DEBUG ARM: models.eject Unmounted disc /dev/sr0 DEBUG ARM: models.eject Failed to eject /dev/sr0

Environment

[OS Distribution and version (run "cat /etc/lsb-release")]

DISTRIB_ID=Ubuntu DISTRIB_RELEASE=20.04 DISTRIB_CODENAME=focal DISTRIB_DESCRIPTION="Ubuntu 20.04.4 LTS"

ARM installed on docker

[ARM Release Version or if cloning from git branch/commit (run "git branch" and "git log -1" to get this)] ARM v 2.6.0

Repository: automaticrippingmachine/automatic-ripping-machine:latest

Log file

[Run the rip in DEBUG and drag and drop the log file onto this comment box]

[>>>>> DRAG AND DROP LOG FILE HERE <<<< MRS_DOUBTFIRE.log <]

github-actions[bot] commented 2 years ago

If youre having issues, please remember to read the wiki and follow the instructions carefully

Endolf commented 2 years ago

I'm seeing this issue too. The status light on the drive flashes constantly as though it's stuck trying to eject the disk, so when inserting a new one it just ejects again.

Running in a docker container in unraid with a verbatim 43888.

lawhazl commented 2 years ago

to get around this im sending a JSON notification via webhook to NodeRED to restart the ARM container automatically after rip and transcode are completed

SoulOfSet commented 2 years ago

I saw this issue in the docker container as well. I ended up just installing it directly instead of using the Docker container to get around this. After the first rip, the disc drive would immediately open when I closed it. I had to restart the container to get it to rip again.

Running sudo setcd -f0 /dev/sr0 would let me close the CD drive again but it still wouldnt pick up new discs to rip.

tylerwmarrs commented 1 year ago

Same issue with first rip and constant disc eject afterwards.

My setup:

Portainer Community Edition Docker Image : https://hub.docker.com/layers/automaticrippingmachine/automatic-ripping-machine/latest/images/sha256-8ce494e93afa9a0e7f6ba3d812b953d244019a4f559da5ae46512a6c3bdd2ee7?context=explore

When I have some time, I'll probably create a custom docker image based on the arm-dependencies base image. It looks like it is updated more frequently anyways.

https://hub.docker.com/r/automaticrippingmachine/arm-dependencies

shitwolfymakes commented 1 year ago

New versions have been release so try updating. Closing as stale, feel free to reopen if the problem persists!

z4ch523 commented 1 year ago

I have the same problem if I open and close the tray via the drive icon under settings > general info > disk drives. This is the docker log from before I open the tray until it reopens after closing it via the GUI:

eject: device name is `/dev/sr0'
eject: expanded name is `/dev/sr0'
eject: `/dev/sr0' is not mounted
eject: `/dev/sr0' is not a mount point
eject: `/dev/sr0' is not a multipartition device
eject: trying to eject `/dev/sr0' using CD-ROM eject command
eject: CD-ROM eject command succeeded
[2023-03-02 21:33:12,344] DEBUG ARM: models.open_close Ejected disc /dev/sr0
fatal: could not read Username for 'https://github.com': No such device or address
[2023-03-02 21:33:12,649] DEBUG ARM: utils.git_check_updates 128
[2023-03-02 21:33:12,649] DEBUG ARM: utils.git_check_updates 4d0f1c4be943c5a623e69d1de42614a066efb206 commit    refs/remotes/origin/main
[2023-03-02 21:33:12,649] DEBUG ARM: utils.git_check_updates 4d0f1c4be943c5a623e69d1de42614a066efb206
[2023-03-02 21:33:12,649] DEBUG ARM: utils.git_check_updates True
[2023-03-02 21:33:12,654] DEBUG ARM: utils.arm_alembic_get Alembic Head is: 95623e8c5d58
[2023-03-02 21:33:12,657] DEBUG ARM: utils.arm_db_get Database Head is: 95623e8c5d58
[2023-03-02 21:33:12,657] DEBUG ARM: utils.arm_db_check Database is current. Head: 95623e8c5d58DB: 95623e8c5d58
[2023-03-02 21:33:12,661] DEBUG ARM: ServerUtil.get_cpu_util Server CPU Util: 4.1
[2023-03-02 21:33:12,669] DEBUG ARM: ServerUtil.get_cpu_temp Server CPU Temp:  0
[2023-03-02 21:33:12,669] DEBUG ARM: ServerUtil.get_memory Server Mem Free:  34.7
[2023-03-02 21:33:12,669] DEBUG ARM: ServerUtil.get_memory Server Mem Used:  9.4
[2023-03-02 21:33:12,670] DEBUG ARM: ServerUtil.get_memory Server Mem Percent:  25.2
[2023-03-02 21:33:12,670] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/transcode/ Space:  1663.6
[2023-03-02 21:33:12,670] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/transcode/ Percent:  79.0
[2023-03-02 21:33:12,671] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/completed/ Space:  1663.6
[2023-03-02 21:33:12,671] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/completed/ Percent:  79.0
[2023-03-02 21:33:12,674] DEBUG ARM: DriveUtils.drive_status_debug *********
[2023-03-02 21:33:12,674] DEBUG ARM: DriveUtils.drive_status_debug Name: Drive 1
[2023-03-02 21:33:12,674] DEBUG ARM: DriveUtils.drive_status_debug Type: CD/BluRay/DVD
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Mount: /dev/sr0
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Open: True
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job Current: None
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job Previous: 7
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job - Status: success
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job - Type: movie
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job - Title: Inside-Man
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug Job - Year: 2006
[2023-03-02 21:33:12,675] DEBUG ARM: DriveUtils.drive_status_debug *********
eject: device name is `/dev/sr0'
eject: expanded name is `/dev/sr0'
eject: `/dev/sr0' is not mounted
eject: `/dev/sr0' is not a mount point
eject: closing tray
fatal: could not read Username for 'https://github.com': No such device or address
[2023-03-02 21:33:17,813] DEBUG ARM: utils.git_check_updates 128
[2023-03-02 21:33:17,814] DEBUG ARM: utils.git_check_updates 4d0f1c4be943c5a623e69d1de42614a066efb206 commit    refs/remotes/origin/main
[2023-03-02 21:33:17,814] DEBUG ARM: utils.git_check_updates 4d0f1c4be943c5a623e69d1de42614a066efb206
[2023-03-02 21:33:17,814] DEBUG ARM: utils.git_check_updates True
[2023-03-02 21:33:17,818] DEBUG ARM: utils.arm_alembic_get Alembic Head is: 95623e8c5d58
[2023-03-02 21:33:17,821] DEBUG ARM: utils.arm_db_get Database Head is: 95623e8c5d58
[2023-03-02 21:33:17,822] DEBUG ARM: utils.arm_db_check Database is current. Head: 95623e8c5d58DB: 95623e8c5d58
[2023-03-02 21:33:17,825] DEBUG ARM: ServerUtil.get_cpu_util Server CPU Util: 3.3
[2023-03-02 21:33:17,834] DEBUG ARM: ServerUtil.get_cpu_temp Server CPU Temp:  0
[2023-03-02 21:33:17,834] DEBUG ARM: ServerUtil.get_memory Server Mem Free:  34.7
[2023-03-02 21:33:17,834] DEBUG ARM: ServerUtil.get_memory Server Mem Used:  9.4
[2023-03-02 21:33:17,835] DEBUG ARM: ServerUtil.get_memory Server Mem Percent:  25.2
[2023-03-02 21:33:17,835] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/transcode/ Space:  1663.6
[2023-03-02 21:33:17,835] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/transcode/ Percent:  79.0
[2023-03-02 21:33:17,836] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/completed/ Space:  1663.6
[2023-03-02 21:33:17,836] DEBUG ARM: ServerUtil.get_disk_space Server /home/arm/media/completed/ Percent:  79.0
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug *********
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug Name: Drive 1
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug Type: CD/BluRay/DVD
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug Mount: /dev/sr0
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug Open: False
[2023-03-02 21:33:17,841] DEBUG ARM: DriveUtils.drive_status_debug Job Current: None
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug Job Previous: 7
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug Job - Status: success
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug Job - Type: movie
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug Job - Title: Inside-Man
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug Job - Year: 2006
[2023-03-02 21:33:17,842] DEBUG ARM: DriveUtils.drive_status_debug *********
Mar  2 20:33:17 75d749b1e012 ARM: Entering docker wrapper
Mar  2 20:33:17 75d749b1e012 ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar  2 20:33:20 75d749b1e012 ARM: Entering docker wrapper
Mar  2 20:33:20 75d749b1e012 ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
z4ch523 commented 1 year ago

I just realized it's the same behavior if I type the command eject -tv in the host console.

1337-server commented 1 year ago

Reopening as this is still unresolved.

I just realized it's the same behavior if I type the command eject -tv in the host console.

Does running eject -T /dev/srX or eject /dev/srX work ?

z4ch523 commented 1 year ago

No, whenever I open the drive via command I will reopen after I close it again. I have to wait a little, then push the hardware button on the actual drive and then it works.

jm2003uk commented 1 year ago

Just dropping a note to say i'm experiencing the same behaviour. External tray loading drive, LED blinks rapidly and about 8 entries per second.

Mar 12 13:10:34 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:34 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:34 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:34 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:34 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:34 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:34 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:34 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:35 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:35 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:35 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:35 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:35 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:35 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0
Mar 12 13:10:35 3eb3ccc41b5f ARM: Entering docker wrapper
Mar 12 13:10:35 3eb3ccc41b5f ARM: [ARM] Not CD, Blu-ray, DVD or Data. Bailing out on sr0

The only condition that needs to happen is for the container to be running and the drive to be opened, whether by entering a command, waiting for a CD to be done or pressing the button on the front. Only way to be able to close the tray is by stopping/restarting the container.

As a side note, if the tray is already open when the container is started it doesnt exhibit this behaviour. But if you close and then reopen, it starts again.

///

Quick edit to add, if you restart udev from within the container, everything also seems to go back to normal i.e the tray can be closed. Could a quick and dirty fix be to restart udev automatically upon the drive ejecting?

1337-server commented 1 year ago

A button could be added. But I would prefer to fix the underlying issue instead if we can.

Have you tried it since it was updated to remove the --privileged flag ? Also, are you passing in both references for the drive ? For example, --device /dev/sr0 and --device /dev/sg0

martinjuhasz commented 9 months ago

I'm running into the same problem.

@1337-server

This didn't work for me. once i removed the privileged flag but added sr0 and sg0, i was not able to even rip something. back on privileged, i can rip one cd, then it enter this deadloop

microtechno9000 commented 7 months ago

To progress this issue further, could those with the issue please detail

  1. System OS docker is running on
  2. Drive type, SATA/USB other?
  3. Occurrence, regular or intermittent
  4. Docker configuration for the drives and if privileged flags have been set
martinjuhasz commented 7 months ago
1. System OS docker is running on

Unraid 6.12.6

  1. Drive type, SATA/USB other? Verbatim USB BluRay Burner
  2. Occurrence, regular or intermittent every time after the first rip
  3. Docker configuration for the drives and if privileged flags have been set

Privileged set, but also /dev/sr0 device forwarded.

i have a friend with the same USB-Drive+Unraid that runs into the same problem.

microtechno9000 commented 6 months ago

I posted some information in #1060 testing a USB dvd drive with ARM, host OS Ubuntu 20.04 LTS

There may be an issue with unraid? could those with the issue confirm the host OS?