Schaka / janitorr

Cleans your Radarr, Sonarr, Jellyseerr and Jellyfin before you run out of space
GNU General Public License v3.0
313 stars 7 forks source link

feign.FeignException$BadRequest: [400 Bad Request] during [POST] #53

Closed OvernightSuccess123 closed 2 months ago

OvernightSuccess123 commented 2 months ago

Hi Schaka,

Seeing if I have an issue I can get support on.

I keep getting this error visible in logs about 30 after sudo docker compose up -d.

. _ _ /\ / '_ () \ \ \ \ ( ( )\ | ' | '| | ' \/ ` | \ \ \ \ \/ _)| |)| | | | | || (| | ) ) ) ) ' |__| .|| ||| |\, | / / / / =========|_|==============|__/=////

:: Spring Boot :: (v3.3.2)

2024-08-15T19:01:06.415-04:00 INFO 1 --- [ main] c.g.s.janitorr.JanitorrApplicationKt : Starting JanitorrApplicationKt using Java 21.0.4 with PID 1 (/app/classes started by root in /) 2024-08-15T19:01:06.419-04:00 INFO 1 --- [ main] c.g.s.janitorr.JanitorrApplicationKt : No active profile set, falling back to 1 default profile: "default" 2024-08-15T19:01:07.543-04:00 INFO 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port 8978 (http) 2024-08-15T19:01:07.554-04:00 INFO 1 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2024-08-15T19:01:07.554-04:00 INFO 1 --- [ main] o.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/10.1.26] 2024-08-15T19:01:07.593-04:00 INFO 1 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2024-08-15T19:01:07.594-04:00 INFO 1 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 1117 ms 2024-08-15T19:01:08.549-04:00 INFO 1 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port 8978 (http) with context path '/' 2024-08-15T19:01:08.565-04:00 INFO 1 --- [ main] c.g.s.janitorr.JanitorrApplicationKt : Started JanitorrApplicationKt in 2.488 seconds (process running for 2.732) 2024-08-15T19:01:08.605-04:00 INFO 1 --- [ scheduling-1] c.g.s.j.cleanup.AbstractCleanupSchedule : Not deleting Shows because minimum disk threshold was not reached. 2024-08-15T19:01:08.606-04:00 INFO 1 --- [ scheduling-1] c.g.s.j.cleanup.AbstractCleanupSchedule : Free disk space: 86.3505946715002% 2024-08-15T19:01:08.606-04:00 INFO 1 --- [ scheduling-1] c.g.s.j.cleanup.AbstractCleanupSchedule : Not deleting Movies because minimum disk threshold was not reached. 2024-08-15T19:01:08.606-04:00 INFO 1 --- [ scheduling-1] c.g.s.j.cleanup.AbstractCleanupSchedule : Free disk space: 86.3505946715002% 2024-08-15T19:01:18.802-04:00 INFO 1 --- [ scheduling-1] c.g.s.j.s.sonarr.SonarrRestService : Dry run - not deleting any TV shows without files or monitoring 2024-08-15T19:01:19.635-04:00 ERROR 1 --- [ scheduling-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task

feign.FeignException$BadRequest: [400 Bad Request] during [POST] to [http://192.168.1.7:8096/Library/VirtualFolders?name=Shows%20%28Deleted%20Soon%29&collectionType=TvShows&paths=/leaving-soon/tv/tag-based&refreshLibrary=false] [MediaServerClient#createLibrary(String,String,AddLibraryRequest,List)]: [Error processing request.] at feign.FeignException.clientErrorStatus(FeignException.java:222) ~[feign-core-13.1.jar:na] at feign.FeignException.errorStatus(FeignException.java:203) ~[feign-core-13.1.jar:na] at feign.FeignException.errorStatus(FeignException.java:194) ~[feign-core-13.1.jar:na] at feign.codec.ErrorDecoder$Default.decode(ErrorDecoder.java:103) ~[feign-core-13.1.jar:na] at feign.InvocationContext.decodeError(InvocationContext.java:126) ~[feign-core-13.1.jar:na] at feign.InvocationContext.proceed(InvocationContext.java:72) ~[feign-core-13.1.jar:na] at feign.ResponseHandler.handleResponse(ResponseHandler.java:63) ~[feign-core-13.1.jar:na] at feign.SynchronousMethodHandler.executeAndDecode(SynchronousMethodHandler.java:114) ~[feign-core-13.1.jar:na] at feign.SynchronousMethodHandler.invoke(SynchronousMethodHandler.java:70) ~[feign-core-13.1.jar:na] at feign.ReflectiveFeign$FeignInvocationHandler.invoke(ReflectiveFeign.java:99) ~[feign-core-13.1.jar:na] at jdk.proxy2/jdk.proxy2.$Proxy81.createLibrary(Unknown Source) ~[na:na] at com.github.schaka.janitorr.mediaserver.AbstractMediaServerRestService.updateLeavingSoon(AbstractMediaServerRestService.kt:217) ~[classes/:na] at com.github.schaka.janitorr.cleanup.AbstractCleanupSchedule.deleteTvShows(AbstractCleanupSchedule.kt:103) ~[classes/:na] at com.github.schaka.janitorr.cleanup.AbstractCleanupSchedule$scheduleDelete$2.invoke(AbstractCleanupSchedule.kt:49) ~[classes/:na] at com.github.schaka.janitorr.cleanup.AbstractCleanupSchedule$scheduleDelete$2.invoke(AbstractCleanupSchedule.kt:49) ~[classes/:na] at com.github.schaka.janitorr.cleanup.AbstractCleanupSchedule.cleanupMediaType(AbstractCleanupSchedule.kt:76) ~[classes/:na] at com.github.schaka.janitorr.cleanup.AbstractCleanupSchedule.scheduleDelete(AbstractCleanupSchedule.kt:49) ~[classes/:na] at com.github.schaka.janitorr.cleanup.TagBasedCleanupSchedule.runSchedule(TagBasedCleanupSchedule.kt:52) ~[classes/:na] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(Unknown Source) ~[na:na] at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na] at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:354) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:196) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:768) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheInterceptor.lambda$invoke$0(CacheInterceptor.java:64) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheAspectSupport.invokeOperation(CacheAspectSupport.java:416) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheAspectSupport.evaluate(CacheAspectSupport.java:548) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:433) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:395) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.cache.interceptor.CacheInterceptor.invoke(CacheInterceptor.java:74) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:768) ~[spring-aop-6.1.11.jar:6.1.11] at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:720) ~[spring-aop-6.1.11.jar:6.1.11] at com.github.schaka.janitorr.cleanup.TagBasedCleanupSchedule$$SpringCGLIB$$0.runSchedule() ~[classes/:na] at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(Unknown Source) ~[na:na] at java.base/java.lang.reflect.Method.invoke(Unknown Source) ~[na:na] at org.springframework.scheduling.support.ScheduledMethodRunnable.runInternal(ScheduledMethodRunnable.java:130) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.scheduling.support.ScheduledMethodRunnable.lambda$run$2(ScheduledMethodRunnable.java:124) ~[spring-context-6.1.11.jar:6.1.11] at io.micrometer.observation.Observation.observe(Observation.java:499) ~[micrometer-observation-1.13.2.jar:1.13.2] at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:124) ~[spring-context-6.1.11.jar:6.1.11] at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) ~[spring-context-6.1.11.jar:6.1.11] at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[na:na] at java.base/java.util.concurrent.FutureTask.runAndReset(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[na:na] at java.base/java.lang.VirtualThread.run(Unknown Source) ~[na:na]

I can post any file you think might be useful.

janitorr docker-compose

services: janitorr: container_name: janitorr image: ghcr.io/schaka/janitorr:stable environment:

  • PUID=1001
  • PGID=1000
  • TZ=America/New_York volumes:
  • ./config:/config
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media:/data

janitorr application.yml with private sections redacted

server: port: 8978

File system access (same mapping as Sonarr, Radarr and Jellyfin) is required to delete TV shows by season and create "Leaving Soon" collections in Jellyfin

Currently, Jellyfin does not support an easy way to add only a few seasons or movies to a collection, we need access to temporary symlinks

Additionally, checks to prevent deletion on currently still seeding media currently require file system access as well

file-system: access: true validate-seeding: true # validates seeding by checking if the original file exists and skips deletion - turning this off will send a delete to the arrs even if a torrent may still be > leaving-soon-dir: "/leaving-soon" # A directory this container can write to and Jellyfin can find under the same path - this will contain new folders with symlinks to files for Jellyfi> from-scratch: true # Clean up entire "Leaving Soon" directory and rebuild from scratch - this can help with clearing orphaned data - turning this off can save resources (less writes to> free-space-check-dir: "/" # This is the default directory Janitorr uses to check how much space is left on your drives. By default, it checks the entire root - you may point it at a sp> application: dry-run: true whole-tv-show: false # activating this will treat as a whole show as recently download/watched from a single episode, rather than that episode's season - shows will be deleted as a who> whole-show-seeding-check: false # Turning this off, disables the seeding check entirely if whole-tv-show is enabled. Activating this check will keep a whole TV show if any season is st> leaving-soon: 14d # 14 days before a movie is deleted, it gets added to a "Leaving Soon" type collection (i.e. movies that are 76 to 89 days old) exclusion-tag: "janitorr_keep" # Set this tag to your movies or TV shows in the arrs to exclude media from being cleaned up

media-deletion: enabled: true movie-expiration:

Percentage of free disk space to expiration time - if the highest given number is not reached, nothing will be deleted

  # If filesystem access is not given, disk percentage can't be determined. As a result, Janitorr will always choose the largest expiration time.
  5: 15d # 15 days
  10: 30d # 1 month - if a movie's files on your system are older than this, they will be deleted
  15: 30d # 2 months
  20: 90d # 3 months
season-expiration:
  5: 15d # 15 days
  10: 20d # 20 days - if a season's files on your system are older than this, they will be deleted
  15: 60d # 2 months
  20: 120d # 4 months

tag-based-deletion: enabled: true minimum-free-disk-percent: 100 schedules:

  • tag: 5 - demo expiration: 30d
  • tag: 10 - demo expiration: 7d

    episode-deletion: # This ignores Jellystat. Only grab history matters. It also doesn't clean up Jellyfin. There is NO seeding check either. enabled: true tag: janitorr_daily # Shows tagged with this will have all episodes of their LATEST season deleted by the below thresholds max-episodes: 10 # maximum (latest) episodes of this season to keep max-age: 30d # Maximum age to keep any episode at all - even the last 10 episodes would expire after 30 days in this example

jellyfin docker-compose

This file is auto-generated by openmediavault (https://www.openmediavault.org)

WARNING: Do not edit this file, your changes will get lost.

Jellyfin

Media server and client app.


https://hub.docker.com/r/linuxserver/jellyfin

services: jellyfin: image: lscr.io/linuxserver/jellyfin:latest network_mode: bridge container_name: jellyfin volumes:

  • ./config:/config
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media/tvshows:/data/tvshows
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media/movies:/data/movies
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media/music:/data/music
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media/books:/data/books
  • /srv/dev-disk-by-uuid-70da2aeb-c7f4-4e77-bf49-4000013a4216/data/media/leaving-soon:/data/leaving-soon environment:
  • PUID=1001
  • PGID=1000
  • TZ=America/New_York
  • JELLYFIN_PublishedServerUrl=192.168.1.7 #optional
  • NVIDIA_DRIVER_CAPABILITIES=all
  • NVIDIA_VISIBLE_DEVICES=all runtime: nvidia deploy: resources: reservations: devices:
    • capabilities: [gpu] devices:
  • /dev/nvidia-caps:/dev/nvidia-caps
  • /dev/nvidia0:/dev/nvidia0
  • /dev/nvidiactl:/dev/nvidiactl

    - /dev/nvidia-modeset:/dev/nvidia-modeset

  • /dev/nvidia-uvm:/dev/nvidia-uvm
  • /dev/nvidia-uvm-tools:/dev/nvidia-uvm-tools ports:
  • 8096:8096
  • 8920:8920 #optional
  • 7359:7359/udp #optional
  • 1900:1900/udp #optional restart: always
Schaka commented 2 months ago

Usually that error means Jellyfin couldn't find the leaving-soon folder and thus, didn't create it. The 400 here means you passed it the incorrect folder.

Does Jellyfin have access to to /leaving-soon/tv/tag-based? It seems to me, you called it /data-/eaving-soon inside the Jellyfin container.

If I understand your mappings correctly, your leaving-soon-dir inside your application.yml should be:

leaving-soon-dir: "/data/leaving-soon"

The directory /leaving-soon isn't known to either container. Janitorr only knows /data/leaving-soon and it looks like Jellyfin does know it under that name too.

OvernightSuccess123 commented 2 months ago

Thanks, kindly. Triple-checked and followed the different guides to make sure the folder was correct and now it works. Please close.