MDeLuise / plant-it

🪴 Self-hosted, open source gardening companion app
https://plant-it.org
GNU General Public License v3.0
686 stars 25 forks source link

Unable to connect to Redis (Docker) #140

Closed WallK closed 7 months ago

WallK commented 7 months ago

Avoid duplicated bug reports

Description

When I open the frontend in browser and log in -- I get "Unable to connect to Redis" notification

I'm not sure where "cache/:6379" and such comes from This redis container uses default config and the same port Does the application expects service name "cache" and nothing else?

Expected behaviour

No response

Steps to reproduce

No response

Local environment

It's a docker deployment Here's a relevant part of docker-compose.yaml:

  # === PLANT-IT ===
  plant-it-backend:
    container_name: plant-it-backend
    image: msdeluise/plant-it-backend:latest
    env_file: ./envs/plant-it-backend.env
    depends_on:
      - mariadb
      - redis-cache
    volumes:
      - "./volumes/plant-it/upload-dir:/upload-dir"
      - "./volumes/plant-it/certs:/certificates"
      # - "certs:/certificates" #bound to tmpfs or something similar?
    ports:
      - "3001:8080" #API port
    restart: unless-stopped

  plant-it-frontend:
    container_name: plant-it-frontend
    image: msdeluise/plant-it-frontend:latest
    env_file: ./envs/plant-it-frontend.env
    links:
      - plant-it-backend
    ports:
      - "3000:3000" #frontend port
    volumes:
      - "./volumes/plant-it/certs:/certificates"
      # - "certs:/certificates" #virtual fs of some kind?
    restart: unless-stopped

  #MySQL (mariadb) DB for inventree and plant-it
  mariadb:
    container_name: mariadb
    image: lscr.io/linuxserver/mariadb:latest
    env_file:
      - ./envs/mariadb.env
      - ./envs/plant-it-backend.env
    ports:
      - "3306:3306"
    volumes:
      - ./volumes/mariadb/config:/config
      - ./volumes/mariadb/db_backup:/backup
    restart: unless-stopped

  # redis acts as database cache manager for inventree and plant-it
  redis-cache:
    container_name: redis-cache
    image: redis:7.2.1
    depends_on:
      - mariadb
    # profiles:
    #     - redis
    env_file:
      - ./envs/inventree.env
    expose:
      - ${INVENTREE_CACHE_PORT:-6379}
    restart: always

Relevant part of plant-it-backend.env:

#
# Cache
#
CACHE_TTL=86400
CACHE_HOST=cache
CACHE_PORT=6379

Additional info

Part of redis log:

1:C 09 Apr 2024 13:04:22.516 * oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
1:C 09 Apr 2024 13:04:22.516 * Redis version=7.2.1, bits=64, commit=00000000, modified=0, pid=1, just started
1:C 09 Apr 2024 13:04:22.516 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
1:M 09 Apr 2024 13:04:22.519 * monotonic clock: POSIX clock_gettime
1:M 09 Apr 2024 13:04:22.528 * Running mode=standalone, port=6379.
1:M 09 Apr 2024 13:04:22.533 * Server initialized
1:M 09 Apr 2024 13:04:22.541 * Loading RDB produced by version 7.2.1
1:M 09 Apr 2024 13:04:22.541 * RDB age 2 seconds
1:M 09 Apr 2024 13:04:22.542 * RDB memory usage when created 0.93 Mb
1:M 09 Apr 2024 13:04:22.542 * Done loading RDB, keys loaded: 2, keys expired: 0.
1:M 09 Apr 2024 13:04:22.542 * DB loaded from disk: 0.009 seconds
1:M 09 Apr 2024 13:04:22.542 * Ready to accept connections tcp

Part of backend container log (this is the whole thing I can see in portainer):

2024-04-09T13:07:21.524Z  INFO 10 --- [           main] c.g.mdeluise.plantit.ApplicationConfig   : UPDATE_EXISTING flag set to false. Skipping update of existing species.
2024-04-09T13:07:23.716Z ERROR 10 --- [   scheduling-1] o.s.s.s.TaskUtils$LoggingErrorHandler    : Unexpected error occurred in scheduled task
org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.translateException(LettuceConnectionFactory.java:1602) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1533) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1358) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1341) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:1059) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:398) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.cache.DefaultRedisCacheWriter.execute(DefaultRedisCacheWriter.java:272) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.cache.DefaultRedisCacheWriter.clean(DefaultRedisCacheWriter.java:189) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.cache.RedisCache.clear(RedisCache.java:190) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.cache.RedisCache.clear(RedisCache.java:178) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.cache.interceptor.AbstractCacheInvoker.doClear(AbstractCacheInvoker.java:122) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.cache.interceptor.CacheAspectSupport.performCacheEvict(CacheAspectSupport.java:505) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.cache.interceptor.CacheAspectSupport.processCacheEvicts(CacheAspectSupport.java:493) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:434) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.cache.interceptor.CacheAspectSupport.execute(CacheAspectSupport.java:345) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.cache.interceptor.CacheInterceptor.invoke(CacheInterceptor.java:64) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:184) ~[spring-aop-6.0.3.jar!/:6.0.3]
    at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:752) ~[spring-aop-6.0.3.jar!/:6.0.3]
    at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:703) ~[spring-aop-6.0.3.jar!/:6.0.3]
    at com.github.mdeluise.plantit.ApplicationConfig$$SpringCGLIB$$3.cacheEvict(<generated>) ~[classes!/:0.4.3]
    at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:103) ~[na:na]
    at java.base/java.lang.reflect.Method.invoke(Method.java:580) ~[na:na]
    at org.springframework.scheduling.support.ScheduledMethodRunnable.run(ScheduledMethodRunnable.java:84) ~[spring-context-6.0.3.jar!/:6.0.3]
    at org.springframework.scheduling.support.DelegatingErrorHandlingRunnable.run(DelegatingErrorHandlingRunnable.java:54) ~[spring-context-6.0.3.jar!/:6.0.3]
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572) ~[na:na]
    at java.base/java.util.concurrent.FutureTask.runAndReset(FutureTask.java:358) ~[na:na]
    at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:305) ~[na:na]
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144) ~[na:na]
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642) ~[na:na]
    at java.base/java.lang.Thread.run(Thread.java:1583) ~[na:na]
Caused by: org.springframework.data.redis.connection.PoolException: Could not get a resource from the pool
    at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:105) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1531) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    ... 28 common frames omitted
Caused by: io.lettuce.core.RedisConnectionException: Unable to connect to cache/<unresolved>:6379
    at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:78) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:56) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at io.lettuce.core.AbstractRedisClient.getConnection(AbstractRedisClient.java:350) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at io.lettuce.core.RedisClient.connect(RedisClient.java:216) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:111) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at java.base/java.util.Optional.orElseGet(Optional.java:364) ~[na:na]
    at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:111) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.lambda$getConnection$0(LettucePoolingConnectionProvider.java:93) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:211) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:201) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:70) ~[commons-pool2-2.11.1.jar!/:2.11.1]
    at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:571) ~[commons-pool2-2.11.1.jar!/:2.11.1]
    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:298) ~[commons-pool2-2.11.1.jar!/:2.11.1]
    at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:223) ~[commons-pool2-2.11.1.jar!/:2.11.1]
    at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:122) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:117) ~[lettuce-core-6.2.2.RELEASE.jar!/:6.2.2.RELEASE]
    at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:99) ~[spring-data-redis-3.0.0.jar!/:3.0.0]
    ... 29 common frames omitted
Caused by: java.net.UnknownHostException: cache: Name or service not known
    at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method) ~[na:na]
    at java.base/java.net.Inet6AddressImpl.lookupAllHostAddr(Inet6AddressImpl.java:52) ~[na:na]
    at java.base/java.net.InetAddress$PlatformResolver.lookupByName(InetAddress.java:1211) ~[na:na]
    at java.base/java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1828) ~[na:na]
    at java.base/java.net.InetAddress$NameServiceAddresses.get(InetAddress.java:1139) ~[na:na]
    at java.base/java.net.InetAddress.getAllByName0(InetAddress.java:1818) ~[na:na]
    at java.base/java.net.InetAddress.getAllByName(InetAddress.java:1688) ~[na:na]
    at java.base/java.net.InetAddress.getByName(InetAddress.java:1568) ~[na:na]
    at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:156) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.internal.SocketUtils$8.run(SocketUtils.java:153) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at java.base/java.security.AccessController.doPrivileged(AccessController.java:571) ~[na:na]
    at io.netty.util.internal.SocketUtils.addressByName(SocketUtils.java:153) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.DefaultNameResolver.doResolve(DefaultNameResolver.java:41) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:61) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.SimpleNameResolver.resolve(SimpleNameResolver.java:53) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:55) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.InetSocketAddressResolver.doResolve(InetSocketAddressResolver.java:31) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.resolver.AbstractAddressResolver.resolve(AbstractAddressResolver.java:106) ~[netty-resolver-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.bootstrap.Bootstrap.doResolveAndConnect0(Bootstrap.java:206) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.bootstrap.Bootstrap.access$000(Bootstrap.java:46) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:180) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.bootstrap.Bootstrap$1.operationComplete(Bootstrap.java:166) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:590) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:557) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:492) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.setValue0(DefaultPromise.java:636) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.setSuccess0(DefaultPromise.java:625) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.DefaultPromise.trySuccess(DefaultPromise.java:105) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.DefaultChannelPromise.trySuccess(DefaultChannelPromise.java:84) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetSuccess(AbstractChannel.java:990) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.AbstractChannel$AbstractUnsafe.register0(AbstractChannel.java:516) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.AbstractChannel$AbstractUnsafe.access$200(AbstractChannel.java:429) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.AbstractChannel$AbstractUnsafe$1.run(AbstractChannel.java:486) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.AbstractEventExecutor.runTask(AbstractEventExecutor.java:174) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:167) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:470) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:569) ~[netty-transport-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.86.Final.jar!/:4.1.86.Final]
    ... 1 common frames omitted
MDeLuise commented 7 months ago

Hi @WallK, thanks for opening the issue! I think the problem reside in the backend.env file, specifically here:

CACHE_HOST=cache

Please try to change that line to

CACHE_HOST=redis-cache

Since your redis container is named redis-cache

WallK commented 7 months ago

Yep, I knew I missed to change something! Thank you!

It actually works now, custom plants are saved and shown!

While we are here: why the app need root db password? To create users and database if not present? Also, I'm still not sure how to add non-custom plants? (: I have added Trefle key in config and I can see Dactylis glomerata (I guess a demo plant added for me?), but how to add plant without clicking on Custom? I had expectations that I will have an ability to pick from a list or something. Was I mistaken?

Thank you again!

MDeLuise commented 7 months ago

I think MYSQL_ROOT_PASSWORD could be removed since not used in the codebase, I will better test this and cleanup the property if confirmed to not be used anymore, thanks for pointing out!

Regarding the Trefle integration, it seems to be functioning as expected if you're able to see Dactylis glomerata without having manually added it. The system should ideally display matching species when you search for a plant name, similar to the example below:

Could you please provide more details on the problem you're experiencing?

WallK commented 7 months ago

Alright, my problem was I was typing and erasing too fast haha It does appear if I wait a bit! Thank you!

I think there should be some indication (or onboarding, or just a bit more directly worded docs) on this, if it makes sense

Ah, another thing, about backend starting time It takes unusually long time after restart to get up and running I see 120 second timeout "Waiting for DB" (or something very similar) and then it takes 46 seconds or so to fully start Is it a race condition with DB container? Backend has depends_on:  - mariadb in it, so it should be somewhat up by that point I think DB is not fast enough to initialize before first call to it is made by plant-it backend, so it goes into that timeout script thingy Or is it all normal behavior? Should I create another issue for this? Sorry for adding more comments about other stuff, it's just very tedious to fill you bug report template (and I'm really not sure it's a bug, this is just a "question")

Thanks again!

MDeLuise commented 7 months ago

I think there should be some indication (or onboarding, or just a bit more directly worded docs) on this, if it makes sense

Currently I'm working on a big refactor on the frontend side, I'm developing a mobile app that will be available in the following weeks so expect some improvements on that side!

Ah, another thing, about backend starting time

I will check the backend startup issue, some changes will be implemented in the next release even on the backend side so hopefully this issue will be fixed. I will notify when the new release will be available and if the problem will persist I will surely take a look into that, thanks!

Sorry for adding more comments about other stuff, it's just very tedious to fill you bug report template (and I'm really not sure it's a bug, this is just a "question")

You're welcome to provide as much feedback and question as possible, I'm really happy to read them!