Closed LordNex closed 6 months ago
what type of storage is the DB saved on?
NAS mapped via SMB through it was at one point on the local server. With the newer systems I've converted everything over to NAS. Actually OpenMedialVault 6 as a VM inside ESXi7.0.3 on my power edge. I have a separate VM of Ubuntu Server LTS running Frigate. Although I've been playing with utilizing ProjectCodeIA's mesh ability but it seem if I install them both on the frigate box, whatever fires up first gets the TPU and the other doesn't fire up. I have one inside my HA install. Was hoping to get it to load and run beside Frigate with the coral TPU in a mesh. But even with mDNS and SSDP enabled for dual stack, I can't get Frigate to see either of the detectors. Which isn't that big of deal as I plan on installing either PCAI or CompreFace locally on a Jetson Nano as soon as I get an SSD to install it on instead of an SD Card.
it is not recommended to run a DB on network storage for this exact reason
it is not recommended to run a DB on network storage for this exact reason
Well I didn't move it to there. All I did was follow the changes you guys documented to make in the docker compose file to migrate to the newest setting. I let Frigate decide where to place stuff but it seems like that's what it should be. Give me a bit and I'll put up my docker-compose file and you can look at it
Oh and technically there on the same physical server. We're just talking about different VMs it's only processing the vSwitch traffic which is all CPU and PCI Lane Speed I believe.
On a side note, I was able to get my Tuya based pet feeder to install on home assistant. And sure enough I can get a stream out of it. I've ran a port scan on it and it doesn't seem to be using any known ports that I'm aware of. My guess is that it's being proxied by Tuya Client, but I'm not sure how to get this stream over to the Frigate VM of Ubuntu Server 22 LTS. It's in one VM, another is Home Assistant. And the 3rd running OpenMediaVault as a NAS. This is running on a 40 core 13th(I think)@ gen Dell PowerEdge R620 with 256 gigs of ram and 4 x 1.2TB SAS drives in RAID 5. The Host OS runs off dual redundant RAID 1 SD Cards and its VMWare ESXi 7.2 U3. All 4 Intel NICs are being aggregated via LACP on a Layer 3 Aruba Smart Switch. So it should be even talking to anything minus bringing in the streams from the network cameras. Which from what I can tell seem to take about 5Mbps a second aggregatively in and out as it should be using RMTP to send to HA and our devices. So far it's been working. Starting to get a little low on space so I need to clean some stuff up on the server. A fresh install would probably do good too.
I'm building a new Gaming Rig at the moment so once that's up to speed I should be good.
Oh and technically there on the same physical server. We're just talking about different VMs it's only processing the vSwitch traffic which is all CPU and PCI Lane Speed I believe.
right, but the issue with hosting the DB on network storage is not the speed, it is how the network storage protocols work.
I'm not sure how to get this stream over to the Frigate VM of Ubuntu Server 22 LTS
https://github.com/felipecrs/hass-expose-camera-stream-source
Oh and technically there on the same physical server. We're just talking about different VMs it's only processing the vSwitch traffic which is all CPU and PCI Lane Speed I believe.
right, but the issue with hosting the DB on network storage is not the speed, it is how the network storage protocols work.
Yea I have no problem running it locally. Only down side is when I made the VM I made it only 64 gigs I can't expand it easy.Actually I think it got mapped there during a beta swap one time and at the time we were told to run like that. And it worked or I'd have known. But it can. Have its own partition as far as I'm concerned. That's the main reason I built it is to tie all these things together. With as many PCI x 16 lanes as this has I'm tempted to throw in a GPU and M.2 card and have it run the main code with the old NAS as a backup. But time and budget will see.
OMV does pretty well and it has direct access to the iDRAC and its resources to build the actual cluster. This is why I'm not running the host VM off of those drives.
I'd have to check but it's either its own proprietary filesystem or EXFAT but has both CIFS and SMB protocols active and both file systems can read the data. It doesn't matter if I'm connecting from a Windows machine, Linux, or my iPhone. And from what I see with the hardware monitor doesn't seem to be that much.
But we're virtualization on virtualization here so I'm not sure how that gets handled. __
I'm not sure how to get this stream over to the Frigate VM of Ubuntu Server 22 LTS
https://github.com/felipecrs/hass-expose-camera-stream-source
Thank you very kindly sir. I will be soon putting that to use
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Describe the problem you are having
I updated to the latest version and I received the following error. I don’t have a problem deleting and rebuilding the database but I thought you guys might want to see.
''' frigate | s6-rc: info: service s6rc-fdholder: starting frigate | s6-rc: info: service s6rc-oneshot-runner: starting frigate | s6-rc: info: service s6rc-fdholder successfully started frigate | s6-rc: info: service s6rc-oneshot-runner successfully started frigate | s6-rc: info: service fix-attrs: starting frigate | s6-rc: info: service fix-attrs successfully started frigate | s6-rc: info: service legacy-cont-init: starting frigate | s6-rc: info: service legacy-cont-init successfully started frigate | s6-rc: info: service log-prepare: starting frigate | s6-rc: info: service log-prepare successfully started frigate | s6-rc: info: service nginx-log: starting frigate | s6-rc: info: service go2rtc-log: starting frigate | s6-rc: info: service frigate-log: starting frigate | s6-rc: info: service go2rtc-log successfully started frigate | s6-rc: info: service go2rtc: starting frigate | s6-rc: info: service nginx-log successfully started frigate | s6-rc: info: service frigate-log successfully started frigate | s6-rc: info: service go2rtc successfully started frigate | s6-rc: info: service go2rtc-healthcheck: starting frigate | s6-rc: info: service frigate: starting frigate | s6-rc: info: service go2rtc-healthcheck successfully started frigate | 2024-01-08 22:03:02.661738795 [INFO] Preparing Frigate... frigate | 2024-01-08 22:03:02.671158688 [INFO] Preparing new go2rtc config... frigate | s6-rc: info: service frigate successfully started frigate | s6-rc: info: service nginx: starting frigate | 2024-01-08 22:03:02.703596950 [INFO] Starting NGINX... frigate | 2024-01-08 22:03:02.705415780 [INFO] Starting Frigate... frigate | s6-rc: info: service nginx successfully started frigate | s6-rc: info: service legacy-services: starting frigate | s6-rc: info: service legacy-services successfully started frigate | 2024-01-08 22:03:03.342500397 [INFO] Starting go2rtc... frigate | 2024-01-08 22:03:03.429281817 22:03:03.429 INF go2rtc version 1.8.4 linux/amd64 frigate | 2024-01-08 22:03:03.430149918 22:03:03.430 INF [api] listen addr=:1984 frigate | 2024-01-08 22:03:03.430369749 22:03:03.430 INF [rtsp] listen addr=:8554 frigate | 2024-01-08 22:03:03.430787775 22:03:03.430 INF [webrtc] listen addr=:8555 frigate | 2024-01-08 22:03:04.492852448 [2024-01-08 22:03:04] frigate.app INFO : Starting Frigate (0.13.0-49814b3) frigate | 2024-01-08 22:03:04.608574857 [2024-01-08 22:03:04] peewee_migrate.logs INFO : Starting migrations frigate | 2024-01-08 22:03:04.708261497 [2024-01-08 22:03:04] peewee_migrate.logs INFO : There is nothing to migrate frigate | 2024-01-08 22:03:05.109157122 database disk image is malformed frigate | 2024-01-08 22:03:06.142685591 [INFO] Service Frigate exited with code 1 (by signal 0) frigate | s6-rc: info: service legacy-services: stopping frigate | s6-rc: info: service legacy-services successfully stopped frigate | s6-rc: info: service nginx: stopping frigate | s6-rc: info: service go2rtc-healthcheck: stopping frigate | 2024-01-08 22:03:06.159429479 [INFO] The go2rtc-healthcheck service exited with code 256 (by signal 15) frigate | s6-rc: info: service go2rtc-healthcheck successfully stopped frigate | 2024-01-08 22:03:06.228774467 [INFO] Service NGINX exited with code 0 (by signal 0) frigate | s6-rc: info: service nginx successfully stopped frigate | s6-rc: info: service nginx-log: stopping frigate | s6-rc: info: service frigate: stopping frigate | s6-rc: info: service frigate successfully stopped frigate | s6-rc: info: service go2rtc: stopping frigate | s6-rc: info: service frigate-log: stopping frigate | s6-rc: info: service nginx-log successfully stopped frigate | 2024-01-08 22:03:06.532191079 exit with signal: terminated frigate | 2024-01-08 22:03:06.548768207 [INFO] The go2rtc service exited with code 0 (by signal 0) frigate | s6-rc: info: service go2rtc successfully stopped frigate | s6-rc: info: service go2rtc-log: stopping frigate | s6-rc: info: service frigate-log successfully stopped frigate | s6-rc: info: service go2rtc-log successfully stopped frigate | s6-rc: info: service log-prepare: stopping frigate | s6-rc: info: service s6rc-fdholder: stopping frigate | s6-rc: info: service log-prepare successfully stopped frigate | s6-rc: info: service legacy-cont-init: stopping frigate | s6-rc: info: service s6rc-fdholder successfully stopped frigate | s6-rc: info: service legacy-cont-init successfully stopped frigate | s6-rc: info: service fix-attrs: stopping frigate | s6-rc: info: service fix-attrs successfully stopped frigate | s6-rc: info: service s6rc-oneshot-runner: stopping frigate | s6-rc: info: service s6rc-oneshot-runner successfully stopped frigate exited with code 0 '''
Version
0.13.0vrc1
Frigate config file
Relevant log output
FFprobe output from your camera
Frigate stats
Operating system
Debian
Install method
Docker Compose
Coral version
USB
Network connection
Wired
Camera make and model
Amcrest AD410, ReoLink RCL-520, Wyze Pan Cam with RSTP firmware, and attempting to get a Tuya Pet Feeder Cam integrated as well
Any other information that may be helpful
I was working also on trying to get my Tuya Pet Feeder Camera to feed as well. Which may have corrupted the DB I’m not sure. I have it as an entity in HA and it shows up under go2rtc but I’m not sure how to get the local stream from it. I commented out that entire camera but still get the following error. I’m going to try and just delete the frigate.db file and let it rebuild. Worse cast I can always revert to a snapshot of the VM it’s running in and reupgrade.