roflcoopter / viseron

Self-hosted, local only NVR and AI Computer Vision software. With features such as object detection, motion detection, face recognition and more, it gives you the power to keep an eye on your home, office or any other place you want to monitor.
MIT License
1.74k stars 179 forks source link

Duplicate Key Violation Error in files_meta_path_key Causing Package Lock in Latest Dev Version #794

Open madman2012 opened 3 months ago

madman2012 commented 3 months ago

On latest dev, I am getting the following error:

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:52.020 EDT [1142] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:52.020 EDT [1142] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:52.020 EDT [1142] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:52] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 167 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:53.014 EDT [1146] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:53.014 EDT [1146] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:53.014 EDT [1146] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:53] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 168 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:54.081 EDT [3302] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:54.081 EDT [3302] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:54.081 EDT [3302] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:54] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 169 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:55.013 EDT [1110] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:55.013 EDT [1110] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:55.013 EDT [1110] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:55] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 170 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:56.017 EDT [1110] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:56.017 EDT [1110] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:56.017 EDT [1110] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:56] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 171 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:57.012 EDT [3302] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:57.012 EDT [3302] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:57.012 EDT [3302] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:57] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 172 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:58.020 EDT [1110] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:58.020 EDT [1110] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:58.020 EDT [1110] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:58] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 173 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:23:59.020 EDT [1110] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:23:59.020 EDT [1110] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:23:59.020 EDT [1110] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:23:59] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 174 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:24:00.012 EDT [1110] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:24:00.012 EDT [1110] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:24:00.012 EDT [1110] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:24:00] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 175 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:24:01.013 EDT [1146] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:24:01.013 EDT [1146] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:24:01.013 EDT [1146] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:24:01] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 176 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:24:02.016 EDT [1146] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:24:02.016 EDT [1146] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:24:02.016 EDT [1146] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:24:02] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 177 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:24:03.085 EDT [1146] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:24:03.085 EDT [1146] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:24:03.085 EDT [1146] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:24:03] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 178 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-08 20:24:04.017 EDT [1146] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-08 20:24:04.017 EDT [1146] DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists. 2024-08-08 20:24:04.017 EDT [1146] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/front_door/segmen [2024-08-08 20:24:04] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723162858.m4s, message repeated 179 times Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 276, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 222, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/front_door/segments/front_door/1723162858.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/front_door/segments/front_door/1723162858.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 0, 20, 58), 'meta': '{"m3u8": {"EXTINF": 5.4895}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj)

here is my config: ffmpeg: global: hwaccel_args: -hwaccel nvdec -c:v hevc_cuvid -c:v h264_cuvid -c:v hevc_nvenc -c:v h264_nvenc camera: front_door: name: Front Door host: 10.0.1.188 path: /cam/realmonitor?channel=1&subtype=0 port: 554 stream_format: rtsp username: password: fps: 15 recorder: lookback: 10

back_yard:
  name: Back Yard
  host: 10.0.1.196
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username: <scrubbed>
  password: <scrubbed>
  fps: 15
  codec: hevc_cuvid
  recorder:
    lookback: 10

drive_way:
  name: Drive Way
  host: 10.0.1.197
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username: <scrubbed>
  password: <scrubbed>
  fps: 15
  codec: hevc_cuvid
  recorder:
    lookback: 10

garage:
  name: Garage
  host: 10.0.1.126
  path: /cam/realmonitor?channel=4&subtype=0
  port: 554
  stream_format: rtsp
  username: <scrubbed>
  password: <scrubbed>
  fps: 15
  codec: hevc_cuvid
  recorder:
    lookback: 10

mog2: motion_detector: cameras: front_door: trigger_recorder: true recorder_keepalive: true max_recorder_keepalive: 30 fps: 5 area: 0.05 width: 300 height: 300 mask: [] threshold: 25 history: 500 detect_shadows: true learning_rate: 0.01

  back_yard:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 25
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  drive_way:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 25
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  garage:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 30
    history: 500
    detect_shadows: true
    learning_rate: 0.01

darknet: object_detector: cameras: front_door: fps: 2 scan_on_motion_only: true labels:

nvr: front_door: {} back_yard: {} drive_way: {} garage: {}

webserver: {}

logger: default_level: error logs: viseron.components.ffmpeg: error viseron.components.nvr: error viseron.components.darknet: error viseron.components.mog2: error viseron.components.storage: error sqlalchemy.engine: error

storage: recorder: tiers:

roflcoopter commented 3 months ago

Should be fixed now in dev. The files were not being removed after they were processed, but now they are. Depending on how many files you still have in the temp folder you might get a sea of errors when restarting, but eventually Viseron will catch up and have deleted all the files.

Also i think you have missunderstood the tiers config a bit. The storage > recorder > tiers config is global for all cameras, so with the config you have now all your cameras will first move to /recordings/front_door, then to /recordings/back_yard, then to /recordings/drive_way etc etc

This should be enough to save everything for 30 days:

storage:
  recorder:
    tiers:
      - path: /recordings
        events:
          max_age:
            days: 30
        continuous:
          max_age:
            days: 30

  snapshots:
    tiers:
      - path: /snapshots
        max_age:
          days: 30
madman2012 commented 3 months ago

Jesper - thank you for the clarification on the recorder settings.

I am getting this error now in Dev, but the container appears to continue to run.

[2024-08-09 11:04:58] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-09 11:04:58] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-09 11:04:59] [INFO ] [viseron.components] - Setting up component logger [2024-08-09 11:05:00] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/back_yard (10.0.1.57) 9.63ms [2024-08-09 11:05:00] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/drive_way (10.0.1.57) 9.69ms [2024-08-09 11:05:00] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/front_door (10.0.1.57) 9.10ms [2024-08-09 11:05:00] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/garage (10.0.1.57) 8.59ms waiting for server to shut down.... done server stopped Waiting for PostgreSQL Server to stop... /var/run/postgresql:5432 - no response PostgreSQL Server has stopped! [cont-finish.d] 10-postgres: exited 0. [cont-finish.d] done. [s6-finish] waiting for services. [s6-finish] sending all processes the TERM signal. [s6-init] making user provided files available at /var/run/s6/etc...exited 0. [s6-init] ensuring user provided files have correct perms...exited 0. [fix-attrs.d] applying ownership & permissions fixes... [fix-attrs.d] done. [cont-init.d] executing container initialization scripts... [cont-init.d] 10-adduser: executing... usermod: no changes **** UID/GID *** User uid: 0 User gid: 0 ** Done ** [cont-init.d] 10-adduser: exited 0. [cont-init.d] 20-gid-video-device: executing... [cont-init.d] 20-gid-video-device: exited 0. [cont-init.d] 30-edgetpu-permission: executing... ** Setting EdgeTPU permissions * Coral Vendor IDs: "1a6e" "18d1" No EdgeTPU USB device was found No EdgeTPU PCI device was found ** Done ** [cont-init.d] 30-edgetpu-permission: exited 0. [cont-init.d] 40-set-env-vars: executing... ** Checking for hardware acceleration platforms ** OpenCL is available! VA-API cannot be used CUDA is available! PostgreSQL major version: 14 PostgreSQL bin: /usr/lib/postgresql/14/bin * Done *** [cont-init.d] 40-set-env-vars: exited 0. [cont-init.d] 50-check-if-rpi: executing... ** Checking if we are running on an RPi ** Not running on any supported RPi * Done *** [cont-init.d] 50-check-if-rpi: exited 0. [cont-init.d] 55-check-if-jetson: executing... ** Checking if we are running on a Jetson Board ** Not running on any supported Jetson board * Done *** [cont-init.d] 55-check-if-jetson: exited 0. [cont-init.d] 60-ffmpeg-path: executing... ** Getting FFmpeg path * FFmpeg path: /home/abc/bin/ffmpeg ***** Done ***** [cont-init.d] 60-ffmpeg-path: exited 0. [cont-init.d] 70-gstreamer-path: executing... * Getting GStreamer path * GStreamer path: /usr/bin/gst-launch-1.0 *** Done ***** [cont-init.d] 70-gstreamer-path: exited 0. [cont-init.d] 80-postgres: executing... * Preparing PostgreSQL * Database has already been initialized. ***** Done ***** [cont-init.d] 80-postgres: exited 0. [cont-init.d] done. [services.d] starting services [services.d] done. Starting PostgreSQL Server... /var/run/postgresql:5432 - no response Waiting for PostgreSQL Server to start... /var/run/postgresql:5432 - accepting connections PostgreSQL Server has started! [2024-08-09 11:24:36] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector, message repe [2024-08-09 11:24:40] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector, message repeated 3 times 2024-08-09 11:24:58.757 EDT [10010] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-09 11:24:58.757 EDT [10010] DETAIL: Key (path)=(/recordings/segments/drive_way/1723217092.m4s) already exists. 2024-08-09 11:24:58.757 EDT [10010] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/drive_way/1723217092.m4s', '2024-08-09T15:24:52'::timestamp, '{"m3u8": {"EXTINF": 5.053646}}') RETURNING files_meta.id [2024-08-09 11:24:58] [ERROR ] [viseron.domains.camera.fragmenter.drive_way] - Failed to process m4s file 1723217092.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/drive_way/1723217092.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/drive_way/1723217092.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/drive_way/1723217092.m4s', 'orig_ctime': datetime.datetime(2024, 8, 9, 15, 24, 52), 'meta': '{"m3u8": {"EXTINF": 5.053646}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-08-09 11:25:07] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector, message repe [2024-08-09 11:25:10] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector, message repeated 3 times [2024-08-09 11:25:36] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x563f11ffc600] RTP: PT=62: bad cseq 0f2a expected=01aa [2024-08-09 11:33:31] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector 2024-08-09 11:33:32.299 EDT [32103] ERROR: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device 2024-08-09 11:33:32.299 EDT [32103] STATEMENT: --sql WITH recording_files as ( SELECT f.id as file_id ,f.tier_id ,f.tier_path ,f.camera_identifier ,f.category ,f.subcategory ,f.path ,f.size ,r.id as recording_id ,r.created_at as recording_created_at ,meta.orig_ctime FROM files f JOIN files_meta meta ON f.path = meta.path LEFT JOIN recordings r ON f.camera_identifier = r.camera_identifier AND meta.orig_ctime BETWEEN r.start_time - INTERVAL '10 sec'

2024-08-09 11:33:32.300 EDT [32102] ERROR: could not resize shared memory segment "/PostgreSQL.977611276" to 4194304 bytes: No space left on device 2024-08-09 11:33:32.300 EDT [32102] STATEMENT: --sql WITH recording_files as ( SELECT f.id as file_id ,f.tier_id ,f.tier_path ,f.camera_identifier ,f.category ,f.subcategory ,f.path ,f.size ,r.id as recording_id ,r.created_at as recording_created_at ,meta.orig_ctime FROM files f JOIN files_meta meta ON f.path = meta.path LEFT JOIN recordings r ON f.camera_identifier = r.camera_identifier AND meta.orig_ctime BETWEEN r.start_time - INTERVAL '10 sec'

2024-08-09 11:33:32.305 EDT [32086] ERROR: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device 2024-08-09 11:33:32.305 EDT [32086] CONTEXT: parallel worker 2024-08-09 11:33:32.305 EDT [32086] STATEMENT: --sql WITH recording_files as ( SELECT f.id as file_id ,f.tier_id ,f.tier_path ,f.camera_identifier ,f.category ,f.subcategory ,f.path ,f.size ,r.id as recording_id ,r.created_at as recording_created_at ,meta.orig_ctime FROM files f JOIN files_meta meta ON f.path = meta.path LEFT JOIN recordings r ON f.camera_identifier = r.camera_identifier AND meta.orig_ctime BETWEEN r.start_time - INTERVAL '10 sec'

2024-08-09 11:33:32.326 EDT [516] LOG: background worker "parallel worker" (PID 32102) exited with exit code 1 2024-08-09 11:33:32.335 EDT [516] LOG: background worker "parallel worker" (PID 32103) exited with exit code 1 2024-08-09 11:33:32.388 EDT [32086] LOG: could not remove directory "base/pgsql_tmp/pgsql_tmp32086.0.sharedfileset": Directory not empty [2024-08-09 11:33:32] [ERROR ] [root] - Uncaught thread exception Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.DiskFull: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device CONTEXT: parallel worker

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/usr/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/src/viseron/components/storage/tier_handler.py", line 223, in _process_events self._on_created(event) File "/src/viseron/components/storage/tier_handler.py", line 256, in _on_created self.check_tier() File "/src/viseron/components/storage/tier_handler.py", line 182, in check_tier self._check_tier(self._storage.get_session) File "/src/viseron/components/storage/tier_handler.py", line 427, in _check_tier events_file_ids = list(self._get_events_file_ids(session)) File "/src/viseron/components/storage/tier_handler.py", line 396, in _get_events_file_ids return get_recordings_to_move( File "/src/viseron/components/storage/tier_handler.py", line 922, in get_recordings_to_move result = session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2245, in _execute_internal result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.OperationalError: (psycopg2.errors.DiskFull) could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device CONTEXT: parallel worker

[SQL: --sql WITH recording_files as ( SELECT f.id as file_id ,f.tier_id ,f.tier_path ,f.camera_identifier ,f.category ,f.subcategory ,f.path ,f.size ,r.id as recording_id ,r.created_at as recording_created_at ,meta.orig_ctime FROM files f JOIN files_meta meta ON f.path = meta.path LEFT JOIN recordings r ON f.camera_identifier = r.camera_identifier AND meta.orig_ctime BETWEEN r.start_time - INTERVAL '%(lookback)s sec'

Should be fixed now in dev. The files were not being removed after they were processed, but now they are. Depending on how many files you still have in the temp folder you might get a sea of errors when restarting, but eventually Viseron will catch up and have deleted all the files.

Also i think you have missunderstood the tiers config a bit. The storage > recorder > tiers config is global for all cameras, so with the config you have now all your cameras will first move to /recordings/front_door, then to /recordings/back_yard, then to /recordings/drive_way etc etc

This should be enough to save everything for 30 days:

storage: recorder: tiers:

— Reply to this email directly, view it on GitHubhttps://github.com/roflcoopter/viseron/issues/794#issuecomment-2277941826, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMS5U4WKIY46ITNOIYJRKLZQS7NTAVCNFSM6AAAAABMHPINTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZXHE2DCOBSGY. You are receiving this because you authored the thread.Message ID: @.***>

madman2012 commented 2 months ago

@roflcoopter Any ideas what is happening here? It happens on dev then fails to run anymore. I have to restart the container to get it recording again. Then it errors out and stops recording after 15-20mins.

text error warn system array login

self._objects_in_fov_setter(shared_frame, objects_in_fov)

File "/src/viseron/domains/object_detector/init.py", line 409, in _objects_in_fov_setter snapshot_path = self._camera.save_snapshot( File "/src/viseron/domains/camera/init.py", line 676, in save_snapshot decoded_frame = self.shared_frames.get_decoded_frame_rgb(shared_frame) File "/src/viseron/domains/camera/shared_frames.py", line 115, in get_decoded_frame_rgb return self._color_convert(shared_frame, COLOR_MODEL_RGB) File "/src/viseron/domains/camera/shared_frames.py", line 105, in _color_convert decoded_frame = self.get_decoded_frame(shared_frame).copy() File "/src/viseron/domains/camera/shared_frames.py", line 95, in get_decoded_frame return self._frames[shared_frame.name] KeyError: UUID('e7df6543-1fe6-40db-85e9-c00200f82d08') [2024-08-13 09:09:49] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for object_detector, message repeated 2 times [2024-08-13 09:09:50] [ERROR ] [viseron.watchdog.thread_watchdog] - Thread drive_way.object_detection is dead, restarting [2024-08-13 09:10:54] [ERROR ] [viseron.components.ffmpeg.camera.garage] - Frame reader process has exited [2024-08-13 09:10:59] [ERROR ] [viseron.components.ffmpeg.camera.garage] - Restarting frame pipe [2024-08-13 09:11:57] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Did not receive a frame [2024-08-13 09:11:59] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-13 09:12:02] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-13 09:12:04] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe [2024-08-13 10:43:27] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector [2024-08-13 10:44:02] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55db98c0f600] RTP: PT=62: bad cseq 7e9b expected=6d19 [2024-08-13 10:44:03] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55db98c0f600] RTP: PT=62: bad cseq 93d8 expected=81ef [2024-08-13 11:16:01] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55db98c0f600] RTP: PT=62: bad cseq 787f expected=670e [2024-08-13 11:16:15] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for object_detector [2024-08-13 11:16:26] [ERROR ] [viseron.components.nvr.nvr.back_yard] - Failed to retrieve result for motion_detector [2024-08-13 11:16:33] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55aa2089f600] RTP: PT=62: bad cseq f018 expected=e33f [2024-08-13 11:17:14] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55aa2089f600] RTP: PT=62: bad cseq 1310 expected=0740 [2024-08-13 11:17:16] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55aa2089f600] RTP: PT=62: bad cseq 292a expected=1d21 [2024-08-13 11:17:16] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55db98c0f600] RTP: PT=62: bad cseq 04dc expected=e1ba [2024-08-13 11:17:17] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55db98c0f600] RTP: PT=62: bad cseq 349c expected=086e [2024-08-13 11:17:19] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55aa2089f600] RTP: PT=62: bad cseq 45e1 expected=2e57 [2024-08-13 11:21:44] [ERROR ] [viseron.domains.camera.fragmenter.back_yard] - Failed to get extinf for 1723562280.m4s 2024-08-13 11:22:23.111 EDT [10578] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-13 11:22:23.111 EDT [10578] DETAIL: Key (path)=(/recordings/segments/front_door/1723562540.m4s) already exists. 2024-08-13 11:22:23.111 EDT [10578] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/front_door/1723562540.m4s', '2024-08-13T15:22:20'::timestamp, '{"m3u8": {"EXTINF": 2.643522}}') RETURNING files_meta.id [2024-08-13 11:22:23] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723562540.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723562540.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723562540.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723562540.m4s', 'orig_ctime': datetime.datetime(2024, 8, 13, 15, 22, 20), 'meta': '{"m3u8": {"EXTINF": 2.643522}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-08-13 11:22:34] [ERROR ] [viseron.components.ffmpeg.camera.front_door] - Frame reader process has exited [2024-08-13 11:22:40] [ERROR ] [viseron.components.ffmpeg.camera.front_door] - Restarting frame pipe [2024-08-13 11:22:51] [ERROR ] [viseron.components.ffmpeg.camera.garage] - Did not receive a frame [2024-08-13 11:22:56] [ERROR ] [viseron.components.ffmpeg.camera.garage] - Restarting frame pipe [2024-08-13 11:24:40] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Did not receive a frame [2024-08-13 11:24:41] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-13 11:24:45] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-13 11:24:46] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe

roflcoopter commented 2 months ago

Strange, what does your config look like now? Can you try to stop Viseron, delete the postgres folder in your config folder, delete all the files in recordings, segments, thumbnails and snapshots to see if the issues persist?

I have an upcoming fix for the missing frame (the UUID KeyError)

Regarding 2024-08-09 11:33:32.299 EDT [32103] ERROR: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device, there is a fix here: https://github.com/roflcoopter/viseron/discussions/721#discussioncomment-10286213

madman2012 commented 2 months ago

I deleted the database restarted it. It ran fine for a bit with h264_nvenc set for the h265 cameras. Then I changed config to copy to for the h265 streams to save resources on GPU, then I got this error after running copy codec for a while on the h265 streams.

The other behavior is that the event recording does not playback when the h265 camera is in copy, however the continuous recording does playback fine in h265 with copy codec on the h265 cameras so that is a bit strange.

config:

ffmpeg: global: hwaccel_args: -hwaccel nvdec -c:v hevc_cuvid -c:v h264_cuvid -c:v hevc_nvenc -c:v h264_nvenc camera: front_door: name: Front Door host: 10.0.1.188 path: /cam/realmonitor?channel=1&subtype=0 port: 554 stream_format: rtsp username: password: fps: 15 recorder: lookback: 5 codec: copy

back_yard:
  name: Back Yard
  host: 10.0.1.196
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 5
    codec: h264_nvenc

drive_way:
  name: Drive Way
  host: 10.0.1.197
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 5
    codec: h264_nvenc

garage:
  name: Garage
  host: 10.0.1.126
  path: /cam/realmonitor?channel=4&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 10
    codec: h264_nvenc

mog2: motion_detector: cameras: front_door: trigger_recorder: true recorder_keepalive: true max_recorder_keepalive: 30 fps: 5 area: 0.05 width: 300 height: 300 mask: [] threshold: 25 history: 500 detect_shadows: true learning_rate: 0.01

  back_yard:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 20
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  drive_way:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 20
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  garage:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 25
    history: 500
    detect_shadows: true
    learning_rate: 0.01

darknet: object_detector: cameras: front_door: fps: 2 scan_on_motion_only: true labels:

nvr: front_door: {} back_yard: {} drive_way: {} garage: {}

webserver: {}

logger: default_level: error logs: viseron.components.ffmpeg: error viseron.components.nvr: error viseron.components.darknet: error viseron.components.mog2: error viseron.components.storage: error sqlalchemy.engine: error

storage: recorder: tiers:

Errors:

Changed to h264


text error warn system array login

DETAIL: Key (path)=(/recordings/segments/front_door/1723674058.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674058.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723674058.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 22, 20, 58), 'meta': '{"m3u8": {"EXTINF": 3.048322}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-14 18:31:26.908 EDT [17745] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-14 18:31:26.908 EDT [17745] DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists. 2024-08-14 18:31:26.908 EDT [17745] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/front_door/1723674681.m4s', '2024-08-14T22:31:21'::timestamp, '{"m3u8": {"EXTINF": 4.908556}}') RETURNING files_meta.id [2024-08-14 18:31:26] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723674681.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723674681.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 22, 31, 21), 'meta': '{"m3u8": {"EXTINF": 4.908556}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-08-14 18:40:51] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Did not receive a frame [2024-08-14 18:40:56] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-14 18:42:19] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-14 18:42:24] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe

H265 PostgreSQL Server has started! [2024-08-14 10:07:05] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 10:07:05] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 10:07:05] [INFO ] [viseron.components] - Setting up component logger [2024-08-14 10:07:41] [ERROR ] [viseron.components.nvr.nvr.back_yard] - Failed to retrieve result for motion_detector [2024-08-14 10:07:54] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector [2024-08-14 10:08:30] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector [2024-08-14 10:10:32] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Frame reader process has exited [2024-08-14 10:10:33] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-14 10:10:37] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-14 10:10:38] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe [2024-08-14 10:14:56] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x56112d85b600] RTP: PT=62: bad cseq 6b52 expected=48c5 [2024-08-14 10:14:58] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55bb36a21600] RTP: PT=62: bad cseq fc82 expected=ea1b [2024-08-14 10:14:58] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55bb36a21600] RTP: PT=62: bad cseq 19d4 expected=0632 [2024-08-14 11:27:06] [ERROR ] [viseron.core] - child_process.darknet.object_detector.process did not exit in time [2024-08-14 11:27:06] [ERROR ] [viseron.core] - Forcefully kill child_process.darknet.object_detector.process [viseron-finish] Viseron exit code 100 /var/run/postgresql:5432 - accepting connections PostgreSQL Server has started! [2024-08-14 11:27:10] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 11:27:10] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 11:27:10] [INFO ] [viseron.components] - Setting up component logger [viseron-finish] Viseron exit code 100 /var/run/postgresql:5432 - accepting connections PostgreSQL Server has started! [2024-08-14 11:44:12] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 11:44:12] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 11:44:12] [INFO ] [viseron.components] - Setting up component logger [2024-08-14 11:44:17] [WARNING ] [tornado.access] - 404 GET /api/v1/recordings/front_door?latest=true (10.0.1.57) 18.49ms [2024-08-14 11:44:17] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/front_door (10.0.1.57) 18.40ms 2024-08-14 11:48:36.254 EDT [4036] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-14 11:48:36.254 EDT [4036] DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists. 2024-08-14 11:48:36.254 EDT [4036] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/garage/1723650513.m4s', '2024-08-14T15:48:33'::timestamp, '{"m3u8": {"EXTINF": 1.867578}}') RETURNING files_meta.id [2024-08-14 11:48:36] [ERROR ] [viseron.domains.camera.fragmenter.garage] - Failed to process m4s file 1723650513.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/garage/1723650513.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 15, 48, 33), 'meta': '{"m3u8": {"EXTINF": 1.867578}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj)


From: Jesper @.> Sent: Wednesday, August 14, 2024 4:15 AM To: roflcoopter/viseron @.> Cc: madman2012 @.>; Author @.> Subject: Re: [roflcoopter/viseron] Duplicate Key Violation Error in files_meta_path_key Causing Package Lock in Latest Dev Version (Issue #794)

Strange, what does your config look like now? Can you try to stop Viseron, delete the postgres folder in your config folder, delete all the files in recordings, segments, thumbnails and snapshots to see if the issues persist?

I have an upcoming fix for the missing frame (the UUID KeyError)

Regarding 2024-08-09 11:33:32.299 EDT [32103] ERROR: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device, there is a fix here: #721 (reply in thread)https://github.com/roflcoopter/viseron/discussions/721#discussioncomment-10286213

— Reply to this email directly, view it on GitHubhttps://github.com/roflcoopter/viseron/issues/794#issuecomment-2288122666, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMS5U6ILDZPUVPT2URXK23ZRMGZVAVCNFSM6AAAAABMHPINTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBYGEZDENRWGY. You are receiving this because you authored the thread.Message ID: @.***>

madman2012 commented 2 months ago

I left h264 for the HEVC cameras and still get the following error. After the error, the webui is not accessible anymore. It seems to run for a few hrs then stops...

text error warn system array login

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723827363.m4s', 'orig_ctime': datetime.datetime(2024, 8, 16, 16, 56, 3), 'meta': '{"m3u8": {"EXTINF": 6.182478}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-08-16 12:56:24] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeated 2 times [2024-08-16 12:56:30] [ERROR ] [root] - Uncaught thread exception Traceback (most recent call last): File "/usr/lib/python3.10/threading.py", line 1016, in _bootstrap_inner self.run() File "/usr/lib/python3.10/threading.py", line 953, in run self._target(*self._args, **self._kwargs) File "/src/viseron/domains/motion_detector/init.py", line 392, in _motion_detection self._filter_motion(shared_frame, contours) File "/src/viseron/domains/motion_detector/init.py", line 361, in _filter_motion self._motion_detected_setter( File "/src/viseron/domains/motion_detector/init.py", line 231, in _motion_detected_setter snapshot_path = self._camera.save_snapshot( File "/src/viseron/domains/camera/init.py", line 676, in save_snapshot decoded_frame = self.shared_frames.get_decoded_frame_rgb(shared_frame) File "/src/viseron/domains/camera/shared_frames.py", line 115, in get_decoded_frame_rgb return self._color_convert(shared_frame, COLOR_MODEL_RGB) File "/src/viseron/domains/camera/shared_frames.py", line 105, in _color_convert decoded_frame = self.get_decoded_frame(shared_frame).copy() File "/src/viseron/domains/camera/shared_frames.py", line 95, in get_decoded_frame return self._frames[shared_frame.name] KeyError: UUID('4158075b-74c0-4c3b-99b0-f8b64ce1227d') [2024-08-16 12:56:35] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeate [2024-08-16 12:56:38] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeate [2024-08-16 12:56:42] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeate [2024-08-16 12:56:46] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeate [2024-08-16 12:56:50] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector, message repeated 6 times [2024-08-16 12:56:53] [ERROR ] [viseron.watchdog.thread_watchdog] - Thread garage.motion_detection is dead, restarting [2024-08-16 12:56:55] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector [2024-08-16 12:57:07] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x5565bb169600] RTP: PT=62: bad cseq 2357 expected=d81d [2024-08-16 12:57:07] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x5565bb169600] RTP: PT=62: bad cseq 447a expected=30e4 [2024-08-16 12:57:34] [ERROR ] [viseron.components.ffmpeg.stream.garage] - [rtsp @ 0x557a61279600] RTP: PT=62: bad cseq 9aea expected=8e8c [2024-08-16 12:57:34] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55779adfa600] RTP: PT=62: bad cseq 8c74 expected=347a [2024-08-16 12:57:47] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x5565bb169600] RTP: PT=62: bad cseq 9176 expected=7082 [2024-08-16 12:57:57] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x55779adfa600] RTP: PT=62: bad cseq f5c4 expected=b5bc [2024-08-16 12:59:28] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x5565bb169600] RTP: PT=62: bad cseq 677c expected=55d8 2024-08-16 13:12:54.314 EDT [18943] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-16 13:12:54.314 EDT [18943] DETAIL: Key (path)=(/recordings/segments/front_door/1723828367.m4s) already exists. 2024-08-16 13:12:54.314 EDT [18943] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/front_door/1723828367.m4s', '2024-08-16T17:12:47'::timestamp, '{"m3u8": {"EXTINF": 5.976622}}') RETURNING files_meta.id [2024-08-16 13:12:54] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723828367.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723828367.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723828367.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723828367.m4s', 'orig_ctime': datetime.datetime(2024, 8, 16, 17, 12, 47), 'meta': '{"m3u8": {"EXTINF": 5.976622}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj)


From: Matthew Leone @.> Sent: Wednesday, August 14, 2024 10:27 PM To: roflcoopter/viseron @.>; roflcoopter/viseron @.> Cc: Author @.> Subject: Re: [roflcoopter/viseron] Duplicate Key Violation Error in files_meta_path_key Causing Package Lock in Latest Dev Version (Issue #794)

I deleted the database restarted it. It ran fine for a bit with h264_nvenc set for the h265 cameras. Then I changed config to copy to for the h265 streams to save resources on GPU, then I got this error after running copy codec for a while on the h265 streams.

The other behavior is that the event recording does not playback when the h265 camera is in copy, however the continuous recording does playback fine in h265 with copy codec on the h265 cameras so that is a bit strange.

config:

ffmpeg: global: hwaccel_args: -hwaccel nvdec -c:v hevc_cuvid -c:v h264_cuvid -c:v hevc_nvenc -c:v h264_nvenc camera: front_door: name: Front Door host: 10.0.1.188 path: /cam/realmonitor?channel=1&subtype=0 port: 554 stream_format: rtsp username: password: fps: 15 recorder: lookback: 5 codec: copy

back_yard:
  name: Back Yard
  host: 10.0.1.196
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 5
    codec: h264_nvenc

drive_way:
  name: Drive Way
  host: 10.0.1.197
  path: /cam/realmonitor?channel=1&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 5
    codec: h264_nvenc

garage:
  name: Garage
  host: 10.0.1.126
  path: /cam/realmonitor?channel=4&subtype=0
  port: 554
  stream_format: rtsp
  username:
  password:
  fps: 15
  recorder:
    lookback: 10
    codec: h264_nvenc

mog2: motion_detector: cameras: front_door: trigger_recorder: true recorder_keepalive: true max_recorder_keepalive: 30 fps: 5 area: 0.05 width: 300 height: 300 mask: [] threshold: 25 history: 500 detect_shadows: true learning_rate: 0.01

  back_yard:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 20
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  drive_way:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 20
    history: 500
    detect_shadows: true
    learning_rate: 0.01

  garage:
    trigger_recorder: true
    recorder_keepalive: true
    max_recorder_keepalive: 30
    fps: 5
    area: 0.05
    width: 300
    height: 300
    mask: []
    threshold: 25
    history: 500
    detect_shadows: true
    learning_rate: 0.01

darknet: object_detector: cameras: front_door: fps: 2 scan_on_motion_only: true labels:

nvr: front_door: {} back_yard: {} drive_way: {} garage: {}

webserver: {}

logger: default_level: error logs: viseron.components.ffmpeg: error viseron.components.nvr: error viseron.components.darknet: error viseron.components.mog2: error viseron.components.storage: error sqlalchemy.engine: error

storage: recorder: tiers:

Errors:

Changed to h264


text error warn system array login

DETAIL: Key (path)=(/recordings/segments/front_door/1723674058.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674058.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723674058.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 22, 20, 58), 'meta': '{"m3u8": {"EXTINF": 3.048322}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) 2024-08-14 18:31:26.908 EDT [17745] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-14 18:31:26.908 EDT [17745] DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists. 2024-08-14 18:31:26.908 EDT [17745] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/front_door/1723674681.m4s', '2024-08-14T22:31:21'::timestamp, '{"m3u8": {"EXTINF": 4.908556}}') RETURNING files_meta.id [2024-08-14 18:31:26] [ERROR ] [viseron.domains.camera.fragmenter.front_door] - Failed to process m4s file 1723674681.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723674681.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/front_door/1723674681.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 22, 31, 21), 'meta': '{"m3u8": {"EXTINF": 4.908556}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-08-14 18:40:51] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Did not receive a frame [2024-08-14 18:40:56] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-14 18:42:19] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-14 18:42:24] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe

H265 PostgreSQL Server has started! [2024-08-14 10:07:05] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 10:07:05] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 10:07:05] [INFO ] [viseron.components] - Setting up component logger [2024-08-14 10:07:41] [ERROR ] [viseron.components.nvr.nvr.back_yard] - Failed to retrieve result for motion_detector [2024-08-14 10:07:54] [ERROR ] [viseron.components.nvr.nvr.drive_way] - Failed to retrieve result for motion_detector [2024-08-14 10:08:30] [ERROR ] [viseron.components.nvr.nvr.garage] - Failed to retrieve result for motion_detector [2024-08-14 10:10:32] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Frame reader process has exited [2024-08-14 10:10:33] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Did not receive a frame [2024-08-14 10:10:37] [ERROR ] [viseron.components.ffmpeg.camera.drive_way] - Restarting frame pipe [2024-08-14 10:10:38] [ERROR ] [viseron.components.ffmpeg.camera.back_yard] - Restarting frame pipe [2024-08-14 10:14:56] [ERROR ] [viseron.components.ffmpeg.stream.back_yard] - [rtsp @ 0x56112d85b600] RTP: PT=62: bad cseq 6b52 expected=48c5 [2024-08-14 10:14:58] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55bb36a21600] RTP: PT=62: bad cseq fc82 expected=ea1b [2024-08-14 10:14:58] [ERROR ] [viseron.components.ffmpeg.stream.drive_way] - [rtsp @ 0x55bb36a21600] RTP: PT=62: bad cseq 19d4 expected=0632 [2024-08-14 11:27:06] [ERROR ] [viseron.core] - child_process.darknet.object_detector.process did not exit in time [2024-08-14 11:27:06] [ERROR ] [viseron.core] - Forcefully kill child_process.darknet.object_detector.process [viseron-finish] Viseron exit code 100 /var/run/postgresql:5432 - accepting connections PostgreSQL Server has started! [2024-08-14 11:27:10] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 11:27:10] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 11:27:10] [INFO ] [viseron.components] - Setting up component logger [viseron-finish] Viseron exit code 100 /var/run/postgresql:5432 - accepting connections PostgreSQL Server has started! [2024-08-14 11:44:12] [INFO ] [viseron.core] - ------------------------------------------- [2024-08-14 11:44:12] [INFO ] [viseron.core] - Initializing Viseron dev [2024-08-14 11:44:12] [INFO ] [viseron.components] - Setting up component logger [2024-08-14 11:44:17] [WARNING ] [tornado.access] - 404 GET /api/v1/recordings/front_door?latest=true (10.0.1.57) 18.49ms [2024-08-14 11:44:17] [WARNING ] [tornado.access] - 404 GET /api/v1/camera/front_door (10.0.1.57) 18.40ms 2024-08-14 11:48:36.254 EDT [4036] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-08-14 11:48:36.254 EDT [4036] DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists. 2024-08-14 11:48:36.254 EDT [4036] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/recordings/segments/garage/1723650513.m4s', '2024-08-14T15:48:33'::timestamp, '{"m3u8": {"EXTINF": 1.867578}}') RETURNING files_meta.id [2024-08-14 11:48:36] [ERROR ] [viseron.domains.camera.fragmenter.garage] - Failed to process m4s file 1723650513.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/domains/camera/fragmenter.py", line 274, in _handle_m4s self._write_files_metadata(file, extinf, program_date_time) File "/src/viseron/domains/camera/fragmenter.py", line 220, in _write_files_metadata session.execute(stmt) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/garage/1723650513.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/recordings/segments/garage/1723650513.m4s', 'orig_ctime': datetime.datetime(2024, 8, 14, 15, 48, 33), 'meta': '{"m3u8": {"EXTINF": 1.867578}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj)


From: Jesper @.> Sent: Wednesday, August 14, 2024 4:15 AM To: roflcoopter/viseron @.> Cc: madman2012 @.>; Author @.> Subject: Re: [roflcoopter/viseron] Duplicate Key Violation Error in files_meta_path_key Causing Package Lock in Latest Dev Version (Issue #794)

Strange, what does your config look like now? Can you try to stop Viseron, delete the postgres folder in your config folder, delete all the files in recordings, segments, thumbnails and snapshots to see if the issues persist?

I have an upcoming fix for the missing frame (the UUID KeyError)

Regarding 2024-08-09 11:33:32.299 EDT [32103] ERROR: could not resize shared memory segment "/PostgreSQL.3947467608" to 4194304 bytes: No space left on device, there is a fix here: #721 (reply in thread)https://github.com/roflcoopter/viseron/discussions/721#discussioncomment-10286213

- Reply to this email directly, view it on GitHubhttps://github.com/roflcoopter/viseron/issues/794#issuecomment-2288122666, or unsubscribehttps://github.com/notifications/unsubscribe-auth/ABMS5U6ILDZPUVPT2URXK23ZRMGZVAVCNFSM6AAAAABMHPINTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBYGEZDENRWGY. You are receiving this because you authored the thread.

madman2012 commented 2 months ago

Hi @roflcoopter,

I wanted to share what I've observed in case it helps with the next dev release.

The main issue I'm seeing is related to database inserts. There are a lot of errors like this:

sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723828367.m4s) already exists.

This seems to be happening in the fragmenter.py file, specifically in the _write_files_metadata method. From what I understand, it's trying to insert a record that already exists. Maybe there's a way to update the record instead of inserting a new one if it already exists? I am not sure why its doing this after a database reset.

Another thing I noticed is that sometimes threads seem to die, especially for motion detection. For example:

Certainly! Here's a revised, more specific comment from a non-coder perspective: CopyHi @roflcoopter,

I've been testing Viseron and it's really impressive! I've run into a few hiccups though, and I wanted to share what I've observed in case it helps with the next dev release.

The main issue I'm seeing is related to database inserts. There are a lot of errors like this: sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/recordings/segments/front_door/1723828367.m4s) already exists. Copy This seems to be happening in the fragmenter.py file, specifically in the _write_files_metadata method. From what I understand, it's trying to insert a record that already exists. Maybe there's a way to update the record instead of inserting a new one if it already exists?

Another thing I noticed is that sometimes threads seem to die, especially for motion detection. For example: [2024-08-16 12:56:53] [ERROR ] [viseron.watchdog.thread_watchdog] - Thread garage.motion_detection is dead, restarting

I'm not sure how to fix this, but perhaps there's a way to make these threads more resilient or restart them more smoothly?

Lastly, I sometimes get this error: KeyError: UUID('4158075b-74c0-4c3b-99b0-f8b64ce1227d')

This seems to be happening in the shared_frames.py file when trying to get a decoded frame. Maybe there's a way to handle this more gracefully if the frame isn't found?

I hope this information is helpful! I'm really excited about Viseron and appreciate all your hard work. Let me know if you need any more details or if there's anything else I can do to help test.

roflcoopter commented 2 months ago

The motion detector thread dies because of the exception when trying to get the missing shared frame. The frames are kept in memory for about 2 seconds, so it seems the motion detector thread is running very slowly. There are safe guards in place to not remove a frame that is currently being processed, but the were not enabled for the motion detectors or object detectors. I have that fixed locally so in the next release that should not be an issue.

Viseron periodically looks for new files to to insert into the database, and my guess is that multiple checks are running concurrently. I have some fixes for that locally as well which i believe will help.

I need a few more days to finish it up before i can push it so please bare with me!

roflcoopter commented 2 months ago

Changes are now pushed to dev that hopefully fixes this! If its not too much hassle i would suggest starting fresh again and see if you still encounter issues. That way we can me sure that there are now strange left overs from the previous runs.

madman2012 commented 1 month ago

Hi @roflcoopter

Thanks for such an awesome project. I am excited about it and still keep tinkering. I’m still encountering a few persistent issues with Viseron on the latest version of DEV that I hope you can help with. Below is a summary of the main errors and behaviors I'm experiencing, along with my configuration and Docker run command for your reference.

  1. Garbage Collection Error The following error is causing problems during garbage collection: Exception ignored in garbage collection: Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/watchdog/utils/dirsnapshot.py", line 353, in walk entry = (p, self.stat(p)) TypeError: object.new() takes exactly one argument (the type to instantiate)

  2. Segmentation Fault (Signal 11) This error occurs and crashes the container: [viseron-finish] Viseron received signal 11

  3. File Handling & Metadata Insertion Failures I've encountered several errors related to file movement and metadata insertion. Here's a representative log:

[ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to move file /localrecord/segments/back_yard/1727405014.m4s to /nasrecord/segments/back_yard/1727405014.m4s: [Errno 2] No such file or directory: '/localrecord/segments/back_yard/1727405014.m4s'

2024-09-27 22:45:22.796 EDT [150213] ERROR: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1727405018.m4s) already exists.

This error is repeated for various file paths and seems to be related to attempting to insert metadata into the database, causing it to fail due to a unique constraint violation:

sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key"

Summary: Garbage Collection TypeError: Preventing proper cleanup of filesystem objects. Segmentation Fault (Signal 11): Causing container crashes. File Handling & Metadata Integrity: Repeated failures when moving files and inserting metadata, likely causing cascading issues with the system.

Here is my current configuration and the Docker run command (with passwords removed) for reference:

docker run -d \ --name viseron \ -v /var/tmp/viseron:/localrecord \ -v /mnt/cctv/recordings:/nasrecord \ -v /etc/localtime:/etc/localtime:ro \ -v /dev/bus/usb:/dev/bus/usb \ --device /dev/dri \ --privileged \ -p 8888:8888 \ -e PUID=0 \ -e PGID=0 \ roflcoopter/viseron:dev

ffmpeg: camera: front_door: name: Front Door host: 10.0.1.188 path: /cam/realmonitor?channel=1&subtype=0 port: 554 stream_format: rtsp username: password: fps: 15 hwaccel_args:

edgetpu: object_detector: device: usb cameras: front_door: fps: 5 scan_on_motion_only: false labels:

storage: recorder: tiers:

Let me know if I can provide any more information to help troubleshoot these issues!

madman2012 commented 1 month ago

accidentally closed it. Sorry.

roflcoopter commented 1 month ago

Hmm this is a hard one. I have not seen the error 1 and 2, and its hard to figure out without more information.

Do you have a longer stacktrace of the first error o i can see what code in Viseron it originates from?

Would also need a crash dump to figure out what is causing the segmentation fault.

I do get sporadic error for duplicates as well so that is something i can work on by myself.

Thanks for your reports!

madman2012 commented 1 month ago

I ran the following log through 01 and this is what it came up with. If you tell me how to run the dumps I can get you more details from my log? I can probably figure that out with 01 and drop more here. Wish I could help more on the code but it sounds like an issue with the tiers.py? The code line fails are identified.

"It's all duplicate key stuff. Can you walk me through the fix step by step? 2024-10-04 16:55:55.159 EDT [39375] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 16:55:55.159 EDT [39375] DETAIL: Key (path)=(/nasrecord/segments/garage/1728062722.m4s) already exists. 2024-10-04 16:55:55.159 EDT [39375] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/garage/1728062722.m4s', '2024-10-04T17:25:22'::timestamp, '{"m3u8": {"EXTINF": 4.023145}}') RETURNING files_meta.id [2024-10-04 16:55:55] [ERROR ] [viseron.components.storage.tier_handler.garage.tier_0] - Failed to insert metadata for /nasrecord/segments/garage/1728062722.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/garage/1728062722.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/garage/1728062722.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/garage/1728062722.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 17, 25, 22), 'meta': '{"m3u8": {"EXTINF": 4.023145}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 16:55:55] [ERROR ] [viseron.components.storage.tier_handler.garage.tier_0] - Failed to move file /localrecord/segments/garage/1728062722.m4s to /nasrecord/segments/garage/1728062722.m4s: [Errno 2] No such file or directory: '/localrecord/segments/garage/1728062722.m4s' [2024-10-04 16:56:06] [ERROR ] [viseron.components.ffmpeg.stream.garage] - [h264 @ 0x648f61051e00] out of range intra chroma pred mode [2024-10-04 16:56:56] [ERROR ] [viseron.components.ffmpeg.stream.garage] - [h264 @ 0x648f61051e00] negative number of zero coeffs at 133 3 2024-10-04 16:57:43.854 EDT [39057] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 16:57:43.854 EDT [39057] DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066316.m4s) already exists. 2024-10-04 16:57:43.854 EDT [39057] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/back_yard/1728066316.m4s', '2024-10-04T18:25:16'::timestamp, '{"m3u8": {"EXTINF": 5.913477}}') RETURNING files_meta.id [2024-10-04 16:57:43] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to insert metadata for /nasrecord/segments/back_yard/1728066316.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066316.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066316.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/back_yard/1728066316.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 18, 25, 16), 'meta': '{"m3u8": {"EXTINF": 5.913477}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 16:57:44] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to move file /localrecord/segments/back_yard/1728066316.m4s to /nasrecord/segments/back_yard/1728066316.m4s: [Errno 2] No such file or directory: '/localrecord/segments/back_yard/1728066316.m4s' 2024-10-04 16:59:20.974 EDT [39375] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 16:59:20.974 EDT [39375] DETAIL: Key (path)=(/nasrecord/segments/drive_way/1728065801.m4s) already exists. 2024-10-04 16:59:20.974 EDT [39375] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/drive_way/1728065801.m4s', '2024-10-04T18:16:41'::timestamp, '{"m3u8": {"EXTINF": 6.001693}}') RETURNING files_meta.id [2024-10-04 16:59:20] [ERROR ] [viseron.components.storage.tier_handler.drive_way.tier_0] - Failed to insert metadata for /nasrecord/segments/drive_way/1728065801.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/drive_way/1728065801.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/drive_way/1728065801.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/drive_way/1728065801.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 18, 16, 41), 'meta': '{"m3u8": {"EXTINF": 6.001693}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 16:59:21] [ERROR ] [viseron.components.storage.tier_handler.drive_way.tier_0] - Failed to move file /localrecord/segments/drive_way/1728065801.m4s to /nasrecord/segments/drive_way/1728065801.m4s: [Errno 2] No such file or directory: '/localrecord/segments/drive_way/1728065801.m4s' 2024-10-04 16:59:49.325 EDT [39057] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 16:59:49.325 EDT [39057] DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066426.m4s) already exists. 2024-10-04 16:59:49.325 EDT [39057] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/back_yard/1728066426.m4s', '2024-10-04T18:27:06'::timestamp, '{"m3u8": {"EXTINF": 6.010417}}') RETURNING files_meta.id [2024-10-04 16:59:49] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to insert metadata for /nasrecord/segments/back_yard/1728066426.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066426.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066426.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/back_yard/1728066426.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 18, 27, 6), 'meta': '{"m3u8": {"EXTINF": 6.010417}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 16:59:49] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to move file /localrecord/segments/back_yard/1728066426.m4s to /nasrecord/segments/back_yard/1728066426.m4s: [Errno 2] No such file or directory: '/localrecord/segments/back_yard/1728066426.m4s' 2024-10-04 17:00:49.744 EDT [39057] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 17:00:49.744 EDT [39057] DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066472.m4s) already exists. 2024-10-04 17:00:49.744 EDT [39057] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/back_yard/1728066472.m4s', '2024-10-04T18:27:52'::timestamp, '{"m3u8": {"EXTINF": 4.007031}}') RETURNING files_meta.id [2024-10-04 17:00:49] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to insert metadata for /nasrecord/segments/back_yard/1728066472.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066472.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/back_yard/1728066472.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/back_yard/1728066472.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 18, 27, 52), 'meta': '{"m3u8": {"EXTINF": 4.007031}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 17:00:50] [ERROR ] [viseron.components.storage.tier_handler.back_yard.tier_0] - Failed to move file /localrecord/segments/back_yard/1728066472.m4s to /nasrecord/segments/back_yard/1728066472.m4s: [Errno 2] No such file or directory: '/localrecord/segments/back_yard/1728066472.m4s' 2024-10-04 17:03:09.089 EDT [39056] ERROR: duplicate key value violates unique constraint "files_meta_path_key" 2024-10-04 17:03:09.089 EDT [39056] DETAIL: Key (path)=(/nasrecord/segments/garage/1728063122.m4s) already exists. 2024-10-04 17:03:09.089 EDT [39056] STATEMENT: INSERT INTO files_meta (path, orig_ctime, meta) VALUES ('/nasrecord/segments/garage/1728063122.m4s', '2024-10-04T17:32:02'::timestamp, '{"m3u8": {"EXTINF": 4.060645}}') RETURNING files_meta.id [2024-10-04 17:03:09] [ERROR ] [viseron.components.storage.tier_handler.garage.tier_0] - Failed to insert metadata for /nasrecord/segments/garage/1728063122.m4s Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/garage/1728063122.m4s) already exists.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/src/viseron/components/storage/tier_handler.py", line 810, in move_file session.execute(ins) File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2351, in execute return self._execute_internal( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/session.py", line 2236, in _execute_internal result: Result[Any] = compile_state_cls.orm_execute_statement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/orm/bulk_persistence.py", line 1283, in orm_execute_statement result = conn.execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.10/dist-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/garage/1728063122.m4s) already exists.

[SQL: INSERT INTO files_meta (path, orig_ctime, meta) VALUES (%(path)s, %(orig_ctime)s, %(meta)s) RETURNING files_meta.id] [parameters: {'path': '/nasrecord/segments/garage/1728063122.m4s', 'orig_ctime': datetime.datetime(2024, 10, 4, 17, 32, 2), 'meta': '{"m3u8": {"EXTINF": 4.060645}}'}] (Background on this error at: https://sqlalche.me/e/20/gkpj) [2024-10-04 17:03:09] [ERROR ] [viseron.components.storage.tier_handler.garage.tier_0] - Failed to move file /localrecord/segments/garage/1728063122.m4s to /nasrecord/segments/garage/1728063122.m4s: [Errno 2] No such file or directory: '/localrecord/segments/garage/1728063122.m4s'

"Thank you for providing the error logs and the tier_handler.py code. I’ll guide you through the steps to identify and fix the duplicate key errors you’re experiencing in Viseron.

Understanding the Error

The error messages indicate that there’s a violation of a unique constraint in the database:

psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "files_meta_path_key" DETAIL: Key (path)=(/nasrecord/segments/garage/1728062722.m4s) already exists.

•   Explanation: The database table files_meta has a unique constraint on the path column, meaning no two records can have the same path value.
•   Cause: The code is attempting to insert a new record with a path that already exists in the files_meta table.

Additionally, there’s an error when moving files:

Failed to move file /localrecord/segments/garage/1728062722.m4s to /nasrecord/segments/garage/1728062722.m4s: [Errno 2] No such file or directory: '/localrecord/segments/garage/1728062722.m4s'

•   Explanation: The program is trying to move a file that doesn’t exist at the source location.

Step-by-Step Fix

  1. Locate the Problematic Code

In your tier_handler.py, the error occurs in the move_file function when inserting into the files_meta table. Here’s the relevant code snippet:

def move_file( get_session: Callable[..., Session], src: str, dst: str, logger: logging.Logger, ) -> None: """Move file from src to dst.""" logger.debug("Moving file from %s to %s", src, dst) try: with get_session() as session: sel = select(FilesMeta).where(FilesMeta.path == src) res = session.execute(sel).scalar_one() ins = insert(FilesMeta).values( path=dst, meta=res.meta, orig_ctime=res.orig_ctime ) session.execute(ins) session.commit() except IntegrityError: logger.error(f"Failed to insert metadata for {dst}", exc_info=True)

try:
    os.makedirs(os.path.dirname(dst), exist_ok=True)
    shutil.copy(src, dst)
    os.remove(src)
except FileNotFoundError as error:
    logger.error(f"Failed to move file {src} to {dst}: {error}")
    with get_session() as session:
        stmt = delete(Files).where(Files.path == src)
        session.execute(stmt)
        session.commit()
  1. Identify the Cause

    • Duplicate Insertion: The code attempts to insert a new record into files_meta with the destination path (dst), but if a record with that path already exists, it causes a UniqueViolation error. • File Not Found: If the source file (src) doesn’t exist, the FileNotFoundError is raised when trying to copy or remove the file.

  2. Modify the Code to Handle Duplicates

To prevent the duplicate key error, we need to check if a record with the destination path already exists before inserting. If it exists, we can update it or skip the insertion.

Here’s how you can modify the move_file function:

from sqlalchemy.exc import IntegrityError, NoResultFound

def move_file( get_session: Callable[..., Session], src: str, dst: str, logger: logging.Logger, ) -> None: """Move file from src to dst.""" logger.debug("Moving file from %s to %s", src, dst) try: with get_session() as session:

Check if the destination path already exists in files_meta

        sel_dst = select(FilesMeta).where(FilesMeta.path == dst)
        try:
            res_dst = session.execute(sel_dst).scalar_one()
            # If it exists, update the existing record
            upd = (
                update(FilesMeta)
                .where(FilesMeta.path == dst)
                .values(meta=res_dst.meta, orig_ctime=res_dst.orig_ctime)
            )
            session.execute(upd)
        except NoResultFound:
            # If it doesn't exist, insert a new record
            sel_src = select(FilesMeta).where(FilesMeta.path == src)
            res_src = session.execute(sel_src).scalar_one()
            ins = insert(FilesMeta).values(
                path=dst, meta=res_src.meta, orig_ctime=res_src.orig_ctime
            )
            session.execute(ins)
        session.commit()
except IntegrityError as e:
    logger.error(f"Integrity error when handling metadata for {dst}: {e}", exc_info=True)
    session.rollback()
except NoResultFound as e:
    logger.error(f"No metadata found for source {src}: {e}", exc_info=True)
    session.rollback()

try:
    os.makedirs(os.path.dirname(dst), exist_ok=True)
    shutil.copy(src, dst)
    os.remove(src)
except FileNotFoundError as error:
    logger.error(f"Failed to move file {src} to {dst}: {error}")
    with get_session() as session:
        stmt = delete(Files).where(Files.path == src)
        session.execute(stmt)
        session.commit()
  1. Explanation of Changes

    • Check for Existing Destination Record: • Before inserting, we check if a record with dst already exists. • If it exists, we can choose to update it or skip the insertion. • Handle NoResultFound Exception: • When fetching the source metadata, if the source record doesn’t exist, we catch NoResultFound and log an error. • Handle IntegrityError: • If an IntegrityError occurs, we log the error and rollback the session to prevent the transaction from being left in an invalid state. • Improved Exception Handling: • Added more specific exception handling to provide clearer error messages and maintain database integrity.

  2. Verify File Existence Before Moving

Before attempting to move the file, ensure that the source file exists:

import os

def move_file( get_session: Callable[..., Session], src: str, dst: str, logger: logging.Logger, ) -> None: """Move file from src to dst.""" if not os.path.exists(src): logger.error(f"Source file {src} does not exist. Skipping move.") with get_session() as session: stmt = delete(Files).where(Files.path == src) session.execute(stmt) session.commit() return

Rest of the code...

  1. Update Database After File Operations

Ensure that after moving the file, the Files table is updated with the new path:

# After os.remove(src)
with get_session() as session:
    # Update the path in the Files table
    upd_files = update(Files).where(Files.path == src).values(path=dst)
    session.execute(upd_files)
    session.commit()
  1. Rebuild and Test

    • Rebuild the Docker Image:

docker build -t viseron-dev .

•   Run the Docker Container:

docker run -d --name viseron-dev -p 8888:8888 viseron-dev

•   Monitor the Logs:

docker logs -f viseron-dev

•   Check for Errors:
•   Verify that the duplicate key errors no longer appear.
•   Ensure that files are being moved correctly and that their metadata is handled appropriately.
  1. Clean Up the Database (Optional)

If there are existing duplicate entries or inconsistencies, consider cleaning up the files_meta table:

•   Connect to the Database:

psql -U yourusername -h yourhost -p yourport yourdbname

•   Remove Duplicate Entries:

DELETE FROM files_meta WHERE id IN ( SELECT id FROM ( SELECT id, ROW_NUMBER() OVER (partition BY path ORDER BY id) AS rnum FROM files_meta ) t WHERE t.rnum > 1 );

Additional Tips

•   Database Indexes: Ensure that the files_meta table has appropriate indexes to optimize queries.
•   Logging: Enhance logging to provide more context when errors occur.
•   Exception Handling: Always handle exceptions gracefully to prevent the application from crashing.

Conclusion

By modifying the move_file function to handle cases where the destination path already exists in the files_meta table, you should resolve the duplicate key errors. Remember to test thoroughly after making changes to confirm that the issue is fixed and that no new issues have been introduced.

Feel free to ask if you need further assistance or clarification on any of the steps. Good luck!"