Open pedropombeiro opened 1 month ago
Hey there @kane610, mind taking a look at this issue as it has been labeled with an integration (unifi
) you are listed as a code owner for? Thanks!
(message by CodeOwnersMention)
unifi documentation unifi source (message by IssueLinks)
@pedropombeiro this is a recorder issue not a unifi issue, duplicate of #123309
Thanks for the link @smarthomefamilyverrips, I'll subscribe to that issue then. Closing this one 👍
Hey there @home-assistant/core, mind taking a look at this issue as it has been labeled with an integration (recorder
) you are listed as a code owner for? Thanks!
(message by CodeOwnersMention)
recorder documentation recorder source (message by IssueLinks)
This is not the same issue as in https://github.com/home-assistant/core/issues/123309; although they are both related to migration.
This is not the same issue as in #123309; although they are both related to migration.
@emontnemery OK I did think where the same 😀
@smarthomefamilyverrips no worries, they are very similar, but I'd like to keep them separated. Did you also have the database upgrade fail? Could you paste the error message?
I got a similar error on the states table:
Logger: homeassistant.components.recorder.core
Source: components/recorder/core.py:988
integration: Recorder (documentation, issues)
First occurred: 06:51:08 (1 occurrences)
Last logged: 06:51:08
Database error during schema migration
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 924, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.UniqueViolation: could not create unique index "states_pkey"
DETAIL: Key (state_id)=(869247) is duplicated.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/recorder/core.py", line 988, in _migrate_schema
new_schema_status = migrator(
^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 335, in migrate_schema_live
schema_status = _migrate_schema(
^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 301, in _migrate_schema
_apply_update(instance, hass, engine, session_maker, new_version, start_version)
File "/usr/src/homeassistant/homeassistant/components/recorder/util.py", line 703, in wrapper
job(instance, *args, **kwargs)
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 710, in _apply_update
migrator_cls(instance, hass, engine, session_maker, old_version).apply_update()
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 756, in apply_update
self._apply_update()
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 1490, in _apply_update
_modify_columns(
File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 562, in _modify_columns
connection.execute(
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1418, in execute
return meth(
^^^^^
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection
return connection._execute_clauseelement(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement
ret = self._execute_context(
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context
return self._exec_single_context(
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context
self._handle_dbapi_exception(
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception
raise sqlalchemy_exception.with_traceback(exc_info[2]) from e
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context
self.dialect.do_execute(
File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 924, in do_execute
cursor.execute(statement, parameters)
sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) could not create unique index "states_pkey"
DETAIL: Key (state_id)=(869247) is duplicated.
[SQL: ALTER TABLE states ALTER event_id TYPE BIGINT, ALTER old_state_id TYPE BIGINT, ALTER attributes_id TYPE BIGINT, ALTER metadata_id TYPE BIGINT]
(Background on this error at: https://sqlalche.me/e/20/gkpj)
We need to see database dumps to understand what's happening. Please contact me on discord, I'm @emontnemery there too.
@pedropombeiro @nalabelle Have you manually altered the database at any point or used some tool to migrate from one database engine to another, for example migrating from SQLite to PostgreSQL?
@pedropombeiro @nalabelle Have you manually altered the database at any point or used some tool to migrate from one database engine to another, for example migrating from SQLite to PostgreSQL?
@emontnemery I did move my database to PostgreSQL with Pgloader about 3 years ago 🤔
Here is my schema dump, just in case: schema_dump.sql
Yes, I migrated from sqlite to postgres. I won't have a chance to dig in the db until this weekend, but it sounds like you might be onto something.
@nalabelle Right. Your DB schema seems mostly OK, but you might have imported duplicates when importing from sqlite. I'm, not sure about the postgres tool you used for migration, but the MariaDB equivalent explicitly disables contstraint checks when importing a dump. Also, the schema dump is the state after the failed migration, right?
If you have a backup from before the migration, the safest option is:
If you don't have a backup, you instead need to do:
ALTER TABLE ONLY public.events
ADD CONSTRAINT events_data_id_fkey FOREIGN KEY (data_id) REFERENCES public.event_data(data_id);
ALTER TABLE ONLY public.states
ADD CONSTRAINT states_attributes_id_fkey FOREIGN KEY (attributes_id) REFERENCES public.state_attributes(attributes_id);
ALTER TABLE ONLY public.states
ADD CONSTRAINT states_old_state_id_fkey FOREIGN KEY (old_state_id) REFERENCES public.states(state_id);
ALTER TABLE ONLY public.statistics
ADD CONSTRAINT statistics_metadata_id_fkey FOREIGN KEY (metadata_id) REFERENCES public.statistics_meta(id) ON DELETE CASCADE;
ALTER TABLE ONLY public.statistics_short_term
ADD CONSTRAINT statistics_short_term_metadata_id_fkey FOREIGN KEY (metadata_id) REFERENCES public.statistics_meta(id) ON DELETE CASCADE;
Note: There might be duplicates in multiple tables. Also, let me know if you need help with deleting duplicates.
Thanks for the instructions @emontnemery. In the end, there were 3 duplicates, and any time that I missed one, I had to go through the steps again and add it to the list to be deleted. So in the end I ran this:
DELETE FROM states_meta WHERE entity_id IN (SELECT entity_id FROM states_meta GROUP BY entity_id HAVING COUNT(*) > 1)
The migration has succeeded, and everything is now working fine. Thanks a lot for helping me through this!
Great news @pedropombeiro!
I'm very late with asking, but do you have a backup of the database from before your manual changes, or even better from before the upgrade to 2024.8.0 which you would be able to share?
Great news @pedropombeiro!
I'm very late with asking, but do you have a backup of the database from before your manual changes, or even better from before the upgrade to 2024.8.0 which you would be able to share?
I'm afraid I don't.
When I restored my database from before the upgrade, the issues ended up being pretty apparent:
ERROR: could not create unique index "states_pkey"
DETAIL: Key (state_id)=(869247) is duplicated.
...
ERROR: insert or update on table "states" violates foreign key constraint "states_attributes_id_fkey"
DETAIL: Key (attributes_id)=(167182) is not present in table "state_attributes".
This was a backup from last Sunday, before the 2024.8 release and any of the migrations.
@nalabelle Would it be possible for you to share the database backup?
@nalabelle Would it be possible for you to share the database backup?
It's over half a GB and contains a bunch of personal data: when we're home, where we are, etc - whole point of keeping this stuff self-hosted and off the cloud 😄. Is there a way to get you the info you need without sharing all that?
The problem
I've tried upgrading to 2024.8.0, but started receiving 2 notifications at startup, regarding a database migration error. I'm at a loss on how to fix this situation.
What version of Home Assistant Core has the issue?
core-2024.8.0
What was the last working version of Home Assistant Core?
core-2024.7.x
What type of installation are you running?
Home Assistant Container
Integration causing the issue
UniFi Protect and UniFi Network
Link to integration documentation on our website
https://www.home-assistant.io/integrations/unifi
Diagnostics information
config_entry-unifi-733e9b927449b5799f3312bb99301803.json
config_entry-unifiprotect-84bdb37b1c82a7fc7ff9611bf1023fe8.json
Example YAML snippet
No response
Anything in the logs that might be useful for us?
Additional information
Logger: homeassistant.helpers.recorder Source: helpers/recorder.py:101 First occurred: 22:57:24 (1 occurrences) Last logged: 22:57:24
Error executing query Traceback (most recent call last): File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) psycopg2.errors.UniqueViolation: could not create unique index "ix_states_meta_entity_id" DETAIL: Key (entity_id)=(sensor.udm_pro_cpu_utilization_2) is duplicated.
The above exception was the direct cause of the following exception:
Traceback (most recent call last): File "/usr/src/homeassistant/homeassistant/helpers/recorder.py", line 101, in session_scope yield session File "/usr/src/homeassistant/homeassistant/components/recorder/migration.py", line 562, in _modify_columns connection.execute( File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1418, in execute return meth( ^^^^^ File "/usr/local/lib/python3.12/site-packages/sqlalchemy/sql/elements.py", line 515, in _execute_on_connection return connection._execute_clauseelement( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1640, in _execute_clauseelement ret = self._execute_context( ^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1846, in _execute_context return self._exec_single_context( ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1986, in _exec_single_context self._handle_dbapi_exception( File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 2353, in _handle_dbapi_exception raise sqlalchemy_exception.with_traceback(exc_info[2]) from e File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/base.py", line 1967, in _exec_single_context self.dialect.do_execute( File "/usr/local/lib/python3.12/site-packages/sqlalchemy/engine/default.py", line 924, in do_execute cursor.execute(statement, parameters) sqlalchemy.exc.IntegrityError: (psycopg2.errors.UniqueViolation) could not create unique index "ix_states_meta_entity_id" DETAIL: Key (entity_id)=(sensor.udm_pro_cpu_utilization_2) is duplicated.
[SQL: ALTER TABLE states_meta ALTER metadata_id TYPE BIGINT ] (Background on this error at: https://sqlalche.me/e/20/gkpj)