The issue is that right now the postgres backend (I assume all backends) fetch all the users from the database and filter them in memory. There are 2 things we can approach on this:
1) Add more logic when constructing the db query, IF we are trying to search for a user based on cn we should let the database to do the filtering and use the power of its existing indexes
2) Add a jsonb index that allows us to perform more complex search operations based on random attributes
(1) is easier to implement and should result in a speed up in most cases, (2) will be harder to implement as we will have to check the database version and may not be used that much.
When we have more real world usage of glauth with a backend we will need to evaluate whether we need to refactor the database logic.
Following up the discussion on https://github.com/canonical/glauth-k8s-operator/issues/55#issuecomment-2283271731.
The issue is that right now the postgres backend (I assume all backends) fetch all the users from the database and filter them in memory. There are 2 things we can approach on this: 1) Add more logic when constructing the db query, IF we are trying to search for a user based on
cn
we should let the database to do the filtering and use the power of its existing indexes 2) Add a jsonb index that allows us to perform more complex search operations based on random attributes(1) is easier to implement and should result in a speed up in most cases, (2) will be harder to implement as we will have to check the database version and may not be used that much.
When we have more real world usage of glauth with a backend we will need to evaluate whether we need to refactor the database logic.