Closed salim-b closed 2 weeks ago
It appears that there is an issue with adding TLS-encrypted external PostgreSQL data sources in NocoDB v0.251.3. Here are a few steps and considerations that might help address the problem:
Check for Similar Issues: There was a similar issue reported and resolved by deleting the /tmp/.s.PGSQL.5432
file [1].
Connection JSON Configuration: Ensure that the ssl
configuration in the connection JSON is correctly set. Here is an example configuration:
{
"client": "pg",
"connection": {
"host": "<YOUR_HOST>",
"port": "5432",
"user": "<YOUR_DB_USER>",
"password": "<YOUR_DB_PASSWORD>",
"database": "<YOUR_DB_NAME>",
"ssl": {
"require": true,
"rejectUnauthorized": false,
"sslMode": "verify-full"
}
}
}
Make sure the sslMode
parameter is correctly set to "verify-full"
or "require"
as needed [2].
Enable Debug Logs: To gain more insight into the connection error, you can enable specific logs by setting the NC_AUTOMATION_LOG_LEVEL
environment variable:
- `NC_AUTOMATION_LOG_LEVEL=ALL`: Both error and success logs will be displayed and history of both types of logs will be inserted to meta database. **This option is only available for Enterprise Edition.**
- `NC_AUTOMATION_LOG_LEVEL=ERROR`: only error logs will be displayed and history of error logs will be inserted to meta database.
- `NC_AUTOMATION_LOG_LEVEL=OFF`: No logs will be displayed and no history will be inserted to meta database.
This might help identify the root cause of the issue [3].
Review Recent Changes: Since the issue did not exist in v0.251.2, reviewing the changes made in v0.251.3 related to PostgreSQL connections might provide some clues. Unfortunately, specific details about changes in v0.251.3 are not available in the provided context.
If the issue persists, it might be helpful to compare the connection handling code between v0.251.2 and v0.251.3 to identify any discrepancies.
To continue talking to Dosu, mention @dosu.
Addendum: When fixing this bug you should ideally add some automated test adding a TLS-encrypted external PGSQL data source, so future regressions are detected earlier (i.e. before stable release). Current QA is really frustrating as even minor NocoDB version updates regularly introduce serious regressions (e.g. here or here). π
I am experiencing this same issue attempting to connect to an Azure hosted Postgresql database while testing the NocoDB Cloud environment. It consistently reports that the destination postgres db rejects the connection because of 'no encryption' when testing, regardless of the json configurations I apply.
@salim-b https://github.com/nocodb/nocodb/pull/9071 this seems to be the offender
Indeed, it's related to SSLUsage.No
being passed on always into the function. If you modify the connection JSON, and add SSLUsage.Preffered
, but at the connection level of the JSON, it will validate
Using the GUI, and then adding the line as suggested gives:
{
"title": "brainy_wren",
"dataSource": {
"client": "pg",
"connection": {
"host": "<REDACTED>.postgres.database.azure.com",
"port": "5432",
"user": "<REDACTED>",
"password": "<REDACTED>",
"database": "<REDACTED>",
},
"searchPath": [
"public"
]
},
"sslUse": "Preferred",
"SSLUsage": "Preferred",
"extraParameters": [],
"is_private": false,
}
We are still getting the same error 'no encryption'. Did I put it in the wrong location? I read the code here ( https://github.com/nocodb/nocodb/blob/2f2395fb7268557ebb5fe5f375c35a3b623fffa9/packages/nocodb-sdk/src/lib/connectionConfigUtils.ts#L3 ) and believe that I have the JSON keys correct, but am no certain on the interactions.
From NocoDB v0.252.0 onwards, adding an external PostgreSQL data source via the meta API works again.
I can successfully add and use it by feeding the following JSON payload to the POST /api/v2/meta/bases/{baseId}/sources
API endpoint:
{
"alias": "External Postgres DB",
"type": "pg",
"config": {
"client": "pg",
"connection": {
"sslmode": "verify-full",
"user": "REDACTED",
"password": "REDACTED",
"database": "REDACTED",
"host": "REDACTED",
"port": 5432,
"ssl": {
"ca": "",
"cert": "",
"key": ""
}
}
},
"inflection_column": "none",
"inflection_table": "none"
}
Adding the data source via the UI is still impossible, the same error: connection is insecure (try using `sslmode=require`)
is displayed when hitting Test database connection, regardless of what I fill the SSL options with (I tried all sensible combos including "SSLUsage": "Preferred"
as suggested above, I think).
Editing the data source via the UI after it has been added via the API is also impossible. First, the following error is displayed in the bottom right when clicking Edit
on the data connection:
Second, the same error: connection is insecure (try using `sslmode=require`)
is displayed when trying to test the connection (with or without altering any values).
I'm also experiencing this issue. We've been testing NOCODB in some non-production proof of concepts. Encountering bugs like this with something as fundamental as a connection string, really undermines my and my team's confidence that it could be a suitable self-hosted replacement for Airtable.
@salim-b #9071 this seems to be the offender
Indeed, it's related to SSLUsage.No being passed on always into the function. If you modify the connection JSON, and add SSLUsage.Preffered, but at the connection level of the JSON, it will validate
@pranavxc Since you're the author of that PR, could you have a look what's going wrong here?
@salim-b #9071 this seems to be the offender
Indeed, it's related to SSLUsage.No being passed on always into the function. If you modify the connection JSON, and add SSLUsage.Preffered, but at the connection level of the JSON, it will validate
@pranavxc Since you're the author of that PR, could you have a look what's going wrong here?
@salim-b Thanks for pointing out, I'm looking into the issue and we will add test for these scenarios soon.
How is this even an issue? Postgres with SSL is pretty much the default in a cloud environment. I've tried with CA cert, nothing happens. Manual config of the "advanced settinngs" just doesn't do anything, and using the "Connection URL" results in another error. This isn't some obscure data store integration, its one of the most common databases. Posting to a REST API to connect a database is a terrible workaround.
We made a fix and created a PR build, can someone help us to verify the build. Once verified we will make a quick fix release ASAP.
docker run -d -p 8888:8080 nocodb/nocodb-timely:0.255.1-pr-9379-20240829-0952
We made a fix and created a PR build, can someone help us to verify the build. Once verified we will make a quick fix release ASAP.
@pranavxc I can confirm that the latest release v0.255.2 which includes that PR fixes this issue. Thanks!
I can confirm it as well... I was able to setup an integration without editing json at all. Huzzah!
Please confirm that the bug report does not already exist
Steps to reproduce
Add an external PostgreSQL data source via the NocoDB UI to a server that requires TLS like Neon.tech, e.g. by supplying a connection string like
or
Click Test database connection. The following error is displayed:
The error message is obviously wrong as I explicitly specified
sslmode=verify-full
(orsslmode=require
), which is also reflected in the connection JSON (when clicking on SSL & Advanced parameters β Edit connection JSON).The exact same settings work fine on NocoDB v0.251.2 and below, so something must have broken recently in NocoDB's code.
Desired Behavior
Adding an external data source to a PostgreSQL server that requires TLS should work without error as with NocoDB v0.251.2.
(I didn't test other DBMS besides PGSQL, so I don't know whether TLS encryption is broken for them, too.)
Project Details
Node: v20.15.1 Arch: x64 Platform: linux Docker: false RootDB: sqlite3 PackageVersion: 0.251.3
Attachments
Relevant logs excerpt from the NocoDB server: