Open khushbu-elastic opened 8 months ago
@danajuratoni Redis and Notion are not available as connectors in 8.13.0 (latest) elastic deployment BC1 . So , I guess we won't be able to verify the same .
@danajuratoni @DianaJourdan In 8.13.0 elastic deployment BC2, AuthorizationException Error is faced while indexing for Redis and Notion connector when configured as connector client. Please find error log here : AuthorizationException.txt
PS : However , the above connectors indexing is successful if ran as customized connector
For Jira and Confluence Connector in 8.13.0 elastic deployment BC2, On overview page, inspite of documents indexed in elasticsearch for that sync, Document added count is displayed as 0 in Content Sync .
@danajuratoni @DianaJourdan
We have validated the following connectors and recorded the demo for them & uploaded on drive:
Dropbox Google Cloud Storage Azure Blob Storage Confluence Data Center Jira Data Center Redis (Tested in BC4) Notion
We will keep here updating for other connectors, as and when done.
NOTE:
For other connectors, as @kajal-elastic has mentioned, there is an issue with Redis & Notion connector. We are unable to verify the QA Checklist for those connectors. Please provide with a solution for them. Also, there is an issue with the document counts for Jira & Confluence. Please refer to the above comments for the same.
For --> Enable scheduling for frequency = every minute and save schedule; refresh the page and verify that the changes were stored --> Setting the Scheduling with frequency= every 5 minutes is there so we have checked it for specific minute in every hour..
For --> Verify that on the index list page index information is updated properly, showing the expected number of documents and new index size --> The Index size is not displayed on the Elasticsearch Indices page. We are able to check the same from Index Management Page as it is displayed there.
The above shared Demo for the connectors are created in BC2 for 8.13.0 Elastic Deployment.
@kajal-elastic
In 8.13.0 elastic deployment BC2, AuthorizationException Error is faced while indexing for Redis and Notion connector when configured as connector client. Please find error log here : AuthorizationException.txt PS : However , the above connectors indexing is successful if ran as customized connector
Can you try this on BC3+? If the error persists, can you try the following?
search-
prefixnative_connector_api_keys
. The following script would work
POST .elastic-connectors/_update/<connector_id>
{
"doc": {
"features": {
"native_connector_api_keys": {
"enabled": false
}
}
}
}
If the error persists let me know.
@navarone-feekery Thanks for the update. This issue seems to be resolved now for Redis and Notion Connector in BC3 .
@danajuratoni @DianaJourdan We have uploaded Notion demo in the google drive and link is provided in the above comment. Please refer to the same and update this issue status accordingly.
Non-regression QA
Note: always test with the latest Build Candidate on Elastic Cloud, using the full Elastic stack
[ ] Start the whole stack from scratch and navigate to Enterprise Search
[ ] Check that no indices are shown in the top level Indices list
[ ] Click on "Create an Elasticsearch index" - a new page is open where you can select an ingestion method
[ ] Choose Connector -> Use a connector
[ ] Choose the connector you want to test and Continue
[ ] Create an index with a valid name and Universal language
[ ] Connector name and description are editable on the Configurations page
[ ] Connector can be deleted from the Indices page
[ ] Connector can be deleted from the Indices page and it can be recreated with the same name after
[ ] Pull connectors repository, run
make install
but do not run connector yet[ ] Verify that you are redirected to "configuration" page where you can create an api key and can copy connector id / whole section of config into the connector
[ ] Update connector configuration with the api_key and connector_id, plus choose a service_type to test and set it in config
[ ] Start the connector by running
make run
- verify that it starts and does not actually do anything yet[ ] Wait for the Kibana page with the connector configuration to update and verify that it's possible to edit connector configuration now
[ ] Edit and save connector configuration, then reload the page and verify that configuration is properly saved
[ ] Click on "Set schedule and sync" and verify that you're redirected to the scheduling tab
[ ] Enable scheduling for frequency = every minute and save schedule; refresh the page and verify that the changes were stored
[ ] Switch to the connector and wait for a minute or two, verify that connector starts to ingest data
[ ] Verify that the data from the connector appears in the expected index
[ ] Verify that on the index list page index information is updated properly, showing expected number of documents and new index size
[ ] Verify that on the connector overview page "Document Count" is updated to reflect the number of documents in the index
[ ] Verify that you can see ingested documents in
documents
tab[ ] Verify that index mappings are correct on the
index mappings
tabRecord a short demo showing the connectors' configuration and that there were documents ingested