Closed MaoZ1993 closed 7 months ago
You can do that by using the config/addons/fidi/impor_files folder for the data importer. you place there the files and you have the cron timing as an add-on setting.
The tricky part is to get authentication stable for unattended import, which seems to clash with subdomains if you have them.
You can do that by using the config/addons/fidi/impor_files folder for the data importer. you place there the files and you have the cron timing as an add-on setting.
The tricky part is to get authentication stable for unattended import, which seems to clash with subdomains if you have them.
Sorry I'm a beginer, Yes I have read the add-on document but still not sure where I can type in the below command after config json file is ready.
docker run \
--rm \
-v $PWD:/import \
-e FIREFLY_III_ACCESS_TOKEN= \
-e IMPORT_DIR_WHITELIST=/import \
-e FIREFLY_III_URL= \
-e WEB_SERVER=false \
fireflyiii/data-importer:latest
Addon is programemd to run such query automatically, pointing the 'import' folder to the one mentioned before.
Thanks, I have put the json files into the dir /config/addons_config/fireflyiii_data_importer/import_files but still not importing. My config.yaml is remain default, is there any step I was missing?
Addon Logs:
Starting...
/etc/cont-init.d/00-banner.sh: executing
-----------------------------------------------------------
Add-on: Firefly iii Data Importer
Data importer for Firefly III (separate addon)
-----------------------------------------------------------
Add-on version: version-v1.0.2
You are running the latest version of this add-on.
System: Home Assistant OS 8.5 (amd64 / generic-x86-64)
Home Assistant Core: 2023.2.5
Home Assistant Supervisor: 2023.01.1
-----------------------------------------------------------
Please, share the above information when looking for help
or support in, e.g., GitHub, forums
https://github.com/alexbelgium/hassio-addons
-----------------------------------------------------------
/etc/cont-init.d/00-global_var.sh: executing
CONFIG_LOCATION='/config/addons_config/fireflyiii_data_importer/config.yaml'
FIREFLY_III_ACCESS_TOKEN='eyJ0eXAiOiJKV1QiLCJhbGciOiJSUzI1NiJ9.eyJhdWQiOiIzIiwianRpIjoiYWI1OTA1YmE2NzAzMjM4YWY5YTllNjZmY2ZkMmE2YTcxNjg4MTUwZmJiOWE1NGNkOTNjYzBjNzZhMTc2ZjhlZTc0NmRkNzEzZmI4YmIyMDQiLCJpYXQiOjE2NzAyNDEzODIuMDI1MjI2LCJuYmYiOjE2NzAyNDEzODIuMDI1MjI4LCJleHAiOjE3MDE3NzczODEuOTk0NzQsInN1YiI6IjEiLCJzY29wZXMiOltdfQ.reu90gFmj9YnjBdhwgeDPYDQ5wQuwWVMlLRqJuDb1-wciB1cAKK7rWkL7hW3tAvcrsuwXMR0iyIua5-7vxi1X7oj8r3hAfjKnl_1gwN5lUu4yxuu2OOnq4UyIq1_w1ZOJWswVV0PPtp0-UmsEnHlVae725C4vmB7rvgJ8bE7PXlEHZUiSIJchxZJSbrOJii8abQlIV46691Kd-_7e5k6ISUsTGGiTorMALFKloCGGcFYnTEcxXEgDl6NWS8WVCH0cfyo8OEyvFYLBgmYICMZObdlP0hzTBIZZzCyokcxu8vsEbqdKFeKg1DQC9gWAQwUQrGAY_UpdOej14PoyckzN8XB4XGiGLaJwyDqTz79ncRS_F2F9f2EeB4OQ42R4w8FLhXMtNYSRm9DWWTkFZV5xwQX-3oKCudtjm_eivCzRH3SVmKY3lVFGJGmA9izEFGKW-k-43HJU0O73uAENsjoyKpTKv5CQDYjSJiks-TCocsZ_nUV3mSx5UN0wMTgR7H6FJBourd_NXYTHwOPiMlFfIOZf-hLSEln7YSSaouSjcz9k0I7-B9sX6JYq0M9HkYqdNGQHxNLiFW4SpA3cVgwiBnsdWJU2HY0AlJVebcd-s3eetn-HIdxsP-rRRuzpTuncGmDHeobpKrie6gO8iUBftolVek9hygFE-Sz4lEQI6Q'
FIREFLY_III_URL='http://192.168.2.74:3473'
SPECTRE_APP_ID='NxhsPYgHxV_EYtI6Pi53pn8qJsXK4rgwTwq3fkzOEk0'
SPECTRE_SECRET='_l3YC1-ej5rbHgU5HKZrujNQRiH4J7j5q6bkF0G9ZQs'
Updates='hourly'
Timezone set from Etc/UTC to Australia/Sydney
/etc/cont-init.d/01-custom_script.sh: executing
[16:57:06] INFO: Execute /config/addons_autoscripts/fireflyiii-data-importer.sh if existing
[16:57:06] INFO: ... no script found
/etc/cont-init.d/20-folders.sh: executing
/etc/cont-init.d/90-config_yaml.sh: executing
Using config file found in /config/addons_config/fireflyiii_data_importer/config.yaml
Config file is a valid yaml
[16:57:07] INFO: Starting the app with the variables in /config/addons_config/fireflyiii_data_importer/config.yaml
/etc/cont-init.d/98-fix.sh: executing
Fix for Nordigen
/etc/cont-init.d/99-run.sh: executing
[16:57:07] INFO: hourly updates
Starting periodic command scheduler: cron.
[16:57:07] INFO: Automatic updates were requested. The files in /config/addons_config/fireflyiii_data_importer/import_files will be imported hourly.
[16:57:07] INFO: Please wait while the app is loading !
Now in entrypoint.sh for Firefly III Data Importer
Script: 1.6 (2022-02-17)
User: 'root'
Group: 'root'
Working dir: '/var/www/html'
Build number: 190
Build date: 27-01-2023 05:47:24 CET
Now parsing _FILE variables.
done!
Firefly III data importer v1.0.2
PHP: cli 8.2.1 Linux
Will now run Apache web server:
[Thu Feb 23 16:57:11.026054 2023] [mpm_prefork:notice] [pid 284] AH00163: Apache/2.4.54 (Debian) configured -- resuming normal operations
[Thu Feb 23 16:57:11.026094 2023] [core:notice] [pid 284] AH00094: Command line: 'apache2 -D FOREGROUND'
127.0.0.1 - - [23/Feb/2023:16:58:04 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [23/Feb/2023:16:59:05 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
You need to wait. Please share updates after the hour as per your configuration:
[16:57:07] INFO: hourly updates Starting periodic command scheduler: cron. [16:57:07] INFO: Automatic updates were requested. The files in /config/addons_config/fireflyiii_data_importer/import_files will be imported hourly. [Thu Feb 23 16:57:11.026054 2023] [mpm_prefork:notice] [pid 284] AH00163: Apache/2.4.54 (Debian) configured -- resuming normal operations [Thu Feb 23 16:57:11.026094 2023] [core:notice] [pid 284] AH00094: Command line: 'apache2 -D FOREGROUND' 127.0.0.1 - - [23/Feb/2023:16:58:04 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0" 127.0.0.1 - - [23/Feb/2023:16:59:05 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
Is there a way I can trace the previous log? It was flood by the health check message,, Once I put the json files into the import folder, after the auto import triggered, the FIDI cannot be open anymore, bounce back "500 Server Error". Restart the add-on can fix it.
127.0.0.1 - - [24/Feb/2023:20:24:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:25:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:26:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:27:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:28:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:29:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:30:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:31:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
127.0.0.1 - - [24/Feb/2023:20:32:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
192.168.2.234 - - [24/Feb/2023:20:33:21 +1100] "GET / HTTP/1.1" 500 6834 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
127.0.0.1 - - [24/Feb/2023:20:33:54 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
192.168.2.234 - - [24/Feb/2023:20:34:13 +1100] "-" 408 0 "-" "-"
192.168.2.234 - - [24/Feb/2023:20:34:16 +1100] "GET / HTTP/1.1" 500 6834 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
Is there a way I can trace the previous log?
The only way I can think of is installing portainer and checking through that the container's logs. Other than that, try restarting the addon, and use a stopwatch in your phone to be ready once 1h goes through to come back and check.
To be honest, i'm having difficulties with the auto-import as well. It used to work fine for me, but FiDi is limited as soon as your subdomains don't match https://github.com/firefly-iii/firefly-iii/discussions/5998
It's confusing, the auto import seems works, but still , the webpage is crashed, "500 Server Error"
[2023-02-25 20:17:27] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:27] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"deposit","date":"2023-02-17 00:00:00","amount":"2002","description":"Salary CHINA UNICOM (AU","order":0,"currency_code":"AUD","tags":["normal","posted","paycheck"],"category_name":"paycheck","external_id":"948944877880613452","interal_reference":"910991635024714271","notes":"","destination_id":13,"source_name":"(unknown source account)"}]}
[2023-02-25 20:17:27] production.ERROR: Submission error: 0 ["Duplicate of transaction #1486."]
[2023-02-25 20:17:27] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:27] production.DEBUG: Add error on index #0 (line no. 1): transactions.0.description: Duplicate of transaction #1486. (original value: "Salary CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:27] production.ERROR: transactions.0.description: Duplicate of transaction #1486. (original value: "Salary CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:27] production.DEBUG: Now submitting transaction 2/6
[2023-02-25 20:17:27] production.DEBUG: Duplicate detection method is "classic", so this method is skipped (return true).
[2023-02-25 20:17:27] production.DEBUG: Transaction #2 is unique.
[2023-02-25 20:17:27] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:27] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"withdrawal","date":"2023-02-18 00:00:00","amount":"1000.00000000000000","description":"Transfer to xx6176 CommBank app","order":0,"currency_code":"AUD","tags":["transfer","posted","transfer"],"category_name":"transfer","external_id":"949504922481597137","interal_reference":"910991635024714271","notes":"","source_id":13,"destination_name":"(unknown destination account)"}]}
[2023-02-25 20:17:27] production.ERROR: Submission error: 0 ["Duplicate of transaction #1495."]
[2023-02-25 20:17:27] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:27] production.DEBUG: Add error on index #1 (line no. 2): transactions.0.description: Duplicate of transaction #1495. (original value: "Transfer to xx6176 CommBank app")
[2023-02-25 20:17:27] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:27] production.ERROR: transactions.0.description: Duplicate of transaction #1495. (original value: "Transfer to xx6176 CommBank app")
[2023-02-25 20:17:27] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:27] production.DEBUG: Now submitting transaction 3/6
[2023-02-25 20:17:27] production.DEBUG: Duplicate detection method is "classic", so this method is skipped (return true).
[2023-02-25 20:17:27] production.DEBUG: Transaction #3 is unique.
[2023-02-25 20:17:27] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:27] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"deposit","date":"2023-02-20 00:00:00","amount":"209","description":"Direct Credit 511037 CHINA UNICOM (AU","order":0,"currency_code":"AUD","tags":["normal","posted","phone"],"category_name":"phone","external_id":"951131824728316750","interal_reference":"910991635024714271","notes":"","destination_id":13,"source_name":"(unknown source account)"}]}
[2023-02-25 20:17:27] production.ERROR: Submission error: 0 ["Duplicate of transaction #1515."]
[2023-02-25 20:17:27] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:27] production.DEBUG: Add error on index #2 (line no. 3): transactions.0.description: Duplicate of transaction #1515. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:27] production.ERROR: transactions.0.description: Duplicate of transaction #1515. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:27] production.DEBUG: Now submitting transaction 4/6
[2023-02-25 20:17:27] production.DEBUG: Duplicate detection method is "classic", so this method is skipped (return true).
[2023-02-25 20:17:27] production.DEBUG: Transaction #4 is unique.
[2023-02-25 20:17:27] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:27] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"deposit","date":"2023-02-20 00:00:00","amount":"384.98","description":"Direct Credit 511037 CHINA UNICOM (AU","order":0,"currency_code":"AUD","tags":["normal","posted","phone"],"category_name":"phone","external_id":"951131824745093967","interal_reference":"910991635024714271","notes":"","destination_id":13,"source_name":"(unknown source account)"}]}
[2023-02-25 20:17:27] production.ERROR: Submission error: 0 ["Duplicate of transaction #1516."]
[2023-02-25 20:17:27] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:27] production.DEBUG: Add error on index #3 (line no. 4): transactions.0.description: Duplicate of transaction #1516. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:27] production.ERROR: transactions.0.description: Duplicate of transaction #1516. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
[2023-02-25 20:17:27] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:27] production.DEBUG: Now submitting transaction 5/6
[2023-02-25 20:17:27] production.DEBUG: Duplicate detection method is "classic", so this method is skipped (return true).
[2023-02-25 20:17:27] production.DEBUG: Transaction #5 is unique.
[2023-02-25 20:17:27] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:27] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:27] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"withdrawal","date":"2023-02-23 00:00:00","amount":"884.49000000000000","description":"Transfer To Fanyu Meng NetBank CUAU Reimbursement","order":0,"currency_code":"AUD","tags":["transfer","posted","transfer"],"category_name":"transfer","external_id":"953293777039006073","interal_reference":"910991635024714271","notes":"","source_id":13,"destination_name":"(unknown destination account)"}]}
[2023-02-25 20:17:28] production.ERROR: Submission error: 0 ["Duplicate of transaction #1528."]
[2023-02-25 20:17:28] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:28] production.DEBUG: Add error on index #4 (line no. 5): transactions.0.description: Duplicate of transaction #1528. (original value: "Transfer To Fanyu Meng NetBank CUAU Reimbursement")
[2023-02-25 20:17:28] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:28] production.ERROR: transactions.0.description: Duplicate of transaction #1528. (original value: "Transfer To Fanyu Meng NetBank CUAU Reimbursement")
[2023-02-25 20:17:28] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:28] production.DEBUG: Now submitting transaction 6/6
[2023-02-25 20:17:28] production.DEBUG: Duplicate detection method is "classic", so this method is skipped (return true).
[2023-02-25 20:17:28] production.DEBUG: Transaction #6 is unique.
[2023-02-25 20:17:28] production.DEBUG: Going to map data for this line.
[2023-02-25 20:17:28] production.DEBUG: No base url in getBaseUrl(), will return config variable.
[2023-02-25 20:17:28] production.DEBUG: No access token in hasAccessToken(), will return config variable.
[2023-02-25 20:17:28] production.DEBUG: Submitting to Firefly III: {"apply_rules":true,"error_if_duplicate_hash":true,"transactions":[{"type":"deposit","date":"2023-02-18 00:00:00","amount":"1000","description":"Transfer from xx6863 CommBank app","order":0,"currency_code":"AUD","tags":["transfer","posted","transfer"],"category_name":"transfer","external_id":"950394469205679083","interal_reference":"910991635611916917","notes":"","destination_id":14,"source_name":"(unknown source account)"}]}
[2023-02-25 20:17:28] production.ERROR: Submission error: 0 ["Duplicate of transaction #1496."]
[2023-02-25 20:17:28] production.DEBUG: This is a duplicate transaction error
[2023-02-25 20:17:28] production.DEBUG: Add error on index #5 (line no. 6): transactions.0.description: Duplicate of transaction #1496. (original value: "Transfer from xx6863 CommBank app")
[2023-02-25 20:17:28] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_running
[2023-02-25 20:17:28] production.ERROR: transactions.0.description: Duplicate of transaction #1496. (original value: "Transfer from xx6863 CommBank app")
[2023-02-25 20:17:28] production.DEBUG: Group is empty, may not have been stored correctly.
[2023-02-25 20:17:28] production.INFO: Done submitting 6 transactions to your Firefly III instance.
[2023-02-25 20:17:28] production.DEBUG: Now in setSubmissionStatus(submission_done)
[2023-02-25 20:17:28] production.DEBUG: Found "conv-tBRB4uJtdxC3" in the session
[2023-02-25 20:17:28] production.DEBUG: Now in startOrFindJob(conv-tBRB4uJtdxC3)
[2023-02-25 20:17:28] production.DEBUG: Try to see if file exists for job conv-tBRB4uJtdxC3.
[2023-02-25 20:17:28] production.DEBUG: Status file exists for job conv-tBRB4uJtdxC3.
[2023-02-25 20:17:28] production.DEBUG: Now in App\Services\Shared\Import\Status\SubmissionStatusManager::storeSubmissionStatus(conv-tBRB4uJtdxC3): submission_done
Import index 0: transactions.0.description: Duplicate of transaction #1486. (original value: "Salary CHINA UNICOM (AU")
Import index 1: transactions.0.description: Duplicate of transaction #1495. (original value: "Transfer to xx6176 CommBank app")
Import index 2: transactions.0.description: Duplicate of transaction #1515. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
Import index 3: transactions.0.description: Duplicate of transaction #1516. (original value: "Direct Credit 511037 CHINA UNICOM (AU")
Import index 4: transactions.0.description: Duplicate of transaction #1528. (original value: "Transfer To Fanyu Meng NetBank CUAU Reimbursement")
Import index 5: transactions.0.description: Duplicate of transaction #1496. (original value: "Transfer from xx6863 CommBank app")
Done!
[2023-02-25 20:17:28] production.DEBUG: Created event ImportedTransactions with filtering (2)
[2023-02-25 20:17:28] production.DEBUG: Array contains 0 line(s)
[2023-02-25 20:17:28] production.DEBUG: Array contains 0 line(s)
[2023-02-25 20:17:28] production.DEBUG: Array contains 6 line(s)
[2023-02-25 20:17:28] production.DEBUG: Now in sendReportOverMail
[2023-02-25 20:17:28] production.INFO: No mailer configured, will not mail.
127.0.0.1 - - [25/Feb/2023:20:17:33 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
192.168.2.234 - - [25/Feb/2023:20:18:27 +1100] "GET / HTTP/1.1" 500 6834 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
192.168.2.234 - - [25/Feb/2023:20:18:27 +1100] "GET /favicon.ico HTTP/1.1" 200 280 "http://homeassistant.local:3474/" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
192.168.2.234 - - [25/Feb/2023:20:18:28 +1100] "GET / HTTP/1.1" 500 6834 "-" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/110.0.0.0 Safari/537.36"
127.0.0.1 - - [25/Feb/2023:20:18:33 +1100] "GET /robots.txt HTTP/1.1" 200 236 "-" "Firefly III Health Checker/1.0"
I've seen similar behaviour too. I think is because there' some sort of authentication bundled with the IPs/hostnames of the connections. If the import occurs automatically, then it's a docker-to-docker connection. If it's done by the website manually, then its a yourcomputer-to-firefly connection.
Can't tell much than that, I think is an original firefly issue rather than the HA addon implementation. You (we?) might be more lucky if opening an issue ticket directly to the original repository of FiDi.
Hi, have you made it work? I'm setting up firefly iii (which can help with support!) and currently puzzled with auto import behavior
Sure, Since last update I've been learning more on how to run it. My stable usage of the auto-import (fidi for short) is with the following parammeters in your addon:
FIREFLY_III_URL: http://your-fireflyiii-container-id:8080
NORDIGEN_ID: your personal gocardless id
NORDIGEN_KEY: your personal gocardless secret key
Updates: daily
silent: false
FIREFLY_III_ACCESS_TOKEN: generate a Personal Access Token in your Firefly3 account
Mind in above that I use a personal access token so you do NOT need to fill anything about FF3_Client_id, nor generate an OAuth. i only use that for multi-account setups which is more advanced to setup (https://github.com/orgs/firefly-iii/discussions/5998#discussioncomment-5629623)
and then in the config.yaml file in the addons folder:
TZ: Europe/Madrid
MAIL_MAILER: log
VERIFY_TLS_SECURITY: false
VANITY_URL: https://subdomainforyourfirefly3container.domain.dom -> I'm running a reverse proxy with nginx. That's the URL i ALWAYS use when personally navigating to my firefly3 instance. This is important becasue when you navigate to FiDi in your browser, it will read from cookies for your active session on such url. i.e. if you navigate to FF3 through IP:port, manual import might fail.
TRUSTED_PROXIES: 172.30.x.y/24 -> your internal docker network, mostly for the reverse proxy connection to work though.
Thanks very much I'll test and update the documentation. Currently I'm stuck as it seems the app expects a config ".json" file matching the .csv file, so I'll have to investigate a bit further
the app expects a config ".json" file matching the .csv file.
That's correct. On your first run you need to proceed manually, follow the guided UI steps to setup your 'configuration' file. Once done, at the last steps, you'll see an extra button saying "download configuration file"; that's your .json.
https://docs.firefly-iii.org/how-to/data-importer/import/json/
After that, and assuming you manually run the import and confirm it runs fine and the settings are as expected (things like file columns to firefly3 fields link, or date format). You can just add the .json file to the import folder of the addon, and add an updated csv in the same folder, and the addon will autoimport it for you. Filename should be the same, just different file extension.
Thanks very much! It's working quite fine now
I thought I could throw a bunch of csv files and it would just import everything, thanks for your explanations. In the end, it's more logic also - once in routine mode (with all history uploaded, my bank only allowed a 1y range max) then there should be much less files to automatically import.
Perhaps I'll take the opportunity to update the addon to use the new /config folder as per current Homeassistant guidelines instead of the current /config/addons_config/fireflyiii_data_importer folder but I'll post here when I'll do that so that everyone is aware of the change
If your bank is supported by old Nordigen (no gocardless) I recommend setting it up so that you won't need to download anything, it would automagically connect every time, read latest X days of data, and autoimport it. So then you jus tneed to log in too FF3 and check the latest Tag for your imported data and correct as needed or setup automated-rules to post-process them.
I only use CSV for banks without API (such as Edenred which isn't officially 'a bank', or AmericanExpress credit cards balances).
PD: Updating the addonn for the new way-of-working of Ha would definitely be interesting to ensure it remains "future proof". PD2: FF3 support different databases. Right now i'm using MAriaDB which is the default for the addon setup, but that's the same i use for HA main data. I recently had to recover from a total loss of data through a backup and such setup would mean until I had HA ready I couldn't access my data, and bundles backups of both together which I would prefer to keep separated for resiliency (Say, I would want to have daily backups of FF3 data but only monthly of HA).
Thanks very much! Indeed my bank looks supported I'll try that!
Regarding the different databases types it is normally an option to the addon, although I have never tried another one... It should even support sqlite which would therefore be embedded in the addon. As I have a postgres addon it should be a possible option. Usually I use a weekly update based on a Google Drive addon https://github.com/sabeechen/hassio-google-drive-backup that allows customization of schedules - and therefore to be very quickly up in case of issues! I guess if there is less than a week of data lost the Nordigen link would allow to quickly recover it ?
EDIT : strangely Nordigen supports my main account but not savings - anyway it is already super helpful
I've pushed a new version with the new /config folder (can be customized in the options), all files should be transferred automatically
hi @alexbelgium - this latest update seems to have broken the import config folder for me - am I doing something wrong? I had my json files saved in /homeassistant/addons_config/fireflyiii_data_importer/ before and now, after the update, firefly importer can't see any files (I did download them just in case beforehand - phew)
any pointers?
Hi, all files have migrated to the new place according to Homeassistant new way of workings! You can access it at the folder /addon_configs/xxx-firelyiii_data_updater and you should find all your previous files there too!
oh my god actually you're right there is a bug that deletes all configuration files instead of copying them!!! I'll pull ASAP the version and put a new one
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi, is everything working as it should I can close it ?
Hi @alexbelgium. Yeah, I rolled back to the old version, restored the config files and upgraded again - all files copied over this time - thanks!!
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi there
Is there a way to upload a pre-config Jason file for automation purposes?
https://docs.firefly-iii.org/data-importer/usage/command_line/
The official document requires a particular mounted folder for docker installation, but in HASS we have no option here...
Thanks, if any additional info is required, I'm happy to help.