alexjustesen / speedtest-tracker

Speedtest Tracker is a self-hosted internet performance tracking application that runs speedtest checks against Ookla's Speedtest service.
https://speedtest-tracker.dev/
MIT License
2.7k stars 99 forks source link

[Feature Request] Cron Schedule from specific hour #552

Closed ShlomiD83 closed 5 months ago

ShlomiD83 commented 1 year ago

Hi, Is it possible to config a cron schedule to start from a specific hour (e.g 0 12/4 *)? at the moment it's not possible, an error pops up about a syntax error.

Thanks.

hackeysack01 commented 1 year ago

this is called stepping in the vixie cron docs https://github.com/vixie/cron/blob/master/Documentation/Features.md

begna112 commented 11 months ago

Just wanted to add that this also should allow formats like 4/10 * * * * to run every 10 minutes, begining at the 4th minute of the hour.

ArthurMitchell42 commented 10 months ago

Hi,

Is it possible to config a cron schedule to start from a specific hour (e.g 0 12/4 *)?

at the moment it's not possible, an error pops up about a syntax error.

Thanks.

Plus one on this please

alexjustesen commented 10 months ago

Hi, Is it possible to config a cron schedule to start from a specific hour (e.g 0 12/4 *)? at the moment it's not possible, an error pops up about a syntax error.

Thanks.

@ShlomiD83 circling back on this one since cron came up on another issue recently. Could you do 0 */4 * * * to achieve every 4 hours?

ShlomiD83 commented 10 months ago

That's the way I have it set now, the request was if we can run the test from a specific time and then 4 hours. For example start at 2AM and from there count 4 hours and test again.

ArthurMitchell42 commented 10 months ago

Hi Alex, The issue with starting at midnight and multiple hours after that is that the results are heavily affected by quite a few background tasks like backup processes, downloads etc. These all start every 4 hours for me and can take up to 1.5 hours to finish. The ideal way to get around this would be to run speed test every 4 hours but offset it by 2 hours to run at 02:00, 06:00 etc. Thanks, Arthur

alexjustesen commented 10 months ago

Thanks @ArthurMitchell42, didn't really consider that use case but makes complete sense.

I either don't have as much data as you or I'm spoiled with gigabit symmetric.

ArthurMitchell42 commented 10 months ago

Oh wow, you are are lucky. The best I can hope for is about 73Mb/s down and 19 up even on a good day. πŸ˜€

alexjustesen commented 10 months ago

For later Alex, the package below seems to have better support for testing cron expressions for the use case above and also supports time zones which is a requirement of #929. I'll bring it into that PR and test it as a viable replacement.

https://github.com/poliander/cron

alexjustesen commented 10 months ago

@ShlomiD83 and @ArthurMitchell42 that other package ended up failing the validation criteria. While not ideal for the time being you could pass which hours you want the tests to run like so... 0 2,6,10,14,18,22 * * *

ArthurMitchell42 commented 10 months ago

Thanks, 2,6,10,14,18,22 works fine. Thanks, Arthur

armond-avanes commented 7 months ago

@alexjustesen Is there any ETA for supporting x/y field format?

alexjustesen commented 7 months ago

@alexjustesen Is there any ETA for supporting x/y field format?

Not at the moment, I haven't been able to find a package that'll help with validation of the expressions with the formats above.

lnlyssg commented 6 months ago

I wonder if the issues reported in #1260, #1328, #1319 are all linked into the fact everyone is hitting the servers on the hour, every 30 mins etc?

ykktrcb7 commented 6 months ago

1328 happens all the time unfortunately 😞

ArthurMitchell42 commented 5 months ago

Regarding issues of low rates I wonder if adding a "random" variance value on top of the cron schedule might help. For example, if you set a test every 4 hours, adding a pseudo random value of 60 minutes means the test would trigger at some point during the defined hour but would be substantially less likely to cause a storm of activity at the servers. This would be particularly effective if the randomness went down to seconds rather than units of minutes. What do we think?

hackeysack01 commented 5 months ago

or just comply as advertised with the cron syntax that goes back to the 1980's.

This halfassed implementation caused me to stop using this software.

alexjustesen commented 5 months ago

TL;DR

Use the range syntax to achieve an offset schedule, example 4-59/5 * * * *.

Deep Dive

Turns out the cron "standard" syntax is anything but consistent. Certain implementations (i.e. Java) support non-standard characters like / and others are more strict when it comes to the syntax it considers "valid".

As far as the syntax 4/5 * * * * this means "run every 5 minutes, starting at 4 minutes past the hour". So this syntax includes an "offset" for the schedule and would look like :04, :09, :14, :19 etc.

If we used the range syntax 4-59/5 * * * * this means "run every 5 minutes, minutes 4 through 59 past the hour". This syntax still includes an offset for the schedule but it's not as clear and would still look like :04, :09, :14, :19 etc.

image

While I'm using minutes in my examples above 0 12-23/4 * * * is also a valid expression using a range for the hours.

If you have trouble sleeping at night I suggest taking a look at https://en.wikipedia.org/wiki/Cron.

Since we finally have a good documented resolution that passes validation to implementing schedules "from" a time (aka an offset) I'm going to close this one out.

lnlyssg commented 5 months ago

Could I suggest either a doc/FAQ update and/or removing the link to https://crontab.cronhub.io on the settings page? As soon as you open https://crontab.cronhub.io it presents "/5 *" which gives the syntax error

alexjustesen commented 5 months ago

Could I suggest either a doc/FAQ update and/or removing the link to https://crontab.cronhub.io on the settings page? As soon as you open https://crontab.cronhub.io it presents "/5 *" which gives the syntax error

Done, added to the FAQ and linked back to this issue and the comment above. I'm going to leave the link to cronhub.io because */5 * * * * "every 5 minutes" is valid, 4/5 * * * * isn't with the offset.

lnlyssg commented 5 months ago

@alexjustesen Would https://crontab.guru be better to link to than https://crontab.cronhub.io/? The former gives a warning for 4/5 * * * * "Non standard! May not work with every cron."

alexjustesen commented 5 months ago

Happy to review a PR, it's probably a better fit.

TheGroundZero commented 3 months ago

Looks like https://github.com/alexjustesen/speedtest-tracker/issues/552#issuecomment-2028532010 no longer functions with the new config?

If I try e.g. 1/6 for hours I get a 500 error. Enabling debugging shows the issue is /6 considered as an invalid option. I had to swap to comma-seperated values instead.

lnlyssg commented 3 months ago

It's working OK for me with SPEEDTEST_SCHEDULE=7-59/30 * * * *

TheGroundZero commented 3 months ago

OK, so it appears you need to specify a range. It was unclear to me from the example that that is required and the <offset>/<interval> syntax doesn't work.

SPEEDTEST_SCHEDULE=37 1/6 * * *

InvalidArgumentException Invalid CRON field value 1/6 at position 1

SPEEDTEST_SCHEDULE=37 0-23/6 * * * Works fine

mightywomble commented 2 months ago

Seeing as my bug report was closed I will comment here

I have tried every SPEEDTEST_SCHEDULE in my docker compose none of them work.

All I’m looking for is a regular speed test that runs hourly. Can anyone help? Do I need to delete and recreate my config folder, I’m not sure this should be this hard?

services:
    speedtest-tracker:
        container_name: speedtest-tracker:latest
        ports:
            - 8080:80
            - 8443:443
        environment:
            - PUID=1000
            - PGID=1000
            - APP_KEY=************
            - DB_CONNECTION=sqlite
            - SPEEDTEST_SCHEDULE=7-59/30 * * * *
        volumes:
            - /opt/docker/speedtest:/config
            - /opt/docker/speedtest//ssl-keys:/config/keys
        image: lscr.io/linuxserver/speedtest-tracker:latest
        restart: unless-stopped
mpaw commented 2 months ago

@mightywomble SPEEDTEST_SCHEDULE=0 */1 * * * will run it every hour on the hour

mightywomble commented 2 months ago

ChangedDocker compose restartΒ Made no differenceI will delete all the config start again and see if that makes a differenceSent from my iPadOn 9 Jul 2024, at 07:43, Marcin Pawelek @.**> wrote:ο»Ώ @mightywomble SPEEDTEST_SCHEDULE=0 /1 * will run it every hour on the hour

β€”Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

lnlyssg commented 2 months ago

@mightywomble What exactly is the issue? Is the test not running at all with the schedule?

mightywomble commented 2 months ago

CorrectI am expecting the test it’s the above to run hourly, its not running at all on a scheduleΒ It runs ok manuallySent from my iPadOn 9 Jul 2024, at 09:28, Jim @.***> wrote:ο»Ώ @mightywomble What exactly is the issue? Is the test not running at all with the schedule?

β€”Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

lnlyssg commented 2 months ago

Do you see the "Next speedtest at:" info on the admin dashboard?

Screenshot 2024-07-09 at 10 23 46

Does the Results table show any tests other than the manual ones?

mightywomble commented 2 months ago

No I don't see Next Speed test at: The results table ONLY shows manual runs Get BlueMail https://bluemail.me for Desktop Jim wrote: Do you see the "Next speedtest at:" info on the admin dashboard? Screenshot.2024-07-09.at.10.23.46.png (view on web) https://github.com/alexjustesen/speedtest-tracker/assets/10902647/9d4f54ec-0e41-455f-90bf-e52ce9ac295f Does the Results table show any tests other than the manual ones? β€” Reply to this email directly, view it on GitHub https://github.com/alexjustesen/speedtest-tracker/issues/552#issuecomment-2217143082 , or unsubscribe https://github.com/notifications/unsubscribe-auth/AFA3ARDKUTQ6QA3BBIUCZBDZLOUAZAVCNFSM6AAAAAAXDS7UWOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMJXGE2DGMBYGI . You are receiving this because you were mentioned.Message ID: @.***>

mightywomble commented 2 months ago

I have deleted the previous volumes and started afresh with this

Using this docker-compose.yaml

services:
    speedtest-tracker:
        container_name: speedtest-tracker:latest
        ports:
            - 8080:80
            - 8443:443
        environment:
            - PUID=1000
            - PGID=1000
            - APP_KEY=base64:0C42pmY7frk9NRpniS+1yU=
            - DB_CONNECTION=sqlite
            - SPEEDTEST_SCHEDULE=0 */1 * * *
        volumes:
            - /opt/docker/speedtest:/config
            - /opt/docker/speedtest//ssl-keys:/config/keys
        image: lscr.io/linuxserver/speedtest-tracker:latest
        restart: unless-stopped

I ran

docker compose pull
docker compose start

This resulted in the following dashboard

image

mightywomble commented 2 months ago

Host OS - Debian 12 Docker Version - Docker version 27.0.3, build 7d4bcd8

Output from docker compose logs

speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  | 
speedtest-tracker  |       β–ˆβ–ˆβ•—     β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•— β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—
speedtest-tracker  |       β–ˆβ–ˆβ•‘     β–ˆβ–ˆβ•”β•β•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•”β•β•β•β–ˆβ–ˆβ•—
speedtest-tracker  |       β–ˆβ–ˆβ•‘     β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ•‘
speedtest-tracker  |       β–ˆβ–ˆβ•‘     β•šβ•β•β•β•β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘   β–ˆβ–ˆβ•‘
speedtest-tracker  |       β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•—β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•‘β–ˆβ–ˆβ•‘β•šβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ•”β•
speedtest-tracker  |       β•šβ•β•β•β•β•β•β•β•šβ•β•β•β•β•β•β•β•šβ•β• β•šβ•β•β•β•β•β•
speedtest-tracker  | 
speedtest-tracker  |    Brought to you by linuxserver.io
speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  | 
speedtest-tracker  | To support the app dev(s) visit:
speedtest-tracker  | speedtest-tracker: https://github.com/sponsors/alexjustesen
speedtest-tracker  | 
speedtest-tracker  | To support LSIO projects visit:
speedtest-tracker  | https://www.linuxserver.io/donate/
speedtest-tracker  | 
speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  | GID/UID
speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  | 
speedtest-tracker  | User UID:    1000
speedtest-tracker  | User GID:    1000
speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  | Linuxserver.io version: v0.20.6-ls30
speedtest-tracker  | Build-date: 2024-06-15T21:23:04+00:00
speedtest-tracker  | ───────────────────────────────────────
speedtest-tracker  |     
speedtest-tracker  | Setting resolver to  127.0.0.11
speedtest-tracker  | Setting worker_processes to 4
speedtest-tracker  | generating self-signed keys in /config/keys, you can replace these with your own keys if required
speedtest-tracker  | ..+......+++++++++++++++++++++++++++++++++++++++*........+.+.........+...+........+....+..+++++++++++++++++++++++++++++++++++++++*...+..+....+...+...+.................+.......+........+......+.+......+.....+.......+..+...+...+....+.....+...+.......+...+...+..+...+...................+......+...............+..+......+......+......+.......+......+...........+...+.+.........+......+......+.....+.+..+.+............+.....+.......+..+.........+......+......+...+....+..+.+............+.....+...+...+....+...+.....+...+......+....+......+.....+.+..+...+.......+.....+......+..........+............+...+...........+..................+.......+......+..+.+......+.....+.+.........+...+.....+.......++++++
speedtest-tracker  | .+......+.....+....+.....+......+.+..+.+.....+.+...+...+.....+.............+............+..+...+....+...+..+................+..+.+..+....+......+..+......+.......+........+.+++++++++++++++++++++++++++++++++++++++*.+...+...+..+++++++++++++++++++++++++++++++++++++++*.+...+.............+............+...+.........+........+.............+..+...+....+...+.....+....+.........+..+..........+.....+.........+.+.........+.........+.....++++++
speedtest-tracker  | -----
speedtest-tracker  | [custom-init] No custom files found, skipping...
speedtest-tracker  | [ls.io-init] done.
mightywomble commented 2 months ago

Manual Speedtest runs fine

mightywomble commented 2 months ago

sqlite3 dump

PRAGMA foreign_keys=OFF;
BEGIN TRANSACTION;
CREATE TABLE IF NOT EXISTS "migrations" ("id" integer primary key autoincrement not null, "migration" varchar not null, "batch" integer not null);
INSERT INTO migrations VALUES(1,'2014_10_12_000000_create_users_table',1);
INSERT INTO migrations VALUES(2,'2014_10_12_100000_create_password_resets_table',1);
INSERT INTO migrations VALUES(3,'2018_08_08_100000_create_telescope_entries_table',1);
INSERT INTO migrations VALUES(4,'2019_08_19_000000_create_failed_jobs_table',1);
INSERT INTO migrations VALUES(5,'2019_12_14_000001_create_personal_access_tokens_table',1);
INSERT INTO migrations VALUES(6,'2022_08_18_015337_create_jobs_table',1);
INSERT INTO migrations VALUES(7,'2022_08_31_202106_create_results_table',1);
INSERT INTO migrations VALUES(8,'2022_09_27_212959_create_admin_user',1);
INSERT INTO migrations VALUES(9,'2022_10_20_211143_create_sessions_table',1);
INSERT INTO migrations VALUES(10,'2022_10_21_104903_create_settings_table',1);
INSERT INTO migrations VALUES(11,'2022_10_21_130121_create_influxdb_settings',1);
INSERT INTO migrations VALUES(12,'2022_10_24_152031_create_notifications_table',1);
INSERT INTO migrations VALUES(13,'2022_10_24_153150_create_database_notifications_settings',1);
INSERT INTO migrations VALUES(14,'2022_10_24_153411_create_thresholds_settings',1);
INSERT INTO migrations VALUES(15,'2022_11_11_134355_create_mail_notification_settings',1);
INSERT INTO migrations VALUES(16,'2022_12_22_125055_create_telegram_notification_settings',1);
INSERT INTO migrations VALUES(17,'2023_01_05_205157_create_cache_table',1);
INSERT INTO migrations VALUES(18,'2023_01_12_135235_update_results_table',1);
INSERT INTO migrations VALUES(19,'2023_02_12_131620_add_comments_to_results_table',1);
INSERT INTO migrations VALUES(20,'2023_02_28_000000_create_one_time_operations_table',1);
INSERT INTO migrations VALUES(21,'2023_03_06_002044_add_verify_ssl_to_influx_db_settings',1);
INSERT INTO migrations VALUES(22,'2023_05_07_000000_rename_password_resets_table',1);
INSERT INTO migrations VALUES(23,'2023_09_11_144858_create_webhook_notification_settings',1);
INSERT INTO migrations VALUES(24,'2023_09_11_225054_add_role_to_users_table',1);
INSERT INTO migrations VALUES(25,'2024_02_07_173217_add_telegram_disable_notification_to_notification_settings',1);
INSERT INTO migrations VALUES(26,'2024_02_18_000000_create_data_migration_settings',1);
INSERT INTO migrations VALUES(27,'2024_02_18_000050_update_locked_default_on_settings_table',1);
INSERT INTO migrations VALUES(28,'2024_02_18_100000_results_bad_json_table',1);
INSERT INTO migrations VALUES(29,'2024_02_19_134641_create_job_batches_table',1);
INSERT INTO migrations VALUES(30,'2024_02_19_134706_create_imports_table',1);
INSERT INTO migrations VALUES(31,'2024_02_19_134707_create_exports_table',1);
INSERT INTO migrations VALUES(32,'2024_02_19_134708_create_failed_import_rows_table',1);
INSERT INTO migrations VALUES(33,'2024_02_22_144650_create_discord_notification_settings',1);
CREATE TABLE IF NOT EXISTS "users" ("id" integer primary key autoincrement not null, "name" varchar not null, "email" varchar not null, "email_verified_at" datetime, "password" varchar not null, "remember_token" varchar, "created_at" datetime, "updated_at" datetime, "role" varchar);
INSERT INTO users VALUES(1,'Admin','admin@example.com','2024-07-09 09:29:35','$2y$12$WeXmYgoPwihw2rKLXbQpVeG61IPIl9MGtvQMoFf81.TRJH1rOn6u2',NULL,'2024-07-09 09:29:35','2024-07-09 09:29:35','admin');
CREATE TABLE IF NOT EXISTS "password_reset_tokens" ("email" varchar not null, "token" varchar not null, "created_at" datetime);
CREATE TABLE IF NOT EXISTS "telescope_entries" ("sequence" integer primary key autoincrement not null, "uuid" varchar not null, "batch_id" varchar not null, "family_hash" varchar, "should_display_on_index" tinyint(1) not null default '1', "type" varchar not null, "content" text not null, "created_at" datetime);
CREATE TABLE IF NOT EXISTS "telescope_entries_tags" ("entry_uuid" varchar not null, "tag" varchar not null, foreign key("entry_uuid") references "telescope_entries"("uuid") on delete cascade, primary key ("entry_uuid", "tag"));
CREATE TABLE IF NOT EXISTS "telescope_monitoring" ("tag" varchar not null, primary key ("tag"));
CREATE TABLE IF NOT EXISTS "failed_jobs" ("id" integer primary key autoincrement not null, "uuid" varchar not null, "connection" text not null, "queue" text not null, "payload" text not null, "exception" text not null, "failed_at" datetime not null default CURRENT_TIMESTAMP);
CREATE TABLE IF NOT EXISTS "personal_access_tokens" ("id" integer primary key autoincrement not null, "tokenable_type" varchar not null, "tokenable_id" integer not null, "name" varchar not null, "token" varchar not null, "abilities" text, "last_used_at" datetime, "expires_at" datetime, "created_at" datetime, "updated_at" datetime);
CREATE TABLE IF NOT EXISTS "jobs" ("id" integer primary key autoincrement not null, "queue" varchar not null, "payload" text not null, "attempts" integer not null, "reserved_at" integer, "available_at" integer not null, "created_at" integer not null);
CREATE TABLE IF NOT EXISTS "sessions" ("id" varchar not null, "user_id" integer, "ip_address" varchar, "user_agent" text, "payload" text not null, "last_activity" integer not null, primary key ("id"));
INSERT INTO sessions VALUES('f890nfE5rXeRqegsG2Wpwwpp0gQMGglsXZjDdeYc',1,'172.18.0.1','Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/126.0.0.0 Safari/537.36 Edg/126.0.0.0','YTo3OntzOjY6Il90b2tlbiI7czo0MDoiV2Vjc0VBWDdZRWJ1R0ZlazYxZlRkVXd6ZGFWeU5IejhUVEZ3WjY3WiI7czozOiJ1cmwiO2E6MDp7fXM6OToiX3ByZXZpb3VzIjthOjE6e3M6MzoidXJsIjtzOjM5OiJodHRwczovL3NwZWVkdGVzdC5zYWZlaG9tZWxhbi5jb20vYWRtaW4iO31zOjY6Il9mbGFzaCI7YToyOntzOjM6Im9sZCI7YTowOnt9czozOiJuZXciO2E6MDp7fX1zOjUwOiJsb2dpbl93ZWJfNTliYTM2YWRkYzJiMmY5NDAxNTgwZjAxNGM3ZjU4ZWE0ZTMwOTg5ZCI7aToxO3M6MTc6InBhc3N3b3JkX2hhc2hfd2ViIjtzOjYwOiIkMnkkMTIkV2VYbVlnb1B3aWh3MnJLTFhiUXBWZUc2MUlQSWw5TUd0dlFNb0ZmODEuVFJKSDFyT242dTIiO3M6ODoiZmlsYW1lbnQiO2E6MDp7fX0=',1720517879);
CREATE TABLE IF NOT EXISTS "notifications" ("id" varchar not null, "type" varchar not null, "notifiable_type" varchar not null, "notifiable_id" integer not null, "data" text not null, "read_at" datetime, "created_at" datetime, "updated_at" datetime, primary key ("id"));
CREATE TABLE IF NOT EXISTS "cache" ("key" varchar not null, "value" text not null, "expiration" integer not null, primary key ("key"));
INSERT INTO cache VALUES('laravel_cache_settings.App\Settings\DataMigrationSettings','s:75:"O:34:"App\Settings\DataMigrationSettings":1:{s:17:"bad_json_migrated";b:1;}";',2035877375);
INSERT INTO cache VALUES('laravel_cache_287b58015ec6ed41cc45119562d7402bb1069aed:timer','i:1720517440;',1720517440);
INSERT INTO cache VALUES('laravel_cache_287b58015ec6ed41cc45119562d7402bb1069aed','i:2;',1720517440);
INSERT INTO cache VALUES('laravel_cache_settings.App\Settings\InfluxDbSettings','s:176:"O:29:"App\Settings\InfluxDbSettings":6:{s:10:"v2_enabled";b:0;s:6:"v2_url";N;s:6:"v2_org";N;s:9:"v2_bucket";s:17:"speedtest-tracker";s:8:"v2_token";N;s:13:"v2_verify_ssl";b:1;}";',2035877445);
INSERT INTO cache VALUES('laravel_cache_settings.App\Settings\NotificationSettings','s:701:"O:33:"App\Settings\NotificationSettings":20:{s:16:"database_enabled";b:0;s:25:"database_on_speedtest_run";b:0;s:29:"database_on_threshold_failure";b:0;s:12:"mail_enabled";b:0;s:21:"mail_on_speedtest_run";b:0;s:25:"mail_on_threshold_failure";b:0;s:15:"mail_recipients";N;s:16:"telegram_enabled";b:0;s:29:"telegram_disable_notification";b:0;s:25:"telegram_on_speedtest_run";b:0;s:29:"telegram_on_threshold_failure";b:0;s:19:"telegram_recipients";N;s:15:"webhook_enabled";b:0;s:24:"webhook_on_speedtest_run";b:0;s:28:"webhook_on_threshold_failure";b:0;s:12:"webhook_urls";N;s:15:"discord_enabled";b:0;s:24:"discord_on_speedtest_run";b:0;s:28:"discord_on_threshold_failure";b:0;s:16:"discord_webhooks";N;}";',2035877866);
CREATE TABLE IF NOT EXISTS "cache_locks" ("key" varchar not null, "owner" varchar not null, "expiration" integer not null, primary key ("key"));
CREATE TABLE IF NOT EXISTS "results_bad_json" ("id" integer primary key autoincrement not null, "ping" float, "download" integer, "upload" integer, "server_id" integer, "server_host" varchar, "server_name" varchar, "url" varchar, "scheduled" tinyint(1) not null default '0', "data" text, "created_at" datetime not null default CURRENT_TIMESTAMP, "successful" tinyint(1) not null default '1', "comments" text);
CREATE TABLE IF NOT EXISTS "operations" ("id" integer primary key autoincrement not null, "name" varchar not null, "dispatched" varchar check ("dispatched" in ('sync', 'async')) not null, "processed_at" datetime);
CREATE TABLE IF NOT EXISTS "settings" ("id" integer primary key autoincrement not null, "group" varchar not null, "name" varchar not null, "locked" tinyint(1) not null default '0', "payload" text not null, "created_at" datetime, "updated_at" datetime);
INSERT INTO settings VALUES(1,'influxdb','v2_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(2,'influxdb','v2_url',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(3,'influxdb','v2_org',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(4,'influxdb','v2_bucket',0,'"speedtest-tracker"','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(5,'influxdb','v2_token',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(6,'notification','database_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(7,'notification','database_on_speedtest_run',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(8,'notification','database_on_threshold_failure',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(9,'threshold','absolute_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(10,'threshold','absolute_download',0,'0','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(11,'threshold','absolute_upload',0,'0','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(12,'threshold','absolute_ping',0,'0','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(13,'notification','mail_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(14,'notification','mail_on_speedtest_run',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(15,'notification','mail_on_threshold_failure',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(16,'notification','mail_recipients',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(17,'notification','telegram_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(18,'notification','telegram_on_speedtest_run',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(19,'notification','telegram_on_threshold_failure',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(20,'notification','telegram_recipients',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(21,'influxdb','v2_verify_ssl',0,'true','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(22,'notification','webhook_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(23,'notification','webhook_on_speedtest_run',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(24,'notification','webhook_on_threshold_failure',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(25,'notification','webhook_urls',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(26,'notification','telegram_disable_notification',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(27,'data_migration','bad_json_migrated',0,'true','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(29,'notification','discord_enabled',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(30,'notification','discord_on_speedtest_run',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(31,'notification','discord_on_threshold_failure',0,'false','2024-07-09 09:29:35','2024-07-09 09:29:35');
INSERT INTO settings VALUES(32,'notification','discord_webhooks',0,'null','2024-07-09 09:29:35','2024-07-09 09:29:35');
CREATE TABLE IF NOT EXISTS "results" ("id" integer primary key autoincrement not null, "service" varchar not null default 'ookla', "ping" float, "download" integer, "upload" integer, "comments" text, "data" text, "status" varchar not null, "scheduled" tinyint(1) not null default '0', "created_at" datetime, "updated_at" datetime);
INSERT INTO results VALUES(1,'ookla',7.2460000000000004405,8719830,4437271,NULL,'{"type":"result","timestamp":"2024-07-09T09:37:46Z","ping":{"jitter":0.624,"latency":7.246,"low":7.051,"high":8.866},"download":{"bandwidth":8719830,"bytes":124794720,"elapsed":15003,"latency":{"iqm":29.522,"low":8.014,"high":301.131,"jitter":8.724}},"upload":{"bandwidth":4437271,"bytes":40986720,"elapsed":9310,"latency":{"iqm":51.914,"low":22.995,"high":348.846,"jitter":8.355}},"packetLoss":0,"isp":"Zen Internet","interface":{"internalIp":"172.20.0.2","name":"eth0","macAddr":"02:42:AC:14:00:02","isVpn":false,"externalIp":"88.98.84.166"},"server":{"id":40788,"host":"speedtest02a.web.zen.net.uk","port":8080,"name":"Zen Internet","location":"London","country":"United Kingdom","ip":"51.148.82.21"},"result":{"id":"1aeae31e-4713-424b-9794-8a118dcff734","url":"https:\/\/www.speedtest.net\/result\/c\/1aeae31e-4713-424b-9794-8a118dcff734","persisted":true}}','completed',0,'2024-07-09 09:37:18','2024-07-09 09:37:46');
CREATE TABLE IF NOT EXISTS "job_batches" ("id" varchar not null, "name" varchar not null, "total_jobs" integer not null, "pending_jobs" integer not null, "failed_jobs" integer not null, "failed_job_ids" text not null, "options" text, "cancelled_at" integer, "created_at" integer not null, "finished_at" integer, primary key ("id"));
CREATE TABLE IF NOT EXISTS "imports" ("id" integer primary key autoincrement not null, "completed_at" datetime, "file_name" varchar not null, "file_path" varchar not null, "importer" varchar not null, "processed_rows" integer not null default '0', "total_rows" integer not null, "successful_rows" integer not null default '0', "user_id" integer not null, "created_at" datetime, "updated_at" datetime, foreign key("user_id") references "users"("id") on delete cascade);
CREATE TABLE IF NOT EXISTS "exports" ("id" integer primary key autoincrement not null, "completed_at" datetime, "file_disk" varchar not null, "file_name" varchar, "exporter" varchar not null, "processed_rows" integer not null default '0', "total_rows" integer not null, "successful_rows" integer not null default '0', "user_id" integer not null, "created_at" datetime, "updated_at" datetime, foreign key("user_id") references "users"("id") on delete cascade);
CREATE TABLE IF NOT EXISTS "failed_import_rows" ("id" integer primary key autoincrement not null, "data" text not null, "import_id" integer not null, "validation_error" text, "created_at" datetime, "updated_at" datetime, foreign key("import_id") references "imports"("id") on delete cascade);
DELETE FROM sqlite_sequence;
INSERT INTO sqlite_sequence VALUES('migrations',33);
INSERT INTO sqlite_sequence VALUES('users',1);
INSERT INTO sqlite_sequence VALUES('results_bad_json',0);
INSERT INTO sqlite_sequence VALUES('settings',32);
INSERT INTO sqlite_sequence VALUES('results',1);
INSERT INTO sqlite_sequence VALUES('jobs',1);
CREATE UNIQUE INDEX "users_email_unique" on "users" ("email");
CREATE INDEX "password_resets_email_index" on "password_reset_tokens" ("email");
CREATE UNIQUE INDEX "telescope_entries_uuid_unique" on "telescope_entries" ("uuid");
CREATE INDEX "telescope_entries_batch_id_index" on "telescope_entries" ("batch_id");
CREATE INDEX "telescope_entries_family_hash_index" on "telescope_entries" ("family_hash");
CREATE INDEX "telescope_entries_created_at_index" on "telescope_entries" ("created_at");
CREATE INDEX "telescope_entries_type_should_display_on_index_index" on "telescope_entries" ("type", "should_display_on_index");
CREATE INDEX "telescope_entries_tags_tag_index" on "telescope_entries_tags" ("tag");
CREATE UNIQUE INDEX "failed_jobs_uuid_unique" on "failed_jobs" ("uuid");
CREATE INDEX "personal_access_tokens_tokenable_type_tokenable_id_index" on "personal_access_tokens" ("tokenable_type", "tokenable_id");
CREATE UNIQUE INDEX "personal_access_tokens_token_unique" on "personal_access_tokens" ("token");
CREATE INDEX "jobs_queue_index" on "jobs" ("queue");
CREATE INDEX "sessions_user_id_index" on "sessions" ("user_id");
CREATE INDEX "sessions_last_activity_index" on "sessions" ("last_activity");
CREATE INDEX "notifications_notifiable_type_notifiable_id_index" on "notifications" ("notifiable_type", "notifiable_id");
CREATE UNIQUE INDEX "settings_group_name_unique" on "settings" ("group", "name");
COMMIT;
svenvg93 commented 2 months ago

Environment are only applied when creating the container, just pull and start won't work, Can you run docker compose down && docker compose up -d

mightywomble commented 2 months ago

Not my first rodeoPlease see aboveI deleted everything and started againNo joySent from my iPadOn 9 Jul 2024, at 11:41, Sven van Ginkel @.***> wrote:ο»Ώ Environment are only applied when creating the container, just pull and start won't work, Can you run docker compose down && docker compose up -d

β€”Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>

lnlyssg commented 2 months ago

I've just copied your compose file from above and except for having to change the containername (`Error response from daemon: Invalid container name (speedtest-tracker:latest), only [a-zA-Z0-9][a-zA-Z0-9.-] are allowed`) and the APP_KEY it's started up OK and shows a next Speedtest at 12 UTC:

Screenshot 2024-07-09 at 12 50 20
mightywomble commented 2 months ago

So i've just built the Dockerfile locally, run from the image that created, it works, if I pull it from the linuxserver.io repo, it doesn't.. No idea why, no time to investigate right now..

OddSquirrel commented 2 months ago

Funnily enough, this works for and runs a speedtest every hour on the hour:

SPEEDTEST_SCHEDULE=0 * * * *

FWIW, I don't see the graphs anymore, either. But the results are all in the database. No idea when that happened, since I don't check the dashboard all the time, but probably after one of the recent updates.