dresden-elektronik / deconz-rest-plugin-v2

deCONZ REST-API version 2 development repository.
BSD 3-Clause "New" or "Revised" License
17 stars 0 forks source link

DDF bundle store #16

Closed Zehir closed 9 months ago

Zehir commented 1 year ago

DDF bundle store

Important PRE-ALPHA-ALPHA this is only a collection of thoughts to figure out how it could be handled.

Deconz community store

The directory structure will like the current /devices of the rest-plugin repo. The creation of the bundle will be make by some github action but where do we put the artifacts, either in a release when a file get updated or an other branche like the gh-pages.

A store will need a way to get the list of avaliable DDF, could be a json file.

The DDF need to be filtred by vendor, model ID, status (gold,...), signature type (deconz (official), others).

This store can be hosted by github. Any user that want to make his own store can fork the repo.

We could make a github bot to have warnings when the js script is changed but the json file that use it is not modified to warn about possible sides effects.

User store

This store will accept already bundled ddf to make them avaliable to other users.

Can't be host on github (probably)

manup commented 1 year ago

Hi here are some thoughts, for me the most difficult part is finding a solution without a backend/database, but that might be just because I'm not familiar with the GitHub API :)

The directory structure will like the current /devices of the rest-plugin repo.

I suggest we move the DDF content which currently in https://github.com/dresden-elektronik/deconz-rest-plugin/tree/master/devices to it's own repository, perhaps https://github.com/deconz-community/ddf to have them cleanly separated from the C++ REST-API plugin (and the store).

How to get DDFs from here to the store

Creating a bundle by script, GitHub Action or manually on a local machine is the same:

$ ddf-bundle create path/to/philips/acmee2000.json

Which creates the acmee2000_ae4cff.ddf bundle file.

The important point to note is that a bundle is always unique and can't be overwritten or destroyed, history is always preserved and users can easily switch between versions.

Interesting cases

Q: What happens when a user modifies a bundle and uploads it to the store? A: Search files are recreated and modified version shows up. Side by side with prior bundles for the device, ideally with a testinglabel.

Q: What happens when a user uploads a bundle which already exists (no modifications)? A: Nothing, as the bundle hash doesn't change.

Q: What happens if a users bundle is uploaded to the store, tested and later on the user creates a PR to https://github.com/deconz-community/ddf , once merged wouldn't the GitHub Action reupload / duplicate the bundle? A: Nothing happens, regardless of the source the bundle has the same hash.

Q: What happens if someone uploads a malware.ddf ? A: If the file isn't in the DDF bundle format it is rejected. Simple checks like RIFF format and valid data to be JSON/JS or whatever we allow are sufficient. The Javascript in a DDF always only runs in deCONZ sandboxed JS engine.

Q: As Thomas pointed out, how can we search for DDFs/bundles which use a certain attribute like state/on? A: There are multiple ways, if it's desired to keep searchability through GitHub each bundle can automatically be unpacked into <bundle-hash>/... directory and the files committed, due the hash there aren't collisions. Searching then is the same as currently. Note in GIT there aren't actually duplicated files between directories which contain the same file.

Personally I think this shouldn't be part of the store, to not making the store more complex for developers searches. E.g. a 10 line script could do the job, like dev-download-and-unpack-ddf-bundles.py and grep -rn 'state/on' shows all bundles with that item.

More interesting cases

Q: How to get local user bundles which were uploaded to the store for testing in the official DDF repository at https://github.com/deconz-community/ddf? A: The user creates a PR in the repository. If the user doesn't have GitHub this could be done by any developer, e.g. unpack bundle, git add . && git commit. As mentioned above the bundle hash always stays the same, when the GitHub Action pushes to the store nothing happens as the content is already there.

The important point here is that users could already test the bundle, it could also be marked as stable by signature. All without having to wait for the PR to be merged, which would be the bottleneck. PRs to be merged need to be reviewed, but even if they touch files of other DDFs, that would result in new bundles — these would be marked as testing automatically, not overwriting stable bundles.

JSON and Queries

While I understand the reasoning behind using JSON files to index bundles and search or filter them, note that it's more powerful and speedier to query against a database. Just like the JSON files a SQLite database file could be created automatically, it can be used in a browser https://sql.js.org/#/?id=inside-the-browser as well.

For example one ddf-store.db file to be downloaded by the browser, and queries like:

SELECT * FROM bundles inner join signatures on hash WHERE modelid = 'acmee2000' AND signature = 'stable'

(deCONZ will internally use something like that)

Features which aren't so obvious

Details aside I really do like what this would implicitly bring us.

Zehir commented 1 year ago

Hi here are some thoughts, for me the most difficult part is finding a solution without a backend/database, but that might be just because I'm not familiar with the GitHub API :)

The directory structure will like the current /devices of the rest-plugin repo.

I suggest we move the DDF content which currently in https://github.com/dresden-elektronik/deconz-rest-plugin/tree/master/devices to it's own repository, perhaps https://github.com/deconz-community/ddf to have them cleanly separated from the C++ REST-API plugin (and the store).

Yes I would like to do that too.

How to get DDFs from here to the store

Creating a bundle by script, GitHub Action or manually on a local machine is the same:

$ ddf-bundle create path/to/philips/acmee2000.json

Which creates the acmee2000_ae4cff.ddf bundle file.

  • This can be done by a GitHub Action for all DDFs in the repo, which are then pushed to the store.
  • The store only needs to save the bundle file for simplicity with its hash as file name.
  • If the bundle already exists, do nothing.
  • If it doesn't exist save it and re-create the JSON files needed to do all searches over all bundle files. We can create as many JSON files as we like to simplify searching (a database would be much simpler here but we can have JSON files to represent queries).

A github action can bundle all DDF on the main branch to a build branch or having them published inside a Release thread.

The important point to note is that a bundle is always unique and can't be overwritten or destroyed, history is always preserved and users can easily switch between versions.

What happen with DDF that is unstable or invalid after bundling. I think we should be able to delete them too.

Interesting cases

Q: What happens when a user modifies a bundle and uploads it to the store? A: Search files are recreated and modified version shows up. Side by side with prior bundles for the device, ideally with a testinglabel.

I think I would like to have a seperate repo for the user upload. This is a easy way to separate them and they will use an other workflow. It's not a big issue to load 2 catalog file and most users will probably use only verified DDF. Edit : Maybe not, see below.

We need a way to know that the new bundle is a modified one and not a new one. Maybe with a unique ID on the DDF file desc sector.

Q: What happens when a user uploads a bundle which already exists (no modifications)? A: Nothing, as the bundle hash doesn't change.

Q: What happens if a users bundle is uploaded to the store, tested and later on the user creates a PR to https://github.com/deconz-community/ddf , once merged wouldn't the GitHub Action reupload / duplicate the bundle? A: Nothing happens, regardless of the source the bundle has the same hash.

I would like to bundle upload be only PR, the PR will handle all beta versions in the conversation of the PR. And when it's considered as stable merge the PR and the new bundle is now part of the may process.

There will be 2 pipelines, one for creating DDF and one for editing DDF. From a user UI he won't see the difference.

Q: What happens if someone uploads a malware.ddf ? A: If the file isn't in the DDF bundle format it is rejected. Simple checks like RIFF format and valid data to be JSON/JS or whatever we allow are sufficient. The Javascript in a DDF always only runs in deCONZ sandboxed JS engine.

It's easy to check if the data is a valid json or not but for other files, I don't know if I can check that is a valid markdown or javascript file.

Q: As Thomas pointed out, how can we search for DDFs/bundles which use a certain attribute like state/on? A: There are multiple ways, if it's desired to keep searchability through GitHub each bundle can automatically be unpacked into <bundle-hash>/... directory and the files committed, due the hash there aren't collisions. Searching then is the same as currently. Note in GIT there aren't actually duplicated files between directories which contain the same file.

GIT support symlink but where I should store the first file, in one of the DDF that use that file ? Or a generic folder ?

Personally I think this shouldn't be part of the store, to not making the store more complex for developers searches. E.g. a 10 line script could do the job, like dev-download-and-unpack-ddf-bundles.py and grep -rn 'state/on' shows all bundles with that item.

More interesting cases

Q: How to get local user bundles which were uploaded to the store for testing in the official DDF repository at https://github.com/deconz-community/ddf? A: The user creates a PR in the repository. If the user doesn't have GitHub this could be done by any developer, e.g. unpack bundle, git add . && git commit. As mentioned above the bundle hash always stays the same, when the GitHub Action pushes to the store nothing happens as the content is already there.

The important point here is that users could already test the bundle, it could also be marked as stable by signature. All without having to wait for the PR to be merged, which would be the bottleneck. PRs to be merged need to be reviewed, but even if they touch files of other DDFs, that would result in new bundles — these would be marked as testing automatically, not overwriting stable bundles.

Maybe create a issue with the DDF attached and a label DDF-Upload. A github action could unpack it on a new branch, and make a PR from that branch to main. And then the discutions will be on that PR and not on the issue. And use that PR to upload changes too. And use the github features to see what new, changed, ...

JSON and Queries

While I understand the reasoning behind using JSON files to index bundles and search or filter them, note that it's more powerful and speedier to query against a database. Just like the JSON files a SQLite database file could be created automatically, it can be used in a browser https://sql.js.org/#/?id=inside-the-browser as well.

For example one ddf-store.db file to be downloaded by the browser, and queries like:

SELECT * FROM bundles inner join signatures on hash WHERE modelid = 'acmee2000' AND signature = 'stable'

I don't know if github will allow to have a sqlite file in the repo.

(deCONZ will internally use something like that)

Yes that obvious

Features which aren't so obvious

Details aside I really do like what this would implicitly bring us.

  • New or updated DDFs are completely separate from deCONZ releases (awesome).

So the deCONZ release won't have any DDFs ? Or only a curated list that is used by most people ?

  • Nothing can break (1), no forced global DDF updates for users as we have now (a deCONZ release is take it or leave it).

I think we should bundle some DDF in the release too and propose the user to update from the bundle. Maybe that extra work for not that much benefits.

  • Users can decide on a per DDF / device base if they want to use a newer DDF.

In general newer is better.

  • No PR review bottlenecks.

Exactly

  • Nothing can break (2) no accidental change on one DDF would break another, if a PR gets merged with a shared file it automatically creates a new bundle for the other DDFs using that file (with state testing again).

I think it's will be easier to not have shared file. We can duplicate them at first if needed

  • Distribution and search-ability for DDF bundles is really easy.
  • Switching between versions of a bundle becomes a one click operation for the user.

👍

manup commented 1 year ago

How to get DDFs from here to the store

Creating a bundle by script, GitHub Action or manually on a local machine is the same:

$ ddf-bundle create path/to/philips/acmee2000.json

Which creates the acmee2000_ae4cff.ddf bundle file.

  • This can be done by a GitHub Action for all DDFs in the repo, which are then pushed to the store.
  • The store only needs to save the bundle file for simplicity with its hash as file name.
  • If the bundle already exists, do nothing.
  • If it doesn't exist save it and re-create the JSON files needed to do all searches over all bundle files. We can create as many JSON files as we like to simplify searching (a database would be much simpler here but we can have JSON files to represent queries).

A github action can bundle all DDF on the main branch to a build branch or having them published inside a Release thread.

I'm not 100% sure here, but the idea was that the .ddf bundle isn't stored in the DDF repository put pushed/uploaded to the store repository. Into a Release Thread could be one option or simply in a directory bundles/<bundle-hash>.ddf.

The important point to note is that a bundle is always unique and can't be overwritten or destroyed, history is always preserved and users can easily switch between versions.

What happen with DDF that is unstable or invalid after bundling. I think we should be able to delete them too.

Yeah devs with admin rights can delete a .ddf file (perhaps later on from a admin UI), the rest is refreshed automatically when the JSON files are re-generated.

Interesting cases

Q: What happens when a user modifies a bundle and uploads it to the store? A: Search files are recreated and modified version shows up. Side by side with prior bundles for the device, ideally with a testinglabel.

I think I would like to have a seperate repo for the user upload. This is a easy way to separate them and they will use an other workflow. It's not a big issue to load 2 catalog file and most users will probably use only verified DDF. Edit : Maybe not, see below.

Could be seperated but I think we can keep it simple here, from the stores perspective it's just a bunch of files regardless from where they are uploaded. The signatures already provide us everything to know whats custom user stuff or official, stable, etc.

We need a way to know that the new bundle is a modified one and not a new one. Maybe with a unique ID on the DDF file desc sector.

I thought so too, but realized we already have all this data :) consider the uniqueid to be {modleid, manufacturername} each bundle referring to this tuple is related and can be ordered by the timestamp it contains. Noteable, that this still works if a bundle for multiple modelids is split up or extended.

Q: What happens when a user uploads a bundle which already exists (no modifications)? A: Nothing, as the bundle hash doesn't change. Q: What happens if a users bundle is uploaded to the store, tested and later on the user creates a PR to https://github.com/deconz-community/ddf , once merged wouldn't the GitHub Action reupload / duplicate the bundle? A: Nothing happens, regardless of the source the bundle has the same hash.

I would like to bundle upload be only PR, the PR will handle all beta versions in the conversation of the PR. And when it's considered as stable merge the PR and the new bundle is now part of the may process.

Tricky, if I understand this right. I'd like that it's possible to upload a bundle even before a PR, otherwise we get the PR review bottleneck back which we currently suffer. The PRs are useful to finally get the sources in the DDF repository which should then also include the review process and discussion. Note this is purely "optional" the store also works with no DDF repository at all :) From a mental model the DDF repository could only contain the good stuff.

There will be 2 pipelines, one for creating DDF and one for editing DDF. From a user UI he won't see the difference.

Q: What happens if someone uploads a malware.ddf ? A: If the file isn't in the DDF bundle format it is rejected. Simple checks like RIFF format and valid data to be JSON/JS or whatever we allow are sufficient. The Javascript in a DDF always only runs in deCONZ sandboxed JS engine.

It's easy to check if the data is a valid json or not but for other files, I don't know if I can check that is a valid markdown or javascript file.

Overall I think simple checks are more than enough here, there is nothing in the bundle which gets executed other than the Javascript in a sandboxed environment (no filesystem/network access). If we see something fishy we can delete the bundle, same as we do with maliscios wiki entries.

deCONZ already contains code to verify/test if a DDF is ok (DDF loader), we could seperate this out and use it in the pipeline to reduce extra duplicated work.

Q: As Thomas pointed out, how can we search for DDFs/bundles which use a certain attribute like state/on? A: There are multiple ways, if it's desired to keep searchability through GitHub each bundle can automatically be unpacked into <bundle-hash>/... directory and the files committed, due the hash there aren't collisions. Searching then is the same as currently. Note in GIT there aren't actually duplicated files between directories which contain the same file.

GIT support symlink but where I should store the first file, in one of the DDF that use that file ? Or a generic folder ?

The bundles would be unpacked each in their own folder, where the folder name is the bundle hash. When such a folder is commited to a repository nothing extra like symlinks needs to be done, GIT stores files via content hashes, two folders with the same file actually point to the same data blob (the file content).

... but as mentioned before I think this shouldn't be part of the store.

Personally I think this shouldn't be part of the store, to not making the store more complex for developers searches. E.g. a 10 line script could do the job, like dev-download-and-unpack-ddf-bundles.py and grep -rn 'state/on' shows all bundles with that item.

More interesting cases

Q: How to get local user bundles which were uploaded to the store for testing in the official DDF repository at https://github.com/deconz-community/ddf? A: The user creates a PR in the repository. If the user doesn't have GitHub this could be done by any developer, e.g. unpack bundle, git add . && git commit. As mentioned above the bundle hash always stays the same, when the GitHub Action pushes to the store nothing happens as the content is already there. The important point here is that users could already test the bundle, it could also be marked as stable by signature. All without having to wait for the PR to be merged, which would be the bottleneck. PRs to be merged need to be reviewed, but even if they touch files of other DDFs, that would result in new bundles — these would be marked as testing automatically, not overwriting stable bundles.

Maybe create a issue with the DDF attached and a label DDF-Upload. A github action could unpack it on a new branch, and make a PR from that branch to main. And then the discutions will be on that PR and not on the issue. And use that PR to upload changes too. And use the github features to see what new, changed, ...

Yeah some streamlines process here would be nice.

JSON and Queries

While I understand the reasoning behind using JSON files to index bundles and search or filter them, note that it's more powerful and speedier to query against a database. Just like the JSON files a SQLite database file could be created automatically, it can be used in a browser https://sql.js.org/#/?id=inside-the-browser as well. For example one ddf-store.db file to be downloaded by the browser, and queries like:

SELECT * FROM bundles inner join signatures on hash WHERE modelid = 'acmee2000' AND signature = 'stable'

I don't know if github will allow to have a sqlite file in the repo.

As far as I know everything can be stored, could also be kept in a Release Thread.

(deCONZ will internally use something like that)

Yes that obvious

Features which aren't so obvious

Details aside I really do like what this would implicitly bring us.

  • New or updated DDFs are completely separate from deCONZ releases (awesome).

So the deCONZ release won't have any DDFs ? Or only a curated list that is used by most people ?

For simplicity I'd propose when a release is build it just includes whatever is in the store and marked as official / stable (signed). deCONZ releases are build by scripts in Docker environment these could query the store in the same way the store HTML page does.

  • Nothing can break (1), no forced global DDF updates for users as we have now (a deCONZ release is take it or leave it).

I think we should bundle some DDF in the release too and propose the user to update from the bundle. Maybe that extra work for not that much benefits.

From a UI perspective a user sees "theres an update for this device(s)", later on perhaps with the option to enable auto updates for a given modelid+manufacturername for bundles marked stable.

If a setup pairs a new device, for a not before used bundle (but which is part of the release) it should be assigned automatically. From there on it works like the Google/Apple App Store just with bundles instead of apps, to update press the update button :) plus the ability to switch to prior versions.

  • Users can decide on a per DDF / device base if they want to use a newer DDF.

In general newer is better.

DDFs improved breaking changes a lot compared to what we had with the C++ only approach, but it still can happen. That's why I like that the user is in full control and can decide to update or not (and revert).

Zehir commented 1 year ago

How to get DDFs from here to the store

Creating a bundle by script, GitHub Action or manually on a local machine is the same:

$ ddf-bundle create path/to/philips/acmee2000.json

Which creates the acmee2000_ae4cff.ddf bundle file.

  • This can be done by a GitHub Action for all DDFs in the repo, which are then pushed to the store.
  • The store only needs to save the bundle file for simplicity with its hash as file name.
  • If the bundle already exists, do nothing.
  • If it doesn't exist save it and re-create the JSON files needed to do all searches over all bundle files. We can create as many JSON files as we like to simplify searching (a database would be much simpler here but we can have JSON files to represent queries).

We will see what is the easier way to do it.

A github action can bundle all DDF on the main branch to a build branch or having them published inside a Release thread.

I'm not 100% sure here, but the idea was that the .ddf bundle isn't stored in the DDF repository put pushed/uploaded to the store repository. Into a Release Thread could be one option or simply in a directory bundles/<bundle-hash>.ddf.

The important point to note is that a bundle is always unique and can't be overwritten or destroyed, history is always preserved and users can easily switch between versions.

What happen with DDF that is unstable or invalid after bundling. I think we should be able to delete them too.

Yeah devs with admin rights can delete a .ddf file (perhaps later on from a admin UI), the rest is refreshed automatically when the JSON files are re-generated.

Interesting cases

Q: What happens when a user modifies a bundle and uploads it to the store? A: Search files are recreated and modified version shows up. Side by side with prior bundles for the device, ideally with a testinglabel.

I think I would like to have a seperate repo for the user upload. This is a easy way to separate them and they will use an other workflow. It's not a big issue to load 2 catalog file and most users will probably use only verified DDF. Edit : Maybe not, see below.

Could be seperated but I think we can keep it simple here, from the stores perspective it's just a bunch of files regardless from where they are uploaded. The signatures already provide us everything to know whats custom user stuff or official, stable, etc.

When a user export the DDF from the rest api, it's will be signed or unsigned ?

I think the easier way to handle signature if it's the store that sign them with different properties based on the users that edit / push the DDF content. This prevent the leak of the keys and there is only one unique keys and you check the signature meta to know who made it, who approved it, ... In the repo all that data will be store somewhere like a signature.json in the folder that can't be modified by a PR unless it's made by us.

We need a way to know that the new bundle is a modified one and not a new one. Maybe with a unique ID on the DDF file desc sector.

I thought so too, but realized we already have all this data :) consider the uniqueid to be {modleid, manufacturername} each bundle referring to this tuple is related and can be ordered by the timestamp it contains. Noteable, that this still works if a bundle for multiple modelids is split up or extended.

Yes but no because of this;

And when it's have multiple models id it's manuA_modelA_manuB_modelB ? It's will lead to long names and could be a issue if the path get really long.

Q: What happens when a user uploads a bundle which already exists (no modifications)? A: Nothing, as the bundle hash doesn't change. Q: What happens if a users bundle is uploaded to the store, tested and later on the user creates a PR to https://github.com/deconz-community/ddf , once merged wouldn't the GitHub Action reupload / duplicate the bundle? A: Nothing happens, regardless of the source the bundle has the same hash.

I would like to bundle upload be only PR, the PR will handle all beta versions in the conversation of the PR. And when it's considered as stable merge the PR and the new bundle is now part of the may process.

Tricky, if I understand this right. I'd like that it's possible to upload a bundle even before a PR, otherwise we get the PR review bottleneck back which we currently suffer. The PRs are useful to finally get the sources in the DDF repository which should then also include the review process and discussion. Note this is purely "optional" the store also works with no DDF repository at all :) From a mental model the DDF repository could only contain the good stuff.

There is multiple path that we can take:

Store everything on the main branch

I need to check the github probot api to see if we can upload a file from a HTTP call.

If I can (I mean the bot) :

Why this ? because we can limit users contribution on the repo without write access. And the purpose of the PR is that we can add comments on it to send commands to the bot or contribute on it. Like @bot I validate that DDF

Store verified DDF on the main branch and users on open PRs

Like before but don't merge the PR until it's was approved but the catalog will be updated to allow finding theses DDF from the store. The .ddf will be made by the bot and stored as comment in the PR tread. On each changes on the PR. The issue is that Github don't accept .ddf file but it's accept .ddf if renamed to .txt or when the .ddf if in a .zip file, it's lighter and can be unzip from the store UI before sending it to the rest api. https://gildas-lormeau.github.io/zip.js/demos/demo-read-file.html

In case we can't upload from a web request

We can create an issue with the .ddf in the zip (that may contain multiple DDF) and create the PR from that and then close the issue.

There will be 2 pipelines, one for creating DDF and one for editing DDF. From a user UI he won't see the difference.

Q: What happens if someone uploads a malware.ddf ? A: If the file isn't in the DDF bundle format it is rejected. Simple checks like RIFF format and valid data to be JSON/JS or whatever we allow are sufficient. The Javascript in a DDF always only runs in deCONZ sandboxed JS engine.

It's easy to check if the data is a valid json or not but for other files, I don't know if I can check that is a valid markdown or javascript file.

Overall I think simple checks are more than enough here, there is nothing in the bundle which gets executed other than the Javascript in a sandboxed environment (no filesystem/network access). If we see something fishy we can delete the bundle, same as we do with maliscios wiki entries.

deCONZ already contains code to verify/test if a DDF is ok (DDF loader), we could seperate this out and use it in the pipeline to reduce extra duplicated work.

Ok seems fair enought

Q: As Thomas pointed out, how can we search for DDFs/bundles which use a certain attribute like state/on? A: There are multiple ways, if it's desired to keep searchability through GitHub each bundle can automatically be unpacked into <bundle-hash>/... directory and the files committed, due the hash there aren't collisions. Searching then is the same as currently. Note in GIT there aren't actually duplicated files between directories which contain the same file.

GIT support symlink but where I should store the first file, in one of the DDF that use that file ? Or a generic folder ?

The bundles would be unpacked each in their own folder, where the folder name is the bundle hash. When such a folder is commited to a repository nothing extra like symlinks needs to be done, GIT stores files via content hashes, two folders with the same file actually point to the same data blob (the file content).

... but as mentioned before I think this shouldn't be part of the store.

Yes I know but I don't want the store became a black box that is just a hosting service. And when somebody would like to contribute on it it's will be easier if it's already unpacked.

Personally I think this shouldn't be part of the store, to not making the store more complex for developers searches. E.g. a 10 line script could do the job, like dev-download-and-unpack-ddf-bundles.py and grep -rn 'state/on' shows all bundles with that item.

More interesting cases

Q: How to get local user bundles which were uploaded to the store for testing in the official DDF repository at https://github.com/deconz-community/ddf? A: The user creates a PR in the repository. If the user doesn't have GitHub this could be done by any developer, e.g. unpack bundle, git add . && git commit. As mentioned above the bundle hash always stays the same, when the GitHub Action pushes to the store nothing happens as the content is already there. The important point here is that users could already test the bundle, it could also be marked as stable by signature. All without having to wait for the PR to be merged, which would be the bottleneck. PRs to be merged need to be reviewed, but even if they touch files of other DDFs, that would result in new bundles — these would be marked as testing automatically, not overwriting stable bundles.

Maybe create a issue with the DDF attached and a label DDF-Upload. A github action could unpack it on a new branch, and make a PR from that branch to main. And then the discutions will be on that PR and not on the issue. And use that PR to upload changes too. And use the github features to see what new, changed, ...

Yeah some streamlines process here would be nice.

JSON and Queries

While I understand the reasoning behind using JSON files to index bundles and search or filter them, note that it's more powerful and speedier to query against a database. Just like the JSON files a SQLite database file could be created automatically, it can be used in a browser https://sql.js.org/#/?id=inside-the-browser as well. For example one ddf-store.db file to be downloaded by the browser, and queries like:

SELECT * FROM bundles inner join signatures on hash WHERE modelid = 'acmee2000' AND signature = 'stable'

I don't know if github will allow to have a sqlite file in the repo.

As far as I know everything can be stored, could also be kept in a Release Thread.

Because he will need preview data I would like to keep it on the repo to have a unique url and not like the lastest release.

(deCONZ will internally use something like that)

Yes that obvious

Features which aren't so obvious

Details aside I really do like what this would implicitly bring us.

  • New or updated DDFs are completely separate from deCONZ releases (awesome).

So the deCONZ release won't have any DDFs ? Or only a curated list that is used by most people ?

For simplicity I'd propose when a release is build it just includes whatever is in the store and marked as official / stable (signed). deCONZ releases are build by scripts in Docker environment these could query the store in the same way the store HTML page does.

Yes but that may became hundreds of GET request to get them all. Or maybe clone the repo and build what needed, we will figure this out when the store will be ready. Or store a zip with all stable DDFs stored somewhere and updated when one of the stable DDF changed. And when a user want to update multiple stable DDF.

  • Nothing can break (1), no forced global DDF updates for users as we have now (a deCONZ release is take it or leave it).

I think we should bundle some DDF in the release too and propose the user to update from the bundle. Maybe that extra work for not that much benefits.

From a UI perspective a user sees "theres an update for this device(s)", later on perhaps with the option to enable auto updates for a given modelid+manufacturername for bundles marked stable.

A auto updates mean that it's deconz that do the update, not the UI.

If a setup pairs a new device, for a not before used bundle (but which is part of the release) it should be assigned automatically. From there on it works like the Google/Apple App Store just with bundles instead of apps, to update press the update button :) plus the ability to switch to prior versions.

  • Users can decide on a per DDF / device base if they want to use a newer DDF.

In general newer is better.

DDFs improved breaking changes a lot compared to what we had with the C++ only approach, but it still can happen. That's why I like that the user is in full control and can decide to update or not (and revert).

Yes just display the update and let the user decide what he want to do and add .md file with warning if there is a warning about it. It's will be a 2 click update, one to see details on the update and one to apply it. Or only one click but pause if there is a warning message about possible breaking change if updating from < 1.x of the DDF version. For that we will need to have some convention on the markdown notes / changelog.

Like a changelog like this : https://keepachangelog.com/en/1.0.0/ And have a tag ###breaking_changes title somewhere

Zehir commented 1 year ago

And also with the bot we could do a button Report an issue on the DDF page on the UI and have a pipeline like this : image

It's will ping whoever contribute to that DDF. Yes to upload a DDF you will need a github account but it's not an issue for me. The button will add some data about the installation like deconz version, ... and if the issue it's a core issue we will be able to transfert it to the rest api repo

Zehir commented 9 months ago

The store is now part of the WIP repo here : https://github.com/deconz-community/ddf-tools