Closed Yukaii closed 1 year ago
TLDR: 貫徹免費仔的架構需求好難設計啊
repo
, read:org
, read:packages
. Copy it to clipboardpbpaste | gh auth login --with-token
gh api \
-H "Accept: application/vnd.github+json" \
"/orgs/BlastLauncher/packages?package_type=npm"
# It takes 0.8 seconds to load, not sure if it's due to the network latency...
gh api \
-H "Accept: application/vnd.github+json" \
"/orgs/BlastLauncher/packages/npm/todo-list/versions"
So there no API to list all packages with versions Q_Q. For batch upgrading extensions, I'll still need one API server to cache cache packages states on GitHub.
Some resources maybe useful when creating the CLI with nodejs
In the first thread, I said I would still need to track the last synced commit, I found that it's not needed anymore, because in ray action they use tj-actions/changed-files that has a since_last_remote_commit
option(Search INPUT_SINCE_LAST_REMOTE_COMMIT
in diff-sha.sh
). But we still need to deal with the merging the commits to the fork. Something like chery-picking every commits in between to fork repo and push them at once.
The first part is completed at sync-upstream.yml
.
Second part is about to build a blast-cli
that can build and changed raycast extension to GitHub Packages.
Steps
Use a workflow to sync repository to the forked repositoryblast/extension
Possible solutions...
The Raycast team runs the "publish" action with every new commit from branch
main
. Since it is a built-in action event on GitHub and run by the workflow inside the repository, they can ensure that the action runs with every commit. [^1]If I manually or automatically mirror the raycast/extensions repository, the synchronization status (i.e., the last commit that was synced) must be stored somewhere, and generating the differences (i.e., determining which extensions need to be updated) will be more complex.
Alternatively, if I fork the raycast/extensions repository and replace the actions it runs, and handle git merging within the GitHub action context, I would still need to track the last synced commit [^2], then push each commit in between one by one to the forked repository, ensuring that every "publish" action is enqueued.
Additionally, if the synchronization process is run periodically as a workflow, I would need to ensure that only one synchronization workflow action is running at a time.[^3][^1]: However, there may be a race condition where, for example, commit 1 changes extension A and commit 2 also changes extension A. In this case, the publish action triggered by commit 1 might run after the action triggered by commit 2, causing the latest published extension to be reverted back to commit 1. But since pull requests are manually merged, this scenario is unlikely to occur. (maybe?)
[^3]: GitHub Actions supports only run one job at a time, so it will be fine
[^2]: Can it be stored as JSON artifacts?