Closed p-mcgowan closed 3 years ago
Currently, I just traverse out of the current project and into a shared resource repo.. eg
../../../../shared-openapi/
The ide can resolve this just fine, but it can be a lot of ../../...
Any shorthand version of this though would break the ability for the ide to resolve the files.. rock + hard place
Yeah, peraonally that would be fine with me but not everyone in the team clones all projects.
We also considered cloning / fetching the raw in a prebuild
, but would be cool to have something just auto update when things change.
not sure if it should even be built into the core, but this can be acheived with a simple helper which calls await exec('git fetch ...')
to take advantage of the local git config - just means copying the helper to any project which reuses models.
Will post it here in case it ends up being included by default, but was hoping json ref parser would handle it automagically (we tried - it doesnt)
You can fetch over a URL though... https://swagger.io/docs/specification/using-ref/ but i guess via https is the issue
I think https is fine (didn't test), but with private repos the internal resolver doesn't know how to auth so I believe it will fetch and receive a 404 instead of a model.
Yeaap.. twas what I was thinking.. it would need to be over SSH to work then the user would pass keys but that's not supported based in the docs link above.
But then what you're essentially talking about is git modules so not sure if it's worth writing a specially helper for that.. git already does all this.
An update of the git modules could be easier added to the build script in the package JSON file, but... Then I suppose this pushing pointless boiler plate work to the (unknown skill level) developer.. ie a larger scope for error.
... I can see this problem coming down the road at me in the next month's too 🤔 and I always lean on the side of "standardise it to reduce chance of human error"... A simple update all git modules works be trivial to add.. with an in/exclusion cli flag
Trying to avoid git modules, with a helper you could do something like this:
// helpers/fetchGitFile.js
export const fetchGitFile = (repo, file) => { // file is something like build/src/components/schemas/somethingModel.yml
const { stdout } = await exec(`git show origin/master:${file}`);
return stdout;
}
Except instead of showing master of the current repo I think you can use ls-remote or something similar. Then the "shared" model could do:
type: object
schema:
<$ fetchGitFile('patco/sweetModels', 'build/src/components/schemas/somethingElse.yml') $>
Since exec('git ...')
will use the user's git config, it will have access to the remote file. I think you can also use ssh protocol to view the single raw file if that doesn't work.
Would save storing things in a shared project or moving files around when it turns out something else needs them - it would behave the same way as a relative path to every repo you own, without having to clone and what not.
I'm certain it's possible, the main question is which command can / should be used. It also means copying the helpers folder across projects but I end up doing that anyway for things like mixins or other little scripts.
EDIT: quick google suggests this command might work too:
# https://stackoverflow.com/questions/1125476/retrieve-a-single-file-from-a-repository
# probably not supported on windows but... you know...
git archive --remote=git://git.foo.com/project.git HEAD:path/to/directory filename | tar -x
Interesting - github doesn't allow it
# fails with all kinds of fun stuff
git archive --remote=git@github.com:p-mcgowan/playground.git HEAD readme.md
However, on gitlab (and apparently bitbucket) it works
# works on my machine - creates the file on disk
git archive --remote=git@gitlab.com:org/path/to/repo.git HEAD readme.md |tar -x
This also works for github, but it's not pretty and doesn't avoid the clone (but definitely faster)
git clone --bare --no-checkout --depth 1 git@github.com:user/repo.git && cd repo.git && git show HEAD:readme.md
And apparently using svn works on github...
# https://stackoverflow.com/questions/9609835/git-export-from-github-remote-repository
svn export https://github.com/user/project/trunk
Or, if you like the hacky solutions - just add the upstream
git remote add tmp-upstream git@github.com:private/repo.git
git fetch tmp-upstream
git show tmp-upstream/master:readme.md
git remote remove tmp-upstream
But that requires a full fetch
hmm.. i'm not opposed to adding this sort of thing.. however, the dev exp would not be too fun; specially as the ide would have no chance of resolving the refs without a custom boats ide plugin
No, that's way too rough for the core - was hoping it would be simpler. This kind of hack is best served from a helper file i think, as it both probably doesn't work on windows and isn't super clean even for good nix. If github allowed it, then there would be a single more or less clean solution, but that threw a spanner in the works.
Yeap.. there's always symlinks too.. gets rid of the annoyance of npm packages or git modules.. but they don't play too well on Windows 🤔
Well if the whole project was a monorepo or everyone cloned all the micros locally it wouldn't be an issue, but would have been nice to find a more portable solution. For now, since this project is on gitlab, I think we will give the archive method a shot, but would love to find something more portable some time down the road.
Could probably close this, but would be cool if it eventually had a nice, clean solution.
Agreed - this doesn't feel like a boats solution at the moment (unless helper is used)
In our microservice arch, we often find we are copy pasting the same models from one place to another.
We should be able to cut down on a lot of error-prone duplication by specifying a url (either raw git url or something else) so that on rebuilding / re-boats-ing the project, it automatically updates the schemas.
This way, we can have 1 definition from some external service, and just have to rebuild to get the latest, updated, and accurate models from shared services.