Open rothgar opened 9 years ago
Is your intent to have Hound automatically discover the git repos and keep them up to date with polling as usual?
That would be ideal. Auto discovery would be important although I read polling with file:// already has issues.
Use the file:// protocol. This allows you to index any local folder, so you can clone the repository locally and then reference the files directly. The downside here is that the polling to keep the repo up to date will not work.
An alternative would be better gitlab support to auto discover/poll all repos for a team, org, etc. I'm looking to index 700+ repos which would obviously be cumbersome to input manually. I saw some of the generation scripts but those would require crons and service restarts if a new repo was added.
+1 for automatic discovery of all repositories from local FS recursively!
It might be a CLI argument like houndd --discover file://path/to/dir
which actually generates the default config.json
in the existing format with a listing of all repos found.
Same approach might work later for the discovery of all repositories for the particular github org like houndd --discover https://github.com/YourOrganization/
This is similar to #13 - it would be good to come up with a solution that solves both cases.
As a note to other users, I will say locally I have searched my "code" directory for git urls like this.
find $HOME/code -name .git -type d -prune | xargs -n1 -P4 -I '{}' git --git-dir='{}' config --get 'remote.origin.url' | sort
which outputs something like this:
git@github.com:andxyz/.dotfiles.git
git@github.com:mislav/dotfiles.git
Then with some text manipulation I created the required config.json
file.
+1 this would be really useful. I was surprised this option doesn't already exist. Here is my BASH script as a workaround:
echo '{' > config.json
echo '"max-concurrent-indexers" : 2,' >> config.json
echo '"dbpath" : "data",' >> config.json
echo '"repos" : {' >> config.json
first=true;
for f in $(find /home/developer -maxdepth 2 -type d); do
if [ -f "$f/HEAD" ]; then
bn=`basename $f`
if [[ "$first" == "true" ]]; then
first=false;
else
echo "," >> config.json
fi;
echo "\"$bn\" : {\"url\" : \"file://$f\"}" >> config.json
fi;
done;
echo '}}' >> config.json
If I have a folder containing all my repos (local or on shared network space using gitlab or similar) the entire directory contains sub directories of different git repos. If I want to index all of the repos, including newly added repos, I would hope to be able to use the following config to index all of the repos at once.
This would hopefully help with managing dozens or hundreds of repos needing to be put into a config.json file.