Closed mihaigalos closed 1 month ago
You need https://github.com/bakhirev/assayo-crawler (or https://hub.docker.com/r/bakhirev/assayo-crawler) But in this moment I not translated doc on English. This is server on NodeJS. It bypasses the list of repositories and collects one log file (or makes many log files)
Please let us know if you have managed to solve the problem using Craler service. I'm working on improving it.
Hi @bakhirev,
Google Translate works for websites, too. So no language problem. 😄
I just want to read an input folder containing some subfolders, generate a log which I can then upload to the UI. Seems the crawler wants me to create a JSON of where to download from, etc.
I mean I already have the folders and I mount them to the input via docker:
pushd $(mktemp -d)
docker run --rm -it --name assayo-crawler -p 8091:80 --mount type=bind,source=/Users/foo/projectsNG,destination=/usr/src/assayo_crawler/input --mount type=bind,source=$(pwd),destination=/usr/src/assayo_crawler/output bakhirev/assayo-crawler
Can you simplify the crawler to look in the input and create the log without any clicking in the crawler UI?
Seems the crawler wants me to create a JSON of where to download from, etc.
yes
Can you simplify the crawler to look in the input and create the log without any clicking in the crawler UI?
Yes, i can. But i took this task to work, but not this month. I'm currently editing a python package and making edits from other people and redesigning the site. Maybe in november I'll go back to work on Crawler and do this.
It seems to me that it is worth writing a bash command without Crawler specifically for this case. I'll think about it this weekend if there's any way to do it in a couple of lines.
@mihaigalos special for you, my friend )) Download this txt file.
/Users/foo/mihaigalos.js
txt
tojs
node mihaigalos.js
/Users/foo/log.txt
Script search only one level subdir, If you need recursive - write me. Did that solve the problem?
Hi,
Awesome stuff. This works well for a monorepo. Any plans to offer support for a multirepo setup? I tried the following but maybe this is not approach here - doesn't work:
Details
```bash #!/bin/bash run_for_one_repo(){ repo=$1 subfolder="projects/$repo" git remote add "$repo" ../"$repo" git fetch "$repo" # Fetch LFS objects git checkout "$repo/main" -- .gitattributes git lfs fetch "$repo" --all # Using subtree to add and merge preserving full history git subtree add --prefix="$subfolder" "$repo/main" } # Define the list of repositories repos=( 'foo-one' 'bar-two' ) git init git lfs install for repo in "${repos[@]}"; do run_for_one_repo "$repo" done ```