Open Ellis-808 opened 1 year ago
Had the same problem
Well you would need a script which wgets all txt filles, loops though all files and outputs to 1 txt file. Finally push it to a repository.
Just a little sketch:
#!/bin/bash
# Set variables
REPO_PATH="/path/to/your/repo"
GITHUB_USER="your-username"
GITHUB_REPO_URL="https://github.com/$GITHUB_USER/your-repo-name.git"
GITHUB_SOURCE_REPOS=("source-repo-1" "source-repo-2" "source-repo-3")
OUTPUT_FILE="all_files.txt"
# Change to the repository directory
cd $REPO_PATH
# Loop through each source repository and append their .txt files to the output file
for source_repo in "${GITHUB_SOURCE_REPOS[@]}"; do
git clone "https://github.com/$GITHUB_USER/$source_repo.git" temp_dir
cd temp_dir
cat *.txt >> ../$OUTPUT_FILE
cd ..
rm -rf temp_dir
done
# Add the output file to the Git staging area
git add $OUTPUT_FILE
# Commit and push the output file to the target GitHub repository
git commit -m "Added all .txt files to $OUTPUT_FILE"
git push $GITHUB_REPO_URL master
I use pihole to filter dns requests on my network & I was wondering if it was possible to publish a publicly available maintained list of all the domains discovered across all published investigations. Currently I pull the domain lists using the raw.githubusercontent link to each domain.txt file in every repository which works, but it would be convenient to only have to use one list containing all discovered domains that was periodically updated as new investigations were published.
Thank you for the consideration!