Sam-Max / rcmltb

An rclone telegram bot to mirror-leech and copy between many cloud servers
https://t.me/rcmltb
GNU General Public License v3.0
276 stars 287 forks source link
aria2c bot clone copy file-manager leech mega mirror pyrogram python qbittorrent rclone rss telegram telegram-bot torrent ytdlp

An Rclone Mirror-Leech Telegram Bot to transfer to and from many clouds. Based on mirror-leech-telegram-bot with rclone support added, and other features and changes from base code.

NOTE: Base repository added recently its own rclone implementation.

Features:

Rclone

Others

Commands for bot(set through @BotFather)

mirror - or /m Mirror to selected cloud 
mirror_batch - or /mb Mirror Telegram files/links in batch to cloud
mirror_select - or /ms Select a fixed cloud/folder for mirror 
leech - or /l Leech from cloud/link to Telegram
leech_batch - or /lb Leech Telegram files/links in batch to Telegram 
ytdl - or /y Mirror ytdlp supported link
ytdl_leech - or /yl Leech yt-dlp supported link
files - or /bf Bot configuration files
debrid - Debrid Manager
rcfm - Rclone File Manager
copy - Copy from cloud to cloud
clone - Clone gdrive link file/folder 
count - count file/folder fom gdrive link
user_setting - User settings
own_setting - Owner settings
rss - Rss feed
tmdb - Search titles
cleanup - Clean cloud trash
cancel_all - Cancel all tasks
storage - Cloud details
serve - Serve cloud as web index 
sync - Sync two clouds
torrsch - Search for torrents
status - Status message of tasks
stats - Bot stats
shell - Run cmds in shell
log - Bot log
ip - show ip
ping - Ping bot
restart - Restart bot

How to deploy?

  1. Installing requirements
git clone https://github.com/Sam-Max/rcmltb rcmltb/ && cd rcmltb

Install Docker by following the official Docker docs

sudo pacman -S docker python
pip3 install -r requirements-cli.txt
  1. Set up config file

1. Mandatory Fields

2. Optional Fields

  1. Deploying with Docker
sudo docker build . -t rcmltb 
sudo docker run -p 80:80 -p 8080:8080 rcmltb
  sudo docker ps
  sudo docker stop id
  sudo docker container prune
  sudo docker image prune -a
  1. Deploying using docker-compose

NOTE: If you want to use port other than 80 (torrent file selection) or 8080 (rclone serve), change it in docker-compose.yml

sudo apt install docker-compose

## Generate Database

1. Go to `https://mongodb.com/` and sign-up.
2. Create Shared Cluster (Free).
4. Add `username` and `password` for your db and click on `Add my current IP Address`.
6. Click on `Connect`, and then press on `Connect your application`.
7. Choose `Driver` **python** and `version` **3.6 or later**.
8. Copy your `connection string` and replace `<password>` with the password of your user, then press close.
9. Go to `Network Access` tab, click on edit button and finally click `Allow access from anywhere` and confirm.

------

## How to create rclone config file

**Check this youtube video (not mine, credits to author):** 
<p><a href="https://www.youtube.com/watch?v=Sp9lG_BYlSg"> <img src="https://img.shields.io/badge/See%20Video-black?style=for-the-badge&logo=YouTube" width="160""/></a></p>

**Notes**:
- When you create rclone.conf file add at least two accounts if you want to copy from cloud to cloud. 
- For those on android phone, you can use [RCX app](https://play.google.com/store/apps/details?id=io.github.x0b.rcx&hl=en_IN&gl=US) app to create rclone.conf file. Use "Export rclone config" option in app menu to get config file.
- Rclone supported providers:
  > 1Fichier, Amazon Drive, Amazon S3, Backblaze B2, Box, Ceph, DigitalOcean Spaces, Dreamhost, **Dropbox**,   Enterprise File Fabric, FTP, GetSky, Google Cloud Storage, **Google Drive**, Google Photos, HDFS, HTTP, Hubic, IBM COS S3, Koofr, Mail.ru Cloud, **Mega**, Microsoft Azure Blob Storage, **Microsoft OneDrive**, **Nextcloud**, OVH, OpenDrive, Oracle Cloud Storage, ownCloud, pCloud, premiumize.me, put.io, Scaleway, Seafile, SFTP, **WebDAV**, Yandex Disk, etc. **Check all providers on official site**: [Click here](https://rclone.org/#providers).

## Getting Google OAuth API credential file and token.pickle

**NOTES**
- You need OS with a browser.
- Windows users should install python3 and pip. You can find how to install and use them from google.
- You can ONLY open the generated link from `generate_drive_token.py` in local browser.

1. Visit the [Google Cloud Console](https://console.developers.google.com/apis/credentials)
2. Go to the OAuth Consent tab, fill it, and save.
3. Go to the Credentials tab and click Create Credentials -> OAuth Client ID
4. Choose Desktop and Create.
5. Publish your OAuth consent screen App to prevent **token.pickle** from expire
6. Use the download button to download your credentials.
7. Move that file to the root of rclone-tg-bot, and rename it to **credentials.json**
8. Visit [Google API page](https://console.developers.google.com/apis/library)
9. Search for Google Drive Api and enable it
10. Finally, run the script to generate **token.pickle** file for Google Drive:

pip3 install google-api-python-client google-auth-httplib2 google-auth-oauthlib python3 generate_drive_token.py

------

## Bittorrent Seed

- Using `-d` argument alone will lead to use global options for aria2c or qbittorrent.

## Qbittorrent

- Global options: `GlobalMaxRatio` and `GlobalMaxSeedingMinutes` in qbittorrent.conf, `-1` means no limit, but you can cancel manually.
  - **NOTE**: Don't change `MaxRatioAction`.

## Using Service Accounts to avoid user rate limit [For Google Drive Remotes]

> For Service Account to work, you must set `USE_SERVICE_ACCOUNTS`= "True" in config file or environment variables.
>**NOTE**: Using Service Accounts is only recommended for Team Drive.

### 1. Generate Service Accounts. [What is Service Account?](https://cloud.google.com/iam/docs/service-accounts)

**Warning**: Abuse of this feature is not the aim of this project and we do **NOT** recommend that you make a lot of projects, just one project and 100 SAs allow you plenty of use, its also possible that over abuse might get your projects banned by Google.

>**NOTE**: If you have created SAs in past from this script, you can also just re download the keys by running:

python3 gen_sa_accounts.py --download-keys $PROJECTID

>**NOTE:** 1 Service Account can copy around 750 GB a day, 1 project can make 100 Service Accounts so you can copy 75 TB a day.

#### Two methods to create service accounts
Choose one of these methods

##### 1. Create Service Accounts in existed Project (Recommended Method)

- List your projects ids

python3 gen_sa_accounts.py --list-projects


- Enable services automatically by this command

python3 gen_sa_accounts.py --enable-services $PROJECTID


- Create Sevice Accounts to current project

python3 gen_sa_accounts.py --create-sas $PROJECTID


- Download Sevice Accounts as accounts folder

python3 gen_sa_accounts.py --download-keys $PROJECTID

##### 2. Create Service Accounts in New Project

python3 gen_sa_accounts.py --quick-setup 1 --new-only

A folder named accounts will be created which will contain keys for the Service Accounts.

### 2. Add Service Accounts

#### Two methods to add service accounts
Choose one of these methods

##### 1. Add Them To Google Group then to Team Drive (Recommended)
- Mount accounts folder

cd accounts


- Grab emails form all accounts to emails.txt file that would be created in accounts folder
- `For Windows using PowerShell`

$emails = Get-ChildItem .**.json |Get-Content -Raw |ConvertFrom-Json |Select -ExpandProperty client_email >>emails.txt


- `For Linux`

grep -oPh '"client_email": "\K[^"]+' *.json > emails.txt


- Unmount acounts folder

cd ..

Then add emails from emails.txt to Google Group, after that add this Google Group to your Shared Drive and promote it to manager and delete email.txt file from accounts folder

##### 2. Add Them To Team Drive Directly
- Run:

python3 add_to_team_drive.py -d SharedTeamDriveSrcID

------

## Yt-dlp and Aria2c Authentication Using .netrc File
For using your premium accounts in yt-dlp or for protected Index Links, create .netrc and not netrc, this file will be hidden, so view hidden files to edit it after creation. Use following format on file: 

Format:

machine host login username password my_password


Example:

machine instagram login user.name password mypassword


**Instagram Note**: You must login even if you want to download public posts and after first try you must confirm that this was you logged in from different ip(you can confirm from phone app).

**Youtube Note**: For `youtube` authentication use [cookies.txt](https://github.com/ytdl-org/youtube-dl#how-do-i-pass-cookies-to-youtube-dl) file.

Using Aria2c you can also use built in feature from bot with or without username. Here example for index link without username.

machine example.workers.dev password index_password



Where host is the name of extractor (eg. instagram, Twitch). Multiple accounts of different hosts can be added each separated by a new line.

-----

## Donations

[![ko-fi](https://ko-fi.com/img/githubbutton_sm.svg)](https://ko-fi.com/sammax09)

-----

## Bot Screenshot: 

<img src="https://github.com/Sam-Max/rcmltb/raw/master/screenshot.png" alt="button menu example">

-----