all | active all | active ping less than 1000ms | active ping less than 1500ms | active ping less than 1000ms and No google 403 | |
---|---|---|---|---|---|
Row URLs | subscription link | subscription link | ask me to provide if u need | ask me to provide if u need | ask me to provide if u need |
Xray JSON | ask me to provide if u need | json configs | json configs | json configs | json configs |
Xray JSON FULL | ask me to provide if u need | json config File | json config File | json config File | json config File |
Clash Meta | provider link | ask me to provide if u need | provider link | provider link | ask me to provide if u need |
This project is aimed at grabbing fresh proxies from Telegram channels and testing them by real-delay ping. A Telegram bot will listen on provided channels (even private ones), then parse new proxies and commit them to ur repo.
The main differences between this project and others are:
Uses the Telegram MTProto API Framework and acts exactly like ur real account. After first running,
It will ask u to sign in to your account. then listens to the provided channel in env
.
Because of that, it can grab proxies everywhere that u join them with ur real account, even PVs :)
To get started with the MTProto API, u should get app_id
and api_hash
which I couldn't get them easily.
app_id
:|. Ask someone to create it for u.(God bless Telegram.)On each new message that contains a proxy URL, proxies_row_url.txt
will be updated real-time.
proxies_row_url.txt
as subscription link in ur clients app (V2rayNG, V2rayN, etc):)This part of project aimed to decode and convert proxy URL to python class witch can be played with.
Because of this part I only support vless
I couldn't find it anywhere. Really nobody wrote it before !? :|
If u know any repo done this before, notify me, tnx.
To test grabbed proxies, be only relied on ping of server on that port is not a correct approach.
In this part I run a xray-core temporarily, witch trys to GET a simple html page by real connection throw proxy.
Also, nobody wrote it before. Come on guys.
checkProxies.py
will be run every 1 hour by the GitHub runner and check for grabbed proxies to be active.
Next, sort these proxies by real delay ping and save them as JSON type in proxies_active.txt
cleanProxiesRowUrl.py
will be run each 12 hours by GitHub runner and remove all URL proxies witch are not present in proxies_active.txt
.
So the url list will be clean always
To commit new proxies in ur repo, u should get GitHub tokenSo the bot will be able to commit new changes, such as new proxies.
Also, I explained How to get api_id
here