Open rfve opened 1 year ago
Yes, there is limit set by telegram. I am not sure but if you can track the last scraped member's id and iterate to that id in next cycle of scraping then it might be possible.
https://github.com/LonamiWebs/Telethon/issues/580
how to apply the solution here
Yeah, this can be a one solution, according to the thread you provide you can implement solution in scraper.py
at line number 85.
I don't know python very well. can you update the code
Remove the line number 85 in scraper.py
file with this code:
`queryKey = ['a', 'b', 'c', 'd', 'e', 'f', 'g', 'h', 'i', 'j', 'k', 'l', 'm', 'n', 'o', 'p', 'q', 'r', 's', 't', 'u', 'v', 'w', 'x', 'y', 'z'] all_participants = [] channel = 'someGroupName'
for key in queryKey:
offset = 0
limit = 100
while True:
participants = client(GetParticipantsRequest(
target_group, ChannelParticipantsSearch(key), offset, limit,
hash=0 ))
if not participants.users:
break
for user in participants.users:
try:
if re.findall(r"\b[a-zA-Z]", user.first_name)[0].lower() == key:
all_participants.append(user)
except:
pass
offset += len(participants.users)
print(offset)`
unfortunately it did not work. :(
error message this ;
Traceback (most recent call last):
File "C:\tg\TelegramScraper-master\scraper.py", line 94, in
Add this import from telethon.tl.functions.channels import GetParticipantsRequest
, and it should solve the error.
I've reorganized the code. the part here needs to be corrected to come from the main query. channel = 'hotdeals'
secondly, it does not write his scrapes to the members.csv file. what could be the reason for this.
code is ;
once the all members are scraped then try to print all_participants list using print(all_participants)
and also length of this list using print(f'Scraped Total Members : {len(all_participants)}')
. and if there are members in this list then we can find the error in not writing to csv file .
I have updated this changes in : https://notepad.link/tQKMg
https://snipboard.io/rPMEke.jpg
unfortunately it didn't work.
is there a solution to this situation?
are you there ?
Hey Sorry,I am little busy. But didn't find any soltion for this
only 10k members are scrape. Do you have the solution to scrape more members