Open denvyy opened 1 year ago
Maybe I can edit the library, or idk.
hi! u did this?
Hi, you can do something like this
from twitter.account import Account, Client
proxies = {
'http://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}',
'https://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}',
}
client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth_token})
account = Account(session=client)
Hi, you can do something like this
from twitter.account import Account, Client proxies = { 'http://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}', 'https://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}', } client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth_token}) account = Account(session=client)
thank+thank+thank
Hi, you can do something like this
from twitter.account import Account, Client proxies = { 'http://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}', 'https://': f'http://{proxy_username}:{proxy_password}@{ip}:{port}', } client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth_token}) account = Account(session=client)
I also using lots of accounts, as suggested by @nedozrel would editing the above not break anything if I apply same for Scraper and Search class? E.g
client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth_token})
scraper = Scraper(session=client)
does one of you has the error
httpx.RemoteProtocolError: Server disconnected without sending a response.
when trying to use the proxy for Account?
The Output :
mentions = account.dm_history()
^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/twitter/account.py", line 705, in dm_history
inbox = self.dm_inbox()
^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/twitter/account.py", line 658, in dm_inbox
r = self.session.get(
^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 1054, in get
return self.request(
^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 827, in request
return self.send(request, auth=auth, follow_redirects=follow_redirects)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 914, in send
response = self._send_handling_auth(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 942, in _send_handling_auth
response = self._send_handling_redirects(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 979, in _send_handling_redirects
response = self._send_single_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_client.py", line 1015, in _send_single_request
response = transport.handle_request(request)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 232, in handle_request
with map_httpcore_exceptions():
File "/usr/lib/python3.11/contextlib.py", line 158, in __exit__
self.gen.throw(typ, value, traceback)
File "/home/vakandi/.local/lib/python3.11/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
raise mapped_exc(message) from exc
httpx.RemoteProtocolError: Server disconnected without sending a response.
Code working perfectly:
proxies = {
'http://': f'http://{proxy_username}:{proxy_password}@{proxy_host}:{proxy_port}',
'https://': f'http://{proxy_username}:{proxy_password}@{proxy_host}:{proxy_port}'
}
client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth})
scraper = Scraper(session=client)
users = scraper.users([user_to_scrape])
Not working code:
proxies = {
'http://': f'http://{proxy_username}:{proxy_password}@{proxy_host}:{proxy_port}',
'https://': f'http://{proxy_username}:{proxy_password}@{proxy_host}:{proxy_port}'
}
client = Client(proxies=proxies, cookies={'ct0': ct0, 'auth_token': auth})
account = Account(session=client)
dms = account.dm_history()
#or notifications, or anything related to Account.
Error only occuring while using Account with proxies
I found a fix, since i already use the library inside a QThread, it will set the global httpx package for proxy use, i guess it can also work in a normal thread or function , remember to set the original AsyncClient back to remove the proxy configuration after the action is performed
My fix :
proxy_url = f"socks5://{self.proxy_username}:{self.proxy_password}@{self.proxy_host}:{self.proxy_port}"
# Patch httpx.AsyncClient globally within this thread to use the proxy
original_async_client_init = httpx.AsyncClient.__init__
def patched_async_client_init(self, *args, **kwargs):
kwargs['proxies'] = proxy_url
original_async_client_init(self, *args, **kwargs)
httpx.AsyncClient.__init__ = patched_async_client_init
account = Account(cookies={"ct0": ct0, "auth_token": auth_token})
account.reply(message, tweet_id)
And after the action is finish :
httpx.AsyncClient.__init__ = original_async_client_init
How i did my test :
Adding a function in "scraper.py" to perform a IP check request to see if my proxy IP is showing up , i use a proxy that give a new IP at every request, so i did 4 request to make sure everything works fine :
async def _test_connection(self, client: AsyncClient) -> str:
r = await client.get('https://api.ipify.org')
return r.text
Then i updated the "_process" function to perform the IP test just before running the called process (scrapping actions):
async def _process(self, operation: tuple, queries: list[dict], **kwargs):
headers = self.session.headers if self.guest else get_headers(self.session)
cookies = self.session.cookies
async with AsyncClient(headers=headers, cookies=cookies, timeout=20) as c:
ip = await self._test_connection(c)
#display the ip address
print(f"IP Address: {ip}")
async with AsyncClient(limits=Limits(max_connections=MAX_ENDPOINT_LIMIT), headers=headers, cookies=cookies, timeout=20) as c:
tasks = (self._paginate(c, operation, **q, **kwargs) for q in queries)
if self.pbar:
return await tqdm_asyncio.gather(*tasks, desc=operation[-1])
return await asyncio.gather(*tasks)
I'm working with a lot of twitter accounts and threads, so I need to setup a proxy for every session. How can I do it?