taux1c / onlyfans-scraper

A tool that allows you to print to file all content you are subscribed to on onlyfans including content you have unlocked or has been sent to you in messages.
MIT License
290 stars 34 forks source link

Every requests return http 400 Bad Request #58

Open ChristophvdF opened 1 year ago

ChristophvdF commented 1 year ago

OS: Win 10

How did you install?: pip

Traceback:

PS C:\Users\djlee> onlyfans-scraper.exe
Using profile: main_profile
Status - DOWN
? What would you like to do? Download content from a user
? Choose one of the following options: Print a list of my subscriptions
Traceback (most recent call last):
  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File "C:\Python311\Scripts\onlyfans-scraper.exe\__main__.py", line 7, in <module>
  File "C:\Python311\Lib\site-packages\onlyfans_scraper\scraper.py", line 415, in main
    process_prompts()
  File "C:\Python311\Lib\site-packages\onlyfans_scraper\scraper.py", line 218, in process_prompts
    subscribe_count = process_me(headers)
                      ^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\onlyfans_scraper\scraper.py", line 197, in process_me
    my_profile = me.scrape_user(headers)
                 ^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Python311\Lib\site-packages\onlyfans_scraper\api\me.py", line 26, in scrape_user
    r.raise_for_status()
  File "C:\Python311\Lib\site-packages\httpx\_models.py", line 749, in raise_for_status
    raise HTTPStatusError(message, request=request, response=self)
httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://onlyfans.com/api2/v2/users/me'
For more information check: https://httpstatuses.com/400

Python Version: 3.11.3

Program Version: 2.0.1

PIP Freeze:

aiofiles==22.1.0
aiohttp==3.8.4
aiohttp-socks==0.7.1
aiosignal==1.3.1
alabaster==0.7.13
alembic==1.10.3
anyio==3.7.0
astroid==2.15.3
async-timeout==4.0.2
attrs==23.1.0
Babel==2.12.1
beautifulsoup4==4.12.2
build==0.10.0
CacheControl==0.12.11
certifi==2022.12.7
charset-normalizer==3.1.0
cleo==2.0.1
colorama==0.4.6
crashtest==0.4.1
dill==0.3.6
distlib==0.3.6
docutils==0.18.1
dulwich==0.21.5
filelock==3.12.0
frozenlist==1.3.3
greenlet==2.0.2
h11==0.14.0
h2==4.1.0
hpack==4.0.0
html5lib==1.1
httpcore==0.16.3
httpx==0.23.3
hyperframe==6.0.1
idna==3.4
imagesize==1.4.1
importlib-metadata==6.6.0
inquirerpy==0.3.4
installer==0.7.0
jaraco.classes==3.2.3
Jinja2==3.1.2
jsonschema==4.17.3
keyring==23.13.1
lazy-object-proxy==1.9.0
lockfile==0.12.2
lxml==4.9.2
Mako==1.2.4
MarkupSafe==2.1.2
mergedeep==1.3.4
more-itertools==9.1.0
msgpack==1.0.5
multidict==6.0.4
mypy==0.991
mypy-extensions==1.0.0
onlyfans-scraper==2.0.1
orjson==3.8.10
packaging==23.1
pexpect==4.8.0
pfzy==0.3.4
pkginfo==1.9.6
platformdirs==3.5.1
poetry==1.5.0
poetry-core==1.6.0
poetry-plugin-export==1.3.1
prompt-toolkit==3.0.38
psycopg2==2.9.6
ptyprocess==0.7.0
Pygments==2.15.1
pyproject_hooks==1.0.0
pyrsistent==0.19.3
PySocks==1.7.1
python-dateutil==2.8.2
python-socks==2.2.0
pywin32-ctypes==0.2.0
PyYAML==6.0
rapidfuzz==2.15.1
requests==2.28.2
requests-toolbelt==1.0.0
revolution==0.3.0
rfc3986==1.5.0
shellingham==1.5.0.post1
six==1.16.0
sniffio==1.3.0
snowballstemmer==2.2.0
soupsieve==2.4.1
Sphinx==5.3.0
sphinx-autoapi==2.1.0
sphinx-rtd-theme==1.2.0
sphinxcontrib-applehelp==1.0.4
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==2.0.1
sphinxcontrib-jquery==4.1
sphinxcontrib-jsmath==1.0.1
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.5
SQLAlchemy==2.0.10
tomlkit==0.11.8
tqdm==4.64.1
trove-classifiers==2023.5.24
typing_extensions==4.5.0
ultima-scraper-api==1.0.5
ultima-scraper-collection==1.0.3
ultima-scraper-renamer==1.0.0
Unidecode==1.3.6
urllib3==1.26.15
user-agent==0.1.10
virtualenv==20.23.0
wcwidth==0.2.6
webencodings==0.5.1
websockets==11.0.2
win32-setctime==1.1.0
wrapt==1.15.0
yarl==1.9.1
zipp==3.15.0

Steps to recreate:

  1. Enter credential
  2. Start application
  3. Select 'Download content from a user'
  4. Select 'Print a list of my subscriptions'
  5. Observe Error
tdex89 commented 1 year ago

Same error i'm getting too.

fseovpn commented 1 year ago

Yeah running into the same issue, I know I have my auth.json file information correctly, honestly had me thinking there was something bugged out with my browser that I even cleared my cookies and cache and still ran into the same problem

StPatty33 commented 1 year ago

Same error. Redid the auth-json file and all the data was the same as is currently in my browser. It lists everything as "DOWN" when I first run onlyfans-scraper as well, but when I try to continue to say, print the list of subs, the errors given are very different from when the auth file has invalid data in it.

As an aside, in the last month or so I've also noticed that a couple of my subs have videos that won't download now--they give errors instead. I also have other software like Internet Download Manager that I can download vids directly from sites normally, and these same undownloadable vids through onlyfans-scraper also don't bring up the little button to be able to download them direct from the site like they normally would. So it looks like OF is maybe rolling out something new that all these downloaders can't process (yet). Screenshot 2023-07-09 122326

mbarberry commented 1 year ago

Looks like they might have changed their API slightly. They also made a public release of a new privacy policy.

StPatty33 commented 1 year ago

Looks like they might have changed their API slightly. They also made a public release of a new privacy policy.

Yeah and it sounds like the videos that won't download now are due to a new DRM they announced a couple months ago that content creators can opt into. They're doing everything they can to lose money it sounds like. I think most people that scrape do it so they can keep what they've paid for, not spread it around elsewhere. It's one thing to ask people to buy porn, it's another to ask them to rent it.

dvkrav commented 1 year ago

Same issue on mac Using profile: main_profile Status - DOWN ? What would you like to do? Download content from a user ? Choose one of the following options: Print a list of my subscriptions Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.10/bin/onlyfans-scraper", line 8, in sys.exit(main()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/onlyfans_scraper/scraper.py", line 415, in main process_prompts() File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/onlyfans_scraper/scraper.py", line 280, in process_prompts loop() File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/onlyfans_scraper/scraper.py", line 218, in process_prompts subscribe_count = process_me(headers) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/onlyfans_scraper/scraper.py", line 197, in process_me my_profile = me.scrape_user(headers) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/onlyfans_scraper/api/me.py", line 26, in scrape_user r.raise_for_status() File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/site-packages/httpx/_models.py", line 749, in raise_for_status raise HTTPStatusError(message, request=request, response=self) httpx.HTTPStatusError: Client error '400 Bad Request' for url 'https://onlyfans.com/api2/v2/users/me' For more information check: https://httpstatuses.com/400

jamesle3199 commented 1 year ago

Same same with me. RIP to onlyfans-scraper.

Saintralphlauron commented 1 year ago

Can it scrape even without subscription

On Tue, Jul 11, 2023, 2:31 AM jamesle3199 @.***> wrote:

Same same with me. RIP to onlyfans-scraper.

— Reply to this email directly, view it on GitHub https://github.com/taux1c/onlyfans-scraper/issues/58#issuecomment-1630481788, or unsubscribe https://github.com/notifications/unsubscribe-auth/AZNQVO6CWA6PTFQ674V7CETXPUMQVANCNFSM6AAAAAA2CYNULQ . You are receiving this because you are subscribed to this thread.Message ID: @.***>

taux1c commented 1 year ago

This repo is outdated (no longer works) and I am no longer maintaining it.