DIGITALCRIMINAL / ArchivedUltimaScraper

Scrape content from OnlyFans and Fansly
GNU General Public License v3.0
948 stars 39 forks source link

Repeatedly getting "unexpected control character in string: line 4 column 67 (char 109)" when running the script #842

Closed darkstarlogin closed 1 year ago

darkstarlogin commented 1 year ago

I see that there may be something else going with Onlyfans as far as rate limiting goes, but I'm having a problem even running the script, and thought I might bend people's ear here before I await a new version of the scraper.

Just upgraded the Mac I've been running this on to Ventura, 13.2, and I'm running the script locally via Homebrew with python 3.10.10 & Poetry 1.3.2. Decided on cleanly installing a lot of stuff, so I'm setting up my digitalcriminals scaper fresh. Running updater.py is fine, running the script to setup goes fine, I lightly edit config.json during setup ('"messages" : false,' which I'm sure many others do), then my o.f. cookie/useragent, etc, all as fairly normal (no fansly setup). Then I'm prompted, as expected, to "Choose sites", and when I select "1" for o.f., that's when the error occurs. Each time I'm told "unexpected control character in string: line 4 column 67 (char 109)" -- and I admit this is at the point at which my knowledge is not enough. The traceback of the error is below, if anyone knows of some simple error I'm making:

Traceback (most recent call last): File "/Volumes/ext10TB/OnlyFans-master/start_us.py", line 82, in asyncio.run(main()) File "/usr/local/Cellar/python@3.10/3.10.10/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/runners.py", line 44, in run return loop.run_until_complete(main) File "/usr/local/Cellar/python@3.10/3.10.10/Frameworks/Python.framework/Versions/3.10/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete return future.result() File "/Volumes/ext10TB/OnlyFans-master/start_us.py", line 59, in main api = await main_datascraper.start_datascraper( File "/Volumes/ext10TB/OnlyFans-master/ultima_scraper/datascraper/main_datascraper.py", line 134, in start_datascraper await default(datascraper) File "/Volumes/ext10TB/OnlyFans-master/ultima_scraper/datascraper/main_datascraper.py", line 49, in default await main_helper.process_profiles(api, global_settings) File "/Users/lusernom/Library/Caches/pypoetry/virtualenvs/ultima-scraper-I59rjUUB-py3.10/lib/python3.10/site-packages/ultima_scraper_api/helpers/main_helper.py", line 613, in process_profiles temp_json_auth = import_json(user_auth_filepath) File "/Users/lusernom/Library/Caches/pypoetry/virtualenvs/ultima-scraper-I59rjUUB-py3.10/lib/python3.10/site-packages/ultima_scraper_api/helpers/main_helper.py", line 394, in import_json json_file = orjson.loads(o_file.read()) orjson.JSONDecodeError: unexpected control character in string: line 4 column 67 (char 109)

Klutronic commented 1 year ago

I am getting a similar error on Windows. If I need to break this out into another issue, that's fine, but I think it's related. Mine is below:

C:\Tools\OnlyFans-master>poetry run python .\start_us.py Choose Sites: 0 = All | 1 = OnlyFans | 2 = Fansly 1 Traceback (most recent call last): File "C:\Tools\OnlyFans-master\start_us.py", line 82, in asyncio.run(main()) File "C:\Users\Klu\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 190, in run return runner.run(main) ^^^^^^^^^^^^^^^^ File "C:\Users\Klu\AppData\Local\Programs\Python\Python311\Lib\asyncio\runners.py", line 118, in run return self._loop.run_until_complete(task) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Klu\AppData\Local\Programs\Python\Python311\Lib\asyncio\base_events.py", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File "C:\Tools\OnlyFans-master\start_us.py", line 59, in main api = await main_datascraper.start_datascraper( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Tools\OnlyFans-master\ultima_scraper\datascraper\main_datascraper.py", line 134, in start_datascraper await default(datascraper) File "C:\Tools\OnlyFans-master\ultima_scraper\datascraper\main_datascraper.py", line 49, in default await main_helper.process_profiles(api, global_settings) File "C:\Users\Klu\AppData\Local\pypoetry\Cache\virtualenvs\ultima-scraper-htpMx2sF-py3.11\Lib\site-packages\ultima_scraper_api\helpers\main_helper.py", line 613, in process_profiles temp_json_auth = import_json(user_auth_filepath) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\Klu\AppData\Local\pypoetry\Cache\virtualenvs\ultima-scraper-htpMx2sF-py3.11\Lib\site-packages\ultima_scraper_api\helpers\main_helper.py", line 394, in import_json json_file = orjson.loads(o_file.read()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ orjson.JSONDecodeError: unexpected character: line 4 column 15 (char 55)

darkstarlogin commented 1 year ago

That seems almost like the same issue -- I don't know what version of python you're using but it's close to what I'm getting.

For now, I'm just waiting for the next release. It seems like even if I got past the errors and the script worked, onlyfans might be rate limiting things, from the sounds of other reports.

Klutronic commented 1 year ago

I agree. It looks like another issue was put in for the same thing you and I are seeing:

#855

I also tried docker. I can get it to run, but it never downloads anything so that could be the rate limiting. It doesn't even create the .sites folder, so who knows.

DIGITALCRIMINAL commented 1 year ago

Remove anything in the json file that looks like this:

https://www.geeksforgeeks.org/control-characters/

Klutronic commented 1 year ago

I ran updater.py (for some reason I had to run it twice) and its working as expected now. Downloaded updates to all of the models I am subbed to. Thank you @DIGITALCRIMINALS

darkstarlogin commented 1 year ago

Remove anything in the json file that looks like this:

https://www.geeksforgeeks.org/control-characters/

Thanks, and I did look through the json file for anything, but found nothing.

And I've since tried a fresh start with the latest code, and am running into all-new all-different problems just running updater.py. I think I'm just personally retiring from scraping -- I'm letting the couple of subscriptions I have just time out and will be happy to save the hd space. Thanks for creating this though!