UltimaHoarder / UltimaScraper

Scrape all the media from an OnlyFans account - Updated regularly
GNU General Public License v3.0
3.88k stars 611 forks source link

AttributeError: 'AuthModel' object has no attribute 'auth_details' #2054

Closed gri1n closed 5 months ago

gri1n commented 6 months ago

After working perfectly for about five months I started getting this error a few days ago.

Traceback (most recent call last):
  File "/home/bioboy/Programs/UltimaScraper/start_us.py", line 62, in <module>
    asyncio.run(main())
  File "/usr/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/home/bioboy/Programs/UltimaScraper/start_us.py", line 44, in main
    _api = await USR.start(
  File "/home/bioboy/Programs/UltimaScraper/ultima_scraper/ultima_scraper.py", line 50, in start
    await self.start_datascraper(datascraper)
  File "/home/bioboy/Programs/UltimaScraper/ultima_scraper/ultima_scraper.py", line 88, in start_datascraper
    setup, _subscriptions = await datascraper.account_setup(
  File "/home/bioboy/.cache/pypoetry/virtualenvs/ultima-scraper-6sfAQqlk-py3.10/lib/python3.10/site-packages/ultima_scraper_collection/modules/module_streamliner.py", line 415, in account_setup
    auth.auth_details.email
AttributeError: 'AuthModel' object has no attribute 'auth_details'. Did you mean: 'get_auth_details'?

Up until this error appeared I hadn't changed anything since setting it up months ago.

gri1n commented 6 months ago

I tried the fixes in these archived posts without any success.

https://github.com/DIGITALCRIMINAL/ArchivedUltimaScraper/issues/1008 https://github.com/DIGITALCRIMINAL/ArchivedUltimaScraper/issues/1054

slippypete commented 6 months ago

Started getting this exact error today.

gri1n commented 6 months ago

I had this same problem when I first started using this scraper months ago and I fixed it with the solution linked below which no longer works.

https://github.com/DIGITALCRIMINAL/ArchivedUltimaScraper/issues/1008#issuecomment-1635093886

fl4shforward commented 6 months ago

In your config.json change the dynamic rules url to: https://raw.githubusercontent.com/DATAHOARDERS/dynamic-rules/main/onlyfans.json This should solve the issue. [EDIT] Don't forget to set your auth.json "active" back to true too.

slippypete commented 6 months ago

In your config.json change the dynamic rules url to: https://raw.githubusercontent.com/DATAHOARDERS/dynamic-rules/main/onlyfans.json This should solve the issue.

Short version: this worked, thanks!

Long version: my auth.json keeps getting set to active: false and when set on on false it shows I have 0 subscriptions. I have to change it back to active: true and then by making the switch above, it works.

betoalanis commented 6 months ago

In your config.json change the dynamic rules url to: https://raw.githubusercontent.com/DATAHOARDERS/dynamic-rules/main/onlyfans.json This should solve the issue. [EDIT] Don't forget to set your auth.json "active" back to true too.

Thank you very much, this solves it :D

gri1n commented 6 months ago

Works, thanks!

gri1n commented 6 months ago

Updating the dynamic_rules_link worked to fix the problem but it's now returned. How often should this link need to be updated?

fl4shforward commented 6 months ago

Updating the dynamic_rules_link worked to fix the problem but it's now returned. How often should this link need to be updated?

@gri1n double check your settings, I just ran it with the same settings as my previous post and it still works.

Capture d’écran 2024-01-05 à 17 19 52
gri1n commented 5 months ago

@fl4shforward Here's my config:

{
  "info": {
    "version": 8.0
  },
  "settings": {
    "auto_site_choice": "onlyfans",
    "export_type": "json",
    "max_threads": 1,
    "min_drive_space": 0,
    "helpers": {
      "renamer": true,
      "reformat_media": true,
      "downloader": true,
      "delete_empty_directories": false
    },
    "webhooks": {
      "global_webhooks": [],
      "global_status": true,
      "auth_webhook": {
        "succeeded": {
          "webhooks": [],
          "status": null,
          "hide_sensitive_info": true
        },
        "failed": {
          "webhooks": [],
          "status": null,
          "hide_sensitive_info": true
        }
      },
      "download_webhook": {
        "succeeded": {
          "webhooks": [],
          "status": null,
          "hide_sensitive_info": true
        },
        "failed": {
          "webhooks": [],
          "status": null,
          "hide_sensitive_info": true
        }
      }
    },
    "exit_on_completion": true,
    "infinite_loop": false,
    "loop_timeout": 0,
    "dynamic_rules_link": "https://raw.githubusercontent.com/DATAHOARDERS/dynamic-rules/main/onlyfans.json",
    "proxies": [],
    "cert": "",
    "random_string": "481f15c8286c11ee84e08539aef36ec3",
    "tui": {
      "active": false,
      "host": "localhost",
      "port": 2112,
      "api_key": "481f15c9286c11ee84e08539aef36ec3"
    }
  },
  "supported": {
    "onlyfans": {
      "settings": {
        "auto_profile_choice": [ "default" ],
        "auto_model_choice": false,
        "auto_api_choice": true,
        "auto_media_choice": "0",
        "browser": {
          "auth": true
        },
        "jobs": {
          "scrape": {
            "subscriptions": true,
            "messages": false,
            "paid_contents": true
          },
          "metadata": {
            "posts": true,
            "comments": true
          }
        },
        "download_directories": [
          "/data/OnlyFans"
        ],
        "file_directory_format": "{model_username}/{media_type}",
        "filename_format": "{filename}.{ext}",
        "metadata_directories": [
          "__user_data__/sites"
        ],
        "metadata_directory_format": "{site_name}/{model_username}/Metadata",
        "delete_legacy_metadata": false,
        "text_length": 255,
        "video_quality": "source",
        "overwrite_files": false,
        "date_format": "%d-%m-%Y",
        "ignored_keywords": [],
        "ignore_type": "",
        "blacklists": [],
        "webhook": true
      }
    },
    "fansly": {
      "settings": {
        "auto_profile_choice": [],
        "auto_model_choice": false,
        "auto_api_choice": true,
        "auto_media_choice": "",
        "browser": {
          "auth": true
        },
        "jobs": {
          "scrape": {
            "subscriptions": true,
            "messages": false,
            "paid_contents": true
          },
          "metadata": {
            "posts": true,
            "comments": true
          }
        },
        "download_directories": [
          "__user_data__/sites"
        ],
        "file_directory_format": "{site_name}/{model_username}/{api_type}/{value}/{media_type}",
        "filename_format": "{filename}.{ext}",
        "metadata_directories": [
          "__user_data__/sites"
        ],
        "metadata_directory_format": "{site_name}/{model_username}/Metadata",
        "delete_legacy_metadata": false,
        "text_length": 255,
        "video_quality": "source",
        "overwrite_files": false,
        "date_format": "%d-%m-%Y",
        "ignored_keywords": [],
        "ignore_type": "",
        "blacklists": [],
        "webhook": true
      }
    }
  }
}

The error is identical to my original post.

dairyroach commented 5 months ago

I had an issue open because of this same error. Getting a new cookie for my auth.json file fixed it for me.

People seem to be experiencing this seemingly randomly ("It worked last time I used it, then started giving me this error out of nowhere"), and I think it's because for at least some of us, the cookie is expiring and the site is blocking the scraper without the error text giving better indication as to what went wrong.

gri1n commented 5 months ago

That didn't fix it but what did is logging in to OnlyFans with Firefox (instead of Chrome) and then using those cookies in auth.json.

tiphae commented 5 months ago

Same as comment above - couldn't get it to work using chrome info, but getting the credentials from firefox worked just fine.