Open Scripter17 opened 1 year ago
You can use gallery-dl --write-pages --filter 'abort()' https://twitter.com/elonmusk/photo
to save user data as TXT files in your working directory.
--write-pages
dumps the response
/photo
prevents additional API requests
--filter 'abort()'
aborts the current download job before downloading the user's profile picture
Implementing this is not possible with the current code base.
You'll have to wait for v2 for that, or you write a small Python script that uses gallery-dl's TwitterAPI code.
#!/usr/bin/env python3
# -*- coding: utf-8 -*-
import sys
from gallery_dl.util import dump_json
from gallery_dl.extractor import twitter
extr = twitter.TwitterTweetExtractor.from_url("twitter.com/i/status/1")
api = twitter.TwitterAPI(extr)
name = sys.argv[1]
user = api.user_by_screen_name(name)
dump_json(user)
For the time being, you can use /USER/profile
as URL together with --dump-json
to get user profile information and nothing else (https://github.com/mikf/gallery-dl/commit/c2bf4ff99cd84f7f94db36d1f619563f07b96ba9).
I might change the name and URL path, but I'm not really sure what would fit best here. (https://github.com/mikf/gallery-dl/issues/5262#issuecomment-2217989092)
I'm trying to automatically get people's alt/other website accounts, but when I do
gallery-dl --dump-json https://twitter.com/username
it gets all the tweets it would when downloadinghttps://twitter.com/username
Basically the idea is that
--dump-user-json
returns just the"user"
part of what is written to metadata filesUsing
--range 0
is usually good enough but would fail if the user never posted anything