davidteather / TikTok-Api

The Unofficial TikTok API Wrapper In Python
https://davidteather.github.io/TikTok-Api
MIT License
4.77k stars 965 forks source link

[BUG] Can't scrap user datas #1076

Open MONNNNNNNSTER opened 12 months ago

MONNNNNNNSTER commented 12 months ago
from TikTokApi import TikTokApi
import asyncio
import os
import json

#ms_token = os.environ.get(
 #   "ms_token", "bhzYCR6imjxwZW5yDu1L3iMUCdb50-Qf-EbqYgN3zh5rgNbRWB96p2Z4tFAHEcw-5jVjykJDmhK94Foz2qtIRS1WATADJCj_X34vsiEz8x2PMkkd3FZIkK2WAk_L3MtCqjdvo3Q="
#)  # set your own ms_token, think it might need to have visited a profile

async def user_example(username, count):
    videos_data = []
    ms_token = None
    async with TikTokApi() as api:
        await api.create_sessions(ms_tokens=[ms_token], num_sessions=1, sleep_after=3)
        user = api.user(username)

        # Fetch all videos (assuming a user won't have more than 100 videos for simplicity)
        async for video_obj in user.videos(count=100):  
            video = video_obj.as_dict  
            video_id = video['id']
            username = video['author']['uniqueId']
            views = video['stats']['playCount']
            friendly_url = f"https://www.tiktok.com/@{username}/video/{video_id}"

            videos_data.append({
                "views": views,
                "url": friendly_url
            })

    # Sorting the videos_data list by views in descending order
    videos_data.sort(key=lambda x: x['views'], reverse=True)

    # Keep only the top 'count' videos
    top_videos = videos_data[:count]

    # Save to JSON file
    with open('utils/videosLists.json', 'w', encoding='utf-8') as file:
        json.dump(top_videos, file, ensure_ascii=False, indent=4)

    return top_videos

if __name__ == "__main__":
    videos_sorted_data = asyncio.run(user_example("faitesentrerlesvictimes", 10))
    for video in videos_sorted_data:
        print(video)

Expected behavior

Trying to get the best 10 videos of an account, woprked eprfectly 2 weeks ago, now i get this error :

Error Trace (if any)

PS C:\Users\axelp\autosoustitre> python.exe .\scrapTiktok.py TikTokApi.user(username='faitesentrerlesvictimes', user_id='None', sec_uid='None') Traceback (most recent call last): File "C:\Users\axelp\autosoustitre\scrapTiktok.py", line 46, in videos_sorted_data = asyncio.run(user_example("faitesentrerlesvictimes", 10)) File "C:\Users\axelp\AppData\Local\Programs\Python\Python310\lib\asyncio\runners.py", line 44, in run return loop.run_until_complete(main) File "C:\Users\axelp\AppData\Local\Programs\Python\Python310\lib\asyncio\base_events.py", line 649, in run_until_complete return future.result() File "C:\Users\axelp\autosoustitre\scrapTiktok.py", line 17, in user_example videos_list = [video async for video in user.videos(count=30)] File "C:\Users\axelp\autosoustitre\scrapTiktok.py", line 17, in videos_list = [video async for video in user.videos(count=30)] File "C:\Users\axelp\AppData\Local\Programs\Python\Python310\lib\site-packages\TikTokApi\api\user.py", line 112, in videos await self.info(**kwargs) File "C:\Users\axelp\AppData\Local\Programs\Python\Python310\lib\site-packages\TikTokApi\api\user.py", line 76, in info resp = await self.parent.make_request( File "C:\Users\axelp\AppData\Local\Programs\Python\Python310\lib\site-packages\TikTokApi\tiktok.py", line 429, in make_request raise EmptyResponseException() TypeError: TikTokException.init() missing 2 required positional arguments: 'raw_response' and 'message'


if someone have the same issue, or if you could help me to fix this will be great :)
calvin5walters commented 12 months ago

Getting same issue

MONNNNNNNSTER commented 11 months ago

up

MaximilienHe commented 11 months ago

Up ! It seems that TikTok changed something on their website, I can't find any working external services (video downloader ...). Any idea on how long it will take to be patched ?

ketanmalempati commented 11 months ago

Anyone got it working?

reversecoderslab commented 9 months ago

Have a look here at my api https://rapidapi.com/reversecoders/api/tiktok4free

Update was done today and you are able to scrape all user relevant data.