vosmiic / jellyfin-ani-sync

Synchronize anime watch status between Jellyfin and anime tracking sites.
GNU General Public License v3.0
214 stars 15 forks source link

Anilist API Manual Sync erroring out. #109

Closed macluxHD closed 3 months ago

macluxHD commented 4 months ago

So as already mentioned in my PR #106 users with bigger lists do encounter the error. During testing I used this users list: https://anilist.co/user/PoXonn/

After looking into it a bit further it is indeed due to some kind of rate limit. It is kind of strange as if we look at some headers before the rate limited call. After each request this value in the header decreases. So one can clearly see that there should be 60 requests remaining? Also during testing this always happened after 30 requests. { X-RateLimit-Limit: 90, X-RateLimit-Remaining: 60 }

But looking at the response from the next API call, we are clearly hitting a rate limit. My guess is that we are just calling the API too fast. { StatusCode: 429, ReasonPhrase: 'Too Many Requests' }

So one solution is that we could just increase the amount of time slept between each call. During testing I used 2 seconds and this resolved the issue and I did not hit the rate limit.

Another solution could be to use a different query which I found when looking through the documentation. Using the MediaListCollection query the API responds with the whole list in one call. So we would only need to call the API once. So I propose we replace the current GraphQL query with this one:

query ($status: MediaListStatus, $userId: Int) {
  MediaListCollection(userId: $userId, status: $status, type: ANIME) {
    lists {
      entries {
        media {
          siteUrl
        }
        completedAt {
          day
          month
          year
        }
        progress
      }
    }
  }
}

This would speed up the retrieval process of the users list and it would not trigger any rate limits.

macluxHD commented 4 months ago

Actually looking at the documentation again for this query, my previous statement saying that the whole list is returned with the proposed query is not true. The limit is just a lot bigger, it returns 500 entries per chunk. So to retrieve the whole list we would still need to go through these chunks, the updated query would then look like this:

query ($status: MediaListStatus, $userId: Int, $chunk: Int, $chunkSize: Int) {
  MediaListCollection(userId: $userId, status: $status, chunk: $chunk, perChunk: $chunkSize, type: ANIME) {
    hasNextChunk
    lists {
      entries {
        media {
          siteUrl
        }
        completedAt {
          day
          month
          year
        }
        progress
      }
    }
  }
}

We would then use hasNextChunk to check if we have fetched every chunk. While incrementing the chunk value like the current page value.

Although slower than previously thought it would still speed the process of retrieving the anime list up and result in less queries.

vosmiic commented 4 months ago

Great, thanks for investigating this. I'll make the change and see how I get on, I'll probably also increase the thread sleep time to 2 seconds since this is supposed to work in the background and it shouldn't matter how long it takes.

I'll also look into adding some re-attempt logic. If we get back a 429 we can attempt the call again after a certain amount of time.

vosmiic commented 3 months ago

I have created a PR for the fixes related to this, are you able to test it locally? I have tested it and all seems to be working fine but it always helps to get a second input. The 429 handling isn't included in the PR because I didn't think it was directly related to this issue, but I am planning on working on it separately.

You can either build it using docker or simply build it manually by pulling it down. I can also build it for you and send a binary if it helps. Thanks.

vosmiic commented 3 months ago

That PR has been merged now, if you wanted to run the latest version of the plugin you will want to either build it yourself or docker can build it for you. These fixes will be included in the next version of the plugin but I'm not sure when that will be released, there are a few other changes I want to add before creating a new release.