Open yashsehgal opened 1 month ago
Hey Yash, this looks interesting and I wanted to contribute
I had some questions/thoughts about it.
What are your thoughts regarding this?
Hi @louremipsum - Thanks for showing interest in this one. This is one of the really important fixes as too many API calls are happening as of now inside the hook 😅
The questions/thoughts you mentioned are really good. Sharing my responses for the same.
Caching with a single source of truth would be a good strategy to reduce API calls(although we would have to decide on when to revalidate data, or maybe adjust the scope of caching like things which are more likely to change would require short time caching and requests which don't change so often like user profile might be cached for a longer time)
This is a valid point. We should make a list of categorised data items which needs/does not need caching. As caching unnecessary will increase the complexity as well.
I have personally never worked with Caching, so I would you to lead this thread (will guide me as well) and we can continue with the fixes after that.
I looked through the repository and was not able to find anything regarding Rate Limiting, which I think would also be an important aspect in making this and also handling rate-limiting errors separately.
You're right. As of now, there's no rate-limiting applied to any method or API call. However, really important and good to have.
Custom pagination support/batch fetching for repo fetching or when fetching large datasets from GH API
Again, a good point! As of now, there are no such pagination applied to the API calls. Only one method for fetching repos
has it i.e. 100
rows per page. Which can be improved and can be made custom, something like,
const {} = useGitHub({
username: "...",
personalAccessToken: "..."
});
const allRepos = getRepositories({
rowsPerPage: N // can be any number
}).all();
Potential use of GraphQL also?
There are some applications of the GraphQL API in the code. For eg. fetching pinned repos. Would you like to describe on this one more?
Great to see the questions coming, really helpful for the coding pattern for the next versions. TYSM! ✨
haha thanks for your thoughts My response for each of your response is:
I have personally never worked with Caching, so I would you to lead this thread (will guide me as well) and we can continue with the fixes after that.
I have also worked on very simple caching(caching of Auth0 M2M token to reduce API calls to Auth0 directly) but it was a fairly simple usecase so maybe we can discuss thoughts and figure it out together? Also for the i have some initial idea for the list of what to cache and what not to cache:
this can be a starting point for caching, should not have to be strictly this but up for further discussion
You're right. As of now, there's no rate-limiting applied to any method or API call. However, really important and good to have.
There is the rate limiting which Github does(primary and secondary) and further rate limiting which can be done using the github hook. Also would have to seriously treat the rate limit by Github since acc. to them they will ban if still req after ratelimit
There are some applications of the GraphQL API in the code. For eg. fetching pinned repos. Would you like to describe on this one more?
Honestly was not sure about this one cause I didn't explore the whole repo as of now but this one of the things which I read before on GH blog that they are useful in combining multiple queries into one for fetching data so maybe(I'm kinda reaching and talking out loud) the hook can offer two options of either serving individual data via REST and if the user knows that they gonna need some set of data everytime then they can use the graphQL one to fetch multiple info in one request.
Update: I have added the recurring meeting details for everyone - For all the upcoming Wednesdays and Fridays during the month of October. Please check the details too in the issue description for,
I see how you're planning to lay the patterns for managing caches for different methods. As you have considerable experience with the same (caching and related things) Can we work on a POC for this one. A POC/DEV version for v1.3
where we can test how things are working in terms of performance and it should also have a good DX.
I would love to assign this one to you, and we can start working on building a POC that will focus on the following points,
Having a better DX for all the methods + sub-methods.
true
at the time of hook initialisation,const gh = useGitHub({
username: "...",
personalAccessToken: "...",
enableCache: true
});
or, is it going to be config object that can be passed inside respective methods.
const gh = useGitHub({
username: "...",
personalAccessToken: "...",
});
const allRepos = gh.getRepositories({ enableCache: true });
const user = gh.getUserInfo({ enableCache: false });
const pinnedRepos = gh.getRepositories({ enableCache: true }).pinned();
I know the latter one looks too much customizable but can be really helpful. What are your thoughts on this?
Let's also have a decent demo prepared for testing the before/after performance of the hook. That will be a good start to see what things are improving and how much better the performance is getting.
For the POC, we can build a simple profile card component, inside which we can show user basic info, all the pinned repos, language-distribution and a dropdown component with list of languages and toggling it will give us a list of repos with the selected
language. That will surely break the API limit 😁
Also, when starting to write the final update for this one, let's plan and break it into sub-features/fixes or PRs for better testing.
. Is it going to be a single config that can be set as true at the time of hook initialisation,
{enableCache: true}
at the time of hook initialisation for people who don't want to think too much while caching and just want it to work swr: boolean
or revalidateAfter: number
or maybe a function to manually re-fetch when the user wants and such. (will think about it and inform you here)Let's also have a decent demo prepared for testing the before/after performance of the hook. That will be a good start to see what things are improving and how much better the performance is getting. For the POC, we can build a simple profile card component, inside which we can show user basic info, all the pinned repos, language-distribution and a dropdown component with list of languages and toggling it will give us a list of repos with the selected language. That will surely break the API limit
Maybe I should make that component before testing and prepare/think about some performance metrics: on top of my head, I can think of number of API calls
, time to fetch
etc.
Although I'm a bit busy rn now maybe I'll try to make it by Wednesday. Till that, if I find time, i will maybe research more on definite ways to improve performance.
. Is it going to be a single config that can be set as true at the time of hook initialisation,
- Option 1: There can be a default config for caching which can be set by using the
{enableCache: true}
at the time of hook initialisation for people who don't want to think too much while caching and just want it to work- Option 2: The user can enable caching per method with more parameters like
swr: boolean
orrevalidateAfter: number
or maybe a function to manually re-fetch when the user wants and such. (will think about it and inform you here)
Both of the options you have mentioned, sounds good as a starting point. This will give devs more flexibility. We can include these patterns in the demo for testing.
Let's also have a decent demo prepared for testing the before/after performance of the hook. That will be a good start to see what things are improving and how much better the performance is getting. For the POC, we can build a simple profile card component, inside which we can show user basic info, all the pinned repos, language-distribution and a dropdown component with list of languages and toggling it will give us a list of repos with the selected language. That will surely break the API limit
Maybe I should make that component before testing and prepare/think about some performance metrics: on top of my head, I can think of
number of API calls
,time to fetch
etc.
Sounds promising, we can surely can continue with these 👍🏽
Although I'm a bit busy rn now maybe I'll try to make it by Wednesday. Till that, if I find time, i will maybe research more on definite ways to improve performance.
Sure, No hurries. Start whenever you're available 😊 We can keep updating this thread as we find and think of more solutions.
Also, adding to the discussion. We can work on the API calls happening inside the hook. As of today, all the API calls (for user info, followers and following list, list of repos, list of pinned repos, etc) are happening at the time of hook initialisation. And this is surely not a good way and consumes a lot of tokens.
As a solution, we can create getters for all the data retrieval functions. So that APIs are called only when there respective method is invoked. Can think more about how the coding pattern will look like. Thanks!
As of the now, the number of API fetches happening inside the hook for all the methods i.e.
userInfo
,getRepositories
and their respective sub-methods - are really high.I have some ideas-as-solution to this issue, which will help reduce the API calls and improve overall dependency for all methods.