Closed Muhammad-Owais-Warsi closed 8 months ago
@Muhammad-Owais-Warsi is attempting to deploy a commit to the itisohm's Team Team on Vercel.
A member of the Team first needs to authorize it.
Hey @Muhammad-Owais-Warsi there's a problem with the current backend. There's a local array used to loop through the repos that need to be checked for GFIs. If we deploy the backend, the original array will be wiped which isn't a problem as of now but might become big in the future. Shall we tackle that first?
Hey @Muhammad-Owais-Warsi there's a problem with the current backend. There's a local array used to loop through the repos that need to be checked for GFIs. If we deploy the backend, the original array will be wiped which isn't a problem as of now but might become big in the future. Shall we tackle that first?
Hey @ItIsOHM really want to contribute to this issue, but can you please explain, as it is not very clear what you have mentioned above. My Questions are:-
1) What is the current problem you are facing ? 2) What are the changes you want ?
Hey @Muhammad-Owais-Warsi there's a problem with the current backend. There's a local array used to loop through the repos that need to be checked for GFIs. If we deploy the backend, the original array will be wiped which isn't a problem as of now but might become big in the future. Shall we tackle that first?
Hey @ItIsOHM really want to contribute to this issue, but can you please explain, as it is not very clear what you have mentioned above. My Questions are:-
- What is the current problem you are facing ?
- What are the changes you want ?
What's you thoughts on this?
Hey @Muhammad-Owais-Warsi sorry for the late reply. What I meant was that in the current backend deployed on AWS EB, we have a local array that stores the repos needed to be checked. If we deploy a new version of the backend, the local array will get wiped. This would lead to the cron job not take into account the older repos and they'll not be checked. To counter this, we need to fetch the list of repos needed to be checked at the start of every backend session. So that we always get a full and fresh list of repos to be checked.
I hope I was more clear this time. Let me know if you want further clarification, happy to help :)
Hey @Muhammad-Owais-Warsi sorry for the late reply. What I meant was that in the current backend deployed on AWS EB, we have a local array that stores the repos needed to be checked. If we deploy a new version of the backend, the local array will get wiped. This would lead to the cron job not take into account the older repos and they'll not be checked. To counter this, we need to fetch the list of repos needed to be checked at the start of every backend session. So that we always get a full and fresh list of repos to be checked.
I hope I was more clear this time. Let me know if you want further clarification, happy to help :)
Okay, so by looking at your code, i think we should save the repoUrl in subSchema that you have made, and check that.. I think that's the only way to prevent that.
Okay, so by looking at your code, i think we should save the repoUrl in subSchema that you have made, and check that.. I think that's the only way to prevent that.
@Muhammad-Owais-Warsi how about we save the repos we need to check in some local data structure every time the session has a fresh start? That'd reduce the calls made to the DB imo
Okay, so by looking at your code, i think we should save the repoUrl in subSchema that you have made, and check that.. I think that's the only way to prevent that.
@Muhammad-Owais-Warsi how about we save the repos we need to check in some local data structure every time the session has a fresh start? That'd reduce the calls made to the DB imo
If we create some local Data Structure it will also get wiped out once the code is deployed, we can try implementing a logic by adding a check whether the length of local array is 0 or more. If it is 0 we can fetch data from db and if not 0 we can run our normal code. What do you think about this??
Yes that works imo. We can fetch the data every time the backend is deployed and have a check on it if the array length is lower than number of unique repos in the DB
Yes that works imo. We can fetch the data every time the backend is deployed and have a check on it if the array length is lower than number of unique repos in the DB
That's great...👍
Yes that works imo. We can fetch the data every time the backend is deployed and have a check on it if the array length is lower than number of unique repos in the DB
Done :)
Hey @Muhammad-Owais-Warsi! The changes LGTM. I'll test this code out on my machine and if it works fine I'll merge it. Also, is there a way to auto-deploy the backend on AWS EB on merge, like the Vercel thing?
Hey @Muhammad-Owais-Warsi! The changes LGTM. I'll test this code out on my machine and if it works fine I'll merge it. Also, is there a way to auto-deploy the backend on AWS EB on merge, like the Vercel thing?
Sorry, but I don't have knowledge about clouds and it's services. But in AWS docs they have written that we have to create a CI/CD pipeline for the automatic deployment
Sorry, but I don't have knowledge about clouds and it's services. But in AWS docs they have written that we have to create a CI/CD pipeline for the automatic deployment
I see! Thanks for the suggestion i'll see to that :)
Hey , is the code working fine ??
Hi @Muhammad-Owais-Warsi sorry for the late reply, i wasn't able to look into this project for a couple of weeks. I tested out your changes and there's a small issue. When you're fetching the repos for checking the repoToCheck
array, you're fetching every repo due to which we have some duplications too. Instead of that try using the findUnique method.
Also I have pushed a new commit that'll in raising the rate limit so that should be helpful too.
Hi @Muhammad-Owais-Warsi sorry for the late reply, i wasn't able to look into this project for a couple of weeks. I tested out your changes and there's a small issue. When you're fetching the repos for checking the
repoToCheck
array, you're fetching every repo due to which we have some duplications too. Instead of that try using the findUnique method.Also I have pushed a new commit that'll in raising the rate limit so that should be helpful too.
I think there will be no duplications coz the data from the db is fetched only when the array length is 0
No by duplication I mean when you fetch the repos if the repo length is zero, it fetches all the repo entries so suppose if four people have subscribed to a repo then that repo is added to the array four times. This leads to unnecessary API calls.
No by duplication I mean when you fetch the repos if the repo length is zero, it fetches all the repo entries so suppose if four people have subscribed to a repo then that repo is added to the array four times. This leads to unnecessary API calls.
Alright now I understood well. Thanks for letting me know I'll make the correction. Also I want to tell you that I just test out your current hosted website, now when I enter any repo URL it just give me following message "Couldn't find the repo, please check the repo URL again." So what's the matter ??
Hey @ItIsOHM I have make the suitable changes , you can check them.
Ohhhh right!! Thanks I'll make the change
Ohhhh right!! Thanks I'll make the change
also be sure to resolve the conflicts.
@Muhammad-Owais-Warsi as i said please resolve the conflicts. I pushed some changes a few days ago regarding the rate limit bug so please fetch the latest version of the repo and commit changes from that.
Updated the Backend Code. Used import syntax instead of require().