Open Tipmethewink opened 1 year ago
I hope this gets merged in, this is really useful. I just realized the implication. I am currently on a project and thought it would pull all the issues, not just the first page capped at 20.
I want all the pages. I therefore have a copy of all my GitLab issues available to me in Obsidian. If you specify the maximum number of pages what's the criteria for the pages it returns? The most recent, the open issues, the highest priority issues... Selfishly I created the branch with my use case in mind and it doesn't seem to be an issue for Obsidian or GitLab (I use the cloud version).
On Thu, 23 May 2024 at 18:41, Ben Roberts @.***> wrote:
@.**** commented on this pull request.
In src/gitlab-api.ts https://github.com/benr77/obsidian-gitlab-issues/pull/22#discussion_r1612089191 :
return requestUrl(params)
.then((response: RequestUrlResponse) => { if (response.status !== 200) { throw new Error(response.text); }
- return response.json as Promise
; - });
- if ("x-next-page" in response.headers && response.headers["x-next-page"] != "") {
- return GitlabApi.load(url, gitlabToken, response.headers["x-next-page"]).then((j) => {
- return response.json.concat(j) as Promise
; - }) as Promise
; - } else {
- return response.json as Promise
; - }
- });
Isn't this just going to keep retrieving page after page of results? What happens if there are thousands of results?
Is is not better to specify the max number of results in the main filter query?
— Reply to this email directly, view it on GitHub https://github.com/benr77/obsidian-gitlab-issues/pull/22#pullrequestreview-2074646047, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALGU36L2XXOTPOY44E4N37TZDYS4VAVCNFSM6AAAAAAU5ZV7DWVHI2DSMVQWIX3LMV43YUDVNRWFEZLROVSXG5CSMV3GSZLXHMZDANZUGY2DMMBUG4 . You are receiving this because you authored the thread.Message ID: @.***>
I didn't mean limit the number of pages, I meant tell Gitlab to send the number you want in a single page of results. So if you know you have e.g. 500 issues and you do want them all, then set a per_page=500
parameter on the filter. However, I've just checked the Gitlab docs and per_page
has a maximum value of 100, so not sure this helps here.
If we allow thousands of results to be downloaded, memory and performance are going to become a problem. How can we set an upper limit?
Retrieves all the pages of issues if a paginated response is returned. Also allows a project specific URL to be used for the GitLab URL.