ash-project / ash_json_api

The JSON:API extension for the Ash Framework
https://hexdocs.pm/ash_json_api
MIT License
56 stars 40 forks source link

Update keyset & offset pagination with tests #92

Closed gordoneliel closed 1 year ago

gordoneliel commented 1 year ago

Whats new

Simplifies keyset & offset pagination link generation. Pagination links now includes next and prev regardless of whether there are records in either direction. If there are no results in either direction, next and or prev links will have a nil value.

Contributor checklist

zachdaniel commented 1 year ago

This is looking great! Had a few comments, want to make sure that we have the right behavior on next/prev page in conjunction with before and after. IIRC more?: true when using before means "there is more stuff before the first thing on this page", and that things before the first item on the page should be considered the "next" page.

gordoneliel commented 1 year ago

@zachdaniel Added all the changes. Before we merge, one thing I noticed was that I get this error sometimes on test runs:

[error] FrameworkError: Framework Error | something went wrong. Error messaging is incomplete so far: %Ash.Error.Page.InvalidKeyset{value: "g2wAAAACdAAAAA13C21pY3Jvc2Vjb25kaAJiAA05 mEGdwZzZWNvbmRhBXcIY2FsZW5kYXJ3E0VsaXhpci5DYWxlbmRhci5JU093BW1vbnRoYQh3Cl9fc3RydWN0X193D0VsaXhpci5EYXRlVGltZXcDZGF5YRN3BHllYXJiAAAH53cGbWludXRlYRp3BGhvdXJhA3cJdGltZV96b25lbQAAAAdFdGMvVVRDdwl6b25lX2FiYnJtAAAAA1VUQ3cKdXRjX29mZnNldGEAdwpzdGRfb2Zmc2V0YQBtAAAAJGNmNzZkYWI1LWRlYjEtNGEyYi1hMDkyLTc0YzMwMDRiNWJlN2o=", key: :before, changeset: nil, query: nil, error_context: [], vars: [], path: [], stacktrace: #Stacktrace<>, class: :invalid

Happened on this particular test: test "next, prev, first & self links are present" on test/spec_compliance/fetching_data/pagination/keyset_test.exs:17

Code looks alright I think, not sure why its getting an invalid keyset when I use that same keyset to manually paginate with the Ash Api before using it on the conn.

There is also a whole space between the keyset value, not sure if that is how it comes out after encoding?

zachdaniel commented 1 year ago

Something definitely seems strange there....can you reproduce this by running the tests repeatedly? If so we probably need to address that before merging.

gordoneliel commented 1 year ago

@zachdaniel I am able to reproduce it. I added a run_tests.sh bash script that runs the tests multiple times, seems to catch it in one of the re-runs. Do you mind taking a look at this when you get the chance? Not sure whats going on here...

gordoneliel commented 1 year ago

Printed out the cursor before using it in the api request, and it looks like there is space on the token that is used by ash from the get request. Notice the + appears as an empty space. Maybe something to do with encoding and decoding the token perhaps?

g2wAAAACdAAAAA13C21pY3Jvc2Vjb25kaAJiAAp+pmEGdwZzZWNvbmRhDHcIY2FsZW5kYXJ3E0VsaXhpci5DYWxlbmRhci5JU093BW1vbnRoYQh3Cl9fc3RydWN0X193D0VsaXhpci5EYXRlVGltZXcDZGF5YRd3BHllYXJiAAAH53cGbWludXRlYSl3BGhvdXJhAHcJdGltZV96b25lbQAAAAdFdGMvVVRDdwl6b25lX2FiYnJtAAAAA1VUQ3cKdXRjX29mZnNldGEAdwpzdGRfb2Zmc2V0YQBtAAAAJGNjODJlODQ1LTAyNzctNDU2My04NzJiLTk4NDM5NGFjMzZhN2o=
17:41:12.702 [error] FrameworkError: Framework Error | something went wrong. Error messaging is incomplete so far: %Ash.Error.Page.InvalidKeyset{value: "g2wAAAACdAAAAA13C21pY3Jvc2Vjb25kaAJiAAp pmEGdwZzZWNvbmRhDHcIY2FsZW5kYXJ3E0VsaXhpci5DYWxlbmRhci5JU093BW1vbnRoYQh3Cl9fc3RydWN0X193D0VsaXhpci5EYXRlVGltZXcDZGF5YRd3BHllYXJiAAAH53cGbWludXRlYSl3BGhvdXJhAHcJdGltZV96b25lbQAAAAdFdGMvVVRDdwl6b25lX2FiYnJtAAAAA1VUQ3cKdXRjX29mZnNldGEAdwpzdGRfb2Zmc2V0YQBtAAAAJGNjODJlODQ1LTAyNzctNDU2My04NzJiLTk4NDM5NGFjMzZhN2o=", key: :after, changeset: nil, query: nil, error_context: [], vars: [], path: [], stacktrace: #Stacktrace<>, class: :invalid}
gordoneliel commented 1 year ago

@zachdaniel Was able to figure it out! Needed to uri encode the query params for keyset pagination as it contained non-uri characters. Tests are working as expected now!

zachdaniel commented 1 year ago

Thank you! Sorry I've been a bit slow in responding, been super busy the last few days :)