Closed sanders41 closed 6 months ago
@sanders41 is attempting to deploy a commit to the bruhbruhroblox's projects Team on Vercel.
A member of the Team first needs to authorize it.
You're correct in assuming the for
loop iterating through hits
is redundant. It's probably there because I forgot I wasn't using MongoDB.
As for async
Meilisearch - is it necessary? The following is from the FastAPI docs.
You can mix def and async def in your path operation functions as much as you need and define each one using the best option for you. FastAPI will do the right thing with them.
Anyway, in any of the cases above, FastAPI will still work asynchronously and be extremely fast.
Forgive my amateur understanding of asynchrnous Python if this is not the case, but won't FastAPI run asychronously regardless of whether Meilisearch has an asynchronous client (especially with multiple workers)? If there is a benefit though (even a nominal one), I'd be willing to merge this.
Thank you for contributing!
For a "regular" def
FastAPI run it in a thread pool, while it awaits an async def
. I have not looked through the FastAPI code to verify this, but the "normal" way to do this would be to look at the return value and if it is a coroutine then await, otherwise take the alternate path (thread pool in this case). If this is correct this means that search_filters
is not run in the thread pool because it is an async def
, but there is also nothing to await so it just blocks.
I have not done any specific FastAPI benchmarking but typically awaiting an async def would be faster than a thread pool. I do have some benchmarks you can see here and the async search was significantly faster when handling multiple requests.
This said another option would be to change search_filters
from an async def
to a def
. Under light load my guess is you wouldn't see much difference if any.
Understood, thank you! I will run a test and merge this.
I got two different errors.
The first was easy to fix. In _prepare_meilisearch
the create_index
function only takes a string for the primary_key
. This is different from the main Meilisearch library because that takes an object.
The second problem was that task_uid
was not an available property of the task variable. Looking at the code, I saw that the intention was to basically await
the create_index
function, so I just did that. It looks a little rough but it works.
Here is the (minified) code before and after.
def _prepare_meilisearch()
if not indexes or "companies" not in [index.uid for index in indexes]:
task = client.create_index("companies", {"primaryKey": "cik"})
client.wait_for_task(task.task_uid, timeout=None)
_prepare_meilisearch()
import asyncio
async def _prepare_meilisearch()
if not indexes or "companies" not in [index.uid for index in indexes]:
await client.create_index("companies", primary_key="cik")
asyncio.new_event_loop().run_until_complete(_prepare_meilisearch())
I'll add these commits and merge the whole thing, but I just want to make sure I'm following best practices.
The
search_filers
route is async however the call to Meilisearch for the search is not. This makes the call to Meilisearch async in order to not block the event loop.