-
I skipped implementing the [batching section of the JSON-RPC specification](https://www.jsonrpc.org/specification#batch), but it would be nice to implement that for some of the more bursty RPCs. I thi…
-
### 📚 Describe the documentation issue
It is unclear what to do in regards to batching when using a custom feature map and adjacency matrix. For instance, the docs help when dealing with the batching…
-
When we run `examples/1_SimpleNet/simplenet.py`, the final thing that's executed is effectively
```
a = [0.0, 1.0, 2.0, 3.0, 4.0]
model(torch.Tensor(a))
```
This would also work with batching e…
-
Thanks for the amazing work! It really is super fast at bs=1.
Can batch usecases, or dynamic batching be supported?
-
Hey! Thanks for the amazing work.
I was trying to run the code in batches of 8. Ran into an issue with torch.stack as all images for Corvi2023 are not resized to the same dimensions. I was wondering …
-
### Component(s)
processor/interval
### Is your feature request related to a problem? Please describe.
`intervalprocessor` exports all metrics strictly on interval. with sufficient scale, thi…
-
During my work with draw calls (https://github.com/david-vanderson/dvui/issues/83#issuecomment-2267644939), I created the following new API design
Currently:
1. dvui allocates temporary memory
…
-
https://github.com/graphql/graphql-over-http/blob/main/rfcs/Batching.md
Could potentially even use Parallel.ForEach on the collection (may depend on how context is injected)
-
### Description
As the project develops, many of our tools work with lists of Audio objects with the goal to be that they can be optimized into Pydra workflows and have easy to use pipelines, especia…
-
Hi,
Thank you for the great work you've done on this model! Is there any way to batch the model using `funasr`? I've been trying to batch with padding and set the `padding_mask` to mask out the un…