Closed shrivaths16 closed 4 months ago
The get_items_for_inference
method in dialog.py
and the ItemsForInference
class in runners.py
were updated to include a batch_size
parameter. This ensures that the batch size is considered during the instantiation of ItemsForInference
, enhancing the flexibility and efficiency of inference operations.
File | Change Summary |
---|---|
sleap/gui/learning/dialog.py |
Updated get_items_for_inference method to include batch_size parameter in ItemsForInference . |
sleap/gui/learning/runners.py |
Added batch_size attribute to ItemsForInference class and updated from_video_frames_dict method to accept batch_size . |
🐇 In the code where data flows, Batch size now in inference shows. From dialog to runner's space, Efficiency finds its place. With this change, our models gleam, Swift and smart, a coder's dream. 🌟
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 74.09%. Comparing base (
7ed1229
) to head (aee00de
). Report is 5 commits behind head on develop.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Description
Adding the option to choose a batch size for inference in the GUI for an easier way to speed up inference.
Types of changes
Does this address any currently open issues?
[list open issues here]
Outside contributors checklist
Thank you for contributing to SLEAP!
:heart: