Closed Oufattole closed 2 weeks ago
[!IMPORTANT]
Review skipped
Auto reviews are disabled on base/target branches other than the default branch.
Please check the settings in the CodeRabbit UI or the
.coderabbit.yaml
file in this repository. To trigger a single review, invoke the@coderabbitai review
command.You can disable this status message by setting the
reviews.review_status
tofalse
in the CodeRabbit configuration file.
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?
:warning: Please install the to ensure uploads and comments are reliably processed by Codecov.
Attention: Patch coverage is 72.80164%
with 133 lines
in your changes missing coverage. Please review.
Project coverage is 74.12%. Comparing base (
de8082e
) to head (2e3f0fd
). Report is 101 commits behind head on dev.
:exclamation: Your organization needs to install the Codecov GitHub app to enable full functionality.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
When we flatten to 2d tensors, the sequence length goes over the max_seq_len. We need to more elegantly handle this in load_subject
. We need to modify this subsection of the function to correctly flatten and subsample to the max_seq_len. I'll add a doctested public function for handling it.
This doctest is flakey and I'm relatively confident it is due to nested-ragged-tensors: subsample_subject_data
@mmcdermott could you take a look?
This resolves issue #104 . Will Merge after #103 . This implements generation analysis capabilities and sampling features for the model evaluation pipeline.
Changes
temperature
parameter) with configurable sample count vianum_samples
global_end
was incorrect