Closed DOH-SNC1303 closed 6 months ago
Additional Checklist of Updates:
TRUE
/FALSE
(instead of specifying the definitions).Manual_Review
to Validation_Review
Output Folder
nested a: 1) Matched_Elements
, and Validation_Review
subfolder.0_Raw_Data
subfolder under Validation_Review
(just use the files from Output
).All_(Def1)
and All_(Def2)
data sets to Output
to replace 0_Raw_Data
ReviewScale
variables to ensure they are robust against weird user input scale (such as 1-7).We are factoring Review Category Labels (True Positive, False Positive) before feeding the reviewer data to caret::confusionMatrix()
. Under this format, "Uncertain" Review Category labels are set to NA within the factor, and I believed dropped from the query metric calculations.
@DOH-SNC1303 @DOH-SNC1303
Question: Is there a way to incorporate "Uncertain' review labels into Query performance metrics? I have really only seen 2x2 (TP/FP and TN/FN) so dropping "Uncertain" review categories seems like an ok approach...
Considerations for additional SRS variables:
12/14/2023:
Added
# Create Matched Elements Subfolder fs::dir_create(paste0(output_folder, "Matched_Elements"))
to support Matched_Elements subfolder creation (where data is written)
Add tool that will support inter-rater/consensus manual review of a sample of pulled records.