Closed cuent closed 3 years ago
@cuent Thanks for letting us know. We are taking a look at this and will get back to you ASAP.
@cuent At the moment, we don't support this feature. Perhaps you can try a workaround where you use the model to create a list or dict of predictions before hand and then look up the appropriate value during the LF.
Thanks, @vkrishnamurthy11. I created a rest service for inference and wrapped it with a label function. I'll close the issue. THanks for your answer :)
@cuent Can you please share the code for what worked? re: The Label function and how you wrapped the same?
Issue description
I did a binary classifier and create a
LabelingFunction
which labels if the prediction score is above a threshold. I load the model as a preprocessing step where I do inference using tf model and transformers. However, when I executePandasParallelLF
, either it load models several times and I ran out of memory if the inference is inside ofget_label
or it cannot pickleTypeError: can't pickle _thread.RLock objects
if the inference is outside of the get_label function (as in the example).If I want to select a clf, what should be the proper way to do inference?