Closed bwenyenye closed 10 months ago
what if you try lowering your threshold? (Assuming you do have a bunch of training data being used)
also, i have found that 'dirty' data in the training data can cause the process to halt with that error.. could be records with special characters, nulls when you havent specified 'has missing'...
what if you try lowering your threshold? (Assuming you do have a bunch of training data being used)
I have tried everything including lowering the threshold and using clean data, nothing so far. However, i may have traced the error to this line of code: b_data = deduper.fingerprinter(full_data)
The other lines prior to that including full_data = ((row['donor_id'], row) for row in read_cur) are working as expected and i am able to print out and inspect them. However b_data is a generator object and i am unable to find code that can print it out so i can inspect its content. I have tried converting it to a list and even using the Next function, any tips on how i can print it out to inspect its contents? Thank you
@fgregg please help with this, Thanks!
please provide a reproducible example
please provide a reproducible example
that is not reproducible. i need data and training data and setting file.
if you cannot share that because of privacy issues, we offer consulting services.
if you cannot share that because of privacy issues, we offer consulting services.
Okay, Thank you. One more question, when i print out full_data this is the output I get: ('r2FEdTZzbOM', ('r2FEdTZzbOM', 'country side', 'sarkodie feat. black sherif')) ('GgeTnpTkzI0', ('GgeTnpTkzI0', 'loaded', 'tiwa savage, asake')) ('aP-MBSrzFNo', ('aP-MBSrzFNo', "'letter from overseas'", 'larry gaaga & black sherif')) ('we5gSjpX03U', ('we5gSjpX03U', 'single', 'kuami eugene')) ('E1YDr0PYg34', ('E1YDr0PYg34', 'nirvana', 'kwesi arthur x kofi mole')) ('o_oenl2Be-w', ('o_oenl2Be-w', '2 sugar', 'wizkid')) ('DPBRGWUgQsA', ('DPBRGWUgQsA', 'soweto', 'victony & tempoe')) ('5BfoawaaARc', ('5BfoawaaARc', 'shatta montez', 'shatta wale')) ('9zEPGHPZCF8', ('9zEPGHPZCF8', 'red flags', 'ruger')) ('1FbzbsWSN88', ('1FbzbsWSN88', 'midnight', 'larruso'))
This is the expected format right?
yes.
I am getting the following Blocking Error:BlockingError: No records have been blocked together. Is the data you are trying to match like the data you trained on? If so, try adding more training data.
I figured the issue is from this line of code: clustered_dupes = deduper.cluster(deduper.score(record_pairs(read_cur)), threshold=0.5)
I am using the PostgreSQL example.
BlockingError Traceback (most recent call last) Cell In[3], line 398 396 print('clustering...') 397 record_pairs_data = record_pairs(read_cur) --> 398 clustered_dupes = deduper.cluster(deduper.score(record_pairs_data), threshold=0.5) 421 print('writing results...') 422 with write_con:
File ~/dss_home/code-envs/python/pyconda_38/lib/python3.8/site-packages/dedupe/api.py:125, in IntegralMatching.score(self, pairs) 116 """ 117 Scores pairs of records. Returns pairs of tuples of records id and 118 associated probabilities that the pair of records are match (...) 122 123 """ 124 try: --> 125 matches = core.scoreDuplicates( 126 pairs, self.data_model.distances, self.classifier, self.num_cores 127 ) 128 except RuntimeError: 129 raise RuntimeError( 130 """ 131 You need to either turn off multiprocessing or protect (...) 134 https://docs.python.org/3/library/multiprocessing.html#the-spawn-and-forkserver-start-methods""" 135 )
File ~/dss_home/code-envs/python/pyconda_38/lib/python3.8/site-packages/dedupe/core.py:126, in scoreDuplicates(record_pairs, featurizer, classifier, num_cores) 124 first, record_pairs = peek(record_pairs) 125 if first is None: --> 126 raise BlockingError( 127 "No records have been blocked together. " 128 "Is the data you are trying to match like " 129 "the data you trained on? If so, try adding " 130 "more training data." 131 ) 133 record_pairs_queue: _Queue = Queue(2) 134 exception_queue: _Queue = Queue()
BlockingError: No records have been blocked together. Is the data you are trying to match like the data you trained on? If so, try adding more training data.