fastai / course-nlp

A Code-First Introduction to NLP course
https://www.fast.ai/2019/07/08/fastai-nlp/
3.42k stars 1.48k forks source link

Kernel dies in the 3-logreg-nb-imdb.ipynb notebook of "A Code-first Introduction to NLP" #31

Open jcatanza opened 4 years ago

jcatanza commented 4 years ago

Kernel dies at this step in the 3-logreg-nb-imdb.ipynb notebook, between the second Naive Bayes section and the second Binarized Naive Bayes section (running Windows 10 Home edition, 64-bit):

    p0 = np.squeeze(np.array(xx[neg].sum(0)))
    p1 = np.squeeze(np.array(xx[pos].sum(0)))
MicPie commented 4 years ago

I was able to solve it by first generating a dense array, then a np.array and then take the sum:

p0 = np.squeeze(np.array(xx[neg].todense()).sum(0))
p1 = np.squeeze(np.array(xx[pos].todense()).sum(0))

There is also an error when saving the data, where itongram needs to be replaced in the 2. write-block to ngramtoi:

with open('itongram.pickle', 'wb') as handle:
    pickle.dump(itongram, handle, protocol=pickle.HIGHEST_PROTOCOL)

with open('ngramtoi.pickle', 'wb') as handle:
    pickle.dump(ngramtoi, handle, protocol=pickle.HIGHEST_PROTOCOL)

After this I encounter also another kernel crash in the following line at the section "Using my ngrams, binarized:":

m2 = LogisticRegression(C=0.1, dual=True)
m2.fit(trn_x_ngram_sgn, y.items)

which can be also solved by handing over trn_x_ngram_sgn.todense().

However, I am not sure if this is the correct way to solve this, as with bigger arrays we maybe run into memory problems because of generating the dense arrays (to soon)?

Another error is also in the 2. last section "Log-count ratio" which I am currently trying to fix.

I still have to look into this, but wanted to share my preliminary findings with you.