Closed oldbighorn closed 2 years ago
Edit: fixed with for-loop
batchsize = 500 relevances = np.zeros(sentences.shape) for i in range(0, sentences.shape[0], batchsize): batch = sentences[i:i+batchsize] relevances[i:i+batchsize, :, :], _ = lrp_model.lrp(batch)
I'm working with the example file and implemented it for explaining clickstream sequences. Now I want to batch wise explain about 1 million sequences. The batch wise explanation works for maximum ca. 10,000 sequences, but with more it crashes automatically due to memory issues. I'm using Google Colab with 25Gb of memory. Is it possible to epxlain more sequences without crashing due to memory issues?