Madhu009 / Deep-math-machine-learning.ai

A blog which talks about machine learning, deep learning algorithms and the Math. and Machine learning algorithms written from scratch.
https://medium.com/deep-math-machine-learning-ai
197 stars 170 forks source link

Top 10 Similar terms #3

Open Mustyy opened 5 years ago

Mustyy commented 5 years ago

Thank you firstly for the tutorial I wanted to ask if it is possible to use the final embeddings to test out a word and return top 10 similar terms.

e.g

Top 10 Similar words given an input word

word="external" word_vec = final_embeddings[dictionary[word]] sim = np.dot(word_vec,-final_embeddings.T).argsort()[0:8] for idx in range(8): print (reverse_dictionary[sim[idx]])

Madhu009 commented 5 years ago

a emdeding vector represents the word. so similar words have similar emebeding (we can use a distance metric to find out the distance ) .

This thread has different ways to handle please check it out https://stackoverflow.com/questions/40074412/word2vec-get-nearest-words

Mustyy commented 5 years ago

Hey @Madhu009 Thanks for the reply I understand embedding vectors, I was just wondering if there was a quick workaround for the code so that I can plug in a word and return top 10 similar terms. I tried using the tensorflow board but it wasnt successful either. Currently researching other methods too