a1da4 / paper-survey

Summary of machine learning papers
32 stars 0 forks source link

Reading: Improving Word Representations via Global Context and Multiple Word Prototypes #202

Open a1da4 opened 3 years ago

a1da4 commented 3 years ago

0. Paper

@inproceedings{huang-etal-2012-improving, title = "Improving Word Representations via Global Context and Multiple Word Prototypes", author = "Huang, Eric and Socher, Richard and Manning, Christopher and Ng, Andrew", booktitle = "Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)", month = jul, year = "2012", address = "Jeju Island, Korea", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/P12-1092", pages = "873--882", }

1. What is it?

They used not only information on context words (local information) but also information on the entire document (global information) to learn word vectors considering polysemous words.

2. What is amazing compared to previous works?

3. Where is the key to technologies and techniques?

スクリーンショット 2021-09-15 16 50 53

From Reisinger and Mooney (2010), their model can learn multi-prototype word vectors.

4. How did evaluate it?

4.1 Nearest neighbors

Table 2 shows that each prototype describes each sense of target words.

スクリーンショット 2021-09-15 17 04 36

4.2 Similarity tasks

  1. WordSim-353 From Table 3, their model outperformed the previous single-prototype method (C&W).

    スクリーンショット 2021-09-15 17 04 55
  2. Word Similarity in Context (new) From Table 5, their model achieved state-of-the-art performance.

    スクリーンショット 2021-09-15 17 09 49

5. Is there a discussion?

6. Which paper should read next?

a1da4 commented 3 years ago

203

Solving clustering-sensitive problems by a probabilistic model based on the Skip Gram.

a1da4 commented 3 years ago

204

a new method using word sense disambiguation

a1da4 commented 3 years ago

207

Multi-Sense Skip-Gram