ymcui / Chinese-BERT-wwm

Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
https://ieeexplore.ieee.org/document/9599397
Apache License 2.0
9.56k stars 1.38k forks source link

The way to get Chinese Word Embedding. #208

Closed qhd1996 closed 2 years ago

qhd1996 commented 2 years ago

For some reason I need to get the Chinese word embedding based on PLM, would you offor some ways to get them based on the existing trained models? Thanks :)

ymcui commented 2 years ago

Please see https://github.com/google-research/bert#using-bert-to-extract-fixed-feature-vectors-like-elmo With this script, you can extract word representation of any layers in BERT.

qhd1996 commented 2 years ago

Please see https://github.com/google-research/bert#using-bert-to-extract-fixed-feature-vectors-like-elmo With this script, you can extract word representation of any layers in BERT.

Thank you very much!

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

stale[bot] commented 2 years ago

Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.