Closed 1148330040 closed 3 years ago
Hello, thanks for opening an issue! We try to keep the github issues for bugs/feature requests. Could you ask your question on the forum instead?
Thanks!
cc @Rocketknight1
I asked questions where you said, but no one replied to me. I hope you can tell me the answer to this question. Thank you! https://discuss.huggingface.co/t/how-to-extract-the-encoded-data-of-feed-forward-layer-in-tfbertmodel/9320
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.
Please note that issues that do not follow the contributing guidelines are likely to be ignored.
env:tf2.2
model:TFBertModel.from_pretrained('hfl/chinese-bert-wwm-ext')
i'm working on an information extraction project. First, I predict the “subject” through Bert CRF, then tf. Gather () the coding layer of shared Bert and the location information corresponding to the “subject”, and then predict the “object“, but I can't extract the feed & forward layer of Bert now
I want to extract the output of the feed & forward layer of the Bert model as the shared coding layer, but I can't find the corresponding method. I want to obtain the output similar to the following:
Tensor("Transformer-11-FeedForward-Add/add:0", shape=(None, None, 768), dtype=float32)
I tried through ”model.trainable_weights[- 5]”layer, but the extracted output is obviously not what I need, and I don't want to directly use "model (ids, masks, tokens) [0]", because Bert's last layer is processed by "layerNormal"