-
We want to collect experiments here that compare BERT, ELMo, and Flair embeddings. So if you have any findings on which embedding type work best on what kind of task, we would be more than happy if yo…
-
## Feature description
Hi! I am having an issue with serialization of `Doc` objects that have `user_hooks`.
What I am trying to do, is to compute the `vectors` property of `Doc` objects by using a…
-
Hello, I built a very simple deep neural network that uses the Universal Sentence Encoder layer from Tensorflow Hub (v5) as first layer, for multilabel classification (in my case, predicting one or mo…
-
**System information**
- colab script
# Install the latest Tensorflow version.
!pip3 install --upgrade tensorflow-gpu
# Install TF-Hub.
!pip3 install tensorflow-hub
!pip3 install seaborn
- Pyth…
-
**System information**
- colab script
# Install the latest Tensorflow version.
!pip3 install --upgrade tensorflow-gpu
# Install TF-Hub.
!pip3 install tensorflow-hub
!pip3 install seaborn
- Py…
-
0
I have this code for semantic search engine built using the pre-trained bert model. I want to convert this model into tflite for deploying it to google mlkit. I want to know how to convert it. …
-
Hey everyone.
Im trying to understand the differences in using several word embeddings (BERT, XLM, ..) with this framework and using the embeddings in other framework, e.g. huggingface
To be more…
-
**System information**
- Have I written custom code : No
- OS Platform and Distribution : Windows 10 / Google Colab
- TensorFlow version (use command below):tensorflow==2.0.0
- Python version:Py…
-
Hello,
Consider the below code:
```
g = tf.Graph()
with g.as_default():
# We will be feeding 1D tensors of text into the graph.
text_input = tf.placeholder(dtype=tf.string, shape=[None])
…
-
Hi, Team.
I just notice that [`universal-sentence-encoder-multilingual-qa`](https://aihub.cloud.google.com/p/products%2F558c8a34-563c-481c-baca-887e082794be
) and [`universal-sentence-encoder-qa`…