-
### Bug Description
Hello
I am developing a chat engine using Redis and PG Vector; however, the contextual information is left out when I put the Redis as chat memory:
```python
OPENAI_EMBED…
-
Is there a way to extend sparknlp and create my custom embedder similar to `BertEmbeddings`? There are some interesting models on TF Hub which I would like to try.
-
| | Plugin Name | Description |
|------|:---------------------------…
-
微博内容精选
-
By definition, we can define BERT score as:
> BERT Score assesses the quality of generated text by comparing it to a reference or a ground truth text. It considers the semantic similarity between t…
-
# REQUEST
I hope this isn't to overwhelming of a request. This would make TxtAI's graph act more like other graph DBs I've used and would open up a lot of possibilities for more sophisticated and a…
-
I really like the concept of the "[skill recipe](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_teaching.ipynb)" in AutoGen, but I think it can be taken much further. One of the key…
-
### Describe the bug
When instantiating a TransformerWordEmbeddings object with setting use_context=True as the only non default option (other than the model) and applying to the same input string,…
-
## ❓ Questions & Help
How can we get the word embedding vector in gpt-2? I follow the guidance in bert(model.embeddings.word_embeddings.weight). But it shows that ''GPT2LMHeadModel' object has …
-
- [x] how to manage nltk for github runtime machine when tests are run ?
- [ ] currently the filebuffer collects the content of a file before sending it for processing. It is able to know if the file…