-
I'm trying to use the `text-davinci-003` model using the following `config.cfg`
```
[nlp]
lang = "en"
pipeline = ["llm"]
[components]
[components.llm]
factory = "llm"
[components.llm.tas…
-
Hello, I'm at the section "Testing with the BeagleBone Black" on page 156 and my BeagleBone can't seem to connect to the NFS server. I'm sure that the server itself is working since I ran the previous…
-
If in a Chinese question answering system make sure the answers are complete. A few questions about the use of llama_index:
1: If I do not load md at once (there are many md files entered into th…
-
I use the command :
alpaca_eval --model_outputs '/home/zhoudong/repos/alpaca_eval/alpaca_data/Infini-Megrez-7b-20231114-v2.json' --annotators_config 'text_davinci_003' --reference_outputs '/home/zho…
-
ask:
```
# set maximum input size
max_input_size = 4096
# set number of output tokens
num_outputs = 648
# set maximum chunk overlap
max_chunk_overlap = 20
# set chunk size limit
chunk_size_li…
-
I have a flutter module added to my native android project. Everything works ok, but when i go back to my android app, and reopen the flutteractivity (with reuse of the flutter engine) I get this exce…
-
### Build Date
PixelExperience_Plus_davinci-13.0-20230713-2143-OFFICIAL.zip
### Device
davinci
### Version
thirteen_plus
### Describe the Bug
NFC Service is crashing.
NFC not working.
vendor/…
-
```
# max LLM token input size
max_input_size = 1024
# set number of output tokens
num_output = 256
# set maximum chunk overlap
max_chunk_overlap = 20
prompt_helper = PromptHelper(max_input_s…
-
### Bug Description
Construction:
```
max_input_size = 4096
chunk_size_limit = 512
chunk_overlap_ratio = 0.1
llm_predictor = LLMPredictor(llm=ChatOpenAI(temperature=0, model_…
-
I've tested several times with different prompts, and it seems there's a limit to the response text. "Total embedding token usage" is always less than 38 tokens. I don't know if the two are related.
…