Closed AlephNotation closed 2 months ago
If we can't get tiny models to .browse()
, do we feel ok about them doing .get()
and .do()
? It would be nice to incorporate this, even with limitations.
Current status (over phone):
page.get()
, not so sure about anything else.Been doing some experiments with examples/findEmail.ts
and node-llama-cpp, and I needed to set the context and batch size in generateObjectLocal.ts:
const context = new LlamaContext({ model, contextSize: 32168, batchSize: 32168 });
Otherwise, it would throw:
/Users/maru/git/hdr/nolita/node_modules/node-llama-cpp/llama/llama.cpp/src/llama.cpp:14466: GGML_ASSERT(n_tokens_all <= cparams.n_batch) failed
The model in question I picked was capybarahermes-2.5-mistral-7b
. Once I set contextSize and batchSize, it did get the emails off the page. If you try to get it to browse, it won't generate a command.
issues: [
{
code: 'too_small',
minimum: 1,
type: 'array',
inclusive: true,
exact: false,
message: 'Array must contain at least 1 element(s)',
path: [ 'command' ]
}
],
addIssue: [Function (anonymous)],
addIssues: [Function (anonymous)],
errors: [
{
code: 'too_small',
minimum: 1,
type: 'array',
inclusive: true,
exact: false,
message: 'Array must contain at least 1 element(s)',
path: [ 'command' ]
}
]
}
@matildepark good to go?
adds ollama models