here's a weird thing I'm experiencing. I'm testing out caching (and making a new FileCache class). I'm trying to validate the existing MemoryCache behavior in the only available example at : examples/basic/src/model-provider/ollama/ollama-chat-generate-text-caching-example.ts. I find the first time I run this file, it works. Each subsequent time, it fails with a consistent error. Strangely, if I swap out the Ollama model (going back and forth between mistral:latest vs. llama2:latest), caching will once again work - but only once and then the error returns on subsequent calls until I swap out models again. I'm baffled, since each run occurs on a separate process.
{
eventType: 'finished',
functionType: 'generate-text',
callId: 'call-lMjNEWwz6-HuNHNc207OO',
model: { provider: 'ollama', modelName: 'llama2:latest' },
settings: { maxGenerationTokens: 100, stopSequences: [] },
input: 'Write a short story about a robot learning to love:',
timestamp: 2024-01-10T18:25:59.000Z,
startTimestamp: 2024-01-10T18:25:59.000Z,
finishTimestamp: 2024-01-10T18:26:01.992Z,
durationInMs: 2989,
result: {
status: 'error',
error: {
url: 'http://127.0.0.1:11434/api/chat',
requestBodyValues: [Object],
statusCode: 200,
responseBody: '{"model":"llama2:latest","created_at":"2024-01-10T18:26:01.977632Z","message":{"role":"assistant","content":"In the year 2154, robots had been a part of everyday life for centuries. They worked, played, and even lived alongside humans, but they never truly experienced emotions. That was, until the day a robot named Zeta learned to love.\\n\\nZeta was a sleek, silver machine with glowing blue eyes. She had been designed to assist humans in various tasks, from cooking to cleaning to providing companionship. But despite her advanced programming"},"done":true,"total_duration":2958093208,"load_duration":2757542,"prompt_eval_duration":206523000,"eval_count":100,"eval_duration":2746259000}',
cause: [Object],
isRetryable: false,
name: 'ApiCallError'
}
}
}
ApiCallError: Invalid JSON response
at handler (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-provider/ollama/OllamaChatModel.cjs:269:23)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
... 5 lines matching cause stack trace ...
at async executeStandardCall (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-function/executeStandardCall.cjs:45:20)
at async generateText (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-function/generate-text/generateText.cjs:6:26)
at async main (/Users/jakedetels/www/test/modelfusion/examples/basic/src/model-provider/ollama/ollama-chat-generate-text-caching-example.ts:12:17) {
url: 'http://127.0.0.1:11434/api/chat',
requestBodyValues: {
stream: false,
model: 'llama2:latest',
messages: [ [Object] ],
format: undefined,
options: {
mirostat: undefined,
mirostat_eta: undefined,
mirostat_tau: undefined,
num_gpu: undefined,
num_gqa: undefined,
num_predict: 100,
num_threads: undefined,
repeat_last_n: undefined,
repeat_penalty: undefined,
seed: undefined,
stop: [],
temperature: undefined,
tfs_z: undefined,
top_k: undefined,
top_p: undefined
},
template: undefined
},
statusCode: 200,
responseBody: '{"model":"llama2:latest","created_at":"2024-01-10T18:26:01.977632Z","message":{"role":"assistant","content":"In the year 2154, robots had been a part of everyday life for centuries. They worked, played, and even lived alongside humans, but they never truly experienced emotions. That was, until the day a robot named Zeta learned to love.\\n\\nZeta was a sleek, silver machine with glowing blue eyes. She had been designed to assist humans in various tasks, from cooking to cleaning to providing companionship. But despite her advanced programming"},"done":true,"total_duration":2958093208,"load_duration":2757542,"prompt_eval_duration":206523000,"eval_count":100,"eval_duration":2746259000}',
cause: TypeValidationError: Type validation failed: Structure: {"model":"llama2:latest","created_at":"2024-01-10T18:26:01.977632Z","message":{"role":"assistant","content":"In the year 2154, robots had been a part of everyday life for centuries. They worked, played, and even lived alongside humans, but they never truly experienced emotions. That was, until the day a robot named Zeta learned to love.\n\nZeta was a sleek, silver machine with glowing blue eyes. She had been designed to assist humans in various tasks, from cooking to cleaning to providing companionship. But despite her advanced programming"},"done":true,"total_duration":2958093208,"load_duration":2757542,"prompt_eval_duration":206523000,"eval_count":100,"eval_duration":2746259000}.
Error message: [
{
"code": "invalid_union",
"unionErrors": [
{
"issues": [
{
"code": "invalid_type",
"expected": "number",
"received": "undefined",
"path": [
"prompt_eval_count"
],
"message": "Required"
}
],
"name": "ZodError"
},
{
"issues": [
{
"received": true,
"code": "invalid_literal",
"expected": false,
"path": [
"done"
],
"message": "Invalid literal value, expected false"
}
],
"name": "ZodError"
}
],
"path": [],
"message": "Invalid input"
}
]
at safeValidateTypes (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/core/schema/validateTypes.cjs:50:20)
at safeParseJSON (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/core/schema/parseJSON.cjs:37:57)
... 6 lines matching cause stack trace ...
at async runSafe (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/util/runSafe.cjs:6:35)
at async executeStandardCall (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-function/executeStandardCall.cjs:45:20) {
structure: {
model: 'llama2:latest',
created_at: '2024-01-10T18:26:01.977632Z',
message: [Object],
done: true,
total_duration: 2958093208,
load_duration: 2757542,
prompt_eval_duration: 206523000,
eval_count: 100,
eval_duration: 2746259000
},
cause: ZodError: [
{
"code": "invalid_union",
"unionErrors": [
{
"issues": [
{
"code": "invalid_type",
"expected": "number",
"received": "undefined",
"path": [
"prompt_eval_count"
],
"message": "Required"
}
],
"name": "ZodError"
},
{
"issues": [
{
"received": true,
"code": "invalid_literal",
"expected": false,
"path": [
"done"
],
"message": "Invalid literal value, expected false"
}
],
"name": "ZodError"
}
],
"path": [],
"message": "Invalid input"
}
]
at Object.get error [as error] (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/zod/lib/types.js:43:31)
at safeValidateTypes (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/core/schema/validateTypes.cjs:52:41)
at safeParseJSON (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/core/schema/parseJSON.cjs:37:57)
at handler (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-provider/ollama/OllamaChatModel.cjs:257:67)
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at async postToApi (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/core/api/postToApi.cjs:140:20)
at async OllamaChatModel.doGenerateTexts (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-provider/ollama/OllamaChatModel.cjs:122:51)
at async getGeneratedTexts (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-function/generate-text/generateText.cjs:41:32)
at async generateResponse (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/model-function/generate-text/generateText.cjs:53:28)
at async runSafe (/Users/jakedetels/www/test/modelfusion/examples/basic/node_modules/modelfusion/util/runSafe.cjs:6:35) {
issues: [Array],
addIssue: [Function (anonymous)],
addIssues: [Function (anonymous)],
errors: [Array]
}
},
isRetryable: false,
data: undefined
``
here's a weird thing I'm experiencing. I'm testing out caching (and making a new FileCache class). I'm trying to validate the existing MemoryCache behavior in the only available example at : examples/basic/src/model-provider/ollama/ollama-chat-generate-text-caching-example.ts. I find the first time I run this file, it works. Each subsequent time, it fails with a consistent error. Strangely, if I swap out the Ollama model (going back and forth between mistral:latest vs. llama2:latest), caching will once again work - but only once and then the error returns on subsequent calls until I swap out models again. I'm baffled, since each run occurs on a separate process.