JBGruber / rollama

https://jbgruber.github.io/rollama/
GNU General Public License v3.0
91 stars 2 forks source link

! Unexpected content type "text/plain". #3

Closed Edouard-Legoupil closed 10 months ago

Edouard-Legoupil commented 10 months ago

Hello,

First congrat for the documentation! Really extensive and nicely made! Always an engaging sign!

Testing package - installed with ollama running on linux...

ping_ollama() ▶ Ollama is running at http://localhost:11434!

When I try any query: query("why is the sky blue?")

I get this error:

httr2::resp_body_json(httr2::req_perform(httr2::req_error(httr2::req_body_json(httr2::req_url_path_append(httr2::request(server), …:

! Unexpected content type "text/plain".

More below

.Last.error <callr_error/rlib_error_3_0/rlib_error/error> Error: ! in callr subprocess.


Backtrace:

  1. rollama::query("why is the sky blue?")
  2. rollama:::build_req(model = model, msg = msg, server = server, images = images, …
  3. rp$get_result()
  4. callr:::rp_get_result(self, private)
  5. callr:::get_result(out, private$options)
  6. callr:::throw(callr_remote_error(remerr, output), parent = fix_msg(remerr[[3]]))

    Subprocess backtrace: . `

Any hints?

Thanks! Edouard

JBGruber commented 10 months ago

Hi,

thanks for filing the issue. It does not really make sense to me at this point. But can you please try the follwing and post the output:

reprex::reprex({
  library(rollama)
  options("rollama_verbose" = FALSE)
  query("why is the sky blue?")
  system2("curl", "-V", stdout = TRUE)
}, session_info = TRUE)

FYI: options("rollama_verbose" = FALSE) gives you better error messages because callr is not involved anymore (I realise this is confusing, but do not know at the moment how to do it better).

JBGruber commented 10 months ago

Very odd, I just encountered the same issue. But it has nothing to do with rollama. You can test that by using:

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt": "What color is the sky at different times of the day? Respond using JSON",
  "format": "json",
  "stream": false
}'

In my case this suddenly returns curl: (52) Empty reply from server.

And if you get an empty reply like me, also try:

docker exec -it ollama ollama run llama2
# or if you run ollama without docker
ollama run llama2

Both fail for me at the moment. Interestingly, other calls to the API, for example, to list available modesl, succeed.

Update:

It seems another process was hogging my vRAM. By turning off the GPU, you can force the model to use your system memory (RAM). So this works for me:

query("why is the sky blue?", model_params = list(num_gpu = 0))

This could be solved much better upstream! There are no logs or errors indicating that this is a OOM error.

Edouard-Legoupil commented 10 months ago

On # 1 -

library(rollama) options("rollama_verbose" = FALSE) query("why is the sky blue?")

> Error in httr2::resp_body_json():

> ! Unexpected content type "text/plain".

> • Expecting type "application/json" or suffix "json".

> Backtrace:

> ▆

> 1. └─rollama::query("why is the sky blue?")

> 2. └─rollama:::build_req(...)

> 3. └─rollama:::make_req(req_data, server, "/api/chat")

> 4. └─httr2::resp_body_json(...)

> 5. └─httr2::resp_check_content_type(...)

> 6. └─httr2:::check_content_type(...)

> 7. └─cli::cli_abort(...)

> 8. └─rlang::abort(...)

system2("curl", "-V", stdout = TRUE)

> [1] "curl 7.68.0 (x86_64-pc-linux-gnu) libcurl/7.68.0 OpenSSL/1.1.1f zlib/1.2.11 brotli/1.0.7 libidn2/2.2.0 libpsl/0.21.0 (+libidn2/2.2.0) libssh/0.9.3/openssl/zlib nghttp2/1.40.0 librtmp/2.3"

> [2] "Release-Date: 2020-01-08"

> [3] "Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp scp sftp smb smbs smtp smtps telnet tftp "

> [4] "Features: AsynchDNS brotli GSS-API HTTP2 HTTPS-proxy IDN IPv6 Kerberos Largefile libz NTLM NTLM_WB PSL SPNEGO SSL TLS-SRP UnixSockets"

Edouard-Legoupil commented 10 months ago

Tried also

FYI - I am using this - https://github.com/docker/genai-stack to launch my ollama

options("rollama_verbose" = FALSE) ping_ollama() ▶ Ollama is running at http://localhost:11434! query("why is the sky blue?", model_params = list(num_gpu = 0)) Error in httr2::resp_body_json(): ! Unexpected content type "text/plain". • Expecting type "application/json" or suffix "json". Run rlang::last_trace() to see where the error occurred.

JBGruber commented 10 months ago

I read somewhere that Ollama does not work with old versions of curl (I can't find the source). The one I'm using is almost 4 years newer than yours (curl 8.5.0 (x86_64-pc-linux-gnu), Release-Date: 2023-12-06). I would suggest you update that. Then I would be interested in seeing what this does (in terminal, not R):

curl http://localhost:11434/api/generate -d '{
  "model": "llama2",
  "prompt": "What color is the sky at different times of the day? Respond using JSON",
  "format": "json",
  "stream": false
}'

If this also fails after updating curl, this is an Ollama problem that I can't really help with.

Edouard-Legoupil commented 10 months ago

yes - had to do a big LTS update...! Thanks! now it works!!