Open ManuelSpinola opened 3 weeks ago
When running local models with Ollama, it can sometimes take a long time for the model to be ready to be used. What model were you trying to use? Maybe try a smaller one like phi-3.
Thank you.
I tried, but no, no response at all.
El sáb, 29 jun 2024 a las 5:04, James Wade @.***>) escribió:
When running local models with Ollama, it can sometimes take a long time for the model to be ready to be used. What model were you trying to use? Maybe try a smaller one like phi-3.
— Reply to this email directly, view it on GitHub https://github.com/MichelNivard/gptstudio/issues/215#issuecomment-2198106624, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFI3FB4EFNLNOOYKIJNYUWTZJ2ICTAVCNFSM6AAAAABJGSLJGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOJYGEYDMNRSGQ . You are receiving this because you authored the thread.Message ID: @.***>
-- Manuel Spínola, Ph.D. Instituto Internacional en Conservación y Manejo de Vida Silvestre Universidad Nacional Apartado 1350-3000 Heredia COSTA RICA @. @.> @.*** Teléfono: (506) 8706 - 4662 Sitio web institucional: ICOMVIS http://www.icomvis.una.ac.cr/index.php/manuel Sitio web personal: Sitio personal https://mspinola-sitioweb.netlify.app Blog sobre Ciencia de Datos: Blog de Ciencia de Datos https://mspinola-ciencia-de-datos.netlify.app
What happens if you use openai? Do you get a response?
Related to ollama: have you pulled any model? Do you get responses when using ollama directly in the terminal?
Thank you.
I write in the terminal:
ollama pull phi
the Background Jobs screen looks like:
Loading required package: shiny
Listening on http://127.0.0.1:7905ℹ Fetching models for openai service...✔ Got models!ℹ Fetching models for ollama service...✔ Got models!
and when I try in gptstudio:::gptstudio_chat(), nothing happens.
El lun, 1 jul 2024 a las 7:19, Samuel Calderon @.***>) escribió:
What happens if you use openai? Do you get a response?
Related to ollama: have you pulled any model? Do you get responses when using ollama directly in the terminal?
— Reply to this email directly, view it on GitHub https://github.com/MichelNivard/gptstudio/issues/215#issuecomment-2200123024, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFI3FB5SNS6TZQAVQO7LK3DZKFJPLAVCNFSM6AAAAABJGSLJGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMBQGEZDGMBSGQ . You are receiving this because you authored the thread.Message ID: @.***>
-- Manuel Spínola, Ph.D. Instituto Internacional en Conservación y Manejo de Vida Silvestre Universidad Nacional Apartado 1350-3000 Heredia COSTA RICA @. @.> @.*** Teléfono: (506) 8706 - 4662 Sitio web institucional: ICOMVIS http://www.icomvis.una.ac.cr/index.php/manuel Sitio web personal: Sitio personal https://mspinola-sitioweb.netlify.app Blog sobre Ciencia de Datos: Blog de Ciencia de Datos https://mspinola-ciencia-de-datos.netlify.app
Hi, now is working with ollama, I follow the example Local Models with Ollama • gptstudio (michelnivard.github.io) https://michelnivard.github.io/gptstudio/articles/ollama.html.
What I did differently this time was to save as default like it showed in the example.
El lun, 1 jul 2024 a las 7:19, Samuel Calderon @.***>) escribió:
What happens if you use openai? Do you get a response?
Related to ollama: have you pulled any model? Do you get responses when using ollama directly in the terminal?
— Reply to this email directly, view it on GitHub https://github.com/MichelNivard/gptstudio/issues/215#issuecomment-2200123024, or unsubscribe https://github.com/notifications/unsubscribe-auth/AFI3FB5SNS6TZQAVQO7LK3DZKFJPLAVCNFSM6AAAAABJGSLJGWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMBQGEZDGMBSGQ . You are receiving this because you authored the thread.Message ID: @.***>
-- Manuel Spínola, Ph.D. Instituto Internacional en Conservación y Manejo de Vida Silvestre Universidad Nacional Apartado 1350-3000 Heredia COSTA RICA @. @.> @.*** Teléfono: (506) 8706 - 4662 Sitio web institucional: ICOMVIS http://www.icomvis.una.ac.cr/index.php/manuel Sitio web personal: Sitio personal https://mspinola-sitioweb.netlify.app Blog sobre Ciencia de Datos: Blog de Ciencia de Datos https://mspinola-ciencia-de-datos.netlify.app
I'm sorry we weren't able to provide better help for this. This was hard to reproduce.
I'm glad it is working for you now. Feel free to close the issue if you think you reached a solution, otherwise please provide further details.
Confirm setup
{gptstudio}
(pak::pak("MichelNivard/gptstudio")
) and tested if the problem remains.{reprex}
and{sessioninfo}
packages to be able to run this issue's code snippetpak::pak(c("reprex", "sessioninfo"))
.What happened?
Sorry for this basic question, but I cannot make gpstudio work with any API. For example, Ollama:
ℹ Fetching models for ollama service... ✔ Got models!
I write a prompt:
make a plot with ggplot2
Nothing happens, no response at all.
Relevant log output
Session info
Created on 2024-06-12 with reprex v2.1.0
Session info
``` r sessioninfo::session_info() #> ─ Session info ─────────────────────────────────────────────────────────────── #> setting value #> version R version 4.4.0 (2024-04-24) #> os macOS Sonoma 14.5 #> system aarch64, darwin20 #> ui X11 #> language (EN) #> collate en_US.UTF-8 #> ctype en_US.UTF-8 #> tz America/Costa_Rica #> date 2024-06-12 #> pandoc 3.1.11 @ /Applications/RStudio.app/Contents/Resources/app/quarto/bin/tools/aarch64/ (via rmarkdown) #> #> ─ Packages ─────────────────────────────────────────────────────────────────── #> package * version date (UTC) lib source #> assertthat 0.2.1 2019-03-21 [1] CRAN (R 4.4.0) #> cli 3.6.2 2023-12-11 [1] CRAN (R 4.4.0) #> curl 5.2.1 2024-03-01 [1] CRAN (R 4.4.0) #> digest 0.6.35 2024-03-11 [1] CRAN (R 4.4.0) #> evaluate 0.23 2023-11-01 [1] CRAN (R 4.4.0) #> fastmap 1.2.0 2024-05-15 [1] CRAN (R 4.4.0) #> fs 1.6.4 2024-04-25 [1] CRAN (R 4.4.0) #> glue 1.7.0 2024-01-09 [1] CRAN (R 4.4.0) #> gptstudio 0.4.0.9000 2024-06-12 [1] Github (MichelNivard/gptstudio@1f32da7) #> htmltools 0.5.8.1 2024-04-04 [1] CRAN (R 4.4.0) #> htmlwidgets 1.6.4 2023-12-06 [1] CRAN (R 4.4.0) #> httpuv 1.6.15 2024-03-26 [1] CRAN (R 4.4.0) #> httr2 1.0.1 2024-04-01 [1] CRAN (R 4.4.0) #> jsonlite 1.8.8 2023-12-04 [1] CRAN (R 4.4.0) #> knitr 1.46 2024-04-06 [1] CRAN (R 4.4.0) #> later 1.3.2 2023-12-06 [1] CRAN (R 4.4.0) #> lifecycle 1.0.4 2023-11-07 [1] CRAN (R 4.4.0) #> magrittr 2.0.3 2022-03-30 [1] CRAN (R 4.4.0) #> mime 0.12 2021-09-28 [1] CRAN (R 4.4.0) #> promises 1.3.0 2024-04-05 [1] CRAN (R 4.4.0) #> R6 2.5.1 2021-08-19 [1] CRAN (R 4.4.0) #> rappdirs 0.3.3 2021-01-31 [1] CRAN (R 4.4.0) #> Rcpp 1.0.12 2024-01-09 [1] CRAN (R 4.4.0) #> reprex 2.1.0 2024-01-11 [1] CRAN (R 4.4.0) #> rlang 1.1.3 2024-01-10 [1] CRAN (R 4.4.0) #> rmarkdown 2.27 2024-05-17 [1] CRAN (R 4.4.0) #> rstudioapi 0.16.0 2024-03-24 [1] CRAN (R 4.4.0) #> sessioninfo 1.2.2 2021-12-06 [1] CRAN (R 4.4.0) #> shiny 1.8.1.1 2024-04-02 [1] CRAN (R 4.4.0) #> withr 3.0.0 2024-01-16 [1] CRAN (R 4.4.0) #> xfun 0.44 2024-05-15 [1] CRAN (R 4.4.0) #> xtable 1.8-4 2019-04-21 [1] CRAN (R 4.4.0) #> yaml 2.3.8 2023-12-11 [1] CRAN (R 4.4.0) #> #> [1] /Library/Frameworks/R.framework/Versions/4.4-arm64/Resources/library #> #> ────────────────────────────────────────────────────────────────────────────── ```