Closed m-saenger closed 2 years ago
Finally I used the psql
command line tool as a workaround to access the Postgres DB from within a future
. There is some overhead obviously (open/close connection for each request), but speed is excellent and memory usage acceptable. data.table::fread
allows to read psql
output directly into a data.table (via cmd
argument). Working example: https://myweather.ch/map
Minimal example:
library(future)
library(ambiorix)
library(data.table)
plan(multisession)
app <- Ambiorix$new()
app$get("/async", function(req, res){
future({
q <- "SELECT something FROM somewhere;"
cmd <- sprintf("psql -h somehost -d somedb -U someuser -w -c %s -AF,", shQuote(q))
dat <- data.table::fread(cmd = cmd))
# Fix column types if necessary
res$send(dat)
})
})
app$start()
@m-saenger I'm sorry I never came back to you on this issue. I've setup my company last year and it left me very little time for all open-source projects.
Has this been resolved? Have you found a solution?
No worries. I found a workaround and closed this issues a couple of days ago. See https://github.com/JohnCoene/ambiorix/issues/41#issuecomment-1019335032.
Project: The goal is an API which queries a database. Results set length varies a lot and so does the response time. In order to avoid blocking of fast queries by slow ones the API should by async.
Problem:
future
does not allow for external pointers (such as DB connections,pool
connections, etc.), thus I had to open a new DB connection (withinfuture
) for every request which slows down response time significantlyI am looking for a solution based on this example with a non-blocking
query_db()
functionality within the future.Any help would be greatly appreciated!