Closed GoogleCodeExporter closed 9 years ago
The proposed change will only solve the problem on single unified systems that
have
both the database and R running on the same machine. I'm running into this same
error when I try and load a dataframe from R running on one linux box and
PostGreSQL
running on another linux box.... permissions are only part of the problem in
this
case... the main one is availabililty of that local /tmp dir on the database
machine.
Original comment by dschr...@gmail.com
on 13 May 2010 at 3:33
For the case of the multiple systems, I created a shared folder on the host
computer
that was mounted to the same directory on the client computers via samba. I then
changed the tempfiles to save there. Very kludgy, but solved my limited needs
with
limited work.
The best solution would probably use the same syntax as pg_dump and pg_dumpall,
with
data coming from stdin all at once, e.g.:
INSERT INTO table FROM stdin VALUES (...)
That solution should be even faster than the copy method because the file
doesn't
have to be saved to disk twice (first as a tempfile, second into Postgres).
However,
the whole upload would need to occur in one transaction.
Original comment by rmcge...@gmail.com
on 15 May 2010 at 1:14
A mutually mounted shared folder seems like it could work. I could just use a
temporary directory in the user's home dir like /home/fred/tmp/ ... might be
nice to
have "/tmp" in the code above read from an environmental variable or a new
parameter
(say something like 'tmp.dir') to dbWriteTable() instead.
Original comment by dschr...@gmail.com
on 17 May 2010 at 8:11
For example, the following edit works for me:
[someuser@linuxmachine]$ diff PostgreSQLSupport.R PostgreSQLSupport-edit.R
633,636c633,640
< if(as.character(Sys.info()["sysname"]) %in% c("Linux", "Darwin"))
< fn <- tempfile("rsdbi","/tmp")
< else
< fn <- tempfile("rsdbi")
---
> tmp.dir <- Sys.getenv('R_DB_TMP')
> if(tmp.dir == '')
> tmp.dir <- '/tmp'
> if(!file.exists(tmp.dir))
> dir.create(tmp.dir)
> fn <- tempfile("rsdbi",tmp.dir)
> Sys.chmod(fn, mode="0744")
as long as I'm sure to assign this environmental variable before I make a call
to
WriteTable:
Sys.setenv(R_DB_TMP= '/home/someuser/tmp')
Original comment by dschr...@gmail.com
on 17 May 2010 at 9:12
As stated in the comment of the source and TODO file, the bulk copy operation
should be done over the socket rather than passing the pathname.
How to achieve this? There is a hint in the psql \copy operation that does
most similar to what we want to do.
The essence is to use "COPY tablename from STDIN"
and send the data using repeated PQputCopyData()
terminated with a single call to PQputCopyEnd()
A proof of concept code without sufficient error handling is attached.
(The patch is relative to RPostgreSQL_0.1-6.).
Also, I believe the issue 9 is the same problem.
Original comment by tomoa...@kenroku.kanazawa-u.ac.jp
on 11 Sep 2010 at 7:19
Attachments:
This should have been fixed as of r144
Original comment by tomoa...@kenroku.kanazawa-u.ac.jp
on 13 Oct 2010 at 2:45
Original issue reported on code.google.com by
rmcge...@gmail.com
on 22 Feb 2010 at 6:52