ropensci / writexl

Portable, light-weight data frame to xlsx exporter for R
https://docs.ropensci.org/writexl
Other
210 stars 17 forks source link

Issue in write_xlsx? #37

Closed KyleHaynes closed 5 years ago

KyleHaynes commented 5 years ago

When writing out a lot of xlsx files, writexl::write_xlsx will eventually fail and the R session will become unusable.

Code used to produce the error:

install.packages("writexl")

# package ‘writexl’ successfully unpacked and MD5 sums checked
# The downloaded binary packages are in  C:\Users\XXXXXX\downloaded_packages

for(i in 1:1000) {
    print(i)
    writexl::write_xlsx(iris, "c:/temmp/asdsad.xlsx")
}

For me it fails after 510 write outs. The same issue occurs when you attempt to write a lot of worksheets as nested data.frames in a list (similar amount: ~500-600).

Error:

Error: Error in writexl: failed to create workbook


I'm on Windows 10, with 32 gig of RAM.

sessionInfo()
R version 3.6.0 (2019-04-26)
Platform: x86_64-w64-mingw32/x64 (64-bit)
Running under: Windows 10 x64 (build 16299)

Matrix products: default

locale:
[1] LC_COLLATE=English_Australia.1252  LC_CTYPE=English_Australia.1252   
[3] LC_MONETARY=English_Australia.1252 LC_NUMERIC=C                      
[5] LC_TIME=English_Australia.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

loaded via a namespace (and not attached):
[1] compiler_3.6.0

Once I force the error, I can't access anything:

sessionInfo()
Error in gzfile(file, "rb") : cannot open the connection
In addition: Warning message:
In gzfile(file, "rb") :
  cannot open compressed file 'C:/Program Files/R/R-3.6.0/library/stats/Meta/package.rds', probable reason 'Too many open files'

Task manager doesn't indicate lack of RAM or CPU or DISK:
image

I have access to RStudio Server (running Red Hat with 100gig of RAM and ~16 cores). I can't re-produce the error even when looping a million times. Perhaps a Windows issue?

Let me know if you require any further info.

KyleHaynes commented 5 years ago

Can reproduce this on colleagues computer (same machine, only 16gig of RAM)

Can't replicate it on home PC (windows 10, far less spec'd).

Closing issue.

segoldma commented 4 years ago

I just got the error message when passing a list of dataframes into writexl::write_xlsx. One of my list elements had a lengthy name (e.g., the name I wanted to assign to the tab on the resultant .xlsx file. By shortening the name from 35 characters, the function worked without error. I imagine this is a limit imposed by Excel.

gfleetwood commented 4 years ago

I'm running into a similar issue with trying to write 27 xlsx files in a for loop. 24 of them write just fine. Another worked when I ran it manually, and the others didn't work until I changed the write directory.

jeroen commented 4 years ago

@gfleetwood what error did you get? Are you also using very long file names or is it another issue?

gfleetwood commented 4 years ago

@jeroen No error. It just didn't write to the folder. The file names are relatively long (plus I'm passing a full path) and are constructed using glue. But all the file names are basically the same length so it's not clear why it would work for most and not for a few.

Here's an example of my write process:

path = "~/../Downloads/FOLDER/FOLDER/2020-04-13_abcdefgh.xlsx"
sheet_name = "April"
df_list <- setNames(list(df), sheet_name)
write_xlsx(df_list, glue(path), format_headers = FALSE)
KyleHaynes commented 4 years ago

@gfleetwood, just curious if you're on Windows? If so, does the total folder+file path exceed 255 ~6~ characters? That's the default limit in Windows ....

https://stackoverflow.com/a/265782/2449656

gfleetwood commented 4 years ago

@KyleHaynes I'm on Windows but it's not a problem in this case. The longest path is 190 characters. It happens to be one of the three that aren't working. The other two are 161 and 164 chars, but numerous files worked whose path were 161/163 chars.