Closed KyleHaynes closed 5 years ago
Can reproduce this on colleagues computer (same machine, only 16gig of RAM)
Can't replicate it on home PC (windows 10, far less spec'd).
Closing issue.
I just got the error message when passing a list of dataframes into writexl::write_xlsx. One of my list elements had a lengthy name (e.g., the name I wanted to assign to the tab on the resultant .xlsx file. By shortening the name from 35 characters, the function worked without error. I imagine this is a limit imposed by Excel.
I'm running into a similar issue with trying to write 27 xlsx files in a for loop. 24 of them write just fine. Another worked when I ran it manually, and the others didn't work until I changed the write directory.
@gfleetwood what error did you get? Are you also using very long file names or is it another issue?
@jeroen No error. It just didn't write to the folder. The file names are relatively long (plus I'm passing a full path) and are constructed using glue. But all the file names are basically the same length so it's not clear why it would work for most and not for a few.
Here's an example of my write process:
path = "~/../Downloads/FOLDER/FOLDER/2020-04-13_abcdefgh.xlsx"
sheet_name = "April"
df_list <- setNames(list(df), sheet_name)
write_xlsx(df_list, glue(path), format_headers = FALSE)
@gfleetwood, just curious if you're on Windows? If so, does the total folder+file path exceed 255 ~6~ characters? That's the default limit in Windows ....
@KyleHaynes I'm on Windows but it's not a problem in this case. The longest path is 190 characters. It happens to be one of the three that aren't working. The other two are 161 and 164 chars, but numerous files worked whose path were 161/163 chars.
When writing out a lot of xlsx files,
writexl::write_xlsx
will eventually fail and the R session will become unusable.Code used to produce the error:
For me it fails after 510 write outs. The same issue occurs when you attempt to write a lot of worksheets as nested
data.frames
in alist
(similar amount: ~500-600).Error:
I'm on Windows 10, with 32 gig of RAM.
Once I force the error, I can't access anything:
Task manager doesn't indicate lack of RAM or CPU or DISK:
I have access to RStudio Server (running Red Hat with 100gig of RAM and ~16 cores). I can't re-produce the error even when looping a million times. Perhaps a Windows issue?
Let me know if you require any further info.