Open Larswad opened 5 years ago
Additional note: I did discover that I didn't emulate the NAND properly (AND'ing the data to write with the file memory, then writing). I fixed that so that write always ANDs, (erase still writes 1's, no AND'ing of course), So then I had then my hopes up for a while.
But, that only prolonged the problem a little bit. I still get the SPIFFS_FULL state, when copying large files, over the old one. And I can even get when having a couple of large but trying to write smaller files that should fit.
closed by mistake. It is still a problem.
First I want to mention that I think that spiffs is great, for embedded spi flashes there are not much free alternatives as good as this out there! Even if this is (for the moment) a flat structure it covers the most usecases for a simple embedded system device.
But I think there is something wrong when writing or overwriting files are considerably large in respect to the memory size, when the memory is getting close to full, even if deleting so that clearly there would be space enough. I have written a command-line interface (using gnu readline) and a cpp wrapper class that both emulate the behaviour of an spi flash (of arbitrary size) in a simple file container (using the callback interface for read, write, erase). The wrapper provides an delegated interface to the spiffs c functions.
Anyhow, this gives me that opportunity to do some heavy testing of spiffs (copying files to and from the "outer" file system, removing files, overwriting etc.).
Say I have a 1MB flash mounted, physical erase and block size 4K, page size 256, formatted, all good. So first writing couple of pretty large files around 350kb, then filling up with a couple of smaller ones (say ~45Kb) until getting close to filled, this is what happens: SPIFFS_ERR_FULL is reported before it is actually full, there is like 130Kb left. Then erasing the last smaller file doesn't really help, Erasing a larger file DOES help (if I write a smaller one after that).
In fact, I can get into this problem from a formatted by first copying a larger file, say 378 Kb. Then copying another one of 378 Kb. Now If I overwrite the first or the other a couple of times I will run into the same problem.
Things that I guess contributes to this:
However, something is still funky with the housekeeping of free space. I can delete files so that it clearly is room for another large file (only having ONE 379 Kb file), and still it fails copying another one. Garbage collecting and check operations makes no difference in this case. This does not happen if I write smaller files and deleting them or overwriting them without filling up the whole space (which will be my use case). It only becomes a problem when getting close to full, or overwriting large files. Yet still this thing worries me that something is wrong. Now, things may be by design here, so if I have misunderstood some basic behaviour of spiffs, I am sorry for that.
Here is a sequence of operations from my command line interface, describing one scenario of overwriting:
What can be seen here is that there is clearly room for that 375 kb file, but there is still error full reported. Is there something in my flash emulation that I haven't considered, is spiffs making use of rewriting cells multiple times for housekeeping before erasing (meaning, AND-ing the bits written with the previous content, assuming flash is initialized with ones)?
Any ideas?