gtatters / Thermimage

R Package for working with radiometric thermal image files and data
GNU General Public License v3.0
164 stars 41 forks source link

Error in writeBin & total frames #9

Closed shaktalerv closed 4 years ago

shaktalerv commented 4 years ago

I have the next message:

Error in writeBin(temperature, wb, size = 4, endian = "little") : only 2^31-1 bytes can be written in a single writeBin() call

n.frames 5711L

Data image

To Reproduce

steps of ""Importing Thermal Videos" in writeFlirBin(bindata=alldata, templookup, w, h, Interval, rootname="rootseq20180328")


I tried convert a SEQ file of 3.5GB, 5711 frames, total converted whith "convertflirVID" is 5584, why?

The objetive is convert SEQ file in png or tiff image with radiance (RAW) or Temperature. moreover, this file (image) in "details" should store the date and time of capture, focal... exif summarise, like to jpg file photo od rgb camera with geo-tag (I have the gnss info, I concatenate with the exif info)

step to step I've been able to generate some my project,

THANKS a lot!

gtatters commented 4 years ago

so, sadly, I don't think that R cannot handle files/variables that large. The writeBin function states it is the maximum on 32-bit systems.

Maybe you want to try the other function, convertflirVID() that simply calls bash/terminal language to convert the video into a series of png or tiff images.

I feel your pain, but it might be easier in the long run to convert the file to a raw png (i.e. where the png file simply contains the raw sensor data), and then you apply the raw2temp() conversion in another platform (not sure what your final purpose is). You could, in principle, derive a customised lookup table or regression equation that would allow you to do that.

shaktalerv commented 4 years ago

Maybe you want to try the other function, convertflirVID() that simply calls bash/terminal language to convert the video into a series of png or tiff images.

yes, but the total frame converted is 5584, and > n.frames [1] 5711 , the data.frame of times is 5711. missing files, why?

gtatters commented 4 years ago

Harder to tell unfortunately without seeing the file. These functions are hacks and sometimes the seq files have missing frames dropped during capture perhaps? Exiftool probably can’t tell. If you want to send the file to me via we transfer or google drive I can look at it offline later?

shaktalerv commented 4 years ago

Warning: Invalid FLIR record - G: Warning: Unsupported FLIR FFF version - G:/

this message on convertflirVID() process, but the fff of the warning are converted.

upload the file....

gtatters commented 4 years ago

it's quite possible there are dropped frames in the file

shaktalerv commented 4 years ago

the file https://drive.google.com/file/d/1AGnjTnpwNxDRwfgX19yFW_UsZWj4W_3e/view?usp=sharing

thanks

gtatters commented 4 years ago

sorry where is your 5584 number coming from? I find 5711 frames. you might need to supply more info

shaktalerv commented 4 years ago

is 5585 png files converted with convertflirVID()

gtatters commented 4 years ago

are you using the latest version of thermimage? I uploaded a new perl script a few versions ago. do you have output for when you set verbose=TRUE that can help?

shaktalerv commented 4 years ago

are you using the latest version of thermimage?

Yes, version 4.0.1

I use this code convertflirVID(f, exiftoolpath="installed", perlpath="installed", fr=30, res.in=res.in, res.out=res.in, outputcompresstype="tiff", outputfilenameroot=NULL, outputfiletype="png", outputfolder="output Rr3", verbose=T) warnings examples:

Warning: Unsupported FLIR FFF version - G:/temp/frame05205.fff Warning: Unsupported FLIR FFF version - G:/temp/frame05206.fff Warning: Invalid FLIR record - G:/temp/frame00028.fff

gtatters commented 4 years ago

so from running tests on the file, there are some errors in the file itself. the perl script extracts all \x46\x46\x46\x00 headers. These are the FFF header. technically, each FFF file generated should have image data in them, but for some reason your file has additional headers scattered throughout the SEQ file. If these were captured on Thermacam researcher pro I'm not too surprised since really large files in SEQ format (in my experience) often have errors in them. I don't know how FLIR deals with them but I can't readily solve those issues with this package.

gtatters commented 4 years ago

with the verbose=T on, you can see the terminal commands that are being called and perhaps try to recreate this step by step, generate the FFF files in the temp folder and sort them by size. All the TIFF frames should be identical in size and all the incorrect FFF files will be smaller. You could delete the small files and then copy the rest of the commands the verbose=T outputs. An awkward solution but the easiest one. I can confirm that the file has 5711 frames from opening in thermacam researcher pro

gtatters commented 4 years ago

I have an idea, but I'll have to add functionality to convertflirVID. It looks like the program used to create the SEQ file is purposely adding FFF headers inappropriately. I'll need to add a new option and it might take a few days to get it online.

If I split the SEQ file based on: \x46 \x46 \x46 \x00 \x43 \x41 \x4D using my perl script, it creates 5711 frames, so I think I can fix the R function, just need some time.

gtatters commented 4 years ago

I think the fix for this is beyond my abilities, although there were 2 things happening.

First was what I suspected. Splitting your files based on what I assumed was a safe search parameter ( \x46 \x46 \x46) creates problems. So in the next update I will add the ability to specify SEQ header ( \x46 \x46 \x46 \x00 \x43 \x41 \x4D) as the split parameter. This works more effectively with these .SEQ files.

Second issue was file size related. It seems related to file size / memory limits of terminal system calls from within R/Rstudio rather than anything in my scripts, since I can split the files successfully in perl (from the terminal window) without using R at all, and get no errors:

perl /Library/Frameworks/R.framework/Versions/3.6/Resources/library/Thermimage/perl/split.pl -i '201603020002.SEQ' -o temp -b frame -p seq -x fff

This creates the appropriate number of .fff files that all still contain image data, time stamps, etc (probably even GPS data but I haven't looked).

But when I ran your 3 Gb file, I started getting this error using convertflirVID function:

panic: sv_setpvn called with negative strlen 

Turns out, others have found a similar issues with large files and perl: https://github.com/phageParser/phageParser/issues/46

If I break your .SEQ file up with maximum 2 Gb files, the error in R goes away

The only real solution would be to edit your files in a Hex editor to cut them down to more manageable sizes first. Probably <2Gb. Hex Fiend is really good and you could search for the FFF strings around the 2 Gb mark, cut and save the file into two parts and run the Rscript on the parts.

Alternatively, you could do all the fff file extractions in terminal using the perl script that comes with Thermimage. I have some of this outlined in the bash repository on github: https://github.com/gtatters/ThermimageBash

I will still work on a minor update to convertflirVID to allow the SEQ split specification, but I can’t fix the file size limit for the internal perl call.

gtatters commented 4 years ago

updated to v4.1.0 to allow specifying split pattern. But this is not likely to work on files larger than 2Gb. I will close for this issue for now. I will delete my local copy of your SEQ file, particularly if the data are sensitive.

shaktalerv commented 4 years ago

I will try with your job proposal, cut my files in small files (<1gb).

I will updated to v4.1.0 and report.

I will delete my local copy of your SEQ file, particularly if the data are sensitive.

Thanks!

Thanks a lot!

gtatters commented 4 years ago

great. good luck. hopefully it helps...at least until the next glitch arises. At least I can confirm that using perl in shell/terminal does seem to work properly on the file, so the issue is related to limits within R (and my ability to troubleshoot them).