hugomflavio / actel

Standardised analysis of acoustic telemetry data from fish moving through receiver arrays
https://hugomflavio.github.io/actel-website
26 stars 6 forks source link

Data does not match to any of the supported hydrophone file formats #102

Closed prisrcunhavt closed 1 month ago

prisrcunhavt commented 2 months ago

I have been trying to run actel with my Vemno acoustic data but the software does not recognize it. I know that for now the package is only design for data generated by VEMCO and THELMA manufactured receivers. However, I would like to check if I can edit my dataset to match actel requirements. Currently, my data has this format:

str(det) 'data.frame': 259215 obs. of 10 variables: $ Timestamp : POSIXct, format: "2014-12-28 09:05:00" "2014-12-28 09:09:00" "2014-12-28 09:11:00" ... $ Receiver : int 124595 124595 124595 124595 124595 124595 124595 124595 124595 124595 ... $ Transmitter : chr "A69-1601-14922" "A69-1601-14922" "A69-1601-14922" "A69-1601-14922" ... $ Signal : int 14922 14922 14922 14922 14922 14922 14922 14922 14922 14922 ... $ Transmitter.Serial: logi NA NA NA NA NA NA ... $ Sensor.Value : logi NA NA NA NA NA NA ... $ Sensor.Unit : logi NA NA NA NA NA NA ... $ Station.Name : chr "Estir'ao da gata" "Estir'ao da gata" "Estir'ao da gata" "Estir'ao da gata" ... $ Latitude : logi NA NA NA NA NA NA ... $ Longitude : logi NA NA NA NA NA NA ...

hugomflavio commented 2 months ago

Hi! Sorry for the delay, this past week I was busy at the OTN symposium :)

Before you try to convert the data into actel's standard format, I am curious as to why your vemco data files could not be processed. Could you provide some more details on what happened when you tried to import those files?

prisrcunhavt commented 2 months ago

Thank you for your response! No need to apologize for the delay; I actually owe you an apology for my late reply. I typically only work on this side project on Thursdays and Fridays.

I wanted to clarify that the data I have is from Venmo, not VEMCO, which may be contributing to the issue. I attempted to align the columns with your manual, but when I ran the explore function, I encountered the following message: `> exp.results <- explore(tz = 'Brazil/West', report = TRUE) #tz = Time Zone M: Importing data. This process may take a while. M: No Code.space column was found in the biometrics. Assigning code spaces based on detections. Warning: Long group names detected. To improve graphic rendering, consider keeping group names under six characters. M: Number of target tags: 15. M: Compiling detections... Warning: File 'detections/VUE_Export2.csv' does not match to any of the supported hydrophone file formats! If your file corresponds to a hydrophone log and actel did not recognize it, please get in contact through www.github.com/hugomflavio/actel/issues/new M: One file was excluded from further analyses. Error: No valid detection files were found.

M: The analysis errored. You can recover latest the job log (including your comments and decisions) by running recoverLog().`

From what I understand, some of these messages are just warnings, such as the missing Code.space and the long group names. However, my main concern is the warning regarding the unsupported hydrophone file format.

Do you have any suggestions on how to resolve this? Your insights would be greatly appreciated!

Thanks again for your help!

hugomflavio commented 2 months ago

hm... I have never heard of "Venmo" acoustic data. Googling it didn't help due to the obvious "Venmo" namesake :)

Your file is called "VUE_Export" though, which is something that usually comes out of a vemco/innovasea source... Could you post here a sample of what the file looks like? A screenshot of the first rows would do.

prisrcunhavt commented 2 months ago

I apologize for the confusion. I initially received the information from a collaborator who mistakenly referred to it as Venmo, but upon reviewing her dissertation, I confirmed that it should be VEMCOM. It seems it was just an autocorrect error.

I can certainly send a screenshot for your reference.

The original file I received looks like this:

Screenshot 2024-10-03 at 1 18 33 PM

I’ve updated it to match the column names in the manual, and it now looks like this:

Screenshot 2024-10-03 at 1 20 16 PM
hugomflavio commented 2 months ago

hah, ok, that makes more sense now :)

Just to be sure, did you try running the analysis with the original file, before you updated it? actel should be able to read the original csv file.

prisrcunhavt commented 2 months ago

Yes, I did. When I try the original one it gives me this error: Error: Something went wrong when processing file 'detections/VUE_Export.csv'. If you are absolutely sure this file is ok, contact the developer. Original error: Importing timestamps failed

Than I tried two transformations, saving a new file in the detections folder: 1) Edit the date column format: VUE$Date.and.Time..UTC. <- as.POSIXct(VUE$Date.and.Time..UTC., format="%m/%d/%y %H:%M") But the same error continue

2) Edit the dat column name: names(VUE)[names(VUE) == "Date.and.Time..UTC."] <- "Timestamp" But than goes to the same error in the edited one: Warning: File 'detections/VUE_Export.csv' does not match to any of the supported hydrophone file formats! If your file corresponds to a hydrophone log and actel did not recognize it, please get in contact through www.github.com/hugomflavio/actel/issues/new

hugomflavio commented 2 months ago

Looking at the screenshot you sent of the original file, I can see why actel is confused. The timestamps were indeed saved in an unusual format. Your first attempt was in the right direction, but it mustn't have been enough to fully fix the timestamps (I'd have to try it myself to be sure).

This being said, you are quite close to getting the standard format right. From that second screenshot, you need to split the "Transmitter" column into two columns (the code space and the signal).

actel provides helper functions for this. Try running:

# assuming your object is called "VUE"
VUE$Signal <- extractSignals(VUE$Transmitter)
VUE$CodeSpace <- extractCodeSpaces(VUE$Transmitter)

I am concerned that the lack of seconds might still cause some issues later on, but we can cross that bridge once actel recognizes the input as a standard format file.

Let me know how it goes!

hugomflavio commented 2 months ago

edit: I just noticed you already have the Signal column, so you're only missing the CodeSpace one.

prisrcunhavt commented 2 months ago

I attempted to edit the original file based on the codes you provided, resulting in the following format:

Screenshot 2024-10-03 at 4 03 15 PM

The file recognizes the changes but raises an error indicating that the receiver is not numeric. Error: The file 'detections/VUE_Export.csv' was recognized as a standard detections file, but the 'Receiver' column is not numeric. Please include only the receiver serial numbers in the 'Receiver' column.

I then tried to modify the receiver column to retain only the numerical values: VUE$Receiver <- substr(VUE$Receiver, nchar(VUE$Receiver) - 5, nchar(VUE$Receiver)) VUE$Receiver <- as.numeric(VUE$Receiver)

Which led to this table:

Screenshot 2024-10-03 at 4 06 28 PM

However, after this adjustment, Actel still does not recognize the data: Error: Something went wrong when processing file 'detections/VUE_Export.csv'. If you are absolutely sure this file is ok, contact the developer. Original error: Importing timestamps failed

Do you think is the lack of seconds that is preventing it?

hugomflavio commented 2 months ago

I just tried importing a file where all the detections had "00" seconds and it worked fine, so it must be something else. The "Importing timestamps failed" error pops up when any of the values in Timestamp becomes NA when converted to a POSIX object.

Assuming your VUE$Timestamp is in POSIX format, could you run any(is.na(VUE$Timestamp)) to confirm? If that returns true, you can then run which(is.na(VUE$Timestamp)) to find out which rows are NA. From there, we can try to understand what happened to those timestamps.

prisrcunhavt commented 1 month ago

I was able to make it work when removing NA from the timestamp. Thank you so much. I am now trying to make some analysis and might bother you in the future. But for now, we solved the issue. Thank you very much for all the patience and support.

hugomflavio commented 1 month ago

Excellent! Don't hesitate to reach out if you run into any other issue :) Good luck!