rethomics / damr

Read TriKinetics' DAM data in R
http://rethomics.github.io
6 stars 7 forks source link

Can not cast an empty data.table #21

Closed lauraferg closed 5 years ago

lauraferg commented 5 years ago

Hi there,

I'm having an issue at the loading step dt <- load_dam(metadata) (with my own data) where I get this error:

Error in dcast.data.table(dt, id + t ~ data_type, value.var = "value") : Can not cast an empty data.table

Any chance anyone has encountered this before? My code up until this point is:

DATA_DIR <- "C:/Users/lafergus/Downloads/Monitor files"
list.files(DATA_DIR, pattern = "*.txt|*.csv")
setwd(DATA_DIR)
metadata <- fread("metadata4.csv")
metadata
metadata <- link_dam_metadata(metadata, result_dir = DATA_DIR)
metadata
dt <- load_dam(metadata)

and I've attached my metadata (copied from excel into a text editor) and DAM files if that helps at all. I've been trying to read my metadata file in as a .csv file (i.e. saving from .xlsx to csv) so I don't know if that might be where an error is occurring, but haven't had any luck trying to rectify that.

Thanks in advance for any insights!

Laura

metadata4.txt

Monitor3.txt

qgeissmann commented 5 years ago

Hey, thanks for reporting your issue, What is the output of (just to be sure it is not empty):

metadata <- link_dam_metadata(metadata, result_dir = DATA_DIR)
metadata

Do you manage to load another dataset? Can you check you have data in the date range you are extracting Also, by having a quick look, it seems that your data is sample every hour, is that normal?

lauraferg commented 5 years ago

Hi - thanks for the quick reply! the output definitely isn't empty - it spits back what is in the file:

id file_info      start_datetime       stop_datetime region_id Individual machine_name
 1: 2018-12-20 08:00:00|Monitor3.txt|01    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         1          1  machine_003
 2: 2018-12-20 08:00:00|Monitor3.txt|02    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         2         17  machine_003
 3: 2018-12-20 08:00:00|Monitor3.txt|03    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         3          2  machine_003
 4: 2018-12-20 08:00:00|Monitor3.txt|04    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         4         18  machine_003
 5: 2018-12-20 08:00:00|Monitor3.txt|05    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         5          3  machine_003
 6: 2018-12-20 08:00:00|Monitor3.txt|06    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         6         19  machine_003
 7: 2018-12-20 08:00:00|Monitor3.txt|07    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         7          4  machine_003
 8: 2018-12-20 08:00:00|Monitor3.txt|08    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         8         20  machine_003
 9: 2018-12-20 08:00:00|Monitor3.txt|09    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00         9          5  machine_003
10: 2018-12-20 08:00:00|Monitor3.txt|10    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        10         21  machine_003
11: 2018-12-20 08:00:00|Monitor3.txt|11    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        11          6  machine_003
12: 2018-12-20 08:00:00|Monitor3.txt|12    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        12         22  machine_003
13: 2018-12-20 08:00:00|Monitor3.txt|13    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        13          7  machine_003
14: 2018-12-20 08:00:00|Monitor3.txt|14    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        14         23  machine_003
15: 2018-12-20 08:00:00|Monitor3.txt|15    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        15          8  machine_003
16: 2018-12-20 08:00:00|Monitor3.txt|16    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        16         24  machine_003
17: 2018-12-20 08:00:00|Monitor3.txt|17    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        17         25  machine_003
18: 2018-12-20 08:00:00|Monitor3.txt|18    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        18          9  machine_003
19: 2018-12-20 08:00:00|Monitor3.txt|19    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        19         26  machine_003
20: 2018-12-20 08:00:00|Monitor3.txt|20    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        20         10  machine_003
21: 2018-12-20 08:00:00|Monitor3.txt|21    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        21         27  machine_003
22: 2018-12-20 08:00:00|Monitor3.txt|22    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        22         11  machine_003
23: 2018-12-20 08:00:00|Monitor3.txt|23    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        23         28  machine_003
24: 2018-12-20 08:00:00|Monitor3.txt|24    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        24         12  machine_003
25: 2018-12-20 08:00:00|Monitor3.txt|25    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        25         29  machine_003
26: 2018-12-20 08:00:00|Monitor3.txt|26    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        26         13  machine_003
27: 2018-12-20 08:00:00|Monitor3.txt|27    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        27         30  machine_003
28: 2018-12-20 08:00:00|Monitor3.txt|28    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        28         14  machine_003
29: 2018-12-20 08:00:00|Monitor3.txt|29    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        29         31  machine_003
30: 2018-12-20 08:00:00|Monitor3.txt|30    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        30         15  machine_003
31: 2018-12-20 08:00:00|Monitor3.txt|31    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        31         32  machine_003
32: 2018-12-20 08:00:00|Monitor3.txt|32    <list> 2018-12-20 08:00:00 2018-12-21 08:00:00        32         16  machine_003
                                     id file_info      start_datetime       stop_datetime region_id Individual machine_name
    condition
 1:     Maple
 2:    Spruce
 3:     Maple
 4:    Spruce
 5:     Maple
 6:    Spruce
 7:     Maple
 8:    Spruce
 9:     Maple
10:    Spruce
11:     Maple
12:    Spruce
13:     Maple
14:    Spruce
15:     Maple
16:    Spruce
17:    Spruce
18:     Maple
19:    Spruce
20:     Maple
21:    Spruce
22:     Maple
23:    Spruce
24:     Maple
25:    Spruce
26:     Maple
27:    Spruce
28:     Maple
29:    Spruce
30:     Maple
31:    Spruce
32:     Maple
    condition

I do have data in the range I'm extracting, and yep it should be every hour (this was an experiment that spanned three months). I realise that the file you're seeing looks like I'm just looking at one day, which is just the result of me trying to see if I could get any date range to work!

I'll see if I can load another data set...

qgeissmann commented 5 years ago

ok, it could be due to the hour sampling, which is a bit "exotic" :), if you still can't sort it out, I will try to reproduce it sometimes next week since you provided the data. In any case, let me know, I am sure we can find a solution!

lauraferg commented 5 years ago

Okay thank you very much! It does look like the damr_tutorial data works fine; just my own that I can't get to load. I'll update if I get anywhere with it!

lauraferg commented 5 years ago

Hi there! I was wondering if by any chance you'd had time to try and reproduce the error I was encountering here. I haven't had any further luck in trying to resolve the issue on my end, unfortunately! Thanks!

qgeissmann commented 5 years ago

Hi, thanks for reminding me! So I had a quick look, and it is the problem that was fixed by #19 actually. It is just not on CRAN yet. If you install damr from github (using devtools, see here), it should fix it. Meanwhile, I will push the nex version to cran and update the issue when accepted! Let me know if it works. sorry for the delay!

lauraferg commented 5 years ago

No worries at all! Thanks so much for the quick reply and the help - I'll give that fix a try!

qgeissmann commented 5 years ago

no worries, I sent the package to CRAN a minute ago, so it should be reviewed and available within hours/days.

lauraferg commented 5 years ago

It worked! Thanks again!