Open muhammad-binkhalid opened 2 years ago
Hi @mbk7162 , Thanks. Can you please try using the dev version and let me know? Also, can you please share thee code?
A problem found in using of kobo_df_download, i have the following message
Error in read.table(file = file, header = header, sep = sep, quote = quote, : no lines available in input
Hi @joelsonkos , This means that after downloading the file in temp, R is unable to read the file. There could be many reason. You may try changing the 'fsep' parameter from ";" to "," in case the separator is ",". It can also happen due to extra quotes in the data. It is difficult to identify the exact reason without running the code for this specific case. However, there is a workaround. Option 1
KoboconnectR::kobotools_kpi_data(assetid= "assetid", url="kobo.humanitarianresponse.info", uname="username", pwd="password")
This will download the data in json format. Main data is usually inside results.
Option 2
new_export_url<-KoboconnectR::kobo_export_create(url="kobo.humanitarianresponse.info", uname="", pwd="",
assetid="", type= "csv", all="false", lang="_default",
hierarchy="false", include_grp="true",grp_sep="/") # Create export
df<-httr::GET(new_export_url, httr::authenticate(user="userid", password ="password")) # Download
df<-httr::content(df, type="raw",encoding = "UTF-8") # Extract raw content
writeBin(df, "data.csv") # Write in local
read.csv("data.csv", sep=";")
Check the encoding or change the 'type' or other parameters. Please let me know how it goes.
Hi, Thank you for creating this package. I am trying to use it to download data from kobotools into R to save it as xls later on. I want to be able to download the data in long format in R (maybe through creating separate dta files). But, when I try to run the function kobo_df_download, it gives the following error "Downloading: 7.9 kB Export instruction sent successfully. Waiting for result. Downloading: 7.9 kB Execution in Progress...Timeout[1] "export creation was not successful" NULL"
I have tried to increase the sleep time as well, but it does not solve the problem.