Open murphyke opened 8 years ago
I tried the data-models 32 bit process and it doesn't recognize my csv files even though they are listed as type " Microsoft comma seperated values file"
C:\oracle_output>data-models-packer.exe -out Colorado_082916.zip C:\oracle_outpu t\CSV Please provide site name: Colorado Please provide common data model name: 2.3 Please provide common data model name: pedsnet Please provide model version: 2.3 Please provide model version: 2.3.0 Please provide etl code URL: https://github.com/PEDSnet/Colorado/tree/master/ETL /CDMV2_3 2016/08/30 09:59:09 packer: error creating or verifying metadata file: non-csv f ile found: C:\oracle_output\CSV\CARE_SITE.CSV
Try using a ".csv" extension (lowercase).
thanks, seems to be the issue... rerunning now.
so the meta_data file was created but the zip file doesn't look right ... said it wasn't a valid zip file and the size of the file is ~37 gb while the others I've done manually are around 7 gb.
Sorry for the trouble... This is clearly beta software. Can you post the exact command you are using?
here's what I ran :
C:\oracle_output>data-models-packer.exe -out Colorado_082916.zip C:\oracle_outpu t\CSV Please provide site name: Colorado Please provide common data model name: 2.3 Please provide common data model name: pedsnet Please provide model version: 2.3 Please provide model version: 2.3.0 Please provide etl code URL: https://github.com/PEDSnet/Colorado/tree/master/ETL /CDMV2_3
That looks exactly like it should. Implementing the compression was a challenge and I'm not surprised that I got it wrong. I also don't have time to fix it right now, so I would say that you should zip it manually and send it over. At least you got a metadata file out of it?
I actually think the default format is .tar.gz if you don't specify the -comp
switch. However, even in that case, you won't be able to read the file with gunzip
due to a bug, although the packer should itself be able to unpack the file successfully. As for the huge file size difference, I'm not sure.
The packer will be replaced after this data cycle by a new tool, so we'll make sure any ideas for improvements make it into that tool.
getting the metadata file created was a HUGE help thanks! I'll zip it manually and send it over tomorrow.
Colorado requested this. It looks like the
gox
-based cross-compilation can be replaced by a newer cross-compilation approach available since 1.5.