Open MathewBiddle opened 4 years ago
$ nccopy -d1 upper_ches_his.nc upper_ches_his_d1.nc
$ nccopy -d1 -s upper_ches_his.nc upper_ches_his_d1s.nc
$ gzip -k upper_ches_his_d1.nc
$ lsc
-rwxrw-r--@ 1 mbiddle staff 18G Jul 17 2019 upper_ches_his.nc*
-rw-r--r-- 1 mbiddle staff 6.7G Aug 6 21:48 upper_ches_his_d1.nc
-rw-r--r-- 1 mbiddle staff 6.7G Aug 6 21:55 upper_ches_his_d1s.nc
-rw-r--r-- 1 mbiddle staff 6.7G Aug 6 21:48 upper_ches_his_d1.nc.gz
Once I compressed all the data, we're looking at:
data$ du -sh *
1.8M CBIBS_NCEI
28K Flats_wind_obs_2013
13G Full_20110719T23_20111101_final
13G Full_20110719T23_20111101_final_noveg
72K SWAN_20130705_20130715_FRICTION_NOVEG_30SEC_KOMAN_pt4+Bathy
or
$ du -sh data/
25G data/
$ tar -zcvf data.tar.gz data/
a data
a data/.DS_Store
a data/CBIBS_NCEI
a data/Full_20110719T23_20111101_final_noveg
a data/Flats_wind_obs_2013
a data/SWAN_20130705_20130715_FRICTION_NOVEG_30SEC_KOMAN_pt4+Bathy
a data/Full_20110719T23_20111101_final
a data/Full_20110719T23_20111101_final/mid_water_currents.csv
a data/Full_20110719T23_20111101_final/tripod_wave.pts
a data/Full_20110719T23_20111101_final/upper_ches_his_d1.nc
a data/Full_20110719T23_20111101_final/upper_ches_bry_d1.nc
a data/Full_20110719T23_20111101_final/upper_ches_avg_d1.nc
a data/Full_20110719T23_20111101_final/river_frc_d1.nc
a data/SWAN_20130705_20130715_FRICTION_NOVEG_30SEC_KOMAN_pt4+Bathy/tripod_wave.pts
a data/Flats_wind_obs_2013/sftripod1_advo_diwasp_MKS_LWT_lowpass_results.mat
a data/Full_20110719T23_20111101_final_noveg/mid_water_currents.csv
a data/Full_20110719T23_20111101_final_noveg/tripod_wave.pts
a data/Full_20110719T23_20111101_final_noveg/upper_ches_his_d1.nc
a data/Full_20110719T23_20111101_final_noveg/upper_ches_avg_d1.nc
a data/Full_20110719T23_20111101_final_noveg/river_frc_d1.nc
a data/CBIBS_NCEI/S_2011.nc
$
$ du -sh data.tar.gz
25G data.tar.gz
not sure what to do now.... thumb drive maybe?
Individual files limited to 100MB! See https://docs.github.com/en/github/managing-large-files/working-with-large-files
Raw history files are 18.94GB. Try compress, then shuffle and compress, then shuffle compress and externally compress (gzip?).