ccri / cloud-local

Install script for a local 1 node cloud...no excuses folks
17 stars 5 forks source link

Hadoop download not working #59

Closed ericstalbot closed 7 years ago

ericstalbot commented 7 years ago

It appears that many (all?) mirrors do not have hadoop 2.7.3 available.

djr7m commented 7 years ago

I should really hack in a shared download area on Synology... at least that way local LAN work would get instant downloads and coverage for version gaps like this. Then we can always add a real mirror or web-cache-proxy later. (I wonder if it was pulled due to the Apache Strut security problem or something?)

djr7m commented 7 years ago

see also issue #52

djr7m commented 7 years ago

ok @ericstalbot here is an immediate hack to help... change your "cloud-local/pkg" folder to be a symbolic link into /net/synds1/volume2/projects2/cloud-local/packages . cd cloud-local; rm pkg; ln -s /net/synds1/volume2/projects2/cloud-local/packages pkg This should allow the "wget -c" download to see the packages already in place and not bother downloading them. Also, anything new you download should land in there for others to save them the download as well.

At the moment I have removed write permissions on the tarballs I put in place just to make sure no one purges them by accident. Next step is to do some actual cloud-local script mods to avoid that type of problem in the future.

djr7m commented 7 years ago

default hadoop version changed in commit df61481af35d82383cd3bb7d31c6d815f006fea9 also cloud-local is now more proxy aware in a8d7608bb862e4306f3710484f401094b3117f4f and 5cc425ec4cd81cea5dc6dcfef1f2adf9f1e5c97d

CCRi now has an internal caching proxy squid.ccri.com port 3128