big-data-europe / docker-hadoop-spark-workbench

[EXPERIMENTAL] This repo includes deployment instructions for running HDFS/Spark inside docker containers. Also includes spark-notebook and HDFS FileBrowser.
689 stars 374 forks source link

How to add a file from local file system? #54

Open alexeytochin opened 6 years ago

alexeytochin commented 6 years ago

I tried docker exec -it namenode hadoop fs -put README.md /tmp/README.md and got put: `README.md': No such file or directory

earthquakesan commented 6 years ago

Hi @Lehis, you need to mount your local folder to docker container. For example:

docker run -v /local/folder/to/mount:/data --network mynetwork --env-file hadoop.env bde2020/hadoop-base:tag hdfs dfs -copyFromLocal /data/myfile /

If you want to use docker exec, then you first need to copy your README.md into your container:

docker cp README.md namenode:/data/README.md
docker exec -it namenode hadoop fs -put README.md /tmp/README.md