Closed xiandong79 closed 6 years ago
[ -z "$HADOOP_HOME" ] && export HADOOP_HOME=/root/ephemeral-hdfs/
# base dir for DataSet
HDFS_URL="hdfs://${master}:9000"
SPARK_HADOOP_FS_LOCAL_BLOCK_SIZE=536870912
# DATA_HDFS="hdfs://${master}:9000/SparkBench", "file:///home/`whoami`/SparkBench"
DATA_HDFS="hdfs://${master}:9000/SparkBench"
#Local dataset optional
#DATASET_DIR=/home/`whoami`/SparkBench/dataset
SPARK_VERSION=1.6.1 #1.4.0
[ -z "$SPARK_HOME" ] && export SPARK_HOME=/root/spark/
flintrock describe <cluster_name>
will give you the DNS names of all the nodes in the cluster. From there it's pretty easy to get the IP addresses. You can also find similar information under spark/conf
and hadoop/conf
when you flintrock login <cluster_name>
.
In spark-ec2, we can know the public IP of EC2 machines by following;
How to know the public IP of master and slaves of EC2 machines launched by flintrock?