Closed madiot closed 5 years ago
Hi @madiot Sorry about the late reply.
You simply have to add it your blueprint, like another service on your host group of choice in the blueprint_dynamic
variable:
- LIVY2_SERVER
It's not by default in the examples, sorry.
About those weird paths, that's a bit annoying, Ambari does that sometimes, when it "automatically" discovers the paths.
You can specify some static paths in the blueprint, under hdfs-site
and yarn-site
like in these examples:
https://github.com/objectrocket/ansible-hadoop/blob/master/playbooks/roles/ambari-server/templates/blueprint-multi-node-1-master.j2#L106
https://github.com/objectrocket/ansible-hadoop/blob/master/playbooks/roles/ambari-server/templates/blueprint-multi-node-1-master.j2#L315
Hi @madiot do you need further help on this or we can close it?
The paths in the blueprint is not yet a feature I've added but it will be worked on. Also check the good discussion here: https://github.com/hortonworks/ansible-hortonworks/pull/72
Reopen it if needed please.
Hello,
we used this ansible script to install an on premise (static) cluster of HDP 2.6.5. with Spark2 only (no Spark). It went all good, but i Spark2 service we noticed that there was no "livyserver for spark2" installed, though the livy2_2_6_5_0_292 system package is installed.
As a work arround from ambari we installed spark service. fixed the issue where HDFS and YARN config got poluted by some unexpected paths like :
HDFS's (namenode directory)
YARN's ( yarn.nodemanager.local-dirs)
Anyway after fixing manually from ambari, above paths, Spark/Livy was fine. Then reinstalling Spark2 from ambari, where where presented with a choice where to put the livy for sark2 server, and not of the NN/DN was selected. We opted to set it on one of the NN, and eventually it got setup correctly.
The overall question we have, is how can we specify in the ansible scripts and/or blueprints to install the livy server for spark or spark2?
looking forward your feedback.
best regards