Closed aakanshasoni93 closed 6 years ago
Default provisioning of Stream4Flow uses a Vagrant box (ubuntu/xenial64) that has been updated recently and do not allow to use password SSH authentication. You need to log in to the server via vagrant ssh sparkMaster
and upload your SSH key or change SSH configuration to allow password login (open /etc/ssh/sshd_config
and set PasswordAuthentication yes
).
Thanks for the question, we will add this information to the Readme file.
Hi,
I have tried as per your suggestions, and it has started to work. I'm really thankful for your timely reply.
I have ran the sparkMaster and Its running. Softflowd is running as well. But Graphs are not generated in the stream$Flow web interface. I have tried for more than 2 days I'm unable to identify the issue.
It would really helpful if you could let me know how the results or graphs are generated and what might be the issue.
Thanks, Aakansha Soni
On Mon, May 14, 2018 at 12:55 PM, Milan Cermak notifications@github.com wrote:
Default provisioning of Stream4Flow uses a Vagrant box (ubuntu/xenial64) that has been updated recently and do not allow to use password SSH authentication. You need to log in to the server via vagrant ssh sparkMaster and upload your SSH key or change SSH configuration to allow password login (open /etc/ssh/sshd_config and set PasswordAuthentication yes).
Thanks for the question, we will add this information to the Readme file.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-388721558, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKZL4meGvUgJjjuwglgIZan3GQGbHks5tyTGBgaJpZM4T8bzl .
It looks like you do not run the analysis application. See https://github.com/CSIRT-MU/Stream4Flow#run-an-example-application-protocols_statistics You can check if it is running on Spark Web Interface.
Hi,
I have run the application. I'm able to see the process, jobs in spark web interface. But I'm not able to see graph in stream4flow web interface.
Thanks, Aakansha Soni
On Mon, May 14, 2018, 5:27 PM Milan Cermak notifications@github.com wrote:
It looks like you do not run the analysis application. See https://github.com/CSIRT-MU/Stream4Flow#run-an-example-application-protocols_statistics You can check if it is running on Spark Web Interface.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-388791626, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKYVDw2R9fi_CB40bBN2jUVTjOeJGks5tyXE6gaJpZM4T8bzl .
Do you see any input data going into an application in Spark Web Interface in application tab Streaming?
Hi, I am getting screen like this. Please help.
Reagrds Aakansha Soni
On Mon, May 14, 2018 at 6:26 PM, Tomas Jirsik notifications@github.com wrote:
Do you see any input data going into an application in Spark Web Interface in application tab Streaming? [image: screenshot_20180514_145542] https://user-images.githubusercontent.com/11919446/39998626-ee969360-5786-11e8-89af-c45d6ff61866.png
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-388806694, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKalx7jN2qjhY0AOsBAXnbuc3cfyBks5tyX7ugaJpZM4T8bzl .
Look into Kibana, if you can see data there:
You should see something like this screenshot:
Hi, Sorry for for troubling so much, but even after giving index name as spark-* it is giving index not found error.
Regards Aakansha Soni
On Mon, May 14, 2018, 7:13 PM Tomas Jirsik notifications@github.com wrote:
Look into Kibana, if you can see data there:
- consumer IP address:5601
- default IP address is http://192.168.0.3:5601/
- index name: spark-*
You should see something like this screenshot: [image: screenshot_20180514_154206] https://user-images.githubusercontent.com/11919446/40000942-6ea95eb0-578d-11e8-9d14-1dc1a8044819.png
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-388821115, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKR8CNpNM6yvvKOAp6g9eDCgYPXM7ks5tyYnxgaJpZM4T8bzl .
Hi,
I feel you are correct. I'm unable to add the spark-* in the Pattern Id of my kibana. Attaching all the required screenshots. please let me know if I am missing out anything. Thanking you.
Regards Aakansha Soni
On Mon, May 14, 2018 at 7:40 PM, Aakansha Soni aakanshasoni93@gmail.com wrote:
Hi, Sorry for for troubling so much, but even after giving index name as spark-* it is giving index not found error.
Regards Aakansha Soni
On Mon, May 14, 2018, 7:13 PM Tomas Jirsik notifications@github.com wrote:
Look into Kibana, if you can see data there:
- consumer IP address:5601
- default IP address is http://192.168.0.3:5601/
- index name: spark-*
You should see something like this screenshot: [image: screenshot_20180514_154206] https://user-images.githubusercontent.com/11919446/40000942-6ea95eb0-578d-11e8-9d14-1dc1a8044819.png
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-388821115, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKR8CNpNM6yvvKOAp6g9eDCgYPXM7ks5tyYnxgaJpZM4T8bzl .
Hi, it is really hard to help you without any screenshots or error messages. Can you please check if Kafka receive input and output data? Use following commands on producer and send us several lines of their output.
/opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic ipfix.entry
/opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic results.output
Hi,
I ran the command as explained by you. Both the command doesnt provide any output. I have attached the screenshot of the both commands. Regards Aakansha Soni
On Tue, May 15, 2018 at 1:09 PM, Milan Cermak notifications@github.com wrote:
Hi, it is really hard to help you without any screenshots or error messages. Can you please check if Kafka receive input and output data? Use following commands on producer and send us several lines of their output.
/opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic ipfix.entry /opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic results.output
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/CSIRT-MU/Stream4Flow/issues/82#issuecomment-389073506, or mute the thread https://github.com/notifications/unsubscribe-auth/AgUwKdvcJ6hYdZMwr1TAdzzvo5gKSH52ks5tyoYlgaJpZM4T8bzl .
Hi,
we do not see any screenshots from you. Please put screenshots directly to the issue as I did (not via email). Please provide us with following screenshots:
1) Screenshosts of outputs of following commands - including errors: /opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic ipfix.entry /opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic results.output
2) Screenshot of the command used to start Spark application.
3) Screenshot from Spark Web Interface - application tab Streaming.
4) Screenshot of the command used to run a Softflowd.
Without these screenshots, we can not, unfortunately, help you as we do not have sufficient information for troubleshooting.
I do not see any update. Closing the issue.
Screenshots are not available within the issue. Please put screenshots directly to the issue as we did (not via email).
Hi,
Apologising for not being available for these many days. Attaching all the required screenshots.
and the last screenshot is after executing these commands.
/opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic ipfix.entry /opt/kafka/bin/kafka-console-consumer.sh --zookeeper producer:2181 --topic results.output Thanking you.
I just deployed producer only and perform the same commands as you and everything works... Can you please check with tcpdump on the producer that exported flows were received? Note that softflowd needs approx. 5 minutes to export flows (shows "EXPIRED" messages). Also try to check the status of IPFIXcol, Kafka, Zookeeper, and TCP normalizer services on producer (needs to be active/running):
sudo service ipfixcol status
sudo service kafka-broker status
sudo service zookeeper status
sudo service tcpnormalizer@default status
I do not see any update. Closing the issue.
I have installed vagrant and ansible. I have done vagrant up and provisioning. All the 4 VMs are up and running, but when i try to do SSH to spark@192.168.0.100, I'm getting error "Permission denied(public key)".
Could anyone help me with the issue .