faucetsdn / poseidon

Poseidon is a python-based application that leverages software defined networks (SDN) to acquire and then feed network traffic to a number of machine learning techniques. The machine learning algorithms classify and predict the type of device.
Apache License 2.0
414 stars 126 forks source link

Build Not Finishing After Docker Containers Removed #622

Closed scottkelso closed 6 years ago

scottkelso commented 6 years ago

I initially got everything running but it didn't seem to pick up the environment variables as I was seeing the repeated error below.

ERROR - UpdateSwitchState:228 - Could not establish connection to https://CONTROLLER_IP:8443/api/v1/ because 'NoneType' object has no attribute 'get_endpoints'.

Notice https://CONTROLLER_IP:8443/api/v1/ looks like it's comming from .plugin_config.yml instead of controller_uri from the parameters which I exported...

export controller_uri=143.117.69.165
export controller_type=faucet
export controller_log_file=/var/log/faucet/faucet.log
export controller_config_file=/etc/faucet/faucet.yaml
export controller_mirror_ports='{"sw1":3}'
export collector_nic=enp0s25 

I decided to stop and remove all of the containers (with the following script) and re-run the ./helpers/run to double check I had definitely exported the correct parameters before running ./helpers/run.

echo "Stopping and removing all docker instances...."
for LOOP in `docker ps | awk '{print $1}' | sed '/CONTAINER/d'`
do
docker stop $LOOP
docker rm $LOOP
done

Ever since I did this, the ./helpers/run command has spun up the containers but seems to get stuck, only outputting a fraction of what it did before, with not very much error output to go on.

./helpers/run 
ea8b574ae2a68de19ed5b6fd92796d444c0fe89138507b62b5cbc5c802b3ab4d
waiting for required containers to build and start (this might take a little while)...done.
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:C 18 Jun 13:23:34.693 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:C 18 Jun 13:23:34.693 # Redis version=4.0.9, bits=64, commit=00000000, modified=0, pid=1, just started
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:C 18 Jun 13:23:34.693 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 * Running mode=standalone, port=6379.
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 # Server initialized
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled.
2018-06-18T14:23:34+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 13:23:34.694 * Ready to accept connections
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + [[ -z redis ]]
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + [[ -z  ]]
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: REMOTE_REDIS_PSWD not set. Please SET
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: REMOTE_REDIS_HOST=redis REMOTE_REDIS_PORT=6379
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + export REMOTE_REDIS_PORT=6379
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + [[ -z  ]]
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + echo REMOTE_REDIS_PSWD not set. Please SET
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + export REMOTE_REDIS_PSWD=
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + [[ -z  ]]
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + export DASH_PREFIX=/rq
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + export RQ_DASHBOARD_SETTINGS=/rq_dash_settings.py
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + echo REMOTE_REDIS_HOST=redis REMOTE_REDIS_PORT=6379
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: + rq-dashboard
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]: RQ Dashboard, version 0.3.3
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:  * Serving Flask app "rq_dashboard.app" (lazy loading)
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:  * Environment: production
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:    WARNING: Do not use the development server in a production environment.
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:    Use a production WSGI server instead.
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:  * Debug mode: off
2018-06-18T14:23:38+00:00 172.17.0.1 core[1371]:  * Running on http://0.0.0.0:9181/ (Press CTRL+C to quit)
2018-06-18T14:23:42+00:00 172.17.0.1 core[1371]: 13:23:42 RQ worker 'rq:worker:6f783c34f86e.1' started, version 0.11.0
2018-06-18T14:23:42+00:00 172.17.0.1 core[1371]: 13:23:42 *** Listening on default...
2018-06-18T14:23:42+00:00 172.17.0.1 core[1371]: 13:23:42 Cleaning registries for queue: default
2018-06-18T14:23:46+00:00 172.17.0.1 core[1371]: 13:23:46 RQ worker 'rq:worker:09f35e4b6da6.1' started, version 0.11.0
2018-06-18T14:23:46+00:00 172.17.0.1 core[1371]: 13:23:46 *** Listening on default...
2018-06-18T14:23:46+00:00 172.17.0.1 core[1371]: 13:23:46 Cleaning registries for queue: default
2018-06-18T14:23:50+00:00 172.17.0.1 core[1371]: 13:23:50 RQ worker 'rq:worker:bc4fbfa61eff.1' started, version 0.11.0
2018-06-18T14:23:50+00:00 172.17.0.1 core[1371]: 13:23:50 *** Listening on default...
2018-06-18T14:23:50+00:00 172.17.0.1 core[1371]: 13:23:50 Cleaning registries for queue: default
2018-06-18T14:23:54+00:00 172.17.0.1 core[1371]: 13:23:54 RQ worker 'rq:worker:dc702fc3f5c5.1' started, version 0.11.0
2018-06-18T14:23:54+00:00 172.17.0.1 core[1371]: 13:23:54 *** Listening on default...
2018-06-18T14:23:54+00:00 172.17.0.1 core[1371]: 13:23:54 Cleaning registries for queue: default
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [8] [INFO] Starting gunicorn 19.8.1
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [8] [INFO] Listening at: http://0.0.0.0:8080 (8)
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [8] [INFO] Using worker: gevent
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [11] [INFO] Booting worker with pid: 11
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [12] [INFO] Booting worker with pid: 12
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [13] [INFO] Booting worker with pid: 13
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: [2018-06-18 13:24:02 +0000] [14] [INFO] Booting worker with pid: 14
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 core[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [1] [INFO] Starting gunicorn 19.8.1
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [1] [INFO] Using worker: gevent
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [9] [INFO] Booting worker with pid: 9
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [10] [INFO] Booting worker with pid: 10
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [11] [INFO] Booting worker with pid: 11
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: [2018-06-18 13:24:02 +0000] [23] [INFO] Booting worker with pid: 23
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:24:02+00:00 172.17.0.1 plugin[1371]:   monkey.patch_all(subprocess=True)
2018-06-18T14:24:06+00:00 172.17.0.1 plugin[1371]: INFO: Accepting connections at http://localhost:5000

Currently running Ubuntu:16.04 to host Faucet locally and a Mininet docker container, all of which I am trying to connect Poseidon to. Also experiencing above error on Ubuntu 18.04. Any advice would be much appreciated. Thanks.

cglewis commented 6 years ago

if you run docker ps -a what do you see?

the .plugin_config.yml overrides config/poseidon.config, and the environment variables override .plugin_config.yml.

it looks like the poseidon container isn't running from your output, which is why I'm curious what your docker ps -a shows.

the exports you have above look right to me. i'd recommend pulling the latest from the poseidon repo - I recently added container name/id to the logging, so it's easier to see which plugins or core tools are acting up in the logs.

scottkelso commented 6 years ago

Thanks for your quick response. docker ps -a shows the following...

CONTAINER ID        IMAGE                                        COMMAND                  CREATED             STATUS                  PORTS                                                                                        NAMES
f98f55d8c553        cyberreboot-poseidon--poseidon-master:HEAD   "tini -- /usr/bin/py…"   About an hour ago   Up About an hour        0.0.0.0:9304->9304/tcp                                                                       cyberreboot-poseidon--poseidon-master-HEAD
a3089d72fd34        cyberreboot/vent-file-drop:master            "python3 /file-drop/…"   About an hour ago   Up About an hour                                                                                                     cyberreboot-vent-file-drop-master
2efe54ec9237        cyberreboot-crviz-master:HEAD                "serve -s build -l 5…"   About an hour ago   Up About an hour        0.0.0.0:32791->5000/tcp                                                                      cyberreboot-crviz-master-HEAD
0abd9b20684d        cyberreboot-poseidon-api-master:HEAD         "gunicorn -b :8000 -…"   About an hour ago   Up About an hour        0.0.0.0:32790->8000/tcp                                                                      cyberreboot-poseidon-api-master-HEAD
8e8c38f96a2f        cyberreboot/vent-network-tap:master          "/network-tap/startu…"   About an hour ago   Up About an hour        0.0.0.0:32789->8080/tcp                                                                      cyberreboot-vent-network-tap-master
dc702fc3f5c5        cyberreboot/vent-rq-worker:master            "rqworker -c settings"   About an hour ago   Up About an hour                                                                                                     cyberreboot-vent-rq-worker-master4
bc4fbfa61eff        cyberreboot/vent-rq-worker:master            "rqworker -c settings"   About an hour ago   Up About an hour                                                                                                     cyberreboot-vent-rq-worker-master3
09f35e4b6da6        cyberreboot/vent-rq-worker:master            "rqworker -c settings"   About an hour ago   Up About an hour                                                                                                     cyberreboot-vent-rq-worker-master2
6f783c34f86e        cyberreboot/vent-rq-worker:master            "rqworker -c settings"   About an hour ago   Up About an hour                                                                                                     cyberreboot-vent-rq-worker-master
20a9033ac288        cyberreboot/vent-rq-dashboard:master         "/run.sh"                About an hour ago   Up About an hour        0.0.0.0:32788->9181/tcp                                                                      cyberreboot-vent-rq-dashboard-master
31c931b19311        cyberreboot/vent-redis:master                "docker-entrypoint.s…"   About an hour ago   Up About an hour        6379/tcp                                                                                     cyberreboot-vent-redis-master
6cd63889a26a        cyberreboot/vent-syslog:master               "/usr/sbin/syslog-ng…"   About an hour ago   Up About an hour        0.0.0.0:514->514/tcp                                                                         cyberreboot-vent-syslog-master
094215ff3a07        cyberreboot/vent-rabbitmq:master             "docker-entrypoint.s…"   About an hour ago   Up About an hour        4369/tcp, 5671/tcp, 15671/tcp, 25672/tcp, 0.0.0.0:5672->5672/tcp, 0.0.0.0:32787->15672/tcp   cyberreboot-vent-rabbitmq-master
ea8b574ae2a6        cyberreboot/vent                             "vent "                  About an hour ago   Up About an hour                                                                                                     vent
5f98c9182239        iwaseyusuke/mininet                          "/ENTRYPOINT.sh"         4 hours ago         Up 4 hours              6633/tcp, 6640/tcp, 6653/tcp, 0.0.0.0:52022->22/tcp                                          angry_jang
6fbaeab5bda7        iwaseyusuke/mininet                          "/ENTRYPOINT.sh"         2 days ago          Exited (0) 2 days ago                                                                                                clever_goldstine
6667fbb98249        poseidon                                     "tini -- /usr/bin/py…"   2 days ago          Exited (1) 2 days ago                                                                                                compassionate_rosalind
6b1463b41049        37caddda6c97                                 "tini -- /usr/bin/py…"   2 days ago          Exited (1) 2 days ago                                                                                                boring_bose
e8a1a36ecc76        37caddda6c97                                 "tini -- /usr/bin/py…"   2 days ago          Exited (1) 2 days ago                                                                                                boring_ride
a7a81275fdf8        37caddda6c97                                 "tini -- /usr/bin/py…"   2 days ago          Exited (1) 2 days ago                                                                                                pedantic_lamarr

I see what you mean. The Poseidon image hasn't been spun up. I will try another clean pull from the repository now.

cglewis commented 6 years ago

actually it looks like it's up and running:

f98f55d8c553        cyberreboot-poseidon--poseidon-master:HEAD   "tini -- /usr/bin/py…"   About an hour ago   Up About an hour        0.0.0.0:9304->9304/tcp                                                                       cyberreboot-poseidon--poseidon-master-HEAD

I'm guessing it's waiting to connect to faucet?

cglewis commented 6 years ago

Although, I still would have expected it to have output some setup stuff that I don't see in your logs. Strange. Let me try and reproduce on my end.

j543453 commented 6 years ago

What main indicators are there that Poseidon is up and running? @cglewis

My logs are as follows:

waiting for required containers to build and start (this might take a little while)...done.
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:C 18 Jun 13:55:29.159 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:C 18 Jun 13:55:29.159 # Redis version=4.0.9, bits=64, commit=00000000, modified=0, pid=1, just started
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:C 18 Jun 13:55:29.159 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 * Running mode=standalone, port=6379.
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 # WARNING: The TCP backlog setting of 511 cannot be enforced because /proc/sys/net/core/somaxconn is set to the lower value of 128.
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 # Server initialized
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect.
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 # WARNING you have Transparent Huge Pages (THP) support enabled in your kernel. This will create latency and memory usage issues with Redis. To fix this issue run the command 'echo never > /sys/kernel/mm/transparent_hugepage/enabled' as root, and add it to your /etc/rc.local in order to retain the setting after a reboot. Redis must be restarted after THP is disabled.
2018-06-18T14:55:29+00:00 172.17.0.1 core[1473]: 1:M 18 Jun 13:55:29.160 * Ready to accept connections
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: REMOTE_REDIS_PSWD not set. Please SET
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: REMOTE_REDIS_HOST=redis REMOTE_REDIS_PORT=6379
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + [[ -z redis ]]
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + [[ -z  ]]
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + export REMOTE_REDIS_PORT=6379
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + [[ -z  ]]
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + echo REMOTE_REDIS_PSWD not set. Please SET
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + export REMOTE_REDIS_PSWD=
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + [[ -z  ]]
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + export DASH_PREFIX=/rq
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + export RQ_DASHBOARD_SETTINGS=/rq_dash_settings.py
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + echo REMOTE_REDIS_HOST=redis REMOTE_REDIS_PORT=6379
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: + rq-dashboard
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: 13:55:32 RQ worker 'rq:worker:2d98d24d2551.1' started, version 0.11.0
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: 13:55:32 *** Listening on default...
2018-06-18T14:55:32+00:00 172.17.0.1 core[1473]: 13:55:32 Cleaning registries for queue: default
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]: RQ Dashboard, version 0.3.3
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:  * Serving Flask app "rq_dashboard.app" (lazy loading)
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:  * Environment: production
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:    WARNING: Do not use the development server in a production environment.
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:    Use a production WSGI server instead.
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:  * Debug mode: off
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]:  * Running on http://0.0.0.0:9181/ (Press CTRL+C to quit)
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]: 13:55:33 RQ worker 'rq:worker:a17965f8ea96.1' started, version 0.11.0
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]: 13:55:33 *** Listening on default...
2018-06-18T14:55:33+00:00 172.17.0.1 core[1473]: 13:55:33 Cleaning registries for queue: default
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 RQ worker 'rq:worker:cfac7400307c.1' started, version 0.11.0
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 *** Listening on default...
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 Cleaning registries for queue: default
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 RQ worker 'rq:worker:14429052eaa0.1' started, version 0.11.0
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 *** Listening on default...
2018-06-18T14:55:36+00:00 172.17.0.1 core[1473]: 13:55:36 Cleaning registries for queue: default
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [8] [INFO] Starting gunicorn 19.8.1
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [8] [INFO] Listening at: http://0.0.0.0:8080 (8)
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [8] [INFO] Using worker: gevent
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [11] [INFO] Booting worker with pid: 11
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [12] [INFO] Booting worker with pid: 12
2018-06-18T14:55:38+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:38 +0000] [13] [INFO] Booting worker with pid: 13
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]: [2018-06-18 13:55:39 +0000] [14] [INFO] Booting worker with pid: 14
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:39+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [1] [INFO] Starting gunicorn 19.8.1
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1)
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [1] [INFO] Using worker: gevent
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [9] [INFO] Booting worker with pid: 9
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [10] [INFO] Booting worker with pid: 10
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [15] [INFO] Booting worker with pid: 15
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: [2018-06-18 13:55:40 +0000] [31] [INFO] Booting worker with pid: 31
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-18T14:55:40+00:00 172.17.0.1 plugin[1473]:   monkey.patch_all(subprocess=True)
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,335 - INFO - Config:51  - From the Environment
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,335 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,336 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,336 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,426 - DEBUG - EndpointWrapper:51  - Endpoint_Wrapper:config:{'config': 'True', 'vent_ip': 'VENT_IP', 'vent_port': '8080'}
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,426 - DEBUG - Monitor_Helper_Base:45  - set_owner = NorthBoundControllerAbstraction
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - DEBUG - poseidonMonitor:400 - Monitor:config:{'config': 'True', 'logging_file': 'logging.json', 'logger_level': 'INFO', 'reinvestigation_frequency': '900', 'max_concurrent_reinvestigations': '2', 'scan_frequency': '5', 'rabbit_server': 'RABBIT_SERVER', 'rabbit_port': '5672', 'collector_nic': 'enp7s0f0', 'collector_interval': '900', 'vent_ip': 'vent_ip', 'vent_port': '8080', 'FA_RABBIT_ENABLED': 'True', 'FA_RABBIT_HOST': 'RABBIT_SERVER', 'FA_RABBIT_PORT': '5672', 'FA_RABBIT_EXCHANGE': 'topic_recs', 'FA_RABBIT_EXCHANGE_TYPE': 'topic', 'FA_RABBIT_ROUTING_KEY': 'FAUCET.Event'}
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - INFO - Config:61  - Config:configure
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - INFO - Monitor_Helper_Base:55  - Handle_SectionConfig configure()
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - INFO - Monitor_Helper_Base:55  - Handle_FieldConfig configure()
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - INFO - Monitor_Helper_Base:55  - Handle_FullConfig configure()
2018-06-18T14:55:41+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:41,427 - INFO - Monitor_Helper_Base:55  - Update_Switch_State configure()
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,456 - INFO - EndpointWrapper:140 - ====START
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,456 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,456 - INFO - EndpointWrapper:135 - None
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,457 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,457 - INFO - EndpointWrapper:135 - None
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,458 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,458 - INFO - EndpointWrapper:135 - None
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,459 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,459 - INFO - EndpointWrapper:135 - None
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,459 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,460 - INFO - EndpointWrapper:135 - None
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,460 - INFO - EndpointWrapper:144 - ****************
2018-06-18T14:55:46+00:00 172.17.0.1 plugin[1473]: 2018-06-18 13:55:46,460 - INFO - EndpointWrapper:145 - ====STOP

I have traffic being mirrored to the port that should be being monitored by Poseidon however I see no indication of this in Poseidon's logs. My docker containers seem to be up according to docker ps -a and there is definitely traffic coming into the interface.

What should I be looking for?

cglewis commented 6 years ago

ok, that looks correct - basically everything after INFO - Config:51 is poseidon output (latest version will improve the logging output to tell you that). At this point it's just waiting to learn about new devices on the network from Faucet.

j543453 commented 6 years ago

I have traffic running at that point to the interface that Poseidon is listening on, is there nothing in the debug that indicates that Poseidon is picking up this traffic?

That log is taken from an instance where two clients are communicating on my switch and the traffic is being mirrored to the interface that Poseidon is listening on but I couldn't spot anything that would indicate that Poseidon knows this.

scottkelso commented 6 years ago

Ah I see which poseidon you are referring to @cglewis. Same output as before unfortunately and Facuet logs don't show any activity. Again I haven't seen logs like @toiletduck123's from after INFO - Config:51 since I removed all the containers to re-run ./helpers/run

The only other output I get is the following repeated...

2018-06-18T16:26:45+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 15:26:45.344 * 100 changes in 300 seconds. Saving...
2018-06-18T16:26:45+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 15:26:45.345 * Background saving started by pid 13
2018-06-18T16:26:45+00:00 172.17.0.1 core[1371]: 13:C 18 Jun 15:26:45.411 * DB saved on disk
2018-06-18T16:26:45+00:00 172.17.0.1 core[1371]: 13:C 18 Jun 15:26:45.412 * RDB: 0 MB of memory used by copy-on-write
2018-06-18T16:26:45+00:00 172.17.0.1 core[1371]: 1:M 18 Jun 15:26:45.445 * Background saving terminated with success

If not, might this export be wrong? collector_nic=enp0s25 This is my machines local interface which hosts Faucet locally and in docker containers: Poseidon and Mininet. If this is correct then maybe Faucet simply isn't mirroring traffic.

cglewis commented 6 years ago

the collector_nic should be whatever the interface name is on the other side of the port you have configured with Faucet to mirror to (from your env: export controller_mirror_ports='{"sw1":3}' so port 3 as you mention).

output should be pretty obvious when it's learning things from Faucet - note it's only going to pick up things learned at L3 (L2 capture is still a TODO: #559).

note if you're using Rabbit for Faucet events, that also has to be enabled on the Faucet side (https://github.com/faucetsdn/faucet/tree/master/adapters)

cglewis commented 6 years ago

@scottkelso i'm also seeing the lack of logs and am able to reproduce on Ubuntu 16.04 - debugging it now, will update shortly on status. thanks for finding this!

cglewis commented 6 years ago

@scottkelso I found the issue, I believe. Syslog wasn't accepting enough tcp connections - I've pushed up a change to fix that, so you should just need to remove your containers and do ./helpers/run again.

@toiletduck123 the logs should look something like when it's working and detecting devices from Faucet:

2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,154 - INFO - Config:51  - From the Environment
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,157 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,157 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,157 - DEBUG - Monitor_Helper_Base:45  - set_owner = Config
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,398 - DEBUG - EndpointWrapper:51  - Endpoint_Wrapper:config:{'config': 'True', 'vent_ip': 'VENT_IP', 'vent_port': '8080'}
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,398 - DEBUG - Monitor_Helper_Base:45  - set_owner = NorthBoundControllerAbstraction
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,399 - DEBUG - poseidonMonitor:400 - Monitor:config:{'config': 'True', 'logging_file': 'logging.json', 'logger_level': 'INFO', 'reinvestigation_frequency': '900', 'max_concurrent_reinvestigations': '10', 'scan_frequency': '5', 'rabbit_server': 'RABBIT_SERVER', 'rabbit_port': '5672', 'collector_nic': 's1-eth3', 'collector_interval': '900', 'vent_ip': 'vent_ip', 'vent_port': '8080', 'FA_RABBIT_ENABLED': 'True', 'FA_RABBIT_HOST': '192.168.0.8', 'FA_RABBIT_PORT': '5672', 'FA_RABBIT_EXCHANGE': 'topic_recs', 'FA_RABBIT_EXCHANGE_TYPE': 'topic', 'FA_RABBIT_ROUTING_KEY': 'FAUCET.Event'}
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,399 - INFO - Config:61  - Config:configure
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,399 - INFO - Monitor_Helper_Base:55  - Handle_SectionConfig configure()
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,399 - INFO - Monitor_Helper_Base:55  - Handle_FieldConfig configure()
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,400 - INFO - Monitor_Helper_Base:55  - Handle_FullConfig configure()
2018-06-18T12:56:54+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:54,400 - INFO - Monitor_Helper_Base:55  - Update_Switch_State configure()
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,442 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,442 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,443 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,443 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,444 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,444 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:145 - ====STOP
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,638 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,639 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,639 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,640 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,640 - INFO - EndpointWrapper:131 - U:a7383b21cb273f7902296a78a2869614ce9433f0:UNKNOWN->NONE:{'mac': '00:00:00:00:00:02', 'ip': '10.0.0.2', 's': '1', 'p': '2', 'v': 'VLAN8'}
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,673 - INFO - EndpointWrapper:131 - U:9200f6e0192d4e4197210d7180519129d9b71c4a:UNKNOWN->NONE:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,683 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,684 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,684 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,684 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,685 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,685 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,686 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:57:34+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:34,686 - INFO - EndpointWrapper:145 - ====STOP
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,502 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,503 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,504 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,504 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,504 - INFO - EndpointWrapper:131 - U:a7383b21cb273f7902296a78a2869614ce9433f0:UNKNOWN->MIRRORING:{'mac': '00:00:00:00:00:02', 'ip': '10.0.0.2', 's': '1', 'p': '2', 'v': 'VLAN8'}
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,516 - INFO - EndpointWrapper:131 - U:9200f6e0192d4e4197210d7180519129d9b71c4a:UNKNOWN->MIRRORING:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,538 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,539 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,539 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,539 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,539 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,539 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,540 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:57:35+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:35,540 - INFO - EndpointWrapper:145 - ====STOP
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,596 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,597 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,597 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,597 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,597 - INFO - EndpointWrapper:131 - U:9200f6e0192d4e4197210d7180519129d9b71c4a:UNKNOWN->MIRRORING:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,638 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,639 - INFO - EndpointWrapper:131 - M:a7383b21cb273f7902296a78a2869614ce9433f0:MIRRORING->NONE:{'mac': '00:00:00:00:00:02', 'ip': '10.0.0.2', 's': '1', 'p': '2', 'v': 'VLAN8'}
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,648 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,648 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,648 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,648 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:145 - ====STOP
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,649 - INFO - EndpointWrapper:131 - U:9200f6e0192d4e4197210d7180519129d9b71c4a:UNKNOWN->MIRRORING:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,677 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,677 - INFO - EndpointWrapper:131 - M:a7383b21cb273f7902296a78a2869614ce9433f0:MIRRORING->NONE:{'mac': '00:00:00:00:00:02', 'ip': '10.0.0.2', 's': '1', 'p': '2', 'v': 'VLAN8'}
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,686 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,687 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,687 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,687 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,687 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:57:36+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:36,687 - INFO - EndpointWrapper:145 - ====STOP
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,706 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,706 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,707 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,709 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,710 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,710 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,710 - INFO - EndpointWrapper:131 - M:a7383b21cb273f7902296a78a2869614ce9433f0:MIRRORING->NONE:{'mac': '00:00:00:00:00:02', 'ip': '10.0.0.2', 's': '1', 'p': '2', 'v': 'VLAN8'}
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,724 - INFO - EndpointWrapper:131 - M:9200f6e0192d4e4197210d7180519129d9b71c4a:MIRRORING->NONE:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,734 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,734 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,734 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,734 - INFO - EndpointWrapper:135 - None
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,735 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:57:37+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:57:37,735 - INFO - EndpointWrapper:145 - ====STOP

The logs will only update when states within Poseidon change.

j543453 commented 6 years ago

Thanks very much for the updates!

How would I go about replicating those logs, what could I do to get state changes within Poseidon?(/what is considered a state change? might mine not be detecting devices in Faucet?)

Why would mine just hang on this:

2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,442 - INFO - EndpointWrapper:140 - ====START
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,442 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,443 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,443 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,444 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,444 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,445 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:135 - None
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:144 - ****************
2018-06-18T12:56:59+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/abb090cb7aa4/plugin[1428]: 2018-06-18 19:56:59,446 - INFO - EndpointWrapper:145 - ====STOP
cglewis commented 6 years ago

You're going to want to look at your Faucet logs and see if you're getting L2 Learn messages like this (L3 src/dst must have real IPs for Poseidon to do anything with them):

Jun 18 19:57:37 faucet.valve INFO     DPID 1 (0x1) L2 learned 00:00:00:00:00:02 (L2 type 0x0800, L3 src 10.0.0.2, L3 dst 10.0.0.1) on Port 2 on VLAN 8 (1 hosts total)
Jun 18 19:57:37 faucet.valve INFO     DPID 1 (0x1) L2 learned 00:00:00:00:00:01 (L2 type 0x0800, L3 src 10.0.0.1, L3 dst 10.0.0.2) on Port 1 on VLAN 8 (2 hosts total)

If you're getting those, then you'll need to make sure that you're capturing the events from Faucet - you can start Poseidon in debug mode (open up .plugin_config.yml and change logger_level to DEBUG) and you will get a lot more output which might be helpful in debugging this. Also you can start a RabbitMQ listener to see if the messages are getting sent from Faucet as expected (see the example consumer): https://github.com/faucetsdn/faucet/tree/master/adapters/vendors/rabbitmq

j543453 commented 6 years ago

I am indeed getting L2 Learn messages, Faucet seems to be working correctly!

I will run Poseidon tomorrow morning in debug mode and see what it shows, thanks very much for the help.

j543453 commented 6 years ago

Update on the logs with DEBUG logging_level and the new git pulled:

2018-06-19T09:36:00+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:00,008 - DEBUG - poseidonMonitor:225 - starting thread_worker
2018-06-19T09:36:00+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:00,008 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:01+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:01,009 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:01+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:01,009 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:01+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:01,010 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:02+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:02,011 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:02+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:02,011 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:02+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:02,011 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:03+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:03,012 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:03+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:03,013 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:03+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:03,013 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:04+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:04,014 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:04+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:04,014 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:04+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:04,015 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,015 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,015 - DEBUG - poseidonMonitor:68  - kick
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - DEBUG - faucet:92  - get_endpoints found:
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - DEBUG - UpdateSwitchState:235 - MACHINES:[]
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - INFO - EndpointWrapper:140 - ====START
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,016 - INFO - EndpointWrapper:135 - None
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:135 - None
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:135 - None
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:135 - None
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,017 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,018 - INFO - EndpointWrapper:135 - None
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,018 - INFO - EndpointWrapper:144 - ****************
2018-06-19T09:36:05+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:05,018 - INFO - EndpointWrapper:145 - ====STOP
2018-06-19T09:36:06+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:06,017 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:06+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:06,018 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:06+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:06,039 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:07+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:07,019 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:07+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:07,019 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:07+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:07,041 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:08+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:08,021 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:08+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:08,021 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:08+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:08,042 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:09+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:09,022 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:09+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:09,023 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:09+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:09,044 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,024 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,025 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,045 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,045 - DEBUG - poseidonMonitor:68  - kick
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,045 - DEBUG - faucet:92  - get_endpoints found:
2018-06-19T09:36:10+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:10,046 - DEBUG - UpdateSwitchState:235 - MACHINES:[]
2018-06-19T09:36:11+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:11,026 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:11+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:11,026 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:11+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:11,057 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:12+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:12,028 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:12+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:12,028 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:12+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:12,058 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [8] [INFO] Starting gunicorn 19.8.1
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [8] [INFO] Listening at: http://0.0.0.0:8080 (8)
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [8] [INFO] Using worker: gevent
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [11] [INFO] Booting worker with pid: 11
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [12] [INFO] Booting worker with pid: 12
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [15] [INFO] Booting worker with pid: 15
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: [2018-06-19 08:36:12 +0000] [21] [INFO] Booting worker with pid: 21
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]: /usr/local/lib/python3.6/site-packages/gunicorn/workers/ggevent.py:65: MonkeyPatchWarning: Monkey-patching ssl after ssl has already been imported may lead to errors, including RecursionError on Python 3.6. Please monkey-patch earlier. See https://github.com/gevent/gevent/issues/1016
2018-06-19T09:36:12+00:00 172.17.0.1 core[1473]:   monkey.patch_all(subprocess=True)
2018-06-19T09:36:13+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:13,029 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:13+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:13,030 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:13+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:13,059 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:14+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:14,031 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:14+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:14,031 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:14+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:14,061 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,033 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,033 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,062 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,062 - DEBUG - poseidonMonitor:68  - kick
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,063 - DEBUG - faucet:92  - get_endpoints found:
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,063 - DEBUG - UpdateSwitchState:235 - MACHINES:[]
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,034 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,034 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,070 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,035 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,036 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,072 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,037 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,037 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,073 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,038 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,038 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,074 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,039 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,039 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,075 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,076 - DEBUG - poseidonMonitor:68  - kick

It shows the same INFO for start, known, unknown mirroring etc once and then continuously repeats the DEBUG lines as show.

Am I right in thinking it isn't picking any machines up from faucet, meaning faucet is not sending messages?

I will try rabbitmq now to see what it says.

j543453 commented 6 years ago

Rabbitmq continually repeats:

faucet_1            | 172.18.0.7 - - [19/Jun/2018 12:42:32] "GET /metrics HTTP/1.1" 200 7659 0.022145
faucet_1            | 172.18.0.7 - - [19/Jun/2018 12:42:47] "GET /metrics HTTP/1.1" 200 7659 0.015152
faucet_1            | 172.18.0.7 - - [19/Jun/2018 12:43:02] "GET /metrics HTTP/1.1" 200 7659 0.011069
faucet_1            | 172.18.0.7 - - [19/Jun/2018 12:43:17] "GET /metrics HTTP/1.1" 200 7646 0.010862
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Not connecting to any RabbitMQ, host is None.
rabbitmq_adapter_1  | Starting with UID=0 GID=0

Is this indicating a Faucet problem?

I can confirm Poseidon is SSHing into the RPi with Faucet on it.

cglewis commented 6 years ago

when you start Faucet, are you using docker-compose and are you supplying the arguments for RabbitMQ?

FAUCET_EVENT_SOCK=1
FAUCET_CONFIG_STAT_RELOAD=1
FA_RABBIT_HOST=<whatever the host IP that's running poseidon>
FA_RABBIT_ENABLED=True

Also, make sure if you are using Rabbit, that when you start Poseidon it also has the environment variable for it set:

rabbit_enabled=True

I can see from your logs here:

2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,063 - DEBUG - faucet:92  - get_endpoints found:

That shows it's not getting any endpoints from Faucet, which is why nothing else in Poseidon is happening.

j543453 commented 6 years ago

I have been using:

cd /home/pi/faucet && FAUCET_CONFIG_STAT_RELOAD=1 docker-compose -f docker-compose-pi.yaml up --build -d

and changed it to

cd /home/pi/faucet && FAUCET_CONFIG_STAT_RELOAD=1 FAUCET_EVENT_SOCK=1 FA_RABBIT_HOST=192.168.1.10 FA_RABBIT_ENABLED=True docker-compose -f docker-compose-pi.yaml up --build -d

However this caused the build to hang at Step 3/6 and not progress (I waited for 30 mins), is this an incorrect way to define the arguments?

cglewis commented 6 years ago

You need to additionally supply the rabbitmq docker-compose file. like so (https://github.com/faucetsdn/faucet/blob/master/adapters/vendors/rabbitmq/README.rst):

cd /home/pi/faucet && FAUCET_CONFIG_STAT_RELOAD=1 FAUCET_EVENT_SOCK=1 FA_RABBIT_HOST=192.168.1.10 FA_RABBIT_ENABLED=True docker-compose -f docker-compose-pi.yaml -f adapters/vendors/rabbitmq/docker-compose-pi.yaml up --build -d

Not sure why it would be hanging, it didn't hang for me doing it that way.

j543453 commented 6 years ago

Thanks very much that build has worked perfectly by adding in the rabbitmq docker-compose file.

I am still getting the same output on Poseidon and the output on rabbitmq is very similar to before:

rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Connected to RabbitMQ at 192.168.1.10:5672
rabbitmq_adapter_1  | Failed to connect to the socket because: [Errno 22] Invalid argument
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Connected to RabbitMQ at 192.168.1.10:5672
rabbitmq_adapter_1  | Failed to connect to the socket because: [Errno 22] Invalid argument
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Connected to RabbitMQ at 192.168.1.10:5672
rabbitmq_adapter_1  | Failed to connect to the socket because: [Errno 22] Invalid argument
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_adapter_1  | Connected to RabbitMQ at 192.168.1.10:5672
rabbitmq_adapter_1  | Failed to connect to the socket because: [Errno 22] Invalid argument
rabbitmq_adapter_1  | Starting with UID=0 GID=0
rabbitmq_1          | 2018-06-20 09:16:31.227 [info] <0.937.0> accepting AMQP connection <0.937.0> (172.18.0.1:51980 -> 172.18.0.7:5672)
rabbitmq_1          | 2018-06-20 09:16:31.242 [info] <0.937.0> connection <0.937.0> (172.18.0.1:51980 -> 172.18.0.7:5672): user 'guest' authenticated and granted access to vhost '/'
rabbitmq_adapter_1  | Connected to RabbitMQ at 192.168.1.10:5672
rabbitmq_adapter_1  | Failed to connect to the socket because: [Errno 22] Invalid argument
rabbitmq_1          | 2018-06-20 09:16:31.321 [warning] <0.937.0> closing AMQP connection <0.937.0> (172.18.0.1:51980 -> 172.18.0.7:5672, vhost: '/', user: 'guest'):
rabbitmq_1          | client unexpectedly closed TCP connection
faucet_1            | (1) accepted ('172.18.0.3', 54886)
faucet_1            | 172.18.0.3 - - [20/Jun/2018 09:16:31] "GET /metrics HTTP/1.1" 200 8098 0.016454
rabbitmq_1          | 2018-06-20 09:16:36.306 [info] <0.952.0> accepting AMQP connection <0.952.0> (172.18.0.1:51988 -> 172.18.0.7:5672)
rabbitmq_1          | 2018-06-20 09:16:36.327 [info] <0.952.0> connection <0.952.0> (172.18.0.1:51988 -> 172.18.0.7:5672): user 'guest' authenticated and granted access to vhost '/'
rabbitmq_1          | 2018-06-20 09:16:36.422 [warning] <0.952.0> closing AMQP connection <0.952.0> (172.18.0.1:51988 -> 172.18.0.7:5672, vhost: '/', user: 'guest'):
rabbitmq_1          | client unexpectedly closed TCP connection
faucet_rabbitmq_adapter_1 exited with code 0

I am really struggling to debug this because I don't understand what it is meant to be doing from the documentation. You suggested me use rabbitmq to see if the messages are getting sent from Faucet as expected. What does this rabbitmq output indicate?

What messages should Faucet be sending and is rabbitmq the only way I can see these?

I can confirm that Poseidon is SSHing into the Raspberry Pi containing Faucet and it is searching in the correct file destinations e.g. etc/faucet/faucet.yaml

gregs5 commented 6 years ago

Hey @toiletduck123 , thanks for the feedback and hanging in there with us on these issues. @cglewis will need to reply re: rabbitmq output as I don't actually know the answer, but one thing we've been discussing internally is trying to re-engineer some of the deployment and logging processes.

Poseidon and Vent are a little more complicated than some stock *nix services because of diverse container use and a lot moving parts, but would it be helpful if we thought about sending start/stop/critical error messages to something like a main log, and more detailed logging to application specific logs? And ideally (if/where possible), some sort of "I'm up!" message per component?

We've obviously got a lot more work to do, but we're trying to brainstorm on what might be easily achievable in the near-term. Thoughts? (Thanks!)

j543453 commented 6 years ago

Hey gregs5!

No, no, thank you guys for hanging in there with me trying to understand everything going on, I really appreciate your help and patience.

I think that would be definitely helpful (the main log and application specific logs), when I started using Poseidon and Faucet too, I couldn't really understand whether it was actually running or not and what it was doing, the DEBUG logging level did help.

However on top of this I am very new to Docker and how containers work and how they are meant to be run/built so I am sure plenty of this misunderstanding is due to my lack of knowledge. I will have a further think at any achievable short-term implementations and get back to you :)

Essentially my problem at the moment is, I don't understand what/how Poseidon and Faucet should communicating therefore I don't even know where to begin to find out why it isn't picking up any endpoints. Some documentation explaining how they should be communicating may help me narrow down where the problem could be. After exporting the correct arguments, ensuring the connection is working and the Laptop with Poseidon can SSH into the Pi, I am afraid I don't really know where to go from here!

Thanks

gregs5 commented 6 years ago

Yeah, totally understand. And FWIW, you're not alone in being new to docker. Our original intention behind using vent and Poseidon together (they are/were separate projects) was to remove the user/implementer from needing to be messing with docker directly, but that (obviously!) hasn't turned out as planned (yet). We'll get there though.

In the meantime, let's wait to hear what Charlie says re: the rabbitmq side. Taking a complete guess, this: "Failed to connect to the socket because: [Errno 22] Invalid argument" doesn't sound good, but I have no idea what to suggest re: fixing it! We'll see what Charlie says...

In the meantime, one thing you'll want to think about is the workstations/devices you want to profile is what kind of traffic you'll want to generate from them. The ML portion of this project works off of 15 minute captures to send to the ML classifiers. That's not to say you need an endpoint going crazy for 15 minutes, but if it "talks" a bunch that's good. If it's silent...not so good. FWIW, when I'm working with the ML portion of the project I'll capture off of a desktop/laptop and just surf news sites for a little bit; that gens plenty of web traffic, and that at least is enough for the ML models to take a guess. But first things first - let's get the pieces talking together!

cglewis commented 6 years ago

@toiletduck123 those logs look like faucet is successfully connecting RabbitMQ (RabbitMQ logs are not super intuitive) - we should check if it's getting the events - so let's step back and make sure we can see events from Faucet in RabbitMQ.

Sorry the documentation is a bit rough - we'll work on improving that. So for getting Faucet event messages into RabbitMQ we're going to use this: https://github.com/faucetsdn/faucet/tree/master/adapters/vendors/rabbitmq

First of all, I haven't tried running RabbitMQ directly on an RPi, so I'm going to walk through a setup where the RabbitMQ server is not on the same host as Faucet (assuming Faucet is running on the RPi) - this will be much like the same setup where Poseidon is on a different system not on the RPi.

So, we're going to deploy the docker-compose file for the RabbitMQ server on a different system (this would be the Poseidon system when we get there): https://github.com/faucetsdn/faucet/blob/master/adapters/vendors/rabbitmq/docker-compose-rabbitmq.yaml

We'll call the system running Faucet on the RPi F and the system running the RabbitMQ server (Poseidon) R.

On system R we'll start the RabbitMQ server doing the following (make sure any Poseidon containers that may already be running have been stopped/removed):

$ git clone https://github.com/faucetsdn/faucet
$ cd faucet
$ docker-compose -f adapters/vendors/rabbitmq/docker-compose-rabbitmq.yaml up

If that works, you should see output like this (snipped to keep it short):

$ docker-compose -f adapters/vendors/rabbitmq/docker-compose-rabbitmq.yaml up
Creating rabbitmq_rabbitmq_1 ... done
Attaching to rabbitmq_rabbitmq_1
rabbitmq_1  | 2018-06-20 16:01:25.006 [info] <0.33.0> Application lager started on node rabbit@2908d1dab409
<snip>
rabbitmq_1  | 2018-06-20 16:01:30.476 [info] <0.5.0> Server startup complete; 3 plugins started.
rabbitmq_1  |  * rabbitmq_management
rabbitmq_1  |  * rabbitmq_web_dispatch
rabbitmq_1  |  * rabbitmq_management_agent

That is now running in the foreground (note we didn't pass in -d to docker-compose) so you'll see logs as they happen, and if you close or ctrl-c that terminal it will kill that service.

Now that we have the RabbitMQ server running on system R, let's go to system F.

We need to start Faucet with the event sock enabled, and specifically for on a RPi:

$ export FAUCET_EVENT_SOCK=1
$ export FAUCET_CONFIG_STAT_RELOAD=1
$ export FA_RABBIT_HOST=192.168.0.7 #the IP address of system `R`
$ git clone https://github.com/faucetsdn/faucet
$ cd faucet
$ docker-compose -f docker-compose-pi.yaml -f adapters/vendors/rabbitmq/docker-compose-pi.yaml up --build -d

After which on system F you'll see:

Creating faucet_faucet_1   ... done
Creating faucet_influxdb_1 ... done
Creating faucet_gauge_1            ... done
Creating faucet_rabbitmq_adapter_1 ... done
Creating faucet_prometheus_1       ... done
Creating faucet_grafana_1          ... done

If you then also do a docker ps, everything should have a status of up like this:

$ docker ps
CONTAINER ID        IMAGE                           COMMAND                  CREATED             STATUS              PORTS                                              NAMES
0665f9bd2762        grafana/grafana:5.1.3           "/run.sh"                2 minutes ago       Up 2 minutes        0.0.0.0:3000->3000/tcp                             faucet_grafana_1
1961e76696fb        prom/prometheus:v2.2.1          "/bin/prometheus --c…"   2 minutes ago       Up 2 minutes        0.0.0.0:9090->9090/tcp                             faucet_prometheus_1
5b7777a54073        faucet/event-adapter-rabbitmq   "/usr/local/bin/entr…"   2 minutes ago       Up 2 minutes                                                           faucet_rabbitmq_adapter_1
5c76db03c851        faucet/gauge:latest             "/usr/local/bin/entr…"   2 minutes ago       Up 2 minutes        0.0.0.0:6654->6653/tcp, 0.0.0.0:32809->9303/tcp    faucet_gauge_1
fdf33ea443ec        influxdb:1.3-alpine             "/entrypoint.sh infl…"   2 minutes ago       Up 2 minutes        0.0.0.0:32808->8083/tcp, 0.0.0.0:32807->8086/tcp   faucet_influxdb_1
b138d9576da1        faucet/faucet:latest            "/usr/local/bin/entr…"   2 minutes ago       Up 2 minutes        0.0.0.0:6653->6653/tcp, 0.0.0.0:32806->9302/tcp    faucet_faucet_1

If you look back on system R you will see the same logs you have above coming in:

rabbitmq_1  | 2018-06-20 16:09:35.703 [info] <0.725.0> accepting AMQP connection <0.725.0> (172.18.0.1:53520 -> 172.18.0.2:5672)
rabbitmq_1  | 2018-06-20 16:09:35.714 [info] <0.725.0> connection <0.725.0> (172.18.0.1:53520 -> 172.18.0.2:5672): user 'guest' authenticated and granted access to vhost '/'
rabbitmq_1  | 2018-06-20 16:09:35.741 [warning] <0.725.0> closing AMQP connection <0.725.0> (172.18.0.1:53520 -> 172.18.0.2:5672, vhost: '/', user: 'guest'):
rabbitmq_1  | client unexpectedly closed TCP connection
rabbitmq_1  | 2018-06-20 16:09:37.098 [info] <0.740.0> accepting AMQP connection <0.740.0> (172.18.0.1:53522 -> 172.18.0.2:5672)
rabbitmq_1  | 2018-06-20 16:09:37.107 [info] <0.740.0> connection <0.740.0> (172.18.0.1:53522 -> 172.18.0.2:5672): user 'guest' authenticated and granted access to vhost '/'
rabbitmq_1  | 2018-06-20 16:09:37.134 [warning] <0.740.0> closing AMQP connection <0.740.0> (172.18.0.1:53522 -> 172.18.0.2:5672, vhost: '/', user: 'guest'):
rabbitmq_1  | client unexpectedly closed TCP connection
rabbitmq_1  | 2018-06-20 16:09:38.348 [info] <0.755.0> accepting AMQP connection <0.755.0> (172.18.0.1:53524 -> 172.18.0.2:5672)
rabbitmq_1  | 2018-06-20 16:09:38.358 [info] <0.755.0> connection <0.755.0> (172.18.0.1:53524 -> 172.18.0.2:5672): user 'guest' authenticated and granted access to vhost '/'
rabbitmq_1  | 2018-06-20 16:09:38.383 [warning] <0.755.0> closing AMQP connection <0.755.0> (172.18.0.1:53524 -> 172.18.0.2:5672, vhost: '/', user: 'guest'):
rabbitmq_1  | client unexpectedly closed TCP connection

At this point Faucet is connected to RabbitMQ, so let's use the consumer from the link at the top of this comment to get events and make sure everything is working as expected.

In another terminal on system R go back to your faucet directory as do the following:

$ pip3 install -r adapters/vendors/rabbitmq/requirements.txt
$ python3 adapters/vendors/rabbitmq/example_consumer.py

At this point it will look like it's doing nothing, and that's because Faucet isn't connected to any switches yet for event to happen.

Back on system F let's take a quick look at the faucet logs:

$ tail /var/log/faucet/faucet.log 
Jun 20 16:07:36 faucet INFO     will automatically reload new config on changes
Jun 20 16:07:36 faucet INFO     Reloading configuration
Jun 20 16:07:36 faucet INFO     configuration /etc/faucet/faucet.yaml changed, analyzing differences
Jun 20 16:07:36 faucet INFO     Add new datapath DPID 1 (0x1)
Jun 20 16:07:37 faucet INFO     event client connected
Jun 20 16:08:13 faucet INFO     will automatically reload new config on changes
Jun 20 16:08:13 faucet INFO     Reloading configuration
Jun 20 16:08:13 faucet INFO     configuration /etc/faucet/faucet.yaml changed, analyzing differences
Jun 20 16:08:13 faucet INFO     Add new datapath DPID 1 (0x1)
Jun 20 16:08:14 faucet INFO     event client connected

Note, the last line is what tells you from Faucet's perspective that it's connected to RabbitMQ.

Let's go ahead and use mininet to connect a switch. First off though, here is the /etc/faucet/faucet.yaml I'm using:

$ cat /etc/faucet/faucet.yaml 
dps:
  switch-1:
    dp_id: 0x1
    hardware: Open vSwitch
    interfaces:
      1:
        native_vlan: 8
      2:
        native_vlan: 8
      3:
        description: mirror
        output_only: true
vlans:
  8:

To start mininet on system F we'll do the following:

$ sudo mn --topo single,3 --mac --switch ovsk --controller=remote,ip=127.0.0.1:6653
*** Creating network
*** Adding controller
*** Adding hosts:
h1 h2 h3 
*** Adding switches:
s1 
*** Adding links:
(h1, s1) (h2, s1) (h3, s1) 
*** Configuring hosts
h1 h2 h3 
*** Starting controller
c0 
*** Starting 1 switches
s1 ...
*** Starting CLI:
mininet> 

Once that happens, you should see two things happen in the logs. The Faucet log file on system F should do something like this:

Jun 20 16:18:40 faucet.valve INFO     DPID 1 (0x1) switch-1 all ports changed
Jun 20 16:19:43 faucet.valve INFO     DPID 1 (0x1) switch-1 Cold start configuring DP
Jun 20 16:19:43 faucet.valve INFO     DPID 1 (0x1) switch-1 Configuring VLAN 8 vid:8 ports:Port 1,Port 2
Jun 20 16:19:43 faucet.valve INFO     DPID 1 (0x1) switch-1 Port 1 configured
Jun 20 16:19:43 faucet.valve INFO     DPID 1 (0x1) switch-1 Port 2 configured
Jun 20 16:19:43 faucet.valve INFO     DPID 1 (0x1) switch-1 Port 3 configured
Jun 20 16:19:44 faucet.valve INFO     DPID 1 (0x1) switch-1 L2 learned 00:00:00:00:00:01 (L2 type 0x86dd, L3 src None, L3 dst None) on Port 1 on VLAN 8 (1 hosts total)
Jun 20 16:19:44 faucet.valve INFO     DPID 1 (0x1) switch-1 L2 learned 00:00:00:00:00:02 (L2 type 0x86dd, L3 src None, L3 dst None) on Port 2 on VLAN 8 (2 hosts total)

and the example_consumer.py program on system R should have output something like this:

$ python3 adapters/vendors/rabbitmq/example_consumer.py 
 [X] 2018-06-20 16:18:55.344989 UTC 'FAUCET.Event':b'{"version": 1, "time": 1529511520.909817, "dp_id": 1, "dp_name": "switch-1", "event_id": 1, "CONFIG_CHANGE": {"restart_type": "none"}}'
 [X] 2018-06-20 16:19:58.295413 UTC 'FAUCET.Event':b'{"version": 1, "time": 1529511583.8685849, "dp_id": 1, "dp_name": "switch-1", "event_id": 2, "DP_CHANGE": {"reason": "cold_start"}}'
 [X] 2018-06-20 16:19:58.295987 UTC 'FAUCET.Event':b'{"version": 1, "time": 1529511583.8716958, "dp_id": 1, "dp_name": "switch-1", "event_id": 3, "PORTS_STATUS": {"1": true, "2": true, "3": true}}'
 [X] 2018-06-20 16:19:58.604069 UTC 'FAUCET.Event':b'{"version": 1, "time": 1529511584.195025, "dp_id": 1, "dp_name": "switch-1", "event_id": 4, "L2_LEARN": {"port_no": 1, "previous_port_no": null, "vid": 8, "eth_src": "00:00:00:00:00:01", "eth_type": 34525, "l3_src_ip": null, "l3_dst_ip": null}}'
 [X] 2018-06-20 16:19:58.764123 UTC 'FAUCET.Event':b'{"version": 1, "time": 1529511584.3553207, "dp_id": 1, "dp_name": "switch-1", "event_id": 5, "L2_LEARN": {"port_no": 2, "previous_port_no": null, "vid": 8, "eth_src": "00:00:00:00:00:02", "eth_type": 34525, "l3_src_ip": null, "l3_dst_ip": null}}'

If all of that worked, then it's ready to work with Poseidon. The only changes needed are the following. Keep system F the way it is, except that if you're still going to use mininet then either Poseidon needs to run on the same system, or the ports of that software switch need to be exposed to real NICs that Poseidon on the other system can get to. Ideally a real switch. Note Poseidon can't run on a RPi at this time.

On system R kill both the RabbitMQ server and the example_consumer.py program. Then perform the following:

$ export controller_type=faucet
$ export controller_uri=192.168.0.8
$ export controller_mirror_ports='{"switch-1":3}' # note the 'output_only' for port 3 on switch-1 in the faucet.yaml
$ export collector_nic=s1-eth3  # this is the NIC that needs to accessible on system `R` that is getting traffic from the `switch-1 port 3` connected to Faucet on system `F` 
$ git clone https://github.com/cyberreboot/poseidon
$ cd poseidon
$ ./helpers/run

Once Poseidon is up, you'll need to get Faucet to learn at L3 - note above it only learned at L2. To do that I simply wait for Faucet to expire the L2 learned hosts - watch the faucet.log for this:

Jun 20 16:33:47 faucet.valve INFO     DPID 1 (0x1) switch-1 0 recently active hosts on VLAN 8, expired [00:00:00:00:00:02 on Port 2]

At which point in mn I do xterm h1 and from that terminal do ping 10.0.0.2 (the default ip of h2) and back in the faucet.log you should see it learn both hosts with L2 and L3 like this:

Jun 20 16:35:26 faucet.valve INFO     DPID 1 (0x1) switch-1 L2 learned 00:00:00:00:00:01 (L2 type 0x0806, L3 src 10.0.0.1, L3 dst 10.0.0.2) on Port 1 on VLAN 8 (1 hosts total)
Jun 20 16:35:26 faucet.valve INFO     DPID 1 (0x1) switch-1 L2 learned 00:00:00:00:00:02 (L2 type 0x0806, L3 src 10.0.0.2, L3 dst 10.0.0.1) on Port 2 on VLAN 8 (2 hosts total)

You don't want to do the ping to learn at L3 until Poseidon is up though, otherwise Faucet will learn about them and send the event before Poseidon is ready to receive it.

Assuming you do that L3 learn step once Poseidon is up and running (otherwise you'll have to wait for a re-learn, I think the default is 5 minutes) you'll see output from Poseidon that looks like this (I'm using DEBUG mode):

2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,274 - DEBUG - poseidonMonitor:400 - Monitor:config:{'config': 'True', 'logging_file': 'logging.json', 'logger_level': 'DEBUG', 'reinvestigation_frequency': '900', 'max_concurrent_reinvestigations': '2', 'scan_frequency': '5', 'rabbit_server': 'RABBIT_SERVER', 'rabbit_port': '5672', 'collector_nic': 's1-eth3', 'collector_interval': '900', 'vent_ip': 'vent_ip', 'vent_port': '8080', 'FA_RABBIT_ENABLED': 'True', 'FA_RABBIT_HOST': 'RABBIT_SERVER', 'FA_RABBIT_PORT': '5672', 'FA_RABBIT_EXCHANGE': 'topic_recs', 'FA_RABBIT_EXCHANGE_TYPE': 'topic', 'FA_RABBIT_ROUTING_KEY': 'FAUCET.Event'}
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,274 - DEBUG - poseidonMonitor:312 - handler Config
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,275 - INFO - Config:61  - Config:configure
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,275 - DEBUG - Config:72  - mod_name:Config |mod_configuration: {'config': 'True'}
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,275 - DEBUG - Monitor_Action_Base:87  - about to configure Handle_SectionConfig
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,275 - INFO - Monitor_Helper_Base:55  - Handle_SectionConfig configure()
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,275 - DEBUG - Monitor_Action_Base:87  - about to configure Handle_FieldConfig
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,276 - INFO - Monitor_Helper_Base:55  - Handle_FieldConfig configure()
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,277 - DEBUG - Monitor_Action_Base:87  - about to configure Handle_FullConfig
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,278 - INFO - Monitor_Helper_Base:55  - Handle_FullConfig configure()
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,278 - DEBUG - poseidonMonitor:323 - handler NorthBoundControllerAbstraction
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,279 - DEBUG - Monitor_Action_Base:63  - NorthBoundControllerAbstraction Base:configure
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,279 - DEBUG - Monitor_Action_Base:66  - NorthBoundControllerAbstraction configure:owner
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,280 - DEBUG - Monitor_Action_Base:76  - NorthBoundControllerAbstraction,NorthBoundControllerAbstraction:{'config': 'True'}
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,280 - DEBUG - Monitor_Action_Base:87  - about to configure Update_Switch_State
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,280 - INFO - Monitor_Helper_Base:55  - Update_Switch_State configure()
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,281 - DEBUG - poseidonMonitor:334 - ----------------------
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,320 - DEBUG - Rabbit_Base:60  - connected to RABBIT_SERVER rabbitmq...
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,321 - DEBUG - Rabbit_Base:76  - array adding key:poseidon.algos.# to rabbitmq channel
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,323 - DEBUG - Rabbit_Base:76  - array adding key:poseidon.action.# to rabbitmq channel
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,326 - DEBUG - Rabbit_Base:92  - about to start channel <BlockingChannel impl=<Channel number=1 OPEN conn=<SelectConnection OPEN socket=('172.17.0.15', 50934)->('172.17.0.3', 5672) params=<ConnectionParameters host=RABBIT_SERVER port=5672 virtual_host=/ ssl=False>>>>
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,353 - DEBUG - Rabbit_Base:60  - connected to RABBIT_SERVER rabbitmq...
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,353 - DEBUG - Rabbit_Base:76  - array adding key:FAUCET.Event.# to rabbitmq channel
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,355 - DEBUG - Rabbit_Base:92  - about to start channel <BlockingChannel impl=<Channel number=1 OPEN conn=<SelectConnection OPEN socket=('172.17.0.15', 50938)->('172.17.0.3', 5672) params=<ConnectionParameters host=RABBIT_SERVER port=5672 virtual_host=/ ssl=False>>>>
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,357 - DEBUG - poseidonMonitor:225 - starting thread_worker
2018-06-20T09:41:08+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:08,358 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:09+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:09,359 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:09+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:09,360 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:09+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:09,360 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:10+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:10,361 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:10+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:10,362 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:10+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:10,362 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:11+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:11,363 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:11+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:11,364 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:11+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:11,364 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:12+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:12,363 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:12+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:12,366 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:12+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:12,366 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,364 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,366 - DEBUG - poseidonMonitor:68  - kick
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,367 - DEBUG - faucet:92  - get_endpoints found:
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,367 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,369 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,368 - DEBUG - UpdateSwitchState:235 - MACHINES:[]
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,371 - INFO - EndpointWrapper:140 - ====START
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,371 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,372 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,372 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,372 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,372 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,372 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,373 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,373 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,373 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,373 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,373 - INFO - EndpointWrapper:144 - ****************
2018-06-20T09:41:13+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:13,374 - INFO - EndpointWrapper:145 - ====STOP
2018-06-20T09:41:14+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:14,370 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:14+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:14,371 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:14+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:14,410 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:15+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:15,373 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:15+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:15,374 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:15+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:15,411 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,375 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,376 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,411 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,846 - DEBUG - poseidonMonitor:213 - got a message: FAUCET.Event:b'{"version": 1, "time": 1529512427.0744884, "dp_id": 1, "dp_name": "switch-1", "event_id": 11, "L2_EXPIRE": {"port_no": 2, "vid": 8, "eth_src": "00:00:00:00:00:02"}}':<class 'bytes'>
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,847 - DEBUG - poseidonMonitor:213 - got a message: FAUCET.Event:b'{"version": 1, "time": 1529512526.1242833, "dp_id": 1, "dp_name": "switch-1", "event_id": 12, "L2_LEARN": {"port_no": 1, "previous_port_no": null, "vid": 8, "eth_src": "00:00:00:00:00:01", "eth_type": 2054, "l3_src_ip": "10.0.0.1", "l3_dst_ip": "10.0.0.2"}}':<class 'bytes'>
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,848 - DEBUG - poseidonMonitor:213 - got a message: FAUCET.Event:b'{"version": 1, "time": 1529512832.165341, "dp_id": 1, "dp_name": "switch-1", "event_id": 14, "L2_EXPIRE": {"port_no": 1, "vid": 8, "eth_src": "00:00:00:00:00:01"}}':<class 'bytes'>
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,848 - DEBUG - poseidonMonitor:213 - got a message: FAUCET.Event:b'{"version": 1, "time": 1529512526.1286383, "dp_id": 1, "dp_name": "switch-1", "event_id": 13, "L2_LEARN": {"port_no": 2, "previous_port_no": null, "vid": 8, "eth_src": "00:00:00:00:00:02", "eth_type": 2054, "l3_src_ip": "10.0.0.2", "l3_dst_ip": "10.0.0.1"}}':<class 'bytes'>
2018-06-20T09:41:16+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:16,849 - DEBUG - poseidonMonitor:213 - got a message: FAUCET.Event:b'{"version": 1, "time": 1529512832.1655495, "dp_id": 1, "dp_name": "switch-1", "event_id": 15, "L2_EXPIRE": {"port_no": 2, "vid": 8, "eth_src": "00:00:00:00:00:02"}}':<class 'bytes'>
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,377 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,378 - DEBUG - poseidonMonitor:553 - rabbit_message:b'{"version": 1, "time": 1529512427.0744884, "dp_id": 1, "dp_name": "switch-1", "event_id": 11, "L2_EXPIRE": {"port_no": 2, "vid": 8, "eth_src": "00:00:00:00:00:02"}}'
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,378 - DEBUG - poseidonMonitor:556 - routing_key:FAUCET.Event
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,379 - DEBUG - poseidonMonitor:563 - FAUCET Event:{'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,379 - DEBUG - poseidonMonitor:599 - 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: **********************
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,381 - DEBUG - poseidonMonitor:600 - faucet_event:[{'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}]
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,382 - DEBUG - poseidonMonitor:601 - **********************
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,382 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:17+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:17,413 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,384 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,384 - DEBUG - poseidonMonitor:553 - rabbit_message:b'{"version": 1, "time": 1529512526.1242833, "dp_id": 1, "dp_name": "switch-1", "event_id": 12, "L2_LEARN": {"port_no": 1, "previous_port_no": null, "vid": 8, "eth_src": "00:00:00:00:00:01", "eth_type": 2054, "l3_src_ip": "10.0.0.1", "l3_dst_ip": "10.0.0.2"}}'
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,385 - DEBUG - poseidonMonitor:556 - routing_key:FAUCET.Event
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,386 - DEBUG - poseidonMonitor:563 - FAUCET Event:{'version': 1, 'time': 1529512526.1242833, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 12, 'L2_LEARN': {'port_no': 1, 'previous_port_no': None, 'vid': 8, 'eth_src': '00:00:00:00:00:01', 'eth_type': 2054, 'l3_src_ip': '10.0.0.1', 'l3_dst_ip': '10.0.0.2'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,386 - DEBUG - poseidonMonitor:599 - 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: **********************
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,387 - DEBUG - poseidonMonitor:600 - faucet_event:[{'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}, {'version': 1, 'time': 1529512526.1242833, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 12, 'L2_LEARN': {'port_no': 1, 'previous_port_no': None, 'vid': 8, 'eth_src': '00:00:00:00:00:01', 'eth_type': 2054, 'l3_src_ip': '10.0.0.1', 'l3_dst_ip': '10.0.0.2'}}]
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,387 - DEBUG - poseidonMonitor:601 - **********************
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,388 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,414 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,415 - DEBUG - poseidonMonitor:68  - kick
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,415 - DEBUG - faucet:79  - faucet messages: [{'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}, {'version': 1, 'time': 1529512526.1242833, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 12, 'L2_LEARN': {'port_no': 1, 'previous_port_no': None, 'vid': 8, 'eth_src': '00:00:00:00:00:01', 'eth_type': 2054, 'l3_src_ip': '10.0.0.1', 'l3_dst_ip': '10.0.0.2'}}]
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - faucet:82  - l2 faucet message: {'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - parser:170 - got faucet message for l2_expire: {'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - faucet:82  - l2 faucet message: {'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - parser:170 - got faucet message for l2_expire: {'version': 1, 'time': 1529512427.0744884, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 11, 'L2_EXPIRE': {'port_no': 2, 'vid': 8, 'eth_src': '00:00:00:00:00:02'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - faucet:82  - l2 faucet message: {'version': 1, 'time': 1529512526.1242833, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 12, 'L2_LEARN': {'port_no': 1, 'previous_port_no': None, 'vid': 8, 'eth_src': '00:00:00:00:00:01', 'eth_type': 2054, 'l3_src_ip': '10.0.0.1', 'l3_dst_ip': '10.0.0.2'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - parser:151 - got faucet message for l2_learn: {'version': 1, 'time': 1529512526.1242833, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 12, 'L2_LEARN': {'port_no': 1, 'previous_port_no': None, 'vid': 8, 'eth_src': '00:00:00:00:00:01', 'eth_type': 2054, 'l3_src_ip': '10.0.0.1', 'l3_dst_ip': '10.0.0.2'}}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - faucet:92  - get_endpoints found:
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - faucet:112 - 00:00:00:00:00:01:[{'ip-address': '10.0.0.1', 'ip-state': 'L2 learned', 'mac': '00:00:00:00:00:01', 'segment': '1', 'port': '1', 'tenant': 'VLAN8', 'active': 1}]
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,416 - DEBUG - UpdateSwitchState:235 - MACHINES:[{'ip-address': '10.0.0.1', 'mac': '00:00:00:00:00:01', 'segment': '1', 'port': '1', 'tenant': 'VLAN8', 'active': 1, 'name': None}]
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - DEBUG - UpdateSwitchState:204 - ***** detected new address {'ip-address': '10.0.0.1', 'mac': '00:00:00:00:00:01', 'segment': '1', 'port': '1', 'tenant': 'VLAN8', 'active': 1, 'name': None}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - INFO - EndpointWrapper:140 - ====START
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - INFO - EndpointWrapper:103 - *******KNOWN*********
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - INFO - EndpointWrapper:103 - *******UNKNOWN*********
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,417 - INFO - EndpointWrapper:131 - U:9200f6e0192d4e4197210d7180519129d9b71c4a:UNKNOWN->NONE:{'mac': '00:00:00:00:00:01', 'ip': '10.0.0.1', 's': '1', 'p': '1', 'v': 'VLAN8'}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,418 - DEBUG - EndpointWrapper:88  - vent update payload: {'id': '9200f6e0192d4e4197210d7180519129d9b71c4a', 'metadata': "{'prev_state': 'None', 'state': 'UNKNOWN', 'next_state': 'NONE', 'transition_time': '1529512878.4170702', 'endpoint_data': {'ip-address': '10.0.0.1', 'mac': '00:00:00:00:00:01', 'segment': '1', 'port': '1', 'tenant': 'VLAN8', 'active': 1, 'name': None}}"}
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,442 - DEBUG - EndpointWrapper:96  - collector update response: (True, 'successfully updated filter: 9200f6e0192d4e4197210d7180519129d9b71c4a')
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,442 - INFO - EndpointWrapper:103 - *******MIRRORING*********
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,443 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,443 - INFO - EndpointWrapper:103 - *******SHUTDOWN*********
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,443 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,443 - INFO - EndpointWrapper:103 - *******REINVESTIGATING*********
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,444 - INFO - EndpointWrapper:135 - None
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,444 - INFO - EndpointWrapper:144 - ****************
2018-06-20T09:41:18+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:18,444 - INFO - EndpointWrapper:145 - ====STOP
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,389 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,390 - DEBUG - poseidonMonitor:553 - rabbit_message:b'{"version": 1, "time": 1529512832.165341, "dp_id": 1, "dp_name": "switch-1", "event_id": 14, "L2_EXPIRE": {"port_no": 1, "vid": 8, "eth_src": "00:00:00:00:00:01"}}'
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,391 - DEBUG - poseidonMonitor:556 - routing_key:FAUCET.Event
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,391 - DEBUG - poseidonMonitor:563 - FAUCET Event:{'version': 1, 'time': 1529512832.165341, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 14, 'L2_EXPIRE': {'port_no': 1, 'vid': 8, 'eth_src': '00:00:00:00:00:01'}}
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,392 - DEBUG - poseidonMonitor:599 - 
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: **********************
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,392 - DEBUG - poseidonMonitor:600 - faucet_event:[{'version': 1, 'time': 1529512832.165341, 'dp_id': 1, 'dp_name': 'switch-1', 'event_id': 14, 'L2_EXPIRE': {'port_no': 1, 'vid': 8, 'eth_src': '00:00:00:00:00:01'}}]
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 2018-06-20 16:41:19,392 - DEBUG - poseidonMonitor:601 - **********************
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 
2018-06-20T09:41:19+00:00 172.17.0.1 cyberreboot-poseidon--poseidon-master-HEAD/59f75f1133bf/plugin[1428]: 

That's it. After it mirrors traffic (it will put the endpoint from an unknown state into a mirroring state, which will spin up a bunch of containers to do the ML portion, which will all get logged in that same place (they'll have their own container names and IDs in the logs to make it easy to keep it separated). The ML results will then get fed back with the predictions of the role and behavior of the endpoint, which again will show up in the logs there.

As @gregs5 mentioned, we'll working on making this deployment easier and more straightforward in the near-term, so apologies for it being rough around the edges, I hope this helps a bit.

cglewis commented 6 years ago

also, @scottkelso I realize your issue kinda of got hijacked, but let me know if you're still running into issues and how we can help.

scottkelso commented 6 years ago

Don't worry @cglewis - @toiletduck123 and I are working on the same project essentially. He is trying to create the setup physically and I am trying to do it virtually with mininet.

The reason I haven't replied is I am trying to figure out how to get the mininet switch to mirror traffic outside of mininet, to the host machine that Poseidon and Faucet are both running on. I am also using docker-mininet

Once I confirm that traffic is getting to the host machine, then I can begin verifying that Poseidon is up and running correctly.

j543453 commented 6 years ago

Thank you very much for taking the time to write out that comment, it was extremely helpful and Poseidon has now detected the endpoints! Rabbitmq worked perfectly on the other machine (Poseidon machine) and the consumer gave the desired output.

I believe my problem was simply the order in which I carried out the setup, like you said, "You don't want to do the ping to learn at L3 until Poseidon is up though, otherwise Faucet will learn about them and send the event before Poseidon is ready to receive it."

I had the traffic sending from the get go (before Poseidon was running), as I assumed I would have wanted the traffic flowing whilst Poseidon starts up so it would detect it.

Again, thank you, onto the Machine Learning!

j543453 commented 6 years ago

One thing I notice as I try to replicate this process numerous times, when I run faucet, turn the switch on and then connect the two clients, only sometimes does the L3 src and dst = None in the Faucet logs, many times it has an IP address or a hex number so I have to keep unplugging and replugging until it is None, what would the explanation be for this?

edit: Trying the whole thing from the beginning and now rabbitmq isn't picking anything up when I build Faucet and Poseidon shows no endpoints, something is being very tempermental on my side and I am trying to figure out which it is. The Faucet logs look fine.

cglewis commented 6 years ago

Great, glad to hear you got it up and running!

So, as for the L3 src/dst, that would be a better question for the faucetsdn/faucet repo. There is an issue open for supporting L2 learning with Poseidon here: #559 which would alleviate this issue. However, the learning and expiration events from Faucet are just being consumed by whatever Faucet sends. I believe the default for expiration from Faucet is 5 minutes (there's a setting you can change in the faucet.yaml I believe) upon expiration it relearns when new traffic with it comes across, which may or may not be a packet with L3. If it's a packet at only L2 Faucet will learn the L2 and mark the L3 as None. Or it could also learn it as a broadcast and set the L3 to 0.0.0.0. Poseidon is currently set to ignore all of those cases since it is focused on identifying roles of devices, where a device is an L3 IP address.

cglewis commented 6 years ago

For this particular issue (see the original problem at the top) has this been resolved? @scottkelso @toiletduck123 you're welcome to open more issues, just want to verify if the original problem has been solved. Thanks!

scottkelso commented 6 years ago

Both @toiletduck123 and I would be confident to say yes, the original problem of Poseidon not building is resolved. I would however be grateful if you could tell me how your mininet h3 (the host your ovsk switch is mirroring traffic to from port 3) exposes that mirrored traffic to a machine with Poseidon running on it (specifically the machine which is hosting mininet).

cglewis commented 6 years ago

@scottkelso sure no problem. So assuming you have Poseidon running on the same host that you're running mininet on, if you start mininet with the mn command I mentioned above (sudo mn --topo single,3 --mac --switch ovsk --controller=remote,ip=127.0.0.1:6653) you'll see in your ifconfig something like the following:

s1-eth1   Link encap:Ethernet  HWaddr 8e:5d:f2:3d:0b:ad  
          inet6 addr: fe80::8c5d:f2ff:fe3d:bad/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:6 errors:0 dropped:0 overruns:0 frame:0
          TX packets:13 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:556 (556.0 B)  TX bytes:1555 (1.5 KB)

s1-eth2   Link encap:Ethernet  HWaddr d6:28:91:2f:dd:bc  
          inet6 addr: fe80::d428:91ff:fe2f:ddbc/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:7 errors:0 dropped:0 overruns:0 frame:0
          TX packets:11 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:646 (646.0 B)  TX bytes:1127 (1.1 KB)

s1-eth3   Link encap:Ethernet  HWaddr ea:4f:7a:13:4d:51  
          inet6 addr: fe80::e84f:7aff:fe13:4d51/64 Scope:Link
          UP BROADCAST RUNNING MULTICAST  MTU:1500  Metric:1
          RX packets:8 errors:0 dropped:0 overruns:0 frame:0
          TX packets:14 errors:0 dropped:0 overruns:0 carrier:0
          collisions:0 txqueuelen:1000 
          RX bytes:732 (732.0 B)  TX bytes:1757 (1.7 KB)

Where it created 3 new interfaces, one for each host on that switch. So, s1 being the switch, and eth1-3 being the hosts. So now if you tell Faucet to set port 3 to be output_only in the faucet.yaml and you tell Poseidon to use port 3 as the mirror port for that switch (export controller_mirror_ports='{"switch-1":3}') then when Faucet sends a learn event on say port 1 or port 2, Poseidon will tell Faucet to reconfigure port 3 to mirror from the port that learned a new event on (you can look at your faucet.yaml once Poseidon has put an endpoint into the mirroring state and you'll see that Poseidon changed it):

For example you'll see faucet.yaml change to something like this:

$ cat /etc/faucet/faucet.yaml
dps:
  switch-1:
    dp_id: 0x1
    hardware: Open vSwitch
    interfaces:
      1:
        native_vlan: 8
      2:
        native_vlan: 8
      3:
        description: mirror
        mirror:
        - 2
        - 1
        output_only: true
vlans:
  8:

Since that traffic is now being mirroring to port 3 (host 3 on mininet) it will be seen on that interface from ifconfig, so we can have Poseidon hook into that interface to save off pcaps from the mirror and do the analysis. So in this case you want s1-eth3 and you set it with this: export collector_nic=s1-eth3.

Note as in the above walk-through the exports have to be done before Poseidon starts for Poseidon to pick them up.

You can find and inspect the pcaps and all of the stages of data transformations and analysis that Poseidon captures in the vent_files directory which is stored in /opt by default.

scottkelso commented 6 years ago

Ah yes, my collector_nic would have been wrong. I was initially seeing the same repeated endpoint output as @toiletduck123 but have managed to get it working. See steps below.

2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,063 - DEBUG - faucet:92  - get_endpoints found:
2018-06-19T09:36:15+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:15,063 - DEBUG - UpdateSwitchState:235 - MACHINES:[]
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,034 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,034 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:16+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:16,070 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,035 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,036 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:17+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:17,072 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,037 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,037 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:18+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:18,073 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,038 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,038 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:19+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:19,074 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,039 - DEBUG - poseidonMonitor:586 - woke from sleeping
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,039 - DEBUG - poseidonMonitor:584 - ***************CTRL_C:{'STOP': False}
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,075 - DEBUG - poseidonMonitor:233 - scheduler woke st_worker
2018-06-19T09:36:20+00:00 172.17.0.1 plugin[1473]: 2018-06-19 08:36:20,076 - DEBUG - poseidonMonitor:68  - kick

I was still getting this output even when done the steps in the following order...

The only difference here is I haven't completed all of the RabbitMQ steps because I already have a naively installed and running instance of Faucet, Guage, Prometheus & Grafana-server (hence Faucet restart command systemctl restart faucet.service). I wasn't sure how to export these facuet configurations to this native Faucet...

$ export FAUCET_EVENT_SOCK=1
$ export FAUCET_CONFIG_STAT_RELOAD=1
$ export FA_RABBIT_HOST=192.168.0.7 #unnecessary because not running rabbitmq 

so I stopped them and did the docker-compose command as you said above instead...

$ systemctl stop faucet.service guage.service prometheus.service grafana-server.service
$ cd faucet/
$ docker-compose -f docker-compose.yaml -f adapters/vendors/rabbitmq/docker-compose.yaml up --build -d

Which eventually worked as long as I still did everything in that particular order as defined above.

Again thank you for your help and patience @cglewis!

cglewis commented 6 years ago

Yeah, that's a good question regarding putting environments variables into Faucet when it's running as a service, I suspect you'd have to edit the systemd unit for faucet.service to include the environment variables on service start. Going to go ahead and close this issue - thanks for sticking with it @scottkelso @jaiken06, and feel to reach out if you run into other issues.