TU-Berlin-SNET / tresor-tracking

This project is used to manage feedback on the TRESOR ecosystem giving partners issues-only access.
0 stars 0 forks source link

Install TRESOR Ecosystem via Docker following the readme Problems #1

Closed ilke-zilci closed 8 years ago

ilke-zilci commented 9 years ago

................

Seems to be stuck at : esfilter_1 | 2015-04-07T14:26:47.433Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004

Should I try to "Setup test services in the TRESOR broker"? Or should I wait until the process ends (the process is "docker-compose up"; CPU usage is 0% ) ?

More at: https://github.com/cyclone-project/cyclone-tracking/issues/60

omer-ilhan commented 9 years ago

At that point the build process has finished and the ecosystem seems to be up and running as it should. Could you elaborate why you think it's stuck?

As for the warnings, the debconf warnings can be ignored but I could not replicate the "cannot parse SRV response" warning. Maybe only a temporary problem with the internet connection. Does it still happen?

lodygens commented 9 years ago

I supposed it was stuck because I though it was the installation procedure only... But if you say it is running, I will now check all that. Thank you for your answer.

lodygens commented 9 years ago

Here I am:

On terminal 1

$> docker-compose up

blabla...

esfilter_1 | 2015-04-07T14:26:47.433Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004

So, I assume this is running.

On another terminal (terminal 2), I run: $> docker exec -i -t tresorecosystem_broker_1 /bin/bash -c "source /usr/local/rvm/scripts/rvm && RAILS_ENV=production rake tresor:setup_environment"

rake aborted! Moped::Errors::ConnectionFailure: Could not connect to a primary node for replica set #<Moped::Cluster:53721300 @seeds=[]> /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/cluster.rb:248:in with_primary' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:55:inblock in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:in call' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:71:in rescue in with_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:64:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:54:in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/database.rb:72:incommand' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/query.rb:41:in count' /usr/local/rvm/gems/ruby-2.1.5/bundler/gems/mongoid-8cb17e983997/lib/mongoid/persistable/deletable.rb:142:indelete_all' /root/tresor-broker/lib/tasks/tresor.rake:38:in map' /root/tresor-broker/lib/tasks/tresor.rake:38:inblock (2 levels) in <top (required)>' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in eval' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in

' Moped::Errors::ConnectionFailure: Could not connect to a primary node for replica set #<Moped::Cluster:53721300 @seeds=[]> /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/cluster.rb:248:in with_primary' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:55:inblock in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:in call' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:54:in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/database.rb:72:incommand' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/query.rb:41:in count' /usr/local/rvm/gems/ruby-2.1.5/bundler/gems/mongoid-8cb17e983997/lib/mongoid/persistable/deletable.rb:142:indelete_all' /root/tresor-broker/lib/tasks/tresor.rake:38:in map' /root/tresor-broker/lib/tasks/tresor.rake:38:inblock (2 levels) in <top (required)>' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in eval' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in
' Tasks: TOP => tresor:setup_environment (See full trace by running task with --trace)

Back on terminal 1, we can see:

esfilter_1 | 2015-04-07T14:26:47.433Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004

broker_1 | W, [2015-04-08T09:29:34.242795 #27469] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-08T09:29:34.252244 #27469] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-08T09:29:34.256553 #27469] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-08T09:29:34.280163 #27469] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-08T09:29:34.424397 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-08T09:29:34.426655 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-08T09:29:34.527043 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-08T09:29:34.528404 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-08T09:29:34.529849 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-08T09:29:34.703291 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-08T09:29:34.705333 #27469] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-08T09:29:34.710032 #27469] INFO -- : Loaded compendium. broker_1 | W, [2015-04-08T09:29:35.416532 #27469] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | W, [2015-04-08T09:29:35.417257 #27469] WARN -- : MOPED: Retrying connection attempt 1 more time(s). runtime: n/a broker_1 | W, [2015-04-08T09:29:35.671875 #27469] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a

lodygens commented 9 years ago

Is it assumed that we have a resolved address : mongodb, but we don't.

Shouldn't such a resolved address had been installed by the installation procedure ("docker-compose up") ?

omer-ilhan commented 9 years ago

Absolutely, the address is created but the necessary port is not exposed. I am committing a fix now.

Thank you for the feedback!

lodygens commented 9 years ago

Thanks. What should I do now? Reinstall everything from the begining ("docker-compose up?") ?

omer-ilhan commented 9 years ago

Exactly, just do docker-compose up to build and run. Docker should automatically reuse already existing containers so it should take considerably less time than on the first run.

More information regarding docker-compose at: docs.docker.com/compose

lodygens commented 9 years ago

Still have connection errors. (please re open the ticket until everything is solved)

$> docker exec -i -t tresorecosystem_broker_1 /bin/bash -c "source /usr/local/rvm/scripts/rvm && RAILS_ENV=production rake tresor:setup_environment" rake aborted! Moped::Errors::ConnectionFailure: Could not connect to a primary node for replica set #<Moped::Cluster:41508760 @seeds=[]> /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/cluster.rb:248:in with_primary' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:55:inblock in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:in call' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:71:in rescue in with_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:64:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:54:in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/database.rb:72:incommand' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/query.rb:41:in count' /usr/local/rvm/gems/ruby-2.1.5/bundler/gems/mongoid-8cb17e983997/lib/mongoid/persistable/deletable.rb:142:indelete_all' /root/tresor-broker/lib/tasks/tresor.rake:38:in map' /root/tresor-broker/lib/tasks/tresor.rake:38:inblock (2 levels) in <top (required)>' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in eval' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in

' Moped::Errors::ConnectionFailure: Could not connect to a primary node for replica set #<Moped::Cluster:41508760 @seeds=[]> /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/cluster.rb:248:in with_primary' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:55:inblock in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:in call' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/selectable.rb:65:inwith_retry' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/read_preference/primary.rb:54:in with_node' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/database.rb:72:incommand' /usr/local/rvm/gems/ruby-2.1.5/gems/moped-2.0.0/lib/moped/query.rb:41:in count' /usr/local/rvm/gems/ruby-2.1.5/bundler/gems/mongoid-8cb17e983997/lib/mongoid/persistable/deletable.rb:142:indelete_all' /root/tresor-broker/lib/tasks/tresor.rake:38:in map' /root/tresor-broker/lib/tasks/tresor.rake:38:inblock (2 levels) in <top (required)>' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in eval' /usr/local/rvm/gems/ruby-2.1.5/bin/ruby_executable_hooks:15:in
' Tasks: TOP => tresor:setup_environment

Cheers.

lodygens commented 9 years ago

This is due to the fact that "mongodb" is not a resolved address (as from the beginning of this issue)

Jeu avr 09 12:45:58 $> docker-compose up Recreating tresorecosystem_mongodb_1... Recreating tresorecosystem_kibana_1... Recreating tresorecosystem_elasticsearch_1... Recreating tresorecosystem_esfilter_1... Recreating tresorecosystem_logstash_1... Recreating tresorecosystem_pdp_1... Recreating tresorecosystem_broker_1... Recreating tresorecosystem_pap_1... Recreating tresorecosystem_proxytls_1... Recreating tresorecosystem_proxyhttp_1... Attaching to tresorecosystem_kibana_1, tresorecosystem_elasticsearch_1, tresorecosystem_esfilter_1, tresorecosystem_logstash_1, tresorecosystem_pdp_1, tresorecosystem_broker_1, tresorecosystem_pap_1, tresorecosystem_proxytls_1, tresorecosystem_proxyhttp_1 kibana_1 | kibana-dyn-config provider listening on 0.0.0.0:3003 kibana_1 | Expecting elasticsearch at http://localhost:4004 pap_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.17.0.38. Set the 'ServerName' directive globally to suppress this message elasticsearch_1 | [2015-04-09 10:48:50,456][INFO ][node ] [Burstarr] version[1.5.0], pid[1], build[5448160/2015-03-23T14:30:58Z] elasticsearch_1 | [2015-04-09 10:48:50,457][INFO ][node ] [Burstarr] initializing ... elasticsearch_1 | [2015-04-09 10:48:50,471][INFO ][plugins ] [Burstarr] loaded [], sites [] esfilter_1 | 2015-04-09T10:48:49.967Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004 broker_1 | --> Downloading a Phusion Passenger agent binary for your platform broker_1 | broker_1 | --> Installing Nginx 1.6.2 engine broker_1 | broker_1 | -------------------------- broker_1 | broker_1 | --> Compiling passenger_native_support.so for the current Ruby interpreter... broker_1 | (set PASSENGER_COMPILE_NATIVE_SUPPORT_BINARY=0 to disable) broker_1 | Compilation succesful. The logs are here: broker_1 | /tmp/passenger_native_support-fde86j.log broker_1 | --> passenger_native_support.so successfully loaded. broker_1 | =============== Phusion Passenger Standalone web server started =============== broker_1 | PID file: /root/tresor-broker/passenger.3000.pid broker_1 | Log file: /root/tresor-broker/log/passenger.3000.log broker_1 | Environment: production broker_1 | Accessible via: http://0.0.0.0:3000/ broker_1 | broker_1 | You can stop Phusion Passenger Standalone by pressing Ctrl-C. broker_1 | Problems? Check https://www.phusionpassenger.com/documentation/Users%20guide%20Standalone.html#troubleshooting broker_1 | =============================================================================== broker_1 | App 173 stdout: elasticsearch_1 | [2015-04-09 10:49:06,580][INFO ][node ] [Burstarr] initialized elasticsearch_1 | [2015-04-09 10:49:06,583][INFO ][node ] [Burstarr] starting ... pdp_1 | 2015-04-09 10:49:06,611 ERROR TcpSocketManager (TCP:logstash:9400) java.net.ConnectException: Connection refused elasticsearch_1 | [2015-04-09 10:49:07,419][INFO ][transport ] [Burstarr] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/172.17.0.28:9300]} elasticsearch_1 | [2015-04-09 10:49:07,508][INFO ][discovery ] [Burstarr] elasticsearch/eb19A10vTr6Lk1WHtPj50A pdp_1 | pdp1 | . ____ pdp1 | /\ / **'_ () _ \ \ \ \ pdp1 | ( ( )** | ' | '| | ' \/ ` | \ \ \ \ pdp1 | \/ )| |)| | | | | || (| | ) ) ) ) pdp1 | ' |____| .**|| ||| |_**, | / / / / pdp1 | =========||==============|__/=//// pdp_1 | :: Spring Boot :: (v1.1.9.RELEASE) pdp_1 | pdp_1 | Starting ContextHandler on a5d81c099fc1 with PID 1 (/opt/tresor-pdp.jar started by root in /opt) pdp_1 | 2015-04-09 10:49:08,193 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-09 10:49:08,194 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.boot.StartupInfoLogger.logStarting(StartupInfoLogger.java:52) pdp_1 | at org.springframework.boot.SpringApplication.logStartupInfo(SpringApplication.java:583) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:308) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | pdp_1 | Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@5a5a7c64: startup date [Thu Apr 09 10:49:08 UTC 2015]; root of context hierarchy pdp_1 | 2015-04-09 10:49:08,266 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-09 10:49:08,267 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.prepareRefresh(AbstractApplicationContext.java:510) pdp_1 | at org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext.prepareRefresh(AnnotationConfigEmbeddedWebApplicationContext.java:175) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:449) pdp_1 | at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:109) pdp_1 | at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:691) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:320) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | broker_1 | W, [2015-04-09T10:49:09.769590 #173] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-09T10:49:09.777560 #173] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-09T10:49:09.783053 #173] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-09T10:49:09.801756 #173] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-09T10:49:09.924020 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-09T10:49:09.926378 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-09T10:49:10.060167 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-09T10:49:10.061542 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-09T10:49:10.062673 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-09T10:49:10.189991 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-09T10:49:10.191890 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-09T10:49:10.196188 #173] INFO -- : Loaded compendium. elasticsearch_1 | [2015-04-09 10:49:11,322][INFO ][cluster.service ] [Burstarr] new_master [Burstarr][eb19A10vTr6Lk1WHtPj50A][1009b8cd3d0e][inet[/172.17.0.28:9300]], reason: zen-disco-join (elected_as_master) elasticsearch_1 | [2015-04-09 10:49:11,436][INFO ][http ] [Burstarr] bound_address {inet[/0:0:0:0:0:0:0:0:9200]}, publish_address {inet[/172.17.0.28:9200]} elasticsearch_1 | [2015-04-09 10:49:11,436][INFO ][node ] [Burstarr] started pdp_1 | JSR-330 'javax.inject.Inject' annotation found and supported for autowiring pdp_1 | Apr 09, 2015 10:49:11 AM org.hibernate.validator.internal.util.Version pdp_1 | INFO: HV000001: Hibernate Validator 5.0.3.Final broker_1 | W, [2015-04-09T10:49:11.797344 #195] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | I, [2015-04-09T10:49:11.824093 #195] INFO -- : Started HEAD "/" for 127.0.0.1 at 2015-04-09 10:49:11 +0000 broker_1 | App 195 stdout: pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration' of type [class org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration$$EnhancerBySpringCGLIB$$3cde7e64] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'securityProperties' of type [class org.springframework.boot.autoconfigure.security.SecurityProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration$$EnhancerBySpringCGLIB$$fe8601c2] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.hibernate.validator.internal.constraintvalidators.NotNullValidator' of type [class org.hibernate.validator.internal.constraintvalidators.NotNullValidator] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'managementServerProperties' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration$$EnhancerBySpringCGLIB$$9f4b0ab4] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration' of type [class org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration$$EnhancerBySpringCGLIB$$ab483d34] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'objectPostProcessor' of type [class org.springframework.security.config.annotation.configuration.AutowireBeanFactoryObjectPostProcessor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration' of type [class org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration$$EnhancerBySpringCGLIB$$923a6da2] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration' of type [class org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration$$EnhancerBySpringCGLIB$$a089d8e9] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler@2e6f4093' of type [class org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration' of type [class org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration$$EnhancerBySpringCGLIB$$337d9466] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'methodSecurityMetadataSource' of type [class org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'metaDataSourceAdvisor' of type [class org.springframework.security.access.intercept.aopalliance.MethodSecurityMetadataSourceAdvisor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) elasticsearch_1 | [2015-04-09 10:49:13,723][INFO ][gateway ] [Burstarr] recovered [2] indices into cluster_state logstash_1 | {:timestamp=>"2015-04-09T10:49:14.061000+0000", :message=>"Using milestone 2 input plugin 'tcp'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} logstash_1 | {:timestamp=>"2015-04-09T10:49:14.512000+0000", :message=>"Using milestone 1 input plugin 'log4j'. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} pdp_1 | Server initialized with port: 8080 pdp_1 | jetty-8.1.15.v20140411 logstash_1 | {:timestamp=>"2015-04-09T10:49:14.579000+0000", :message=>"Using milestone 2 filter plugin 'json'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} pdp_1 | Initializing Spring embedded WebApplicationContext pdp_1 | Root WebApplicationContext: initialization completed in 6421 ms pdp_1 | Mapped "{[/dump],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp1 | Mapped "{[/metrics/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.MetricsMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/metrics],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/configprops],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/trace],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/mappings],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/info],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp1 | Mapped "{[/env/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EnvironmentMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/env],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/autoconfig],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/health],methods=[],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.HealthMvcEndpoint.invoke() pdp_1 | Mapped "{[/beans],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Creating filter chain: Ant [pattern='/css/'], [] pdp_1 | Creating filter chain: Ant [pattern='/js/'], [] pdp_1 | Creating filter chain: Ant [pattern='/images/'], [] pdp_1 | Creating filter chain: Ant [pattern='//favicon.ico'], [] pdp_1 | Creating filter chain: Ant [pattern='/info'], [] pdp_1 | Creating filter chain: Ant [pattern='/health'], [] pdp1 | Creating filter chain: Ant [pattern='/policy/*'], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@336ff06c, org.springframework.security.web.context.SecurityContextPersistenceFilter@179f6076, org.springframework.security.web.header.HeaderWriterFilter@3a033047, org.springframework.security.web.authentication.logout.LogoutFilter@3041819a, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@25187150, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@52eefcae, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@5c97f922, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@7a115cd5, org.springframework.security.web.session.SessionManagementFilter@5fda5fa6, org.springframework.security.web.access.ExceptionTranslationFilter@19d07227] pdp1 | Creating filter chain: OrRequestMatcher [requestMatchers=[Ant [pattern='/dump'], Ant [pattern='/dump/'], Ant [pattern='/dump.'], Ant [pattern='/metrics'], Ant [pattern='/metrics/'], Ant [pattern='/metrics.'], Ant [pattern='/configprops'], Ant [pattern='/configprops/'], Ant [pattern='/configprops.'], Ant [pattern='/trace'], Ant [pattern='/trace/'], Ant [pattern='/trace.'], Ant [pattern='/mappings'], Ant [pattern='/mappings/'], Ant [pattern='/mappings.'], Ant [pattern='/env'], Ant [pattern='/env/'], Ant [pattern='/env.'], Ant [pattern='/autoconfig'], Ant [pattern='/autoconfig/'], Ant [pattern='/autoconfig.'], Ant [pattern='/beans'], Ant [pattern='/beans/'], Ant [pattern='/beans._']]], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@44844212, org.springframework.security.web.context.SecurityContextPersistenceFilter@3a192cf2, org.springframework.security.web.header.HeaderWriterFilter@6a38ceb8, org.springframework.security.web.authentication.logout.LogoutFilter@366df14a, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@4bc63fc3, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@4ad5e1aa, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@1fd28648, org.springframework.security.web.session.SessionManagementFilter@378bc883, org.springframework.security.web.access.ExceptionTranslationFilter@21b0fb27, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@64cdcc2d] pdp_1 | Mapping servlet: 'dispatcherServlet' to [/] pdp1 | Mapping filter: 'metricFilter' to: [/] pdp1 | Mapping filter: 'springSecurityFilterChain' to: [/] pdp1 | Mapping filter: 'applicationContextIdFilter' to: [/] pdp_1 | Mapping filter: 'webRequestLoggingFilter' to: [/*] pdp_1 | Initialized FileBasedClientIdServiceIdPolicyStore with path /opt/policies pdp_1 | Initialized WeekdayPIP pdp_1 | Initialized LocationPIP with url http://ls.snet.tu-berlin.de:8080/pe/api/v2/pdp and authentication pdp_1 | Initialized PIPAttributeFinderModule, number of pips: 2 pdp_1 | Mapped "{[/policy/{clientId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/json],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicies(java.lang.String,org.springframework.security.core.userdetails.UserDetails) throws com.fasterxml.jackson.core.JsonProcessingException pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[PUT],params=[],headers=[],consumes=[application/xacml+xml],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.putPolicy(java.lang.String,java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[DELETE],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.deletePolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xml],custom=[]}" onto public java.lang.String org.snet.tresor.pdp.contexthandler.controller.HomeController.getHomeDocument() pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/xacml+xml],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLDecision(java.lang.String) pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/samlassertion+xml],produces=[application/samlassertion+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLSAMLDecision(java.lang.String) pdp_1 | Looking for @ControllerAdvice: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@5a5a7c64: startup date [Thu Apr 09 10:49:08 UTC 2015]; root of context hierarchy pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Starting beans in phase 0 pdp_1 | Located managed bean 'requestMappingEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=requestMappingEndpoint] pdp_1 | Located managed bean 'environmentEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=environmentEndpoint] pdp_1 | Located managed bean 'healthEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=healthEndpoint] pdp_1 | Located managed bean 'beansEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=beansEndpoint] pdp_1 | Located managed bean 'infoEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=infoEndpoint] pdp_1 | Located managed bean 'metricsEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=metricsEndpoint] pdp_1 | Located managed bean 'traceEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=traceEndpoint] pdp_1 | Located managed bean 'dumpEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=dumpEndpoint] pdp_1 | Located managed bean 'autoConfigurationAuditEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=autoConfigurationAuditEndpoint] pdp_1 | Located managed bean 'shutdownEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=shutdownEndpoint] pdp_1 | Located managed bean 'configurationPropertiesReportEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=configurationPropertiesReportEndpoint] pdp_1 | Initializing Spring FrameworkServlet 'dispatcherServlet' pdp_1 | FrameworkServlet 'dispatcherServlet': initialization started pdp_1 | FrameworkServlet 'dispatcherServlet': initialization completed in 41 ms pdp_1 | Started SelectChannelConnector@0.0.0.0:8080 pdp_1 | Jetty started on port: 8080 pdp_1 | Started ContextHandler in 13.091 seconds (JVM running for 25.588) broker_1 | W, [2015-04-09T11:31:50.007836 #1292] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-09T11:31:50.016248 #1292] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-09T11:31:50.021832 #1292] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-09T11:31:50.040472 #1292] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-09T11:31:50.181810 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-09T11:31:50.183692 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-09T11:31:50.285543 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-09T11:31:50.286157 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-09T11:31:50.287995 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-09T11:31:50.456375 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-09T11:31:50.457492 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-09T11:31:50.462805 #1292] INFO -- : Loaded compendium. broker_1 | W, [2015-04-09T11:31:51.118468 #1292] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | W, [2015-04-09T11:31:51.119187 #1292] WARN -- : MOPED: Retrying connection attempt 1 more time(s). runtime: n/a broker_1 | W, [2015-04-09T11:31:51.376140 #1292] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a

omer-ilhan commented 9 years ago

I am trying to replicate the issue now so to summarize: in terminal 1 you deploy with docker-compose up and in terminal 2 you run docker exec -i -t tresorecosystem_broker_1 /bin/bash -c "source /usr/local/rvm/scripts/rvm && RAILS_ENV=production rake tresor:setup_environment" as described in Setup test services in the TRESOR broker. Is that correct?

Also, could you post the contents of your docker-compose.yml which is in the tresor-ecosystem folder. The first 10-20 lines should suffice.

Thanks.

lodygens commented 9 years ago

yes it is correct

here is the yml header

$> head -40 docker-compose.yml mongodb: image: mongo:2.6 command: mongod --smallfiles volumes:

elasticsearch: image: dockerfile/elasticsearch volumes:

logstash: build: components/tresor-logging/logstash volumes:

esfilter: build: components/tresor-logging/node-es-filter-proxy links:

omer-ilhan commented 9 years ago

Thank you. It seems like that is the old docker-compose.yml. Please update (e.g. git pull) and then deploy.

lodygens commented 9 years ago

oups! yes I forgot to update let see... :)

lodygens commented 9 years ago

As I mentionned, it does not seem to be a question of exposed port, but more a question that mongodb is not a resolved address:

broker_1 | W, [2015-04-09T13:56:31.436782 #194] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | I, [2015-04-09T13:56:31.452958 #194] INFO -- : Started HEAD "/" for 127.0.0.1 at 2015-04-09 13:56:31 +0000 broker_1 | [ 2015-04-09 13:56:31.4737 133/7fd1b96b8700 Ser/Server.h:931 ]: [Client 1-1] Disconnecting client with error: client socket write error: Broken pipe (errno=32)

lodygens commented 9 years ago

With docker-compose.yml updated: $> head -40 docker-compose.yml mongodb: image: mongo:2.6 command: mongod --smallfiles volumes:

mathiasslawik commented 9 years ago

Some suggestions: Compare Docker and Docker-Compose versions.

Check, if you can resolve any name from a running container. You can enter a running container with docker exec -i -t...

Is the mongodb container even running? (check with docker ps)

Maybe the MongoDB data files are corrupted. Delete the data/mongodb folder and try again.

Did you wait until all containers are up before issuing the broker command?

mathiasslawik commented 9 years ago

From the console output it looks like Docker-Compose does not recreate the mongodb container, as it is not listed in the output.

Comment everything out but the mongodb container from the yml file and try to run docker compose. It should show you the error.

Am 09.04.2015 13:33 schrieb Oleg Lodygensky notifications@github.com:

This is due to the fact that "mongodb" is not a resolved address (as from the beginning of this issue)

Jeu avr 09 12:45:58 $> docker-compose up Recreating tresorecosystem_mongodb_1... Recreating tresorecosystem_kibana_1... Recreating tresorecosystem_elasticsearch_1... Recreating tresorecosystem_esfilter_1... Recreating tresorecosystem_logstash_1... Recreating tresorecosystem_pdp_1... Recreating tresorecosystem_broker_1... Recreating tresorecosystem_pap_1... Recreating tresorecosystem_proxytls_1... Recreating tresorecosystem_proxyhttp_1... Attaching to tresorecosystem_kibana_1, tresorecosystem_elasticsearch_1, tresorecosystem_esfilter_1, tresorecosystem_logstash_1, tresorecosystem_pdp_1, tresorecosystem_broker_1, tresorecosystem_pap_1, tresorecosystem_proxytls_1, tresorecosystem_proxyhttp_1 kibana_1 | kibana-dyn-config provider listening on 0.0.0.0:3003 kibana_1 | Expecting elasticsearch at http://localhost:4004 pap_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.17.0.38. Set the 'ServerName' directive globally to suppress this message elasticsearch_1 | [2015-04-09 10:48:50,456][INFO ][node ] [Burstarr] version[1.5.0], pid[1], build[5448160/2015-03-23T14:30:58Z] elasticsearch_1 | [2015-04-09 10:48:50,457][INFO ][node ] [Burstarr] initializing ... elasticsearch_1 | [2015-04-09 10:48:50,471][INFO ][plugins ] [Burstarr] loaded [], sites [] esfilter_1 | 2015-04-09T10:48:49.967Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004 broker_1 | --> Downloading a Phusion Passenger agent binary for your platform broker_1 | broker_1 | --> Installing Nginx 1.6.2 engine broker_1 | broker_1 | -------------------------- broker_1 | broker_1 | --> Compiling passenger_native_support.so for the current Ruby interpreter... broker_1 | (set PASSENGER_COMPILE_NATIVE_SUPPORT_BINARY=0 to disable) broker_1 | Compilation succesful. The logs are here: broker_1 | /tmp/passenger_native_support-fde86j.log broker_1 | --> passenger_native_support.so successfully loaded. broker_1 | =============== Phusion Passenger Standalone web server started =============== broker_1 | PID file: /root/tresor-broker/passenger.3000.pid broker_1 | Log file: /root/tresor-broker/log/passenger.3000.log broker_1 | Environment: production broker_1 | Accessible via: http://0.0.0.0:3000/ broker_1 | broker_1 | You can stop Phusion Passenger Standalone by pressing Ctrl-C. broker_1 | Problems? Check https://www.phusionpassenger.com/documentation/Users%20guide%20Standalone.html#troubleshooting broker_1 | =============================================================================== broker_1 | App 173 stdout: elasticsearch_1 | [2015-04-09 10:49:06,580][INFO ][node ] [Burstarr] initialized elasticsearch_1 | [2015-04-09 10:49:06,583][INFO ][node ] [Burstarr] starting ... pdp_1 | 2015-04-09 10:49:06,611 ERROR TcpSocketManager (TCP:logstash:9400) java.net.ConnectException: Connection refused elasticsearch_1 | [2015-04-09 10:49:07,419][INFO ][transport ] [Burstarr] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/172.17.0.28:9300]} elasticsearch_1 | [2015-04-09 10:49:07,508][INFO ][discovery ] [Burstarr] elasticsearch/eb19A10vTr6Lk1WHtPj50A pdp_1 | pdp1 | . ____ pdp_1 | /\ / ' () _ \ \ \ \ pdp1 | ( ( ) | '_ | '| | ' \/ ` | \ \ \ \ pdp_1 | \/ **)| |)| | | | | || (| | ) ) ) ) pdp_1 | ' |**| .|| ||| |\, | / / / / pdp_1 | =========||==============|/=///_/ pdp_1 | :: Spring Boot :: (v1.1.9.RELEASE) pdp_1 | pdp_1 | Starting ContextHandler on a5d81c099fc1 with PID 1 (/opt/tresor-pdp.jar started by root in /opt) pdp_1 | 2015-04-09 10:49:08,193 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-09 10:49:08,194 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.boot.StartupInfoLogger.logStarting(StartupInfoLogger.java:52) pdp_1 | at org.springframework.boot.SpringApplication.logStartupInfo(SpringApplication.java:583) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:308) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | pdp_1 | Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@5a5a7c6https://github.com/org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext/tresor-tracking/commit/5a5a7c64: startup date [Thu Apr 09 10:49:08 UTC 2015]; root of context hierarchy pdp_1 | 2015-04-09 10:49:08,266 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-09 10:49:08,267 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.prepareRefresh(AbstractApplicationContext.java:510) pdp_1 | at org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext.prepareRefresh(AnnotationConfigEmbeddedWebApplicationContext.java:175) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:449) pdp_1 | at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:109) pdp_1 | at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:691) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:320) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | broker_1 | W, [2015-04-09T10:49:09.769590 #173] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-09T10:49:09.777560 #173] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-09T10:49:09.783053 #173] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-09T10:49:09.801756 #173] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-09T10:49:09.924020 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-09T10:49:09.926378 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-09T10:49:10.060167 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-09T10:49:10.061542 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-09T10:49:10.062673 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-09T10:49:10.189991 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-09T10:49:10.191890 #173] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-09T10:49:10.196188 #173] INFO -- : Loaded compendium. elasticsearch_1 | [2015-04-09 10:49:11,322][INFO ][cluster.service ] [Burstarr] new_master [Burstarr][eb19A10vTr6Lk1WHtPj50A][1009b8cd3d0e][inet[/172.17.0.28:9300]], reason: zen-disco-join (elected_as_master) elasticsearch_1 | [2015-04-09 10:49:11,436][INFO ][http ] [Burstarr] bound_address {inet[/0:0:0:0:0:0:0:0:9200]}, publish_address {inet[/172.17.0.28:9200]} elasticsearch_1 | [2015-04-09 10:49:11,436][INFO ][node ] [Burstarr] started pdp_1 | JSR-330 'javax.inject.Inject' annotation found and supported for autowiring pdp_1 | Apr 09, 2015 10:49:11 AM org.hibernate.validator.internal.util.Version pdp_1 | INFO: HV000001: Hibernate Validator 5.0.3.Final broker_1 | W, [2015-04-09T10:49:11.797344 #195] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | I, [2015-04-09T10:49:11.824093 #195] INFO -- : Started HEAD "/" for 127.0.0.1 at 2015-04-09 10:49:11 +0000 broker_1 | App 195 stdout: pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration' of type [class org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration$$EnhancerBySpringCGLIB$$3cde7e64] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'securityProperties' of type [class org.springframework.boot.autoconfigure.security.SecurityProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration$$EnhancerBySpringCGLIB$$fe8601c2] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.hibernate.validator.internal.constraintvalidators.NotNullValidator' of type [class org.hibernate.validator.internal.constraintvalidators.NotNullValidator] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'managementServerProperties' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration$$EnhancerBySpringCGLIB$$9f4b0ab4] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration' of type [class org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration$$EnhancerBySpringCGLIB$$ab483d34] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'objectPostProcessor' of type [class org.springframework.security.config.annotation.configuration.AutowireBeanFactoryObjectPostProcessor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration' of type [class org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration$$EnhancerBySpringCGLIB$$923a6da2] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration' of type [class org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration$$EnhancerBySpringCGLIB$$a089d8e9] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler@2e6f4093' of type [class org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration' of type [class org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration$$EnhancerBySpringCGLIB$$337d9466] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'methodSecurityMetadataSource' of type [class org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'metaDataSourceAdvisor' of type [class org.springframework.security.access.intercept.aopalliance.MethodSecurityMetadataSourceAdvisor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) elasticsearch_1 | [2015-04-09 10:49:13,723][INFO ][gateway ] [Burstarr] recovered [2] indices into cluster_state logstash_1 | {:timestamp=>"2015-04-09T10:49:14.061000+0000", :message=>"Using milestone 2 input plugin 'tcp'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} logstash_1 | {:timestamp=>"2015-04-09T10:49:14.512000+0000", :message=>"Using milestone 1 input plugin 'log4j'. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} pdp_1 | Server initialized with port: 8080 pdp_1 | jetty-8.1.15.v20140411 logstash_1 | {:timestamp=>"2015-04-09T10:49:14.579000+0000", :message=>"Using milestone 2 filter plugin 'json'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} pdp_1 | Initializing Spring embedded WebApplicationContext pdp_1 | Root WebApplicationContext: initialization completed in 6421 ms pdp_1 | Mapped "{[/dump],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/metrics/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.MetricsMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/metrics],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/configprops],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/trace],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/mappings],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/info],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/env/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EnvironmentMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/env],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/autoconfig],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/health],methods=[],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.HealthMvcEndpoint.invoke() pdp_1 | Mapped "{[/beans],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Creating filter chain: Ant [pattern='/css/'], [] pdp_1 | Creating filter chain: Ant [pattern='/js/'], [] pdp_1 | Creating filter chain: Ant [pattern='/images/'], [] pdp_1 | Creating filter chain: Ant [pattern='//favicon.ico'], [] pdp_1 | Creating filter chain: Ant [pattern='/info'], [] pdp_1 | Creating filter chain: Ant [pattern='/health'], [] pdp1 | Creating filter chain: Ant [pattern='/policy/'], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@336ff06https://github.com/org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter/tresor-tracking/commit/336ff06c, org.springframework.security.web.context.SecurityContextPersistenceFilter@179f607https://github.com/org.springframework.security.web.context.SecurityContextPersistenceFilter/tresor-tracking/commit/179f6076, org.springframework.security.web.header.HeaderWriterFilter@3a03304https://github.com/org.springframework.security.web.header.HeaderWriterFilter/tresor-tracking/commit/3a033047, org.springframework.secur ity.web.authentication.logout.LogoutFilter@3041819https://github.com/org.springframework.security.web.authentication.logout.LogoutFilter/tresor-tracking/commit/3041819a, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@2518715https://github.com/org.springframework.security.web.authentication.www.BasicAuthenticationFilter/tresor-tracking/commit/25187150, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@52eefcahttps://github.com/org.springframework.security.web.savedrequest.RequestCacheAwareFilter/tresor-tracking/commit/52eefcae, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@5c97f92https://github.com/org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter/tresor-tracking/commit/5c97f922, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@7a115cdhttps://github.com/org.springframework.security.web.authentication.AnonymousAuthenticationFilter/tresor-tracking/commit/7a115cd5, org.springframework.security.web.session.SessionManagementFilter@5fda5fahttps://github.com/org.springframework.security.web.session.SessionManagementFilter/tresor-tracking/commit/5fda5fa6, org.springframework.security.web.access.ExceptionTranslationFilter@19d0722https://github.com/org.springframework.security.web.access.ExceptionTranslationFilter/tresor-tracking/commit/19d07227] pdp_1 | Creating filter chain: OrRequestMatcher [requestMatchers=[Ant [pattern='/dump'], Ant [pattern='/dump/'], Ant [pattern='/dump.'], Ant [pattern='/metrics'], Ant [pattern='/metrics/'], Ant [pattern='/metrics.'], Ant [pattern='/configprops'], Ant [pattern='/configprops/'], Ant [pattern='/configprops.'], Ant [pattern='/trace'], Ant [pattern='/trace/'], Ant [pattern='/trace.'], Ant [pattern='/mappings'], Ant [pattern='/mappings/'], Ant [pattern='/mappings.'], Ant [pattern='/env'], Ant [pattern='/env/'], Ant [pattern='/env.'], Ant [pattern='/autoconfig'], Ant [pattern='/autoconfig/'], Ant [pattern='/autoconfig.'], Ant [pattern='/beans'], Ant [pattern='/beans/'], Ant [pattern='/beans.']]], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFi lter@4484421https://github.com/org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter/tresor-tracking/commit/44844212, org.springframework.security.web.context.SecurityContextPersistenceFilter@3a192cfhttps://github.com/org.springframework.security.web.context.SecurityContextPersistenceFilter/tresor-tracking/commit/3a192cf2, org.springframework.security.web.header.HeaderWriterFilter@6a38cebhttps://github.com/org.springframework.security.web.header.HeaderWriterFilter/tresor-tracking/commit/6a38ceb8, org.springframework.security.web.authentication.logout.LogoutFilter@366df14https://github.com/org.springframework.security.web.authentication.logout.LogoutFilter/tresor-tracking/commit/366df14a, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@4bc63fchttps://github.com/org.springframework.security.web.authentication.www.BasicAuthenticationFilter/tresor-tracking/commit/4bc63fc3, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@4ad5e1ahttps://github.com/org.springframework.security.web.savedrequest.RequestCacheAwareFilter/tresor-tracking/commit/4ad5e1aa, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@1fd2864https://github.com/org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter/tresor-tracking/commit/1fd28648, org.springframework.security.web.session.SessionManagementFilter@378bc88https://github.com/org.springframework.security.web.session.SessionManagementFilter/tresor-tracking/commit/378bc883, org.springframework.security.web.access.ExceptionTranslationFilter@21b0fb2https://github.com/org.springframework.security.web.access.ExceptionTranslationFilter/tresor-tracking/commit/21b0fb27, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@64cdcc2] pdp_1 | Mapping servlet: 'dispatcherServlet' to [/] pdp_1 | Mapping filter: 'metricFilter' to: [/] pdp_1 | Mapping filter: 'springSecurityFilterChain' to: [/] pdp_1 | Mapping filter: 'applicationContextIdFilter' to: [/] pdp1 | Mapping filter: 'webRequestLoggingFilter' to: [/] pdp_1 | Initialized FileBasedClientIdServiceIdPolicyStore with path /opt/policies pdp_1 | Initialized WeekdayPIP pdp_1 | Initialized LocationPIP with url http://ls.snet.tu-berlin.de:8080/pe/api/v2/pdp and authentication pdp_1 | Initialized PIPAttributeFinderModule, number of pips: 2 pdp_1 | Mapped "{[/policy/{clientId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/json],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicies(java.lang.String,org.springframework.security.core.userdetails.UserDetails) throws com.fasterxml.jackson.core.JsonProcessingException pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[PUT],params=[],headers=[],consumes=[application/xacml+xml],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.putPolicy(java.lang.String,java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[DELETE],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.deletePolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xml],custom=[]}" onto public java.lang.String org.snet.tresor.pdp.contexthandler.controller.HomeController.getHomeDocument() pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/xacml+xml],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLDecision(java.lang.String) pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/samlassertion+xml],produces=[application/samlassertion+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLSAMLDecision(java.lang.String) pdp_1 | Looking for @ControllerAdvice: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@5a5a7c6https://github.com/org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext/tresor-tracking/commit/5a5a7c64: startup date [Thu Apr 09 10:49:08 UTC 2015]; root of context hierarchy pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Starting beans in phase 0 pdp_1 | Located managed bean 'requestMappingEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=requestMappingEndpoint] pdp_1 | Located managed bean 'environmentEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=environmentEndpoint] pdp_1 | Located managed bean 'healthEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=healthEndpoint] pdp_1 | Located managed bean 'beansEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=beansEndpoint] pdp_1 | Located managed bean 'infoEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=infoEndpoint] pdp_1 | Located managed bean 'metricsEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=metricsEndpoint] pdp_1 | Located managed bean 'traceEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=traceEndpoint] pdp_1 | Located managed bean 'dumpEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=dumpEndpoint] pdp_1 | Located managed bean 'autoConfigurationAuditEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=autoConfigurationAuditEndpoint] pdp_1 | Located managed bean 'shutdownEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=shutdownEndpoint] pdp_1 | Located managed bean 'configurationPropertiesReportEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=configurationPropertiesReportEndpoint] pdp_1 | Initializing Spring FrameworkServlet 'dispatcherServlet' pdp_1 | FrameworkServlet 'dispatcherServlet': initialization started pdp_1 | FrameworkServlet 'dispatcherServlet': initialization completed in 41 ms pdp_1 | Started SelectChannelConnector@0.0.0.0:8080 pdp_1 | Jetty started on port: 8080 pdp_1 | Started ContextHandler in 13.091 seconds (JVM running for 25.588) broker_1 | W, [2015-04-09T11:31:50.007836 #1292] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-09T11:31:50.016248 #1292] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-09T11:31:50.021832 #1292] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-09T11:31:50.040472 #1292] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-09T11:31:50.181810 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-09T11:31:50.183692 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-09T11:31:50.285543 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-09T11:31:50.286157 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-09T11:31:50.287995 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-09T11:31:50.456375 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-09T11:31:50.457492 #1292] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-09T11:31:50.462805 #1292] INFO -- : Loaded compendium. broker_1 | W, [2015-04-09T11:31:51.118468 #1292] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | W, [2015-04-09T11:31:51.119187 #1292] WARN -- : MOPED: Retrying connection attempt 1 more time(s). runtime: n/a broker_1 | W, [2015-04-09T11:31:51.376140 #1292] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a

— Reply to this email directly or view it on GitHubhttps://github.com/TU-Berlin-SNET/tresor-tracking/issues/1#issuecomment-91202709.

lodygens commented 9 years ago

Dear all, I am not aware there are docker and docker-compose vesions; I am only following https://github.com/TU-Berlin-SNET/tresor-ecosystem where it is written to launch a docker-compose command.

I propose that I touch nothing and I specifically don't modify yml file.

My workflow suggestion, as beta tester: -1- I follow https://github.com/TU-Berlin-SNET/tresor-ecosystem -2- if no bug, goto -7- -3- I report bug(s) -4- you correct -5- I update my local copy -6- goto -1- -7- done

:)

mathiasslawik commented 9 years ago

Hi,

Unfortunately, MongoDB does not support running via Boot2Docker on Windows and OSX, see here, as it does not support saving the databases through the Virtualbox Shared Folders, as it is done by Boot2Docker.

You can put the mongodb databases in the VM: Enter the Boot2Docker VM via boot2docker ssh, create an empty folder, e.g. /root/mongodb-data and change this line in docker-compose.yml:

- data/mongodb:/data/db

to

- /root/mongodb-data:/data/db

Best regards, Mathias

lodygens commented 9 years ago

Here is what I have.

$> boot2docker ssh docker@boot2docker:~$ pwd /home/docker docker@boot2docker:~$ ls -l total 4 ---------- 1 docker staff 29 Jan 1 1970 boot2docker, please format-me drwxr-sr-x 2 docker staff 40 Apr 13 13:59 mongodb-data/ docker@boot2docker:~$ exit

$> head docker-compose.yml mongodb: image: mongo:2.6 command: mongod --smallfiles volumes:

elasticsearch: image: dockerfile/elasticsearch pc-89155 : ~/Stratuslab/Cyclone/Tresor/tresor-ecosystem Lun avr 13 16:28:00 $>

lodygens commented 9 years ago

And docker-compoose gives

Lun avr 13 16:10:01 $> docker-compose up Recreating tresorecosystem_mongodb_1... Recreating tresorecosystem_kibana_1... Recreating tresorecosystem_elasticsearch_1... Recreating tresorecosystem_esfilter_1... Recreating tresorecosystem_logstash_1... Recreating tresorecosystem_pdp_1... Recreating tresorecosystem_broker_1... Recreating tresorecosystem_pap_1... Recreating tresorecosystem_proxytls_1... Recreating tresorecosystem_proxyhttp_1... Attaching to tresorecosystem_kibana_1, tresorecosystem_elasticsearch_1, tresorecosystem_esfilter_1, tresorecosystem_logstash_1, tresorecosystem_pdp_1, tresorecosystem_broker_1, tresorecosystem_pap_1, tresorecosystem_proxytls_1, tresorecosystem_proxyhttp_1 kibana_1 | kibana-dyn-config provider listening on 0.0.0.0:3003 kibana_1 | Expecting elasticsearch at http://localhost:4004 esfilter_1 | 2015-04-13T14:10:22.282Z - info: SUCESS node-es-filter-proxy with target http://elasticsearch:9200 now listening on 0.0.0.0:4004 elasticsearch_1 | [2015-04-13 14:10:34,800][INFO ][node ] [Batragon] version[1.5.0], pid[1], build[5448160/2015-03-23T14:30:58Z] elasticsearch_1 | [2015-04-13 14:10:34,801][INFO ][node ] [Batragon] initializing ... elasticsearch_1 | [2015-04-13 14:10:34,937][INFO ][plugins ] [Batragon] loaded [], sites [] pap_1 | AH00558: apache2: Could not reliably determine the server's fully qualified domain name, using 172.17.0.17. Set the 'ServerName' directive globally to suppress this message broker_1 | --> Downloading a Phusion Passenger agent binary for your platform broker_1 | broker_1 | --> Installing Nginx 1.6.2 engine broker_1 | broker_1 | -------------------------- broker_1 | broker_1 | --> Compiling passenger_native_support.so for the current Ruby interpreter... broker_1 | (set PASSENGER_COMPILE_NATIVE_SUPPORT_BINARY=0 to disable) pdp_1 | 2015-04-13 14:11:07,348 ERROR TcpSocketManager (TCP:logstash:9400) java.net.ConnectException: Connection refused pdp_1 | pdp1 | . ____ pdp1 | /\ / **'_ () _ \ \ \ \ pdp1 | ( ( )** | ' | '| | ' \/ ` | \ \ \ \ pdp1 | \/ )| |)| | | | | || (| | ) ) ) ) pdp1 | ' |____| .**|| ||| |_**, | / / / / pdp1 | =========||==============|__/=//// pdp_1 | :: Spring Boot :: (v1.1.9.RELEASE) pdp_1 | pdp_1 | Starting ContextHandler on 82220b81f00b with PID 1 (/opt/tresor-pdp.jar started by root in /opt) pdp_1 | 2015-04-13 14:11:10,920 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-13 14:11:10,921 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.boot.StartupInfoLogger.logStarting(StartupInfoLogger.java:52) pdp_1 | at org.springframework.boot.SpringApplication.logStartupInfo(SpringApplication.java:583) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:308) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | pdp_1 | Refreshing org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@2adc675: startup date [Mon Apr 13 14:11:11 UTC 2015]; root of context hierarchy pdp_1 | 2015-04-13 14:11:11,006 ERROR Unable to write to stream TCP:logstash:9400 for appender LOGSTASH pdp_1 | 2015-04-13 14:11:11,007 ERROR An exception occurred processing Appender LOGSTASH org.apache.logging.log4j.core.appender.AppenderLoggingException: Error writing to TCP:logstash:9400 socket not available pdp_1 | at org.apache.logging.log4j.core.net.TcpSocketManager.write(TcpSocketManager.java:120) pdp_1 | at org.apache.logging.log4j.core.appender.OutputStreamManager.write(OutputStreamManager.java:136) pdp_1 | at org.apache.logging.log4j.core.appender.AbstractOutputStreamAppender.append(AbstractOutputStreamAppender.java:106) pdp_1 | at org.apache.logging.log4j.core.config.AppenderControl.callAppender(AppenderControl.java:97) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.callAppenders(LoggerConfig.java:428) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:407) pdp_1 | at org.apache.logging.log4j.core.config.LoggerConfig.log(LoggerConfig.java:365) pdp_1 | at org.apache.logging.log4j.core.Logger.logMessage(Logger.java:112) pdp_1 | at org.apache.logging.slf4j.Log4jLogger.log(Log4jLogger.java:374) pdp_1 | at org.apache.commons.logging.impl.SLF4JLocationAwareLog.info(SLF4JLocationAwareLog.java:159) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.prepareRefresh(AbstractApplicationContext.java:510) pdp_1 | at org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext.prepareRefresh(AnnotationConfigEmbeddedWebApplicationContext.java:175) pdp_1 | at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:449) pdp_1 | at org.springframework.boot.context.embedded.EmbeddedWebApplicationContext.refresh(EmbeddedWebApplicationContext.java:109) pdp_1 | at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:691) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:320) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:952) pdp_1 | at org.springframework.boot.SpringApplication.run(SpringApplication.java:941) pdp_1 | at org.snet.tresor.pdp.contexthandler.ContextHandler.main(ContextHandler.java:13) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) pdp_1 | at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) pdp_1 | at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) pdp_1 | at java.lang.reflect.Method.invoke(Method.java:606) pdp_1 | at org.springframework.boot.loader.MainMethodRunner.run(MainMethodRunner.java:53) pdp_1 | at java.lang.Thread.run(Thread.java:745) pdp_1 | elasticsearch_1 | [2015-04-13 14:11:11,847][INFO ][node ] [Batragon] initialized elasticsearch_1 | [2015-04-13 14:11:11,849][INFO ][node ] [Batragon] starting ... elasticsearch_1 | [2015-04-13 14:11:13,277][INFO ][transport ] [Batragon] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/172.17.0.7:9300]} elasticsearch_1 | [2015-04-13 14:11:13,349][INFO ][discovery ] [Batragon] elasticsearch/bPua7auZQlazq3qVljIf8g pdp_1 | JSR-330 'javax.inject.Inject' annotation found and supported for autowiring pdp_1 | Apr 13, 2015 2:11:15 PM org.hibernate.validator.internal.util.Version pdp_1 | INFO: HV000001: Hibernate Validator 5.0.3.Final pdp_1 | Bean 'org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration' of type [class org.springframework.security.config.annotation.configuration.ObjectPostProcessorConfiguration$$EnhancerBySpringCGLIB$$76fe7841] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) broker_1 | Compilation succesful. The logs are here: broker_1 | /tmp/passenger_native_support-vp6ayz.log broker_1 | --> passenger_native_support.so successfully loaded. pdp_1 | Bean 'objectPostProcessor' of type [class org.springframework.security.config.annotation.configuration.AutowireBeanFactoryObjectPostProcessor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler@202fec2b' of type [class org.springframework.security.access.expression.method.DefaultMethodSecurityExpressionHandler] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration' of type [class org.springframework.boot.autoconfigure.security.SecurityAutoConfiguration$$EnhancerBySpringCGLIB$$894b971] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) broker_1 | =============== Phusion Passenger Standalone web server started =============== broker_1 | PID file: /root/tresor-broker/passenger.3000.pid broker_1 | Log file: /root/tresor-broker/log/passenger.3000.log broker_1 | Environment: production broker_1 | Accessible via: http://0.0.0.0:3000/ broker_1 | broker_1 | You can stop Phusion Passenger Standalone by pressing Ctrl-C. broker_1 | Problems? Check https://www.phusionpassenger.com/documentation/Users%20guide%20Standalone.html#troubleshooting broker_1 | =============================================================================== pdp_1 | Bean 'securityProperties' of type [class org.springframework.boot.autoconfigure.security.SecurityProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerPropertiesAutoConfiguration$$EnhancerBySpringCGLIB$$ca3c3ccf] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.hibernate.validator.internal.constraintvalidators.NotNullValidator' of type [class org.hibernate.validator.internal.constraintvalidators.NotNullValidator] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'managementServerProperties' of type [class org.springframework.boot.actuate.autoconfigure.ManagementServerProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration' of type [class org.springframework.boot.actuate.autoconfigure.ManagementSecurityAutoConfiguration$ManagementSecurityPropertiesConfiguration$$EnhancerBySpringCGLIB$$6b0145c1] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration' of type [class org.springframework.boot.autoconfigure.security.AuthenticationManagerConfiguration$$EnhancerBySpringCGLIB$$5df0a8af] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration' of type [class org.springframework.security.config.annotation.authentication.configuration.AuthenticationConfiguration$$EnhancerBySpringCGLIB$$6c4013f6] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration' of type [class org.springframework.security.config.annotation.method.configuration.GlobalMethodSecurityConfiguration$$EnhancerBySpringCGLIB$$ff33cf73] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'methodSecurityMetadataSource' of type [class org.springframework.security.access.method.DelegatingMethodSecurityMetadataSource] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Bean 'metaDataSourceAdvisor' of type [class org.springframework.security.access.intercept.aopalliance.MethodSecurityMetadataSourceAdvisor] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) pdp_1 | Server initialized with port: 8080 pdp_1 | jetty-8.1.15.v20140411 elasticsearch_1 | [2015-04-13 14:11:17,198][INFO ][cluster.service ] [Batragon] new_master [Batragon][bPua7auZQlazq3qVljIf8g][82026eb87503][inet[/172.17.0.7:9300]], reason: zen-disco-join (elected_as_master) pdp_1 | Initializing Spring embedded WebApplicationContext pdp_1 | Root WebApplicationContext: initialization completed in 6216 ms elasticsearch_1 | [2015-04-13 14:11:17,356][INFO ][http ] [Batragon] bound_address {inet[/0:0:0:0:0:0:0:0:9200]}, publish_address {inet[/172.17.0.7:9200]} elasticsearch_1 | [2015-04-13 14:11:17,357][INFO ][node ] [Batragon] started pdp_1 | Mapped "{[/trace],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/configprops],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp1 | Mapped "{[/metrics/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.MetricsMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/metrics],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/info],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/beans],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/dump],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp1 | Mapped "{[/env/{name:.}],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EnvironmentMvcEndpoint.value(java.lang.String) pdp_1 | Mapped "{[/env],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/health],methods=[],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.HealthMvcEndpoint.invoke() pdp_1 | Mapped "{[/mappings],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() pdp_1 | Mapped "{[/autoconfig],methods=[GET],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public java.lang.Object org.springframework.boot.actuate.endpoint.mvc.EndpointMvcAdapter.invoke() logstash_1 | {:timestamp=>"2015-04-13T14:11:19.190000+0000", :message=>"Using milestone 2 input plugin 'tcp'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} broker_1 | App 172 stdout: pdp_1 | Creating filter chain: Ant [pattern='/css/'], [] pdp_1 | Creating filter chain: Ant [pattern='/js/'], [] pdp_1 | Creating filter chain: Ant [pattern='/images/'], [] pdp_1 | Creating filter chain: Ant [pattern='//favicon.ico'], [] pdp_1 | Creating filter chain: Ant [pattern='/info'], [] pdp_1 | Creating filter chain: Ant [pattern='/health'], [] pdp1 | Creating filter chain: Ant [pattern='/policy/*'], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@36b6447b, org.springframework.security.web.context.SecurityContextPersistenceFilter@7a87a785, org.springframework.security.web.header.HeaderWriterFilter@1ae894e0, org.springframework.security.web.authentication.logout.LogoutFilter@6bfc4d30, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@71047cc7, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@41577f32, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@52945e57, org.springframework.security.web.authentication.AnonymousAuthenticationFilter@784a034c, org.springframework.security.web.session.SessionManagementFilter@6bf26637, org.springframework.security.web.access.ExceptionTranslationFilter@1bba2502] pdp1 | Creating filter chain: OrRequestMatcher [requestMatchers=[Ant [pattern='/trace'], Ant [pattern='/trace/'], Ant [pattern='/trace.'], Ant [pattern='/configprops'], Ant [pattern='/configprops/'], Ant [pattern='/configprops.'], Ant [pattern='/metrics'], Ant [pattern='/metrics/'], Ant [pattern='/metrics.'], Ant [pattern='/beans'], Ant [pattern='/beans/'], Ant [pattern='/beans.'], Ant [pattern='/dump'], Ant [pattern='/dump/'], Ant [pattern='/dump.'], Ant [pattern='/env'], Ant [pattern='/env/'], Ant [pattern='/env.'], Ant [pattern='/mappings'], Ant [pattern='/mappings/'], Ant [pattern='/mappings.'], Ant [pattern='/autoconfig'], Ant [pattern='/autoconfig/'], Ant [pattern='/autoconfig._']]], [org.springframework.security.web.context.request.async.WebAsyncManagerIntegrationFilter@75e528fd, org.springframework.security.web.context.SecurityContextPersistenceFilter@38a3d255, org.springframework.security.web.header.HeaderWriterFilter@376f1fcc, org.springframework.security.web.authentication.logout.LogoutFilter@79bd8a7f, org.springframework.security.web.authentication.www.BasicAuthenticationFilter@546f68f, org.springframework.security.web.savedrequest.RequestCacheAwareFilter@b45e780, org.springframework.security.web.servletapi.SecurityContextHolderAwareRequestFilter@1a618a48, org.springframework.security.web.session.SessionManagementFilter@6090a682, org.springframework.security.web.access.ExceptionTranslationFilter@cd41408, org.springframework.security.web.access.intercept.FilterSecurityInterceptor@22e413c5] logstash_1 | {:timestamp=>"2015-04-13T14:11:19.674000+0000", :message=>"Using milestone 1 input plugin 'log4j'. This plugin should work, but would benefit from use by folks like you. Please let us know if you find bugs or have suggestions on how to improve this plugin. For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} logstash_1 | {:timestamp=>"2015-04-13T14:11:19.951000+0000", :message=>"Using milestone 2 filter plugin 'json'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.4.2/plugin-milestones", :level=>:warn} pdp_1 | Mapping servlet: 'dispatcherServlet' to [/] pdp1 | Mapping filter: 'metricFilter' to: [/] pdp1 | Mapping filter: 'springSecurityFilterChain' to: [/] pdp1 | Mapping filter: 'applicationContextIdFilter' to: [/] pdp_1 | Mapping filter: 'webRequestLoggingFilter' to: [/*] pdp_1 | Initialized FileBasedClientIdServiceIdPolicyStore with path /opt/policies elasticsearch_1 | [2015-04-13 14:11:20,360][INFO ][gateway ] [Batragon] recovered [2] indices into cluster_state pdp_1 | Initialized WeekdayPIP pdp_1 | Initialized LocationPIP with url http://ls.snet.tu-berlin.de:8080/pe/api/v2/pdp and authentication pdp_1 | Initialized PIPAttributeFinderModule, number of pips: 2 pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[DELETE],params=[],headers=[],consumes=[],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.deletePolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[PUT],params=[],headers=[],consumes=[application/xacml+xml],produces=[],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.putPolicy(java.lang.String,java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/policy/{clientId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/json],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicies(java.lang.String,org.springframework.security.core.userdetails.UserDetails) throws com.fasterxml.jackson.core.JsonProcessingException pdp_1 | Mapped "{[/policy/{clientId}/{serviceId}],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.ClientIdServiceIdPolicyStoreController.getPolicy(java.lang.String,java.lang.String,org.springframework.security.core.userdetails.UserDetails) pdp_1 | Mapped "{[/],methods=[GET],params=[],headers=[],consumes=[],produces=[application/xml],custom=[]}" onto public java.lang.String org.snet.tresor.pdp.contexthandler.controller.HomeController.getHomeDocument() pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/samlassertion+xml],produces=[application/samlassertion+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLSAMLDecision(java.lang.String) pdp_1 | Mapped "{[/pdp],methods=[POST],params=[],headers=[],consumes=[application/xacml+xml],produces=[application/xacml+xml],custom=[]}" onto public org.springframework.http.ResponseEntity org.snet.tresor.pdp.contexthandler.controller.PDPController.getXACMLDecision(java.lang.String) pdp_1 | Looking for @ControllerAdvice: org.springframework.boot.context.embedded.AnnotationConfigEmbeddedWebApplicationContext@2adc675: startup date [Mon Apr 13 14:11:11 UTC 2015]; root of context hierarchy pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Registering beans for JMX exposure on startup pdp_1 | Starting beans in phase 0 pdp_1 | Located managed bean 'requestMappingEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=requestMappingEndpoint] pdp_1 | Located managed bean 'environmentEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=environmentEndpoint] pdp_1 | Located managed bean 'healthEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=healthEndpoint] pdp_1 | Located managed bean 'beansEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=beansEndpoint] pdp_1 | Located managed bean 'infoEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=infoEndpoint] pdp_1 | Located managed bean 'metricsEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=metricsEndpoint] pdp_1 | Located managed bean 'traceEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=traceEndpoint] pdp_1 | Located managed bean 'dumpEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=dumpEndpoint] pdp_1 | Located managed bean 'autoConfigurationAuditEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=autoConfigurationAuditEndpoint] pdp_1 | Located managed bean 'shutdownEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=shutdownEndpoint] pdp_1 | Located managed bean 'configurationPropertiesReportEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=configurationPropertiesReportEndpoint] pdp_1 | Initializing Spring FrameworkServlet 'dispatcherServlet' pdp_1 | FrameworkServlet 'dispatcherServlet': initialization started pdp_1 | FrameworkServlet 'dispatcherServlet': initialization completed in 51 ms pdp_1 | Started SelectChannelConnector@0.0.0.0:8080 pdp_1 | Jetty started on port: 8080 pdp_1 | Started ContextHandler in 14.231 seconds (JVM running for 49.041) broker_1 | W, [2015-04-13T14:11:34.049268 #172] WARN -- : Overwriting existing field _id in class Client. broker_1 | W, [2015-04-13T14:11:34.064520 #172] WARN -- : Overwriting existing field _id in class Provider. broker_1 | W, [2015-04-13T14:11:34.070755 #172] WARN -- : Overwriting existing field _id in class ServiceBooking. broker_1 | W, [2015-04-13T14:11:34.109366 #172] WARN -- : Overwriting existing field _id in class SDL::Base::Type::Service. broker_1 | W, [2015-04-13T14:11:34.243490 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SpecificLocation. broker_1 | W, [2015-04-13T14:11:34.245689 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::UnspecificLocation. broker_1 | W, [2015-04-13T14:11:34.369435 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::RestInterface. broker_1 | W, [2015-04-13T14:11:34.372348 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SoapInterface. broker_1 | W, [2015-04-13T14:11:34.373604 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::XmlrpcInterface. broker_1 | W, [2015-04-13T14:11:34.492299 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::ImmediateBooking. broker_1 | W, [2015-04-13T14:11:34.494083 #172] WARN -- : Overwriting existing field identifier in class SDL::Base::Type::SynchronousBooking. broker_1 | I, [2015-04-13T14:11:34.498348 #172] INFO -- : Loaded compendium. broker_1 | W, [2015-04-13T14:11:36.551378 #194] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | I, [2015-04-13T14:11:36.576350 #194] INFO -- : Started HEAD "/" for 127.0.0.1 at 2015-04-13 14:11:36 +0000 broker_1 | App 194 stdout: broker_1 | [ 2015-04-13 14:11:36.5999 133/7f5695f78700 Ser/Server.h:931 ]: [Client 1-1] Disconnecting client with error: client socket write error: Broken pipe (errno=32)

mathiasslawik commented 9 years ago

Hi,

Your MongoDB instance is still not starting. :worried:

Could you please run docker-compose run mongodb in the tresor-ecosystem directory and post its output?

Thanks in advance. :+1:

Best regards, Mathias

lodygens commented 9 years ago

In one terminal : Mar avr 14 10:25:55 $> docker-compose run mongodb 2015-04-14T08:35:51.619+0000 [initandlisten] MongoDB starting : pid=1 port=27017 dbpath=/data/db 64-bit host=84462993842f 2015-04-14T08:35:51.620+0000 [initandlisten] db version v2.6.9 2015-04-14T08:35:51.620+0000 [initandlisten] git version: df313bc75aa94d192330cb92756fc486ea604e64 2015-04-14T08:35:51.620+0000 [initandlisten] build info: Linux build20.nj1.10gen.cc 2.6.32-431.3.1.el6.x86_64 #1 SMP Fri Jan 3 21:39:27 UTC 2014 x86_64 BOOST_LIB_VERSION=1_49 2015-04-14T08:35:51.620+0000 [initandlisten] allocator: tcmalloc 2015-04-14T08:35:51.620+0000 [initandlisten] options: { storage: { smallFiles: true } } 2015-04-14T08:35:51.621+0000 [initandlisten] journal dir=/data/db/journal 2015-04-14T08:35:51.621+0000 [initandlisten] recover : no journal files present, no recovery needed 2015-04-14T08:35:51.647+0000 [initandlisten] allocating new ns file /data/db/local.ns, filling with zeroes... 2015-04-14T08:35:51.658+0000 [FileAllocator] allocating new datafile /data/db/local.0, filling with zeroes... 2015-04-14T08:35:51.658+0000 [FileAllocator] creating directory /data/db/_tmp 2015-04-14T08:35:51.659+0000 [FileAllocator] done allocating datafile /data/db/local.0, size: 16MB, took 0 secs 2015-04-14T08:35:51.659+0000 [initandlisten] build index on: local.startup_log properties: { v: 1, key: { _id: 1 }, name: "id", ns: "local.startup_log" } 2015-04-14T08:35:51.660+0000 [initandlisten] added index to empty collection 2015-04-14T08:35:51.662+0000 [initandlisten] waiting for connections on port 27017

lodygens commented 9 years ago

In another terminal : $> docker-compuse up bla bla ... pdp_1 | Located managed bean 'configurationPropertiesReportEndpoint': registering with JMX server as MBean [org.springframework.boot:type=Endpoint,name=configurationPropertiesReportEndpoint] pdp_1 | Initializing Spring FrameworkServlet 'dispatcherServlet' pdp_1 | FrameworkServlet 'dispatcherServlet': initialization started pdp_1 | FrameworkServlet 'dispatcherServlet': initialization completed in 61 ms pdp_1 | Started SelectChannelConnector@0.0.0.0:8080 pdp_1 | Jetty started on port: 8080 pdp_1 | Started ContextHandler in 15.447 seconds (JVM running for 18.856) broker_1 | App 197 stdout: broker_1 | [ 2015-04-14 08:37:09.3267 136/7ffb5b3ac700 Ser/Server.h:931 ]: [Client 1-1] Disconnecting client with error: client socket write error: Broken pipe (errno=32) broker_1 | W, [2015-04-14T08:37:09.291252 #197] WARN -- : MOPED: Could not resolve IP for: mongodb:27017 runtime: n/a broker_1 | I, [2015-04-14T08:37:09.302226 #197] INFO -- : Started HEAD "/" for 127.0.0.1 at 2015-04-14 08:37:09 +0000