ontoportal / ontoportal-project

OntoPortal Alliance centralized repository for the management of the OntoPortal project
https://ontoportal.org/
3 stars 1 forks source link

Proxy error with http requests #34

Closed Bouchemel-Nasreddine closed 5 months ago

Bouchemel-Nasreddine commented 7 months ago

Hey, i have been trying to renew the license of industryportal, when putting the key and clicking renew it return "something went wrong...", when checking the logs in the server I noticed that the request sent, is not receiving a response which cause a timeout (for the case of license it happens here) [Mon Nov 20 14:38:07.959412 2023] [proxy_http:error] [pid 30609:tid 140530960807680] (70007)The timeout specified has expired: [client 192.168.44.5:53370] AH01102: error reading status line from remote server industryportal.enit.fr:8080 [Mon Nov 20 14:38:07.959568 2023] [proxy:error] [pid 30609:tid 140530960807680] [client 192.168.44.5:53370] AH00898: Error reading from remote server returned by /admin/update_info ,and that's the case with all the request (LinkedData::Client::HTTP.get()) , and same when trying to send manually the request that was sent which is (http://data.industryportal.enit.fr/admin/update_info?action=update_info&controller=admin&ncbo_cache_buster=1700490849.9199219&apikey=) it will return the same behavior Screenshot from 2023-11-20 15-51-58

Additional context A logical solution in this case will be to configure the used LinkedData HTTP client with proxy configurations

syphax-bouazzouni commented 7 months ago

Can you show us your API logs, which can be found in /srv/ontoportal/ontoportal_api/current/logs, try to see into the files appliance.log or unicorn.stderr.log and search /admin/update_info

Bouchemel-Nasreddine commented 7 months ago

in the appliance.log file i get this: 193.50.189.11, 193.50.189.11 - nass [20/Nov/2023:15:43:54 +0000] "GET /admin/update_info?ncbo_cache_buster=1700494974.1846325 HTTP/1.0" 200 190 60.0781 in the unicorn.stderr.log file I get:

193.50.189.11, 193.50.189.11 - nass [20/Nov/2023:15:47:49 +0000] "GET /admin/update_info?action=update_info&controller=admin&ncbo_cache_buster=1700495209.6361096 HTTP/1.0" 200 190 60.0468
193.50.189.11, 193.50.189.11 - nass [20/Nov/2023:15:48:11 +0000] "GET /admin/update_info?ncbo_cache_buster=1700495231.237694 HTTP/1.0" 200 190 60.0920
syphax-bouazzouni commented 7 months ago

The configuration of the appliance proxy, is in this folder etc/httpd/config.d, in one of the .conffile you should see something like this:

<VirtualHost *:443>
  ServerName data.stageportal.lirmm.fr
  ProxyPreserveHost On
  AllowEncodedSlashes On
  SSLProxyEngine on
  SSLProxyVerify none
  SSLProxyCheckPeerCN off
  SSLProxyCheckPeerName off
  SSLProxyCheckPeerExpire off

  ProxyPass / https://stageportal.lirmm.fr:8443/ nocanon
  ProxyPassReverse / https://stageportal.lirmm.fr:8443/ nocanon

  Include /etc/letsencrypt/options-ssl-apache.conf
  SSLCertificateFile /etc/letsencrypt/live/services.stageportal.lirmm.fr/cert.pem
  SSLCertificateKeyFile /etc/letsencrypt/live/services.stageportal.lirmm.fr/privkey.pem

Which defines, the proxy stageportal.lirmm.fr:8443 -> data.stageportal.lirmm.fr (in your case you replace stageportal.lirmm to industryportal.enit.f)

The issue I think, here is that when you called '/admin/update_info', it didn't work and reached the proxy timeout (60s).

So the real question is, did the endpoint /admin/update_info work? we had a similar issue I think recently #22 but I don't remember what was the solution, (@jvendetti and @alexskr)

Anyway, I don't think the issue is in the proxy, you(@Bouchemel-Nasreddine) can try to set a larger timeout and see the reel issue behind and give us the log.

syphax-bouazzouni commented 7 months ago

Set as not related, but saved for later. From @Bouchemel-Nasreddine : I'm not sure if this is related but in the unicorn error file I get at every single request on the portal another warning about an unknown keyword:

cache error: unknown keyword: :namespace
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-client-0.14.1/lib/redis_client/config.rb:21:in `initialize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-client-0.14.1/lib/redis_client/config.rb:184:in `initialize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-client-0.14.1/lib/redis_client.rb:143:in `new'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-client-0.14.1/lib/redis_client.rb:143:in `config'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-5.0.6/lib/redis/client.rb:23:in `config'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-5.0.6/lib/redis.rb:157:in `initialize_client'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-5.0.6/lib/redis.rb:73:in `initialize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-store-1.1.7/lib/redis/store.rb:10:in `initialize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-store-1.1.7/lib/redis/store/factory.rb:27:in `new'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-store-1.1.7/lib/redis/store/factory.rb:27:in `create'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-store-1.1.7/lib/redis/store/factory.rb:10:in `create'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-rack-cache-1.2.4/lib/rack/cache/redis_metastore.rb:26:in `initialize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-rack-cache-1.2.4/lib/rack/cache/redis_metastore.rb:17:in `new'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/redis-rack-cache-1.2.4/lib/rack/cache/redis_metastore.rb:17:in `resolve'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/storage.rb:38:in `create_store'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/storage.rb:18:in `resolve_metastore_uri'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/context.rb:35:in `metastore'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/context.rb:176:in `lookup'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/context.rb:67:in `call!'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cache-1.13.0/lib/rack/cache/context.rb:52:in `call'
/srv/ontoportal/ontologies_linked_data/lib/ontologies_linked_data/security/access_denied_middleware.rb:10:in `call'
/srv/ontoportal/ontologies_linked_data/lib/ontologies_linked_data/security/authorization.rb:45:in `call'
/srv/ontoportal/ontologies_api/releases/20220313093013/lib/rack/param_translator.rb:47:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/bundler/gems/rack-post-body-to-params-0fd30e710386/lib/rack/post-body-to-params.rb:144:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-accept-0.4.5/lib/rack/accept/context.rb:22:in `call'
/srv/ontoportal/ontologies_api/releases/20220313093013/lib/rack/slice_detection.rb:36:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cors-1.0.6/lib/rack/cors.rb:98:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-cors-1.0.6/lib/rack/cors.rb:98:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/static.rb:124:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/commonlogger.rb:33:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:219:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-protection-1.5.5/lib/rack/protection/xss_header.rb:18:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-protection-1.5.5/lib/rack/protection/json_csrf.rb:18:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-protection-1.5.5/lib/rack/protection/base.rb:49:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-protection-1.5.5/lib/rack/protection/base.rb:49:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-protection-1.5.5/lib/rack/protection/frame_options.rb:31:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/logger.rb:15:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/commonlogger.rb:33:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:219:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:212:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/head.rb:13:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/rack-1.6.13/lib/rack/methodoverride.rb:22:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:182:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:2013:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:1487:in `block in call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:1787:in `synchronize'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/sinatra-1.4.8/lib/sinatra/base.rb:1487:in `call'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-5.8.0/lib/unicorn/http_server.rb:634:in `process_client'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-worker-killer-0.4.5/lib/unicorn/worker_killer.rb:53:in `process_client'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-worker-killer-0.4.5/lib/unicorn/worker_killer.rb:93:in `process_client'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-5.8.0/lib/unicorn/http_server.rb:732:in `worker_loop'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-5.8.0/lib/unicorn/http_server.rb:548:in `spawn_missing_workers'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-5.8.0/lib/unicorn/http_server.rb:144:in `start'
/srv/ontoportal/.bundle/ruby/2.7.0/gems/unicorn-5.8.0/bin/unicorn:128:in `<top (required)>'
/srv/ontoportal/.bundle/ruby/2.7.0/bin/unicorn:23:in `load'
/srv/ontoportal/.bundle/ruby/2.7.0/bin/unicorn:23:in `<top (required)>'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli/exec.rb:58:in `load'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli/exec.rb:58:in `kernel_load'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli/exec.rb:23:in `run'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli.rb:483:in `exec'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/vendor/thor/lib/thor/command.rb:27:in `run'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/vendor/thor/lib/thor/invocation.rb:127:in `invoke_command'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/vendor/thor/lib/thor.rb:392:in `dispatch'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli.rb:31:in `dispatch'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/vendor/thor/lib/thor/base.rb:485:in `start'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/cli.rb:25:in `start'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/exe/bundle:48:in `block in <top (required)>'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/lib/bundler/friendly_errors.rb:117:in `with_friendly_errors'
/usr/local/rbenv/versions/2.7.0/lib/ruby/gems/2.7.0/gems/bundler-2.3.15/exe/bundle:36:in `<top (required)>'
/usr/local/rbenv/versions/2.7.0/bin/bundle:23:in `load'
/usr/local/rbenv/versions/2.7.0/bin/bundle:23:in `<main>'
jvendetti commented 7 months ago

Issue 22 that you referenced above required code changes to address two issues:

  1. The appliance ID was absent from the Administration Console if the /admin/update_info endpoint contained an error code. Fixed by this commit: https://github.com/ncbo/bioportal_web_ui/commit/9bf940509e17578de001352ea1a22e0db2b5a46d.
  2. The VirtualApplianceIdValidator prevented license validation if the admin/update_info endpoint contained any error codes. Addressed in this commit: https://github.com/ncbo/bioportal_web_ui/commit/5d55a8ce64e88451d5ae30560076cb9d1ffd5dd1.

According to @alexskr, appliance versions 3.2.0 and above contain these fixes: https://github.com/ontoportal/ontoportal-project/issues/22#issuecomment-1801095027.

It's not evident to me that this problem report is the same as issue 22. If you enter http://data.industryportal.enit.fr/admin/update_info in a browser without any parameters tacked on, what JSON response do you see?

Bouchemel-Nasreddine commented 7 months ago

Following the steps mentioned in #22 , i'ev been able to renew it though the problem is still their.

@jvendetti When doing it in the browser (with the apikey, bc without it will just return a 403 access denied), it returns the same response mentioned earlier, and I think this replies to @syphax-bouazzouni 's question about whether it worked in the backend or not. Screenshot from 2023-11-23 09-03-11

@syphax-bouazzouni for the conf file you provided, I already have the reverse proxy conf their: <VirtualHost *:80> ServerName data.industryportal.test.enit.fr ProxyPreserveHost On AllowEncodedSlashes On ProxyPass / http://industryportal.test.enit.fr:8080/ nocanon ProxyPassReverse / http://industryportal.test.enit.fr:8080/ nocanon </VirtualHost>

Now I'm not sure how is the update manager is working, but my guesses are that is trying to reach a certain URL to verify the submitted key. if that is the case and the URL is external, it will not be able to reach unless it is configured to go through the proxy.

syphax-bouazzouni commented 6 months ago

We figured out a solution with @Bouchemel-Nasreddine.

@Bouchemel-Nasreddine can you put the solution here, and close this issue? Thanks

Bouchemel-Nasreddine commented 5 months ago

Finally, the problem was indeed the enterprise proxy preventing requests from reaching the internet, the solution was to define a system-wide proxy configuration, to do so:

$ cd /etc/profile.d
$ nano proxy.sh (or any other editor)

put in the file:

export http_proxy=http://username:password@your_proxy_server:your_proxy_port
export https_proxy=http://username:password@your_proxy_server:your_proxy_port

# Specify addresses to bypass the proxy, optional
export no_proxy="localhost,0.0.0.0"

Save and close, then make the file executable: $ chmod +x proxy.sh

The system will now automatically execute the proxy.sh script during user logins, setting up the defined proxy configurations.