nasa-jpl-memex / memex-explorer

Viewers for statistics and dashboarding of Domain Search Engine data
BSD 2-Clause "Simplified" License
121 stars 69 forks source link

ddt integration - proxy configuration #599

Open ahmadia opened 9 years ago

ahmadia commented 9 years ago

We are not correctly mapping requests from:

http://explorer.continuum.io/ddt

In such a way that the server resources are being properly transferred. For example, root-level resources are still coming from http://explorer.continuum.io/ when they need to be mapped to localhost:8084

Everything works as correctly as it should on:

http://explorer.continuum.io:8084/ or http://explorer.continuum.io:8084/seedcrawler (I've aliased seedcrawler to the index)

If I load: http://explorer.continuum.io/ddt

It tries to load:

http://explorer.continuum.io/getAvailableCrawlers http://explorer.continuum.io/getAvailableProjectionAlgorithms

Instead of:

http://explorer.continuum.io/ddt/getAvailableCrawlers http://explorer.continuum.io/ddt/getAvailableProjectionAlgorithms

@amfarrell, you've got experience with nginx proxy pass configuration, can you have a go at figuring out how to do this correctly? Note that the tool itself still isn't working properly, but I'd like us to be able to expose it at: http://explorer.continuum.io/ddt without exposing the 8084 port.

If this is a cherrypy configuration issue, we can also try to modify that as well, but it would be more complicated.

ahmadia commented 9 years ago

@amfarrell - Note that the production server is a bit of a sticky mess right now. I think you know what I did with the docker move, I also switched our branch over to a new one and merged in @brittainhard's first pass at integration.

I don't care if we have some manual steps now, so long as we keep track of them (nginx configuration needs to be copied over, server needs to be restarted, after a provision).

amfarrell commented 9 years ago

oh, wow. I just realized what this is. This is the issue I was thinking we would run into when we decided to do things like explorer.continuum.io/ddt/ rather than ddt.explorer.continuum.io At the time we agreed that we would solve this by using the HTML link re-writing capability of whatever webserver we were using. I guess it is time to look into that.

ahmadia commented 9 years ago

This is well out of my baliwick, but can't we fix this with a root directive for the ddt location?

amfarrell commented 9 years ago

Nope, that sets the filesystem location which nginx looks in for static file responses. We need http://wiki.nginx.org/HttpSubModule

ahmadia commented 9 years ago

Okay, this seems relevant: http://serverfault.com/questions/281741/multiple-sub-filter-rules-for-a-reverse-proxy-on-nginx

amfarrell commented 9 years ago

This app serves a js page “dataaccess.js” and on line 128 or that page, it calls runQuery('/getAvailableCrawlers', {}, onAvailableCrawlersLoaded); which is defined as

var runQuery = function(query, args, onCompletion, doneCb) { $.post( query, args, onCompletion)};

This means that the application itself is doing the request for an absolute path. There isn't a way to re-write this. Instead, we need the application to do a query for a relative path, or to prepend that query with the value of an environment variable named URL_BASE. The former is preferrable.

ahmadia commented 9 years ago

Awesome. Can you provide a pull request directly to https://github.com/ViDA-NYU/domain_discovery_tool with this and any other suggested fixes and cc me? We can build and test a Docker image that addresses this issue.

ahmadia commented 9 years ago

This has been pushed to the DDT folks.

brittainhard commented 9 years ago

@ahmadia can this be closed?

ahmadia commented 9 years ago

No, unfortunately.