lukejacksonn / servor

Dependency free file server for single page app development
MIT License
1.04k stars 70 forks source link

Proxy support #42

Open subhero24 opened 4 years ago

subhero24 commented 4 years ago

I am using servor for SPA app development, and would like to proxy my /api requests to my node api backend. Does servor support proxying requests?

analytik commented 4 years ago

Hi. I have a similar use case. I already have a fairly simple Express instance that both serves html/js/css files and API requests. What I'd like to do is if env === dev, then serve static files and fallback through servor.

lukejacksonn commented 4 years ago

Hey all 👋 servor does not currently support this behaviour. It is really meant to be a local replica of a static CDN like server (albeit with a few handy development features). I have never used servor in production rather I push my sites/apps I develop with it to something like Netlify or GitHub Pages.

That said, servor does have a node API which you could use to combine it with another server such as express for example. That kind of defeats the purpose though I guess.

Have you got an example of how the API might look like if we were to include such functionality in servor, how would you want to configure it etc?

analytik commented 4 years ago

Thanks for the fast response! I'll see if I can make a prototype and see what I really want to do. I guess for me, the functionality I'm looking for is live reload. This is probably not a universal problem, but my setup is basically mixing node.js urls with a html5 app. Sorry to subhero24 for derailing his topic maybe - in my case I am OK with serving Vue and API from the same server, although obviously ideally there would be different microservices for those.

My router is currently like

const {join} = require("path");

module.exports = function (express, router, debugController) {
    return function (app, rootPath) {
        debugController.setRouting(router);
        router.get('/', function (req, res) {
            return res.sendFile(`${rootPath}/site/index.html`);
        });
        router.get('/static/:page', function (req, res) {
            return res.sendFile(`${rootPath}/site/${req.params.page}.html`);
        });
        for (const route of getVueRoutes()) {
            // console.log(`-------------VUE- ${route} ---------`);
            router.get(route, function (req, res) {
                return res.sendFile(`${rootPath}/site/index.html`);
            });
        }
        app.use(router);
        app.use(express.static(join(rootPath, "site")));
    }
};

function getVueRoutes() {
    const requireModule = require("esm")(module);
    const routes = requireModule('../site/routes.esm.js');
    // console.dir(routes.default.routes);
    return routes.default.routes.map(vueRouteObject => vueRouteObject.path).filter(foo => foo !== '*');
}

Here express is a configured express instance, router is an express-promise-router and debugController is just a controller, where setRoutes adds a few API URLs. For root, and for Vue routes I serve the main vue app, index.html (ideally here would be some clever isomorphic functionality, but whatever).

Then I can serve some plain-html static files, and everything else (css, js, assets) I serve from the 'site' folder (invisible to browsers).

This is pretty clunky, but it does what I need. Mostly I want to see how far I can go with buildless Vue while compensating for some of the lost benefits of a build process. If I manage to make a usable snippet, I'll share it here today, or during the weekend.

analytik commented 4 years ago

OK, that wasn't actually that hard. Not sure if this helps anyone, but... shrug :D I tried.

Of course for this to be sharable, it would need wrapping to a neat configurable module, basically passing along the router object, base path, index file path, etc.

To actually comment to the original question by @subhero24 :

Please note that fs in this case is really bluebird.promisifyAll(require('fs')).

const {join} = require('path');
const fs = require('../container.js').get('fs');
const clients = [];

module.exports = function (express, router, debugController) {
    return function (app, rootPath) {
        debugController.setRouting(router);
        router.get('/', serveIndex(rootPath));
        router.get('/static/:page', function (req, res) {
            return res.sendFile(`${rootPath}/site/${req.params.page}.html`);
        });
        router.get('/livereload', liveReloadController);
        for (const route of getVueRoutes()) {
            // console.log(`-------------VUE- ${route} ---------`);
            router.get(route, serveIndex(rootPath));
        }
        app.use(router);
        app.use(express.static(join(rootPath, 'site')));
        if (process.env.ENV === 'dev' || process.env.NODE_ENV === 'development') {
            console.log('Starting up file watcher for Servor.');
            watchChanges(join(rootPath, 'site'));
        }
    }
};

function getVueRoutes() {
    const requireModule = require('esm')(module);
    const routes = requireModule('../site/routes.esm.js');
    // console.dir(routes.default.routes);
    return routes.default.routes.map(vueRouteObject => vueRouteObject.path).filter(foo => foo !== '*');
}

// servor stuff from here on:

const liveReloadScript = `<head>
  <script>
    const source = new EventSource('/livereload');
    source.onmessage = e => location.reload(true);
    console.log('[servor-express] listening for file changes');
  </script>
`;

function serveIndex(rootPath) {
    if (process.env.ENV === 'dev' || process.env.NODE_ENV === 'development') {
        return async function (req, res) {
            let content = await fs.readFileAsync(`${rootPath}/site/index.html`, 'utf8');
            content = content.replace('<head>', liveReloadScript);
            return res.send(content).end();
        }
    }
    return function (req, res) {
        return res.sendFile(`${rootPath}/site/index.html`);
    }
}

function sendMessage (res, channel, data) {
    res.write(`event: ${channel}\nid: 0\ndata: ${data}\n`);
    res.write('\n\n');
}

function watchChanges(path) {
    fs.watch(path, {recursive: true}, () => {
        while (clients.length > 0) {
            sendMessage(clients.pop(), 'message', 'reload');
        }
    });
}

function liveReloadController(req, res) {
    res.writeHead(200, {
        Connection: 'keep-alive',
        'Content-Type': 'text/event-stream',
        'Cache-Control': 'no-cache',
        'Access-Control-Allow-Origin': '*'
    });
    sendMessage(res, 'connected', 'ready');
    setInterval(sendMessage, 60000, res, 'ping', 'waiting');
    clients.push(res);
}

how 2 proxy:

// rp = requst-promise
async function proxy(req, res) {
    try {
        const resp = await rp({
            url: `http://127.0.0.1:80${req.path}`,
            method: req.method,
            resolveWithFullResponse: true,
            headers: req.headers,
            simple: false
        });
        return res
            .append('content-type', resp.headers['content-type'] || 'application/json; charset=utf-8')
            .send(resp.body)
            .status(resp.statusCode)
            .end();
    }
    catch (err) {
        console.error(err);
        return res
            .send(err && err.body || 'Server error')
            .status(err && err.statusCode || 500)
            .end();
    }
}
chiefjester commented 4 years ago

Hi @lukejacksonn 👋🏻

I'm just dropping here to support this issue and add a use case. Ideally, you should just drop files in a CDN. In a microservice world, it's better to have a relative URL, as opposed to using it externally, because you skip the whole tango dance with CORs. And let's be honest, CORs is just a pain to configure 🤣.

We've been using this approach for years now, and hooking sites to Cloudflare, makes websites housed in Nginx served in a CDN fashion.

Ideally, the API would look like this, if we're using a config. If we are to use the cli, I propose we use -P like what http-server does.

{
  "proxy": [
    { "path": "api/**", "destination": "https://xxx.api/" }
  ]
}

With the rise of browsers blocking third-party scripts by default. I think this approach would be more popular.

And I think this is going to be a norm, Netlify, Firebase, Zeit, Cloudflare all have a convenient way to proxy a relative API to a microservice/external URL. And they're all housed by a CDN.

lukejacksonn commented 4 years ago

Thanks @thisguychris that does make things easier to comprehend. So essentially you want to reserve a specific route to make a request to another specified server with the same round sans the /api? Something like:

if (url.match(config.proxy.path)) {
  fetch(config.proxy.destination + url.replace(config.proxy.path, ''))
    .then(json => res.json())
    .then(json => res.send(json))
}

But probably as a raw stream rather than parsed json or whatever?

chiefjester commented 4 years ago

@lukejacksonn yes! something like that 😊

subhero24 commented 4 years ago

@lukejacksonn that would be perfect!

danbroooks commented 4 years ago

@lukejacksonn +1 for this

lukejacksonn commented 4 years ago

I'm going to work on this along with some other PRs and issues. It might make it into the next release.. not 100% on it yet though. I think it will rely mostly on implementing the picking up of a servor config file so that we don't have to keep adding cli flags.