redis / ioredis

🚀 A robust, performance-focused, and full-featured Redis client for Node.js.
MIT License
14.38k stars 1.2k forks source link

Package size #508

Closed adjourn closed 7 years ago

adjourn commented 7 years ago

Now that serverless computing is gaining popularity & size matters more than ever, should we reevaluate this client from size & modularity perspective (e.g lodash)? I was trying to browse dependencies, files to figure it out but this codebase is too alien for me to evaluate it accurately. It doesn't seem to concern many, I didn't find any discussion about this, except https://github.com/luin/ioredis/issues/286 & https://github.com/luin/ioredis/pull/494.


My tests with Rollup.js & example backend (bundled to single flat file including all dependencies):

Normal

  1. Without Redis clients: ~520Kb
  2. With redis client: ~798Kb
  3. With ioredis client: ~1591Kb

Uglified

  1. Without Redis clients: ~220Kb
  2. With redis client: ~347Kb
  3. With ioredis client: ~544Kb

Uglified + zipped

  1. Without Redis clients: ~60Kb
  2. With redis client: ~100Kb
  3. With ioredis client: ~160Kb

I can easily live with 160Kb but it's ~2/3 the size of whole backend code + all other dependencies. This example backend isn't some simple REST proxy backend, it handles GraphQL queries, JWT tokens, password hashing, business logic & everything else a typical backend does. 2/3 seems a bit harsh, don't you think?

I can totally understand if this package targets "traditional" backends & size is not a concern, just let us know. If so, it's a bit shame because there are only 2 good (popular might be a better word) Node.js Redis clients & from these 2 only this one supports clusters.

tuananh commented 7 years ago

why do you worry about the size? you're not going to build and upload it manually are you?

adjourn commented 7 years ago

@tuananh

Not sure if I understand your question but here's how it usually works in "serverless" (e.g AWS Lambda):

Code is ran in isolated container. If container has been dropped (haven't been triggered for several minutes, e.g by API call from actual user), it has to reinitialize container, download, unzip, parse & run the code again. It's called cold start & can take second(s) if zip file is too big.

Less code == faster start == better user experience == cheaper because it's usually priced per 100ms.


And yes, Im currently building & uploading the code manually to serverless service provider but it's only because Im testing & trying things out. I would probably have an automated dev pipeline in production.

tuananh commented 7 years ago

300KB difference isn't much compare with your code (500KB) + container base size (few MB at least, assuming it's alpine-based).

If container has been dropped

In this case, the response is slow anyway. And i don't think lambda (for example) will download, unzip every single time the func execute

adjourn commented 7 years ago

@tuananh

No, it only downloads & unzips on cold start & you don't get too many cold starts if traffic is medium-high.

You're probably right about no huge time differences but for example, https://github.com/luin/ioredis/pull/494 - such a small change & already cuts ioredis size in bundle ~25-50% (~25% smaller if only uglified, ~50% smaller if also zipped).

tuananh commented 7 years ago

Fair enough.

Btw, looks like @luin hears you :) https://github.com/luin/ioredis/pull/494

adjourn commented 7 years ago

Nice! Package codebase itself is relatively small & author(s), maintainer(s) probably know the best, if & which dependencies are crucial + there's definitely some extra code to support older versions of Node, not going to protest against that.

Anyway, I feel a lot better using this package now that #494 is merged, Im going to close this issue. It doesn't look like it's something people are much worried about anyway.