testjavascript / nodejs-integration-tests-best-practices

✅ Beyond the basics of Node.js testing. Including a super-comprehensive best practices list and an example app (March 2024)
3.3k stars 196 forks source link

Speed database for local env setup #3

Closed mikicho closed 4 years ago

mikicho commented 4 years ago

@jhenaoz @Thormod @goldbergyoni

I want to focus on MySQL and Postgres, which are the most popular databases engines in those days. We can add more databases later (MongoDB is a good candidate IMO)

Also, Because of WSL2, I think we can recommend using Linux practices for Windows, WDYT?

The challenge:

Optional Solutions

Postgres:

Tasks

jhenaoz commented 4 years ago

I suggest running the app and all the commands inside a docker image, that way we don't nedd to worry about any OS specific config

mikicho commented 4 years ago

@jhenaoz Thanks, Indeed this is preferable, Just not sure about the performance drawbacks (if any) on Mac, want to do a small benchmark to check it out. my guess is there is a small, neglectable performance drawback.

goldbergyoni commented 4 years ago

@mikicho Sounds like a super-strategic topic. It can be interesting to benchmark the differences. I also wonder, whether we can map to RAM in Mac/Windows in an automated way, so a developer just pull, type 'npm t' and the magic happens OR a developer would have to manually point his Docker daemon to some specific local folder.

I suggest running the app and all the commands inside a docker image, that way we don't nedd to worry about any OS specific config

@jhenaoz Did you mean the infra (e.g. DB, MQ) or also the code under test?

mikicho commented 4 years ago

@jhenaoz another thing is there are options that available only for Linux, for example tmpfs

@jhenaoz @goldbergyoni @Thormod In component testing, do we run schema migrations at the beginning of every run? just truncate all tables? leave the tables intact?

jhenaoz commented 4 years ago

@mikicho , how about schema migration in some global setup for all the test execution? and truncate tables for data clean up for each test case?

goldbergyoni commented 4 years ago

@jhenaoz @mikicho Interesting stuff, my 2.5 cent:

Schema - Migration on global setup sounds indeed very sensible like Juan wrote 🚀. Maybe in dev env, no need to seed on every run, rather when explicitly asked? This will bump the performance for local dev given that in 99% of the executions we don't need to migrate (no schema changes)

Clean-up - If we clean-up after every test, then processA.test1.afterEach will delete processB.test12 data during execution, isn't it? Maybe clean-up only at the end, globalTeardown, and every test should take care to act on his own records only?

For example:

test('When deleting a user, then his articles are deleted', ()=>{

// Arrange
const userToAdd = {name: `Joe ${uuid()}`};
...
}
mikicho commented 4 years ago

@goldbergyoni @jhenaoz Just saying there is another option which is to open a transaction and rollback it after each. it's efficient and clean. Having said that I agree with you, on dev no need for cleaning after each (just once in a while to keep the tables small) I don't think we need to clean up at all on global teardown.

goldbergyoni commented 4 years ago

@mikicho Yes, good point, transactions are also an option (when working with a supportive DB + single DB + local process) and it's good that we cover everything before making decisions.

Curious to hear @jhenaoz and @Thormod thoughts on this.

Thormod commented 4 years ago

Several comments about this discussion:

goldbergyoni commented 4 years ago

While I believe using transactions is a valid option, it's not my superior one because:

  1. Works only with DB that supports transaction. Also, if there is a cache layer or a 2nd DB there might be partial data as they don't participate in the transaction
  2. Gets hairy when the code already has a transaction, there you get transaction inside a transaction OMG
  3. Won't work for remote tests
  4. Performance penalty and locking
  5. White-box approach - Instead of focusing on the API, the tester might dig into the DAL
  6. Can't see data in DB for troubleshooting purposes

My preferred option is like (always) like the production is working - Generate unique rows for every test and don't delete those. It isn't perfect but maybe better.

mikicho commented 4 years ago

@goldbergyoni @jhenaoz shouldn't we close this in favor of #6 ? (maybe the opposite to preserve the conversation above)

goldbergyoni commented 4 years ago

@mikicho Yes