ryanmcdermott / clean-code-javascript

:bathtub: Clean Code concepts adapted for JavaScript
MIT License
91.65k stars 12.29k forks source link

Suggestion: Prefer iterators dependent on the variable's name #49

Open cookiengineer opened 7 years ago

cookiengineer commented 7 years ago

I figured out that for most complex code structures where I need nested loops due to performance, I can more easily read them once I chose to use iterators based on the variable's (object's or array's) name. I typically use the leading character as the iterator (e.g. dataset => d) and a trailing l to imply the length (e.g. dataset => dl).

While I know that functional programming is easier to read (e.g. using Object.values(dataset).forEach((chunk, c) => {}) it's hardly avoidable in timing-relevant methods and complex sorting algorithms that need performant behaviours (e.g. an update or render loop in a game engine).

Anyways, here's an example. Let me know whatcha think.

BAD example

const dataset = [{ foo: [1,2,3,4], bar: [1,2,3,4] }];

for (let i in dataset) {
    let chunk = dataset[i];
    for (let j = 0, l = chunk.length; j < l; j++) {
        console.log(chunk[j]); // you get the point
    }
}

GOOD example

const dataset = [{ foo: [1,2,3,4], bar: [1,2,3,4] }];

for (let d in dataset) {
    let chunk = dataset[d];
    for (let c = 0, cl = chunk.length; c < cl; c++) {
        console.log(chunk[c]); // you get the point
    }
}
vsemozhetbyt commented 7 years ago

It may be not absolutely matched with Avoid Mental Mapping (and Don't over-optimize). Maybe something like datasetPart and chunkElement?

And I don't know if this is in the book (I have not read it, unfortunately). Do the repository creators accept tips not connected with the book directly?

cookiengineer commented 7 years ago

It's about iterator and length variable naming, not the chunk or the element itself, so mental mapping does not apply (except if you want to name it dataset_iterator and dataset_length which would be a bit stupid to type often?

vsemozhetbyt commented 7 years ago

Ah, sorry, I've misread the second cycle part.

cookiengineer commented 7 years ago

No worries :)

ryanmcdermott commented 7 years ago

Interesting, can you open a PR for that @cookiengineer? Thanks!

cookiengineer commented 7 years ago

Yep, will do one when I have enough time. Probably tomorrow.

cookiengineer commented 7 years ago

@ryanmcdermott Added a pull request :)

ProLoser commented 7 years ago

I think this is relevant to mindmapping. Is there a significant reason for not opting for forEach() to help reduce mindmapping (aside from some performance)? You could use terms like group and property when using for-in to help avoid mindmapping:

const dataset = [{ foo: [1,2,3,4], bar: [1,2,3,4] }];

for (let group in dataset) {
    dataset[group].forEach( item => console.log( item ) );
}

Which can again be cleaned up (if we're flexible) to:

const dataset = [{ foo: [1,2,3,4], bar: [1,2,3,4] }];

for (let group in dataset) {
    dataset[group].forEach( console.log.bind(console) );
}
cookiengineer commented 7 years ago

When using splice() there's no way to get around a classic loop. forEach will continue and skip the next entry right after the deleted one. Only solution would be to use array[i] = null, which will cause unboxing, boxing and an unnecessary GC run that could be saved.

While I agree and also prefer forEach for the sake of readability, I have to also say that forEach should not be used in a game engine's update loop that runs with 60 FPS+.

In my pull request ( #127 ) I kind of infringed the avoid over-optimization part... so I guess the forEach solution should be the one in the README though I still won't agree from the performance side of things.

The troubles newcomers will have when using functional ideas (filter/forEach/map/etc.) in update loops of their programs will get big once they have an interval. And I'm not talking about ms optimizations here, I'm talking about garbage beyond 10MB / min here that can be easily reached and will lead to program (or server) crashes. The "thinking in garbage" idea isn't contained in this guide and I think it's still important to know to write good ES code.

However, before I fix my pull request I'd like to have a vote on how people see this topic and what they prefer or to suggest to it. As stated, I'm always using lazy caching for projecting data structures and performant update loops. And I think that should be embraced.

ProLoser commented 7 years ago

What's "lazy caching"?

Hmm... I like the don't over-optimize but yes it's understandable that you need optimization in game engines. Perhaps that could be a note or addendum? Although from my perspective, for the sake of speed, don't over-optimize on your first pass of development (even in a game) and then come back and make another optimization pass on your code. Not that I've ever developed a game before...

I feel like code is always going to need optimization, and you just don't want to get bogged down in it early on (especially at the detriment of your code's readability or refactorability which are crucial to long-term lifespan of code).

Overall, I see no problems with your example of using relevant letters, but I don't really like the use of 1-letter variables altogether. And I try to avoid nesting loops deeply enough that you need more than i, j, and k.

I think personally it's not important to make sure your 1-letter variable is a relevant letter (since you're already using a much-harder-to-grok/scan variable name) and instead if you use something like a 1-letter variable, try to immediately swap over to a full-identifier object/variable so that it's use is extremely short. If you DO need to use the index throughout, then just call it index or chunkIndex? I use index all the time as an argument for forEach() or related functions.

cookiengineer commented 7 years ago

What's "lazy caching"?

Lazy caching is the idea to offload hard computation (map/reduce) off the computation loop and into the parts of the code where it changes.

let main = {}:

main.update = () => {
    for (let d = 0, dl = this.dataset.length; d < dl; d++) {
        this.dataset[d].update();
    }
};

main.setDataset = (dataset) => {
    this.dataset = dataset.filter(val => val.enabled === true);
};

If you DO need to use the index throughout, then just call it index or chunkIndex

My suggestion was primarily not about naming it index, but as index1, index2, index3, chunkIndex, dataSetWithMuchCamelIndex are shitty to read to suggest an alternative with the leading letters of the array it is related to - to reduce mind mapping overhead.

ProLoser commented 7 years ago

What about dataItem instead of d? I'm still not sure what part of your example is the lazy caching.

I do agree about too many variables with bad names, but I feel like that's a symptom of overall a bigger problem with your code organization and changing from i to d isn't really going to be where the real impact could be had.