d3x0r / JSON6

JSON for Humans (ES6)
Other
236 stars 14 forks source link

Complete coverage #34

Closed brettz9 closed 4 years ago

brettz9 commented 4 years ago

~With the exception of this single uncovered else (which I think can be removed for the reason given there),~ it seems the remaining uncovered items may be less concerned with JSON6-specific feature handling and are more merely for the general purpose private utilities within the project (status, stack and queue).

However, I think it would really be reassuring as to data integrity if such an appealing project could get to 100% coverage to ensure things are working as expected--at least as currently specified (as well as have a manner of documentation--through the tests)--and then set the thresholds accordingly to 100% as well.

~Looks like there are just 9 uncovered lines left now (and 8 uncovered else paths).~

Update: After recent commits and #36, the only uncovered now are these else paths: 186, 193-201 (193, 199, 201)

d3x0r commented 4 years ago

these lines either a) the linked list thing is entirely broken... or just in normal usage it doesn't encounter these conditions. This doesn't mean it's not possible to reach these conditions... but it would probably require modifying the code; they're the sort of error a programmer making a change might run into... like normally the stack is balanced for same number of pushes as pops; but the list supports getting an extra pop/unshift.

The buffer tracking doesn't actually stack like the original version; it used to more collect the text segments and then handle several buffers at once (or some of several) but with the current version, each buffer added either results in 0 values.. . hmm.

        parser.write( '1' );
        parser.write( '2' );
        parser.write( '3' );
        parser.write( '4' );
        parser.write( '5' );
        parser.write( ' ' );

... but I suppose all of those got collected into internal val buffer...

http://github.com/d3x0r/JSON6/blob/master/lib/json6.js#L193-L201 https://github.com/d3x0r/JSON6/blob/master/lib/json6.js#L186

... I don't know, yet, how I want to solve the above...

btw - I often run npm pack before doing npm publish can you apply checks to pack as well?

d3x0r commented 4 years ago

The other thing I wanted to mention; while initially tying up loose ends, and benchmarking, making sure the changes I did weren't regressions in that respect, there were a couple places where changing 'let' to 'var' increased speed significantly; and while yes, that sort of behavior should also be reported to JS engines, there can also be something about that variable which makes it improbable for the engine to be able to deal with. shrug

The point really is; there's no database/timeline of the benchmark results to know what the current state is... (although of course can check out things in the past); but then that also means that the built code might be slower than the original library too...

mostly just FYI/me expressing my observations/concerns.

brettz9 commented 4 years ago

Re: timeline of benchmark results, the global mocha determination of what slow is could be changed,, but that is probably too broad. You could instead peg the per-test thresholds to results you found satisfactory (with the this.slow() feature or if you wanted to be stricter, using this.timeout()), with the inconvenience that if the addition of new features necessitated slower performance, the test thresholds would need to be updated.

Re: let to var, I think that is due to buble compiling let into a pre-ES2015 form. I think that problem would go away if we switched to Babel and set Node targets to a higher minimum so that the build would only polyfill features needed by the targeted Node version and not polyfilling full support for features like let which have been around for quite a while. If you wanted to drop pre Node 10 support, for example (or even pre-Node 8 or pre-Node 6)--as many projects are doing due to pre-Node 10 having reached End-of-Life status--from what I have seen, Babel won't need to transpile let into anything different, so the performance should not change.

d3x0r commented 4 years ago

Re let slowdown... that was just running the raw .js non build/transpiled version; just saying a transpiler can tend to broad-sweep replace 'var' to 'let' which might be detrimental... if anything it was something V8 itself was doing; and; unrforuately there's not like a Node-FF to use spidemonkey to see if it's a common behavior - then maybe it's more of a logic issue - though I can't imagine.

This is also partially why I've been hesitant to take such broad sweeping linting ; though I do like the test; thank you very much for formalizing them; so it was worth the pain just to also be able to revisit and prove any changes didn't break anything.

Edit: and it was a common slowdown on chrome and node

Edit2: There's no reasonable value I can say 'if more than N is bad' I have a very fast processor; Travis has a very shared Virtual CPU :) It really needs to be always done on the same box to get apples to apples comparisons; any slight difference and you go to apples and potatoes or around the other way to oranges ;) I'm not setup to guarantee that.

d3x0r commented 4 years ago

so. 100 100 100 100 coverage :)

brettz9 commented 4 years ago

That'd be nice!

If the statements are known to be unreachable but are there as guards for future changes, one can add // istanbul ignore next (or if above an if one can be more precise and use // istanbul ignore if or // istanbul ignore else).

However, I think these statements should really only be added as a last resort. (I did add one such statement when checking exports existence, but that is of course because the line is for browser checking and Node will automatically have such a variable defined.)

d3x0r commented 4 years ago

I didn't really need the ignores; (would be nice? Did I miss something?)

There is a condition in a streaming process that more data could be added to the stream while processing the callback for the data found already in the stream; this is a more real-world example, but I did add an internal parser test using write_() which queues multiple packets and got the testing of the buffer holders working. (It did also have a bug, that was causing unshift() to always have to allocate a node, making like the whole thing guarateed execute). But; I'm fine leaving it with not checking the result, really, always there will be a shift() before an unshift() (or a throw, which invalidates the whole thing, and loses partial data buffers)

brettz9 commented 4 years ago

Oh, sorry, I am the one who missed it. Awesome, thank you! I plan to submit a PR now to get the final threshold (branches) enforcing 100% as well.

brettz9 commented 4 years ago

Just wanted to thank you again for all your diligence in reviewing and getting the project all up to 100% coverage. Very reassuring, and I hope it further encourages adoption!

d3x0r commented 4 years ago

Thanks; I apologize for not being able to dedicate the time earlier :) I hope so also.

d3x0r commented 4 years ago

Edit: I grabbed all these automated tests and threw them against JSOX; all was well there, except a lot of bad tests had to be deleted, because unquoted partial keywords would just be their string... (the npm package page for json-6 has a failing status badge to me from travis ; would have gotten that on the latest publish)