bestiejs / json3

A JSON polyfill. No longer maintained.
https://bestiejs.github.io/json3
Other
1.02k stars 150 forks source link

Optimize lex() #27

Closed oxyc closed 11 years ago

oxyc commented 11 years ago

In browsers string operations are quite expensive so for hot code it's a good idea to work with character codes instead.

As character codes make the code a lot more difficult to read and json3 is pure awesome from before, I only changed the most costly operations and kept the rest as before.

commit 65d583c7272967159cabc09ecec421eb781cdfdf
Author: Oskar Schöldström <public@oxy.fi>
Date:   Sat Mar 23 13:45:24 2013 +0200

    Optimize for the common case where a string is valid

    before x 108,854 ops/sec ±0.50% (93 runs sampled)
    after x 129,460 ops/sec ±0.42% (101 runs sampled)

commit dabbd930c819df00605d351ac89aebb7b2fb9892
Author: Oskar Schöldström <public@oxy.fi>
Date:   Sat Mar 23 13:25:03 2013 +0200

    Optimize token type branching in lex()

    before x 70,844 ops/sec ±0.68% (99 runs sampled)
    after x 108,854 ops/sec ±0.50% (93 runs sampled)

The performance is measured with node, I actually have no idea what the gain is on browsers that will use these functions. If anyone has access to proper VMs the changes can be benchmarked on http://jsperf.com/json3/20 (these are slightly more optimized as lex() only uses char codes)

Edit: Err, okay I'm new to pull requests, this did not go how it was supposed to go :D But I'm guessing you can figure it out @kitcambridge

oxyc commented 11 years ago

I updated the jsperf tests with these two commits. http://jsperf.com/json3/21

There's still some performance to be gained by converting evertyhing in lex() to character codes. If you decide you want this, I can convert the rest as well.

ghost commented 11 years ago

This is amazing—thank you so much, @oxyc! Merged into the v3.2.5 branch. I'll have access to my test VMs on Monday, and will run the jsPerf tests then.

ghost commented 11 years ago

I went ahead and converted the remaining strings to character codes, as you suggested. Thanks again!