rcdmk / aspJSON

A fast classic ASP JSON parser and encoder for easy JSON manipulation to work with the new JavaScript MV* libraries and frameworks.
MIT License
203 stars 89 forks source link

LoadRecordSet timing out within ArrayPush > Ubound Array #74

Open RaviRamDhali opened 6 years ago

RaviRamDhali commented 6 years ago

I have been running in some very slow results and even timeouts when calling LoadRecordSet.

Here are some details: Total number of records : 100 rows Total number of columns per row: 60

Some testing shows:

Total number of records : 100 rows
Total number of columns per row:  10
Time: 1 sec
Total number of records : 100 rows
Total number of columns per row:  30
Time: 3 sec
Total number of records : 100 rows
Total number of columns per row:  50
Time: 8 - 12 sec ****
Total number of records : 100 rows
Total number of columns per row:  60
Time: 20 sec OR timeout ****

I was able to narrow down the bottleneck to : public function ArrayPush() which rebuilds the array each time.

Any thoughts on getting around this bottleneck?

image

rcdmk commented 6 years ago

Hi. Good catch. I can improve that for sure. Just have to use a different approach for this case.

RaviRamDhali commented 6 years ago

@rcdmk I am also trying to (help) find a solution. There is some good testing at : https://stackoverflow.com/questions/4605270/add-item-to-array-in-vbscript

rcdmk commented 6 years ago

Due to the nature of the array system in VBS, I'll have to use another data structure for this, maybe. I'm not really happy with the overhead in the current method, but it's the best way I've found to work natively. I'm still researching for solutions. The link you provided gave me some ideas, but I have no real solution yet.

SeanLMcCullough commented 6 years ago

You could consider using the GetRows method, this will give you an array and with incredibly high performance. That way you could do some rather simple native array manipulations to load it in.

I've found that the communication overhead of ADODB when iterating over large recordsets (especially with tons of columns) imposes significant performance impacts when using MoveNext. I believe ADODB requests each record by record from the database rather than fetching the lot initially. My rule of thumb is that if you've written your query properly, and you're not going to do an early exit, there is no need for ADODB's row-by-row overhead.

rcdmk commented 6 years ago

That is a good point and indeed I always use GetRows in my apps. In this particular case, I had to get the column names for the properties and this is not available in the GetRows result array.

Thinking a lil bit now, I could have gotten the field names from the recordset and then used GetRows to fetch the results and iterate them by index.

I'll try that in the next release.

Thanks.

JohnSTRO-pi commented 4 years ago

@rcdmk Have you had any success with GetRows() implementation? I'm trying to read a RecordSet from the results of the DB query, and convert that into a JSON. I has approximately 4000 rows, so getRows() would be the fastest way to do that I think. I'm trying to do it in Javascript. Pls let me know if you have a working solution.

Thank you | john

RonaldZielaznicki commented 3 years ago

Hey folks. I've manage to take the loading and make it less painful. With about 100 rows and 4 columns the load time drops down to less than a second. I'm looking at half a second to get the entire recordset I'm loading into the JSON object.

I did this by taking many of the suggestions above and applying them to the code. I've also added two new functions, ArrayConcat and addMultiple. What these do is allow us to send in arrays and do one redim rather than having to do several.

However, the problem I'm running up against is serializing the JSON. I did some research and turns out VBScript string concatenation is really slow. I saw two methods for dealing with this. One is to use an array and join, and the other is to use an ADODB.Stream. I opted to go for the second. However, the serialization is still taking longer than I'd like.

I've gotten it down to a little under two seconds to serialize the recordset by using the ADODB.Stream. Anyone have any suggestions on how to get it faster?

Obligatory code to review: https://github.com/RonaldZielaznicki/aspJSON/blob/Showcase/jsonObject.class.asp

rcdmk commented 1 year ago

Hello all,

Didn't have a lot of time to work on this, but I'm back and trying a few things.

I've made a few optimizations on the current development branch (4999ef7), if you would like to test that out. ArrayPush is not used anymore and, instead, I've implemented a buffering logic to avoid resizing the array on every push.

Now, in regards to concatenation, VBScript is very limited in that regard, but I'll give it another try with a custom string builder. In the past, I tried using arrays and joining the result and got no performance improvement (it was worse in some cases), but I'll see if the new optimizations help with that too.

Best, Ricardo