Open grzegorzgolec opened 10 years ago
No, currently this is not supported - see also #658 I don't think you'll get noticeable performance increase by just switching to array (would be interesting to measure real world numbers though). It's relatively easy to implement, feel free to submit PR.
This is also something we'd be interested in. When changing some other parts of our pipeline from objects to arrays we got about a 65% reduction in time when loading / processing / storing data. Primary reasons being that objects generate a lot more garbage, are much larger when converted to JSON, and arrays are much more efficient to iterate over - not all of these apply directly to this module, but I would expect we'd have a pretty measurable performance boost, certainly in our app-side code handling the results, and hopefully also in the node-mysql-side code if it has to do a lot less work marshaling data around.
It would indeed improve performance here, especially because it would remove the need for the module to look up the field name for each column in each row and add a new property to the object for each column (and in fact, since we know how many fields come back, we would pre-allocate each row's array with the necessary number of elements, removing any potential array re-allocation.
This would be a great feature. Node-pg has this, it's called a config called rowMode='array'
PRs to implement are welcome and will speed up the time-to-release significantly :)!
Kinda like fetch_row
vs fetch_assoc
. The array is beneficial in some cases because it can be passed directly to grid views of certain frameworks. Now the result set needs postprocessing for that.
Subscribing because interested.
Agree that this is highly desirable and has to be more efficient the more data there is. By calling connection.query(sql, function(err, rows, flds) we can have the fields meta data so outputting fieldname for every field for every record must be slower than just sending a simple js array. pg & firebird + maybe others have this already.
Hi @tomcon, you're welcome to work on a pull request to implement this feature! We would love it!
+1 Would be great to have :-) JSON file would be a lot smaler. (20-30 percent smaler in my app). That's a lot when you are on bad mobile connections most of the time.
EDIT: Have now converted some resultset from Object to Array and one resultset actually reduced from 7151 bytes to 3459 bytes. More than 50% reduction in file size.
Hi @Stavanger75, you're welcome to work on a pull request to implement this feature! We would love it!
Hi @dougwilson, thanks for the invite. It looks like @ifsnow have allready completed the task. Thats great :-)
Indeed, @ifsnow submitted it a few days after my invite :)
is there any update on this option?
For anyone reading this and waiting, you can use the mostly compatible mysql2 library in the meantime: https://github.com/sidorares/node-mysql2/blob/master/documentation/Extras.md#receiving-rows-as-array-of-columns-instead-of-hash-with-column-name-as-key
Is there option for getting records from mysql as flat array rather than object. Something like this
{ contractor_id: 199 , contractor_name: ... }
represented as[199,"Andrew","Banks", 88.92, 55,15]
.I want to speed up reading of 75k records, now it takes 35 seconds I am wandering is array representation would be transferred to node faster as 35 seconds.