Closed romeerez closed 4 years ago
Hi @Romaboy
i'm not quite sure what you want to archieve with postgre's row_to_json()
or with your jsonArrayString
method.
If you have an API endpoint and want to return a json string, you could do the following:
const users = await User.where(conditions) // converts every row to an `User` record
return JSON.stringify(users) // returns the json string
this has the additional benefit, that you could easily modify every attribute's value via convertOutput
.
If you just want the raw objects without all the record overhead, do the following:
const users = await User.where(conditions).asRaw() // does NOT create `User` records
return JSON.stringify(users) // returns the json string
The above method is definitely faster, because it uses less memory, but with asRaw()
no attribute conversion will be made either. You'll get the raw pg/knex output.
asRaw()
will be called automatically if you do a .select('some_function()')
or use aggregate functions.
Thanks, glad you like it!
Thanks for fast reply. What I want to achieve:
SELECT row_to_json(users.*) FROM users
This will return JSON string like '[{name: "username"}]', which can be just passed to client. One action - just query.
const users = await User.where(conditions).asRaw() // does NOT create `User` records
return JSON.stringify(users) // returns the json string
Here goes several actions:
In real API usually no one cares is it responding 50ms or 200ms, but I'm doing node.js app just for fun and want it to not doing redundant operations "db response to json and then from json to string"
let result = await store.models.user.select('row_to_json(users.*)').asRaw()
console.log(result)
result = await store.models.user.select('row_to_json(users.*)')
console.log(result)
asRaw
seems to not work here as both console.log
gives the same parsed javascript object
Moved forward: Now I can define static method which will give json, but I can't define scope which could return exact value. In ActiveRecord static methods can be used as scope and return simple value.
Hmm, I got your point.
the pg
client does the basic result serialization and will automatically return a JS object.
See the following example:
const { Client } = require('pg')
const client = new Client()
await client.connect()
const result = await client.query( 'SELECT row_to_json(users.*) FROM users')
console.log(result.rows) // [{row_to_json: {id: 1, login: '...', ...}}]
That's the same with openrecord:
const result = await User.select('row_to_json(users.*)')
If you need the JSON as a string simply add an explicit type to the query:
const result = await User.select('row_to_json(users.*)::text')
console.log(result) // [{row_to_json: '{"id":1,"login":"..."}'}]
if you take a look at the postrges docs, you will see that row_to_json
returns json
. That's the reason you need to typecast it to text
.
openrecord automatically calls asRaw
because of the SQL function call.
So your last example queries are 100% identical.
to simplify the select('row_to_json(users.*)::text')
call, you can do the following:
class BaseModel extends require('openrecord/store/postgres').BaseModel {
static definition() {
this.scope('jsonArrayString')
}
static jsonArrayString() {
this.select(`row_to_json(${this.definition.tableName}.*)::text`)
}
}
The scope
function modifies the jsonArrayString
method to be chainable
And use it like this:
const jsonString = User.where(conditions).jsonArrayString()
If you need to select only specific columns, you need to implement this into jsonArrayString
Yes, thanks, more exactly I got:
COALESCE(json_agg(row_to_json(${this.definition.tableName}.*)), '[]')::text
-- COALESCE will return empty array if no such records
-- json_agg is aggregating all records to json array
-- row_to_json convers each row
Now I'm trying to make this method return string value instead of { coalesce: '[json array]' }
Okay, I got simple all columns jsonAgg helper, but I can not do more important thing to it: do subquery. Here why I tried to implement this strange thing:
let start = Date.now()
for (let i = 0; i < 100; i++)
await User.jsonAgg()
console.log(Date.now() - start)
start = Date.now()
for (let i = 0; i < 100; i++)
JSON.stringify(await User)
console.log(Date.now() - start)
100 users in the table, ten string columns, 100 iterations - jsonAgg here done in 0.2 seconds, JSON.stringify in 1 second
Here is code of my BaseModel for other models:
const {BaseModel} = require('openrecord/store/postgres')
module.exports = class Base extends BaseModel {
static definition() {
this.scope('singleValue', function() {
this.singleResult().setInternal('select_value', true)
})
this.scope('jsonAgg', function() {
const selection = `coalesce(json_agg(row_to_json(${this.definition.tableName}.*)), '[]')::text`
return this.select(selection).singleValue()
})
this.afterFind(function(data) {
const selectValue = this.getInternal('select_value')
if (selectValue)
for (let key in data.result) {
data.result = data.result[key]
break
}
})
}
}
I wish that Model.jsonAgg()
produced SELECT *some magic* FROM table
as it is in code above, and that Model.select('name').where({name: 'name'})
produced
SELECT *some magic* FROM (
SELECT name FROM table WHERE name = 'name'
) t
(I did such helpers on ActiveRecord, so missed them here) but it appear to be too hard, functional style helper would be easier.
Not easy, but I got a working prototype:
The query
object given to the beforeFind
hook is a knex query.
It's not the most elegant solution, but at the moment the beforeFind
hook can't just return a brand new query object, so we have to modify the existing one.
this.beforeFind(function(query) {
const connection = this.store.connection
this.asRaw()
query.modify(function(queryBuilder) {
queryBuilder
.from(queryBuilder.clone().as('json_subquery')) // clone the current query
.clearSelect() // remove all selects
.clearWhere() // remove all conditions
.clearOrder() // remove all sorting
.clearHaving() // remove all havings
.select(
connection.raw(
`coalesce(json_agg(row_to_json(json_subquery.*)), '[]')::text as row`
)
)
})
// undocumented knex functions!!
query._clearGrouping('group') // remove all grouping - where is currently no knex native method?!
Object.assign(query._single, {
// remove limit an offset
limit: undefined,
offset: undefined
})
}, -1000) // run after all other `beforeFind` hooks
You'll need to add some logic to switch between normal mode an the JSON subquery one like you did in your last example. I think that this could be implemented directly into openrecord (PG only).
What do you think?
Does this solve your problem?
Yes, exactly, thanks a lot!
Now I modified my base model to have definition:
this.scope('jsonAgg', function() {
this.setInternal('jsonAgg', true)
})
this.beforeFind(function(query) {
if (!this.getInternal('jsonAgg')) return
// here your code
}
And benchmarked selection:
let start = Date.now()
for (let i = 0; i < 100; i++)
await User.select('a', 'b', 'c', 'd', 'e').jsonAgg()
console.log(Date.now() - start)
start = Date.now()
for (let i = 0; i < 100; i++)
JSON.stringify(await User.select('a', 'b', 'c', 'd', 'e'))
console.log(Date.now() - start)
Json agg rules here: on first request almost the same, then it takes 0.17s vs 1.03s for JSON.stringify
Strange result, looks odd, it showing that most of time is spending inside JS lib - knex, OpenRecord and JSON.stringify, not in database query
awesome! If you have a flamegraph or some similar benchmarking results regarding "time spend", please attach it to this issue. I'm very interested in reducing overhead.
Thanks, Philipp
ActiveRecord has useful methods
select_value
,select_values
,select_rows
and other.I'd like to use
row_to_json
from postres as it is the most efficient way to give result from API.I'd like to add some helper kind of this:
(don't sure yet how is static method could be integrated with OpenRecord models)
And then use it like:
Method select_value should take the first row the first column result without parsing and just return it.
Maybe I could try to implement it and make PR, will it be useful, how do you think?
By the way, OpenRecord looks the most convinient library among other ORMs for node.js, especially for RoR developers, thanks a lot