Closed JonForest closed 7 years ago
This is great stuff. Super useful to know. Thanks @JonForest
The main improvement I could see would be to drop the raw querying via the connector and integrate knex giving us A: consistency B: promises C: better more manageable syntax than raw sql. D: additional parameter safety?? It should be a simple thing to do to read datasources.json for the connection details and put the knex instance on app in a boot script or whatever.
Thanks a lot for posting this. This is helpful
The blog post was written up here: https://www.mediasuite.co.nz/blog/using-native-sql-loopback/ Closed! :)
I've been working on a blog post for the soon-to-be Mediasuite blog. This is what I've got so far, but it's not been polished in the slightest. I'd love some feedback and some technical validation.
The examples in here are basically ripped from some code I wrote in My Worksites. I'm going to replace them code from this repo: https://github.com/mediasuitenz/blog-books-and-authors
All the excitement, such that it is, happens in here: https://github.com/mediasuitenz/blog-books-and-authors/blob/master/server/models/book.js
This blog post will undergo some serious polishing before it goes live, but before I do that, I'd like to ensure it isn't the proverbial turd.
Using Native SQL in Loopback
The Case For Native SQL
Sometimes you want to use a join. I know, I know, we can get a super set and filter it down, but relational databases are really good at this kind of thing. They’ve been doing it for a long, long time. As swift as the Node runtime has become, it can’t, and shouldn’t compete.
We want to limit our query by values on other models, but Loopback does not give us the facility to do this. Thera are a couple of options. 1) Create a View and then map a Loopback model to this View. It’s a good strategy, and one we use for some search features, but is not practical if you want to use the results to populate an existing model already mapped to a database table. 2) Write SQL, execute it and hydrate the Loopback models with the results by hand.
This blog post is going to examine the second of these options. Some of the code in here is specific to the PostgreSQL connector, though the same pattern will be appropriate for any relational database.
Getting Access to the Connector
To access the methods required, we need to get hold of the database connector.
If we deconstruct this a little bit, what are we doing?
app
is the main loopback objectdataSources
represents the data sources that are declared in thedatasources.json
(or equivalent js file)db
is the name of our chosen datasource, in this case theconnector
is the object that contains the useful methods for executing the SQL and managing the resultsIn fact, the
connector
object is so useful, and I don’t want to keep typing it all the time, that we should create a reference to it that’s easier to type.// bit about the bind if needed https://docs.strongloop.com/display/public/LB/Connecting+models+to+data+sources- basics of setting up a data source https://docs.strongloop.com/display/public/LB/Environment-specific+configuration - how to set up environment specific data sources, and a bit more detail on what they key/values in the datasource object are used for.
Further reading for interest
https://docs.strongloop.com/display/public/LB/Advanced+topics%3A+data+sources - setting up programatic data sources https://docs.strongloop.com/display/public/LB/PostgreSQL+connector - PostgreSQL connector info
Running the Query
Now we’ve got access to the Connector, this bit is really easy.
For the sake of this example, let’s assume a relatively simple query
Then we can use this as simply as:
Loopback is still heavily callback based, and since we’re quite deep into its API there is not promise equivalent of this method.
Hydrating the Models
We now have
resultObjects
, an array of objects, with each object representing one row of results from the query. Each row contains all the fields from, in this example theworksite
andworksitemeta
tables.Warning: If you have columns of the same name in each table, only one field will be present in the javascript object. One table’s data will be lost. This is a real problem you’ll have to manage and design for.
We have to turn the data we get back from the server into the correct structure for hydrating the Loopback model. We can do that using:
The
fromRow
method performs the mapping from the model property key to the field name. From here, we simply create new instances of the models we want using this data.Putting these bits of code together, we can end up something like this:
https://docs.strongloop.com/display/public/LB/Model+definition+JSON+file - see “Data Mapping Properties” for how the data mapping is managed. It’s normally not required to manually add these details, though here is where you can implement non-default options.
What About Related Models
Well, that’s a good question. When you query with Loopback, if you want the related model you can issue the query with an
include
instruction in the query, but you can’t do that here. You’ll note in the SQL there were two models worth of data returned. If you have unique field names, it is simple as passing the one results object (worksiteRaw
in this example) into thefromRow
method for each model. e.g.It is required to explicitly link the two, e.g.
And that’s it! You’re models are hydrated, linked and ready to be sent back via your API.
We Suffer In Testing
The downside of this approach, rather obviously, is that it relies on a PostgreSQL backend. Unit Testing of Loopback apps predominantly use the Memory connector, which means that code paths that require SQL population can’t be tested in this fashion.
The only solution would be to create a unit test specific database, and use that connector when in the node ‘test’ environment. While this will solve the problem, at Media Suite we’ve built some cool little hacks to make running the unit tests insanely fast - even with a full set-up and tear-down of fixture data for each test. For instance our test suite currently rips through 390 tests in somewhere around 12 seconds on my Macbook. If we were to move our tests to Postgres, we could easily be back around the eight minute mark.
This remains an outstanding area of investigation.