Most applications work with databases that have many tables. Because Scala macros do not support parametric polymorphism, there is a lot of repetitive code that needs to be written for every table. This becomes an issue when a database contains dozens or hundreds of tables.
I can imagine several possible approaches for generating this boilerplate, described at the end of this document. Each approach would start the same way, by processing case classes that match the regexes specified by the Config file, and generating code. Each table's queries would be generated into a unique type.
The config file would need parameters to control the generated code:
quill {
table-generation {
package-name { // regex matches fully qualified case class names to packages for DAO types
"model.persistence.user.*" : "model.persistence.user.dao"
".*?xxx.*" : "model.persistence.xxx.dao"
".*" : "model.persistence.dao"
}
generate-within = trait // or class, or abstract class
default-cache = strong // or soft, or none
cache-override {
"user.*" : soft
}
execution-context = model.myExecutionContext // default value: scala.concurrent.ExecutionContext.Implicits.global
}
}
For example, given the above config file, and tables called course and user traits, the following traits would be generated: model.persistence.dao.CourseDaoLike andmodel.persistence.user.dao.UserDaoLike. The traits would follow the default Quill naming convention, so the corresponding domain classes would be model.persistence.Course and model.persistence.user.User.
This is what the contents of UserDao.scala might look like:
package model.persistence.dao
import model.User
import model.persistence.Types.{IdLong, IdOptionLong}
import model.persistence.{CachedPersistence, Id, QuillImplicits, SoftCacheLike}
import scala.concurrent.ExecutionContext
import scala.language.postfixOps
trait UserDaoLike extends CachedPersistence[Long, Option[Long], User]
with SoftCacheLike[Long, Option[Long], User]
with QuillImplicits {
import model.persistence.QuillConfiguration.ctx._
implicit val ec: ExecutionContext = model.myExecutionContext
val queryAll: Quoted[EntityQuery[Course]] = quote { query[Course] }
override val _findAll: List[User] = run { quote { query[User] } }
val queryById: IdOptionLong => Quoted[EntityQuery[User]] =
(id: IdOptionLong) =>
quote { query[User].filter(_.id == lift(id)) }
val _deleteById: (IdOptionLong) => Unit =
(id: IdOptionLong) => {
run { quote { queryById(id).delete } }
()
}
val _findById: IdOptionLong => Option[User] =
(id: Id[Option[Long]]) =>
run { quote { queryById(id) } }.headOption
val _insert: User => User =
(user: User) => {
val id: Id[Option[Long]] = try {
run { quote { query[User].insert(lift(user)) }.returning(_.id) }
} catch {
case e: Throwable =>
Logger.error(e.getMessage)
throw e
}
user.setId(id)
}
val _update: User => User =
(user: User) => {
val id: Long = run { queryById(user.id).update(lift(user)) }
user.setId(Id(Some(id)))
}
@inline override def findById(id: IdOptionLong): Option[User] =
id.value.map(theCache.get).getOrElse { run { queryById(id) }.headOption }
}
The above trait could be used to create the DAO for User:
object Users extends UserDaoLike {
// User-specific DAO code goes here
}
The possible approaches that I can think of for implementing the above are:
Generate Scala code into files. The common code would be placed in target/scala-X.YY/quill-cache. The top-level code would be generated into src/main/scala/packageName, so it would be checked in as part of the project
Emit code at compile time using a macro that drives the Quill macros. How feasible is this?
Most applications work with databases that have many tables. Because Scala macros do not support parametric polymorphism, there is a lot of repetitive code that needs to be written for every table. This becomes an issue when a database contains dozens or hundreds of tables.
I can imagine several possible approaches for generating this boilerplate, described at the end of this document. Each approach would start the same way, by processing case classes that match the regexes specified by the
Config
file, and generating code. Each table's queries would be generated into a unique type.The config file would need parameters to control the generated code:
For example, given the above config file, and tables called
course
anduser
traits, the following traits would be generated:model.persistence.dao.CourseDaoLike
andmodel.persistence.user.dao.UserDaoLike
. The traits would follow the default Quill naming convention, so the corresponding domain classes would bemodel.persistence.Course
andmodel.persistence.user.User
.This is what the contents of
UserDao.scala
might look like:The above trait could be used to create the DAO for
User
:The possible approaches that I can think of for implementing the above are:
target/scala-X.YY/quill-cache
. The top-level code would be generated intosrc/main/scala/packageName
, so it would be checked in as part of the project