ThoughtWorksInc / DeepLearning.scala

A simple library for creating complex neural networks
http://deeplearning.thoughtworks.school/
Apache License 2.0
770 stars 87 forks source link

General LA, Data Flow and Reactive Programming #5

Open sirinath opened 7 years ago

sirinath commented 7 years ago

For what you are implementing for deep learning with a bit more flexibility perhaps you can make this a such that it can also be used to code application logic in LA / Data Flow / Reactive paradigms. Is it possible to give this flexibility?

Atry commented 7 years ago

Sorry but I did not get your point. Could you paste some code or pseudocode?

sirinath commented 7 years ago
object A {
   val a: Matrix = Matrix(3, 3)
   val b: Matrix = Matrix(3, 3)
   val c: Matrix = a * c
}

A.a.append(myVector)

Above similar to https://github.com/lihaoyi/scala.rx on vectors which a time dimetion.

object A {
   val a: Table = Table ("Col A", "Col B")
   val b: Table = Table ("Col A", "Col C")
   val c: Table = a * c
}

A.a.append(myRow)

A.a.latest("Col A")

Above inspired by http://flix.github.io/

Atry commented 7 years ago

I guess you mean you want to let the same DSL syntax have different kernels. In order to achieve the goal, some implicit abstract factories are required to create DSL's ASTs, like what they did in http://okmij.org/ftp/tagless-final/ . The current DeepLearning.scala's codebase does not use this approach.