XtractOpen / Meganet.jl

A fresh approach to deep learning written in Julia
http://www.xtract.ai/
Other
14 stars 9 forks source link

started to add the tmp variable is in/out variable to single and doub… #33

Closed eldadHaber closed 6 years ago

eldadHaber commented 6 years ago

Made changes to single and doubleSymLayer to support tmp as input and output Changed the ResNN to support this change

This should propagate up ...

coveralls commented 6 years ago

Coverage Status

Coverage increased (+0.08%) to 78.217% when pulling b744b4b738294a12557716170a73ebdacf74fb93 on eldadHaber:pull-request/b744b4b7 into bffc199dd23dd1a7c19dffda2261550a304709d6 on XtractOpen:master.

eldadHaber commented 6 years ago

Good points

Can you do the change? We can discuss after how to extend it.

E

On Jan 30, 2018, at 9:44 PM, Keegan Lensink notifications@github.com wrote:

@klensink commented on this pull request.

In src/integrators/ResNN.jl:

@@ -61,7 +67,7 @@ function apply(this::ResNN{T},theta::Array{T},Y0::Array{T},doDerivative=true) w

 Ydata = zeros(T,0,nex)
 for i=1:this.nt
  • Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative)
  • Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative,tmp[i,2]) tmp[i, 2] creates a sub array that copies the data out of tmp into a temporary variable. I suspect that this means you would not really be saving many memory allocation passing the input like this.

I suspect would need to either pass view(tmp, i, 2) to create a reference to that section in memory, or pass `tmp, i, 2' as three input arguments and modify tmp in place within the function's scope to accomplish this with no allocations.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

klensink commented 6 years ago

Yup, I'll test it out tomorrow and accept the PR once I get it working.

On Jan 30, 2018 10:30 PM, "Eldad Haber" notifications@github.com wrote:

Good points

Can you do the change? We can discuss after how to extend it.

E

On Jan 30, 2018, at 9:44 PM, Keegan Lensink notifications@github.com wrote:

@klensink commented on this pull request.

In src/integrators/ResNN.jl:

@@ -61,7 +67,7 @@ function apply(this::ResNN{T},theta:: Array{T},Y0::Array{T},doDerivative=true) w

Ydata = zeros(T,0,nex) for i=1:this.nt

  • Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative)
  • Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y, doDerivative,tmp[i,2]) tmp[i, 2] creates a sub array that copies the data out of tmp into a temporary variable. I suspect that this means you would not really be saving many memory allocation passing the input like this.

I suspect would need to either pass view(tmp, i, 2) to create a reference to that section in memory, or pass `tmp, i, 2' as three input arguments and modify tmp in place within the function's scope to accomplish this with no allocations.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/XtractOpen/Meganet.jl/pull/33#issuecomment-361837256, or mute the thread https://github.com/notifications/unsubscribe-auth/ATplBXuDm592Oi4fYqFnnNH8e1gj3cujks5tQAh5gaJpZM4Rzcnc .

eldadHaber commented 6 years ago

Not sure if Array{Array{T},1} or Array{Array{T},2}

From: Keegan Lensink Sent: Wednesday, January 31, 2018 10:06 AM To: XtractOpen/Meganet.jl Cc: Eldad Haber; Mention Subject: Re: [XtractOpen/Meganet.jl] started to add the tmp variable is in/out variable to single and doub… (#33)

@klensink commented on this pull request.

In src/integrators/ResNN.jl:

@@ -53,6 +53,12 @@ function apply(this::ResNN{T},theta::Array{T},Y0::Array{T},doDerivative=true) w nex = div(length(Y0),nFeatIn(this)) Y = reshape(Y0,:,nex) tmp = Array{Any}(this.nt+1,2) @eldadHaber @eldadHaber Does tmp need to be an Array{Any}? It looks like tmp could be an Array{Array{T}, 1} — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

klensink commented 6 years ago

From how you defined it here (tmp = Array{Any}(this.nt+1,2)) it looks like it would be an Array{Array{T}, 2}. I tried to quickly edit my comment above, but it looks like you got this by email so you saw my typo.

I was more looking to confirm if the elements of tmp will be arrays?

eldadHaber commented 6 years ago

In general yes

From: Keegan Lensink Sent: Wednesday, January 31, 2018 10:21 AM To: XtractOpen/Meganet.jl Cc: Eldad Haber; Mention Subject: Re: [XtractOpen/Meganet.jl] started to add the tmp variable is in/out variable to single and doub… (#33)

From how you defined it here (tmp = Array{Any}(this.nt+1,2)) it looks like it would be an Array{Array{T}, 2}. I tried to quickly edit my comment above, but it looks like you got this by email so you saw my typo. I was more looking to confirm if the elements of tmp will be arrays? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.

lruthotto commented 6 years ago

I'm skeptical about making tmp an Array{Array{T}, 2}. Note that in the second column, we'll have temp variables from the layers. For some layers, this will itself be an Array{T,2}. So, on the ResNN level, I don't think we can constrain the type.

lruthotto commented 6 years ago

Might be worth trying this though. I'd expect the tests to fail if you add the constraint.

DavidBegert commented 6 years ago

closing due to outdated