Closed eldadHaber closed 6 years ago
Good points
Can you do the change? We can discuss after how to extend it.
E
On Jan 30, 2018, at 9:44 PM, Keegan Lensink notifications@github.com wrote:
@klensink commented on this pull request.
In src/integrators/ResNN.jl:
@@ -61,7 +67,7 @@ function apply(this::ResNN{T},theta::Array{T},Y0::Array{T},doDerivative=true) w
Ydata = zeros(T,0,nex) for i=1:this.nt
- Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative)
- Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative,tmp[i,2]) tmp[i, 2] creates a sub array that copies the data out of tmp into a temporary variable. I suspect that this means you would not really be saving many memory allocation passing the input like this.
I suspect would need to either pass view(tmp, i, 2) to create a reference to that section in memory, or pass `tmp, i, 2' as three input arguments and modify tmp in place within the function's scope to accomplish this with no allocations.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
Yup, I'll test it out tomorrow and accept the PR once I get it working.
On Jan 30, 2018 10:30 PM, "Eldad Haber" notifications@github.com wrote:
Good points
Can you do the change? We can discuss after how to extend it.
E
On Jan 30, 2018, at 9:44 PM, Keegan Lensink notifications@github.com wrote:
@klensink commented on this pull request.
In src/integrators/ResNN.jl:
@@ -61,7 +67,7 @@ function apply(this::ResNN{T},theta:: Array{T},Y0::Array{T},doDerivative=true) w
Ydata = zeros(T,0,nex) for i=1:this.nt
- Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y,doDerivative)
- Z,dummy,tmp[i,2] = apply(this.layer,theta[:,i],Y, doDerivative,tmp[i,2]) tmp[i, 2] creates a sub array that copies the data out of tmp into a temporary variable. I suspect that this means you would not really be saving many memory allocation passing the input like this.
I suspect would need to either pass view(tmp, i, 2) to create a reference to that section in memory, or pass `tmp, i, 2' as three input arguments and modify tmp in place within the function's scope to accomplish this with no allocations.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/XtractOpen/Meganet.jl/pull/33#issuecomment-361837256, or mute the thread https://github.com/notifications/unsubscribe-auth/ATplBXuDm592Oi4fYqFnnNH8e1gj3cujks5tQAh5gaJpZM4Rzcnc .
Not sure if Array{Array{T},1} or Array{Array{T},2}
From: Keegan Lensink Sent: Wednesday, January 31, 2018 10:06 AM To: XtractOpen/Meganet.jl Cc: Eldad Haber; Mention Subject: Re: [XtractOpen/Meganet.jl] started to add the tmp variable is in/out variable to single and doub… (#33)
@klensink commented on this pull request.
In src/integrators/ResNN.jl:
@@ -53,6 +53,12 @@ function apply(this::ResNN{T},theta::Array{T},Y0::Array{T},doDerivative=true) w nex = div(length(Y0),nFeatIn(this)) Y = reshape(Y0,:,nex) tmp = Array{Any}(this.nt+1,2) @eldadHaber @eldadHaber Does tmp need to be an Array{Any}? It looks like tmp could be an Array{Array{T}, 1} — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
From how you defined it here (tmp = Array{Any}(this.nt+1,2)
) it looks like it would be an Array{Array{T}, 2}
. I tried to quickly edit my comment above, but it looks like you got this by email so you saw my typo.
I was more looking to confirm if the elements of tmp
will be arrays?
In general yes
From: Keegan Lensink Sent: Wednesday, January 31, 2018 10:21 AM To: XtractOpen/Meganet.jl Cc: Eldad Haber; Mention Subject: Re: [XtractOpen/Meganet.jl] started to add the tmp variable is in/out variable to single and doub… (#33)
From how you defined it here (tmp = Array{Any}(this.nt+1,2)) it looks like it would be an Array{Array{T}, 2}. I tried to quickly edit my comment above, but it looks like you got this by email so you saw my typo. I was more looking to confirm if the elements of tmp will be arrays? — You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or mute the thread.
I'm skeptical about making tmp an Array{Array{T}, 2}
. Note that in the second column, we'll have temp variables from the layers. For some layers, this will itself be an Array{T,2}
. So, on the ResNN level, I don't think we can constrain the type.
Might be worth trying this though. I'd expect the tests to fail if you add the constraint.
closing due to outdated
Made changes to single and doubleSymLayer to support tmp as input and output Changed the ResNN to support this change
This should propagate up ...