I'm having difficulties imposing a hard-constraint Neumann boundary condition in DeepXDE.
Concretely, say I want to solve some PDE on the space-time domain $(x,t)\in[-1,1]\times[0,1]$ with Neumann boundary condition $\partialx u(1,t)=0$. One way to hard-constraint this boundary condition is by transforming the output of a neural network $u{\mathrm{NN}}=u_{\mathrm{NN}}(x,t)$ as follows:
Is there a way to implement this transformation in DeepXDE using an output_transform function (in particular, the evaluation of the NN and its $x$-derivative at $(1,t)$ ) or should I use callbacks for this task?
Hi there,
I'm having difficulties imposing a hard-constraint Neumann boundary condition in DeepXDE.
Concretely, say I want to solve some PDE on the space-time domain $(x,t)\in[-1,1]\times[0,1]$ with Neumann boundary condition $\partialx u(1,t)=0$. One way to hard-constraint this boundary condition is by transforming the output of a neural network $u{\mathrm{NN}}=u_{\mathrm{NN}}(x,t)$ as follows:
$$u{\mathrm{transformed}}(x,t) := x\cdot (u{\mathrm{NN}}(x,t)-u_{\mathrm{NN}}(1,t)-\partialxu{\mathrm{NN}}(1,t)).$$
Is there a way to implement this transformation in DeepXDE using an output_transform function (in particular, the evaluation of the NN and its $x$-derivative at $(1,t)$ ) or should I use callbacks for this task?
Any help would be very much appreciated.
Best, Christopher