Closed jlchan closed 1 year ago
Merging #59 (0adca35) into main (963eb3f) will decrease coverage by
2.10%
. The diff coverage is0.00%
.
@@ Coverage Diff @@
## main #59 +/- ##
==========================================
- Coverage 96.78% 94.69% -2.10%
==========================================
Files 22 23 +1
Lines 2492 2547 +55
==========================================
Hits 2412 2412
- Misses 80 135 +55
Impacted Files | Coverage Δ | |
---|---|---|
src/RefElemData_TensorProductWedge.jl | 0.00% <0.00%> (ø) |
|
src/StartUpDG.jl | 100.00% <ø> (ø) |
:mega: We’re building smart automated test selection to slash your CI/CD build times. Learn more
Closing in favor of https://github.com/jlchan/StartUpDG.jl/pull/93
@Davknapp this is the draft I have for a tensor product wedge. The setup only involves changing the reference element constructor; everything else should remain the same. Here's an example of how I call it
Trixi.jl specializes
RefElemData
for FDSBP types usingDerivativeOperators
from @ranocha's SummationByPartsOperators.jl package (see https://github.com/trixi-framework/Trixi.jl/blob/6e5512e8f4b8b2c3bf6811c6a3b0e548de4a91f6/src/solvers/dgmulti/sbp.jl#L66-L120). The intent is to use these with theTensorProductWedge
type to construct discretization matrices.There are two major changes needed for this PR:
RefElemData
currently assumes all differentiation matrices are the same type; however, it might make sense to allow the tensor product wedge to have differentiation matrices of different type (e.g., a sparse differentiation matrix in the vertical direction). This would be a breaking change https://github.com/jlchan/StartUpDG.jl/issues/58.