acroy / Expokit.jl

Julia implementation of EXPOKIT routines
Other
26 stars 11 forks source link

Roadmap #9

Open mforets opened 7 years ago

mforets commented 7 years ago

as a step forward what do you think about:

CC: @acroy, @garrison

acroy commented 7 years ago

Sounds great! I would probably add some benchmarks, but this could also be done later on.

mforets commented 7 years ago

cool. let me add those new issues.

Jutho commented 7 years ago

How about giving these functions a bit more of a descriptive name, in line with Julia's style conventions? E.g. using a keyword argument for the algorithm or so.

mforets commented 7 years ago

+1. suggestion:

acroy commented 7 years ago

I am not sure about that; expmv, phimv and padm are calculating different things (i.e. solve different ODEs) in different ways. At least expmv (or expv) is used in other languages as well and thus people "know what they get". Extending Base.expm is probably reasonable.

How about having a "matrix exponential" type? Has that been discussed already?

Jutho commented 7 years ago

I agree that a suitable design for the interface is non-trivial. You cannot extend Base.expm by just a keyword argument.

A matrix exponential type sounds like a nice Julian way to go. You can then define A_mul_B!(w, expA, v; alg = ...) for algorithms that only act on a vector, and functions like full(expA) for algorithms that compute the full exponential.

acroy commented 7 years ago

Ah, you are right of course about Base.expm (apparently, I got a bit rusty).

The behaviour you describe is exactly what I had in mind. I guess the new type (say MatrixExp) would be a subtype of AbstractMatrix or is there any catch (partly having in mind LinearMaps.jl)?

Jutho commented 7 years ago

Good question. I guess an AbstractMatrix does require that you can access individual entries more or less efficiently, though there is no strict definition I think. It could be implemented similarly as a LinearMap, or even as a specific subtype of it (but then you need to add a dependency, and that's something I am also always rather hesitant about).

acroy commented 7 years ago

I would like to avoid dependencies at this stage. Later on one could still tie into your package or whatever.

Access to individual entries could be slow. (One can get an entire column via exp(A)*[...,0,1,0,...] etc.) In the manual I found the statement

The AbstractArray type includes anything vaguely array-like, and implementations of it might be quite different from conventional arrays. For example, elements might be computed on request rather than stored. [...] It is recommended that these operations [getindex and setindex] have nearly constant time complexity, or technically Õ(1) complexity, as otherwise some array functions may be unexpectedly slow.

I am not sure if the latter can be guaranteed in our case.

Jutho commented 7 years ago

That's what I am also thinking; they shouldn't be subtypes of AbstractArray.

ChrisRackauckas commented 7 years ago

Thanks for reviving this project! We are creating some exponential integrators and hit a wall since this area needs some Julia development.

expmv is a sufficiently well-known name (it's the name of the Higham algorithm at least) that it shouldn't be avoided. phimv is related and sufficiently clear that it's the phi matrix vector product. I would avoid the unnecessary allocation though and make the Pade part algorithm=:pade.

Like matrix exponentials and all of that, it should be a function and the matrix exponential type which has A_mul_B! call expmv should be added separately so that way users could easily add this kind of overload to their own abstract linear maps.

Does anyone see a license at all for ExpoKit? I can't find it, so I'd be very careful and not look at their code.

It would be nice if the Higham algorithm was offered as well (http://epubs.siam.org/doi/abs/10.1137/100788860). I would like it if there was sufficient ability in one package to pull off this kind of benchmarking (http://profs.sci.univr.it/~caliari/pdf/preCKOR13.pdf).

Anyways, if this project is revived (looks like it just was a few days ago!) we will likely abandon our efforts over at https://github.com/JuliaDiffEq/Expmv.jl/issues/1 and use this.

acroy commented 6 years ago

@ChrisRackauckas : I was also very happy that @mforets woke me from my "hibernation" ...

Regarding the license: Before I started this package few years ago I was in contact with R.B. Sidje and asked him about it. It seems there is no strict license and he is Ok with us using his code as a starting point.

The Higham algorithm was actually implemented by @marcusps in his package Expmv.jl. He also did some benchmarking, but it seems not much happened recently. There is also #1 on having the algorithm in this package.

ChrisRackauckas commented 6 years ago

Expmv.jl allocates a ton (no in place form either) and doesn't have the norm 1 estimation function, and it has seemed that development stopped, so I was hoping you guys would pick it up. Looks like that is on the list as #1

mforets commented 6 years ago

that sounds goood to me :) within 4 weeks or so i propose to focus on adding functionality and benchmarks. the interface for inclusion in JuliaDiffEq can be revised afterwards.

ChrisRackauckas commented 6 years ago

Well since you gave me a timeframe I'll hold you to it 😄

marcusps commented 6 years ago

@ChrisRackauckas @acroy Expmv.jl a norm 1 estimation function in a branch, just hasn't been fully tested. Development has stopped, true -- I just don't have the bandwidth at the moment.

mforets commented 6 years ago

in connection to the discussion about the interface, have you checked MappedArrays.jl? can the lazy matrix exponential be a subclass of a MappedArray?

Jutho commented 6 years ago

I think MappedArrays is for lazily mapping a function elementswise to an arbitrary array, quite the opposite thus as mapping a function to a matrix as an operator (however it is implemented).

ChrisRackauckas commented 6 years ago

I think this is ready to register. If you are willing to register it, I can review the repo right now and I'll accept it in METADATA.

ChrisRackauckas commented 6 years ago

I just reviewed it. If you register now, just ping me on the METADATA PR and I'll merge.

acroy commented 6 years ago

Done.

ChrisRackauckas commented 6 years ago

It would be nice if Al-Mohy & Higham was added to the roadmap. There's some starter code around Julia to pull from, it would be good to have a complete version for the benchmarks #21. @mforets you've done a fantastic job, and I hope we get at least this one more thing out of you :).

mforets commented 6 years ago

i'm up for it 👌

after reading the paper i'll experiment with the code in @marcusp's ExpmV.jl. i see norm1est and normest1 branches. do you have something to comment about that? which one should i look? and less important for now, but for organizational purposes: where should i send my PR? here it's ok? thanks

acroy commented 6 years ago

I am not against including AH in this repo if the license allows it? (I think Marcus mentioned the MATLAB code to be under BSD license and his code is as well). I think it might be good to start directly from the description in the paper if license is an issue.

ChrisRackauckas commented 6 years ago

Any movement on the AH part?

mforets commented 6 years ago

Nope, sorry.. i've just relocated, among other things, so no free time to code from my side yet.

ChrisRackauckas commented 6 years ago

It's all good. Thank you for all of the contributions you've done so far!