Closed milibopp closed 8 years ago
Yes, this would be nice. We should definitely be starting to look at nalgebra
to decide what kind of structures are needed. I'm guessing these would be approximate, given the limitations of floating point numbers?
Well, I think that the vector types defined in nalgebra should be exact vector spaces as long as you provide them with an exact field. Of course floats are not an exact field but the vectors can be used with other types. I think we should not limit this to approximations, even though they will probably be used much more.
There's another complication from the uniqueness requirement. A type should be capable of being a vector space in more than one sense. Consider, for example, an n-dimensional vector space over the complex numbers. It is also a 2n-dimensional vector space over the reals. My intuitive solution would to parametrize the VectorSpace
trait over the underlying scalar field, but I have a feeling that this is similar to parametrization over the operator symbols as in #18.
I have started on working to implement Module-like structures.
Do we need to have separate RightModule and LeftModule traits? And if we have those do we want Bimodule trait?
edit: Also how the module-like structures should be documented? The current style for other stuff doesn't work that neatly because they are composite systems.
Should this be closed? Or is there still stuff the needs to be done?
I think this can be closed. Other vector spaces (e.g. normed, finite, etc.) will be the topic of another issue.
These can be done, as soon as there are fields. Should also enable
nalgebra
to begin to migrate to the abstractions used here.cc @sebcrozet