dstl / Stone-Soup

A software project to provide the target tracking community with a framework for the development and testing of tracking algorithms.
https://stonesoup.rtfd.io
MIT License
409 stars 136 forks source link

Implementing multiple models (GPB, IMM) #129

Open sgmranso opened 5 years ago

sgmranso commented 5 years ago

I'm keen to look at implementing support in Stone-Soup for proposing multiple models algorithms, such as GPB and IMM. I'm fairly familiar with the theory, but less so with the Stone-Soup repository and the GitHub platform. Some support/collaboration with this would be appreciated.

DaveKirkland commented 5 years ago

Hi, I was also interested in the implementing IMM. I had mentioned this to the Stone-Soup group. I am somewhat familiar with the Stone-Soup design or at least portions of it. Have you seen the Jupyter notebooks for the the tutorials? In there you can see how it is fairly straight forward to switch between the Kalman and Unscented predictors/updaters. Some of the Kalman stuff has been modified since I looked at it, but I believe that is mostly behind the scenes stuff. I have a few ideas on how the IMM and GPB would likely fit into the existing structure. I've been a bit sidetracked by other work for the last bit though to take a serious look at it.

sglvladi commented 5 years ago

Hi! As part of another Dstl project, I recently had a look at and provisionally implemented an IMM predictor/updater pair that can be used in place of the existing Kalman/ExtendedKalman/UnscentedKalman predictors/updaters.

As a small teaser, here is a summary of the approach I followed:

Similar to @DaveKirkland, I am currently busy (mainly writing up my thesis), but I would be available to have a look at this in more detail starting from the beginning of September.

P.S. Apologies for closing/reopening the issue.. I accidentally pressed the wrong button..

DaveKirkland commented 5 years ago

@sglvladi

As a small teaser, here is a summary of the approach I followed:

  • The class instantiation interface is not the same as the Predictors/Updaters we have had so far, i.e. they don't expect a Transition/Measurement model, but rather i) a list of Predictors/Updaters, each parameterised with the desired models, and ii) a matrix that specifies the model transition probabilities.

I was thinking along the same lines for passing in the models and transition matrix. Probably have an abstract base class and then write IMM, GPB1 and GPB2 subclasses. The functions inside these classes would be responsible for doing the prediction / update for each model (this part might fit in the base class) and combining them - each subclass would combine according to it's own algorithm. I'd have to take more time to look and see how that fits with the existing SS design.

I'm not sure about the GaussianMixtureState and how it fits in with the IMM type models. The SS mixture model is used in the GMPHD filter. The intent was to make it reusable.

sglvladi commented 5 years ago

@DaveKirkland

I was thinking along the same lines for passing in the models and transition matrix. Probably have an abstract base class and then write IMM, GPB1 and GPB2 subclasses. The functions inside these classes would be responsible for doing the prediction / update for each model (this part might fit in the base class) and combining them - each subclass would combine according to it's own algorithm. I'd have to take more time to look and see how that fits with the existing SS design.

This sounds like a good plan. I haven't gone to this extent in my implementation, as I haven't really looked at the GPB variants in detail, but from discussions with @simonrmaskell I understand that the part of predicting and updating the underlying densities should be pretty much the same. What varies is when/how you merge the densities, which should be specific to each algorithm.

I'm not sure about the GaussianMixtureState and how it fits in with the IMM type models. The SS mixture model is used in the GMPHD filter. The intent was to make it reusable.

I am certainly not implying that it is not reusable, on the contrary it most certainly is, but given time restrictions it was easier to write something simpler that suited our needs for that project. When it comes to how it is used in the IMM case, I view the state propagated by IMM as a mixture of densities (assume WeightedGaussian for now) for each model. Not sure if this is the same for GPB, but based on this resource I suspect it will be similar.

DaveKirkland commented 5 years ago

@sdhiscocks @sglvladi A problem I've been thinking about - If we pass in a CV and a CA model, we have to expand & contract the statevectors and correlation matrices (because the models have different dimensions). The IMM/GPB object probably doesn't know what models were passed in. One solution is to also, pass in a function that handles the mixing, but that function is very specific to the types of models passed in. How to generalize it so that we combine CV with Coordinated turn, or another model that gets created? I'm not how to capture this is the object oriented framework?

One example - users could code their own function, since they would know which models they are using, but then that results in users doing a lot of the work. We could try to provide a library of functions, but this gets unwieldly. You can't handle all the combinations e..g CV1, CV2, CA, where CV1 and CV2 are constant velocity models but with different process noises. Also, when a user adds a new model, then this library of functions becomes useless.

Lyudmil - have you had similar design issues in your other implementation?

DaveKirkland commented 5 years ago

I'm not sure about the GaussianMixtureState and how it fits in with the IMM type models. The SS mixture model is used in the GMPHD filter. The intent was to make it reusable.

I am certainly not implying that it is not reusable, on the contrary it most certainly is, but given time restrictions it was easier to write something simpler that suited our needs for that project. When it comes to how it is used in the IMM case, I view the state propagated by IMM as a mixture of densities (assume WeightedGaussian for now) for each model. Not sure if this is the same for GPB, but based on this resource I suspect it will be similar.

From what I recall, yes GPB1, GPB2, and IMM would all behave similarly. GPB1 uses 1 measurement history, GPB2 uses 2 measurements, and IMM kind of lies somewhere between the two.

DaveKirkland commented 5 years ago

@sgmranso @sglvladi I've taken a look at Lyudmil's IMM branch. I've updated my fork to the latest master and merged in the IMM branch. I created a local branch on my fork called mixture_models for my work on this - I haven't push the merged IMM code yet - Note merging the IMM with the StoneSoup master was painless.

I have an idea how to implement the model conversions in a way that is somewhat extensible and reusable. I'm going to test it out using the IMM code and then see about implementing the GPB1 and GPB2 or at least see about getting the framework to support them.

simonrmaskell commented 5 years ago

If we get the implementation correct, we should find that IMM, GPB1, GPB2, GPB3 (an algorithm that people have thought of but rarely (if ever) implemented) and IMM2 (an algorithm that I don't think has been invented yet) fall out as parameter settings. In fact, we should also discover that we can do PDAF, PDAF2, (TO)-MHT, IMM-PDAF, GPB3-PDAF2, IMM2-MHT etc etc. The key to doing this is to make the algorithms explicitly described in terms of the rules that they use to combine mixture components and implicit in terms of the implementation of the rules. I sense there's a discussion needed that would sensibly involve Duncan (who's GitHub handle I don't know), as well as the other people contributing to and being mentioned in this thread. Is that something we should do as part of a sprint in October?

DaveKirkland commented 5 years ago

@simonrmaskell Right now, I'm trying to address how to handle the conversion of models with different dimensions. That's something that we should handle with IMM, GPB1, and GPB2. I think we should get a handle on how to implement these first. That way we have an idea of what the issues are. Trying to get a fully parameterized version right away is going to difficult. I'm still trying to get a handle on the algorithmic variants of IMM, GPB1, and GPB2 and getting them into the Stonesoup framework.

simonrmaskell commented 5 years ago

Changing dimensions between models is an orthogonal design choice to the logic used to manage the hypotheses. My real point was that there is significant expertise in the team on how to turn these algorithmic choices into parameter settings: the first time I solved that problem was in about 2001, so I'm confident we can do it again here!! I guess my only plea is that we avoid populating Stone Soup with monolithic implementations of the vast array of algorithms that we could describe much more succinctly with some sensibly defined parameters: please don't start writing code for GPB1, GPB2, IMM, PDAF, IMM-PDAF etc etc

DaveKirkland commented 5 years ago

We should probably set up a meeting with those people that have the expertise in how to generalize these algorithms. Perhaps you could prepare some materials on how you solved this issue before. We have to see how that fits in with the existing StoneSoup framework or whether and/or how it needs to be updated.

Lyudmil has done a specific IMM implementation that I'm using to look at the mixed state issue. His implementation has already identified some issues:

  1. GMM models not being reused between IMM, GMPHD etc
  2. IMMUpdater & IMMPredictor donot inherit from from existing Updater/Predictor classes
DavidFCrouse commented 5 years ago

In that regard, note that the functions multipleModelUpdate and multipleModelPred in the Tracker Component Library handle the case where different states have different dimensionalities. The technique for the IMM is discussed in:

%[2] T. Yuan, Y. Bar-Shalom, P. Willett, E. Mozeson, S. Pollak, and D.

%    Hardiman, "A multiple IMM estimation approach with unbiased mixing for

%    thrusting projectiles," IEEE Transactions on Aerospace and Electronic

%    Systems, no. 4, pp. 3250-3267, Oct. 2012.

I also generalized it to the GPB2, which is implemented in the same file. The main thing is that you don’t just mix in zero (i.e. to extend lower dimensional models), because you will bias stuff.

Dr. David F. Crouse Radar Division

Code 5344 Naval Research Laboratory Phone: (202) 404-8106 E-Mail: david.crouse@nrl.navy.mil

From: DaveKirkland notifications@github.com Reply-To: dstl/Stone-Soup reply@reply.github.com Date: Friday, September 20, 2019 at 3:16 PM To: dstl/Stone-Soup Stone-Soup@noreply.github.com Cc: Subscribed subscribed@noreply.github.com Subject: Re: [dstl/Stone-Soup] Implementing multiple models (GPB, IMM) (#129)

We should probably set up a meeting with those people that have the expertise in how to generalize these algorithms. Perhaps you could prepare some materials on how you solved this issue before. We have to see how that fits in with the existing StoneSoup framework or whether it needs to be updated.

Lyudmil has done a specific IMM implementation that I'm using to look at the mixed state issue. His implementation has already identified some issues: GMM models not being reused between IMM, GMPHD etc

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub, or mute the thread.

simonrmaskell commented 5 years ago

@DaveKirkland I've prepared some slides previously and presented them to @paul-a-thomas, @sgmranso and @dcrew-dstl (Declan, not Duncan as mis-stated above (oops!)). I'd happily share those slides and/or present them again, but I think we need to discuss (at the next Stone Soup telecon I suggest), who does which bit of this. It would be silly to be duplicating effort.

DaveKirkland commented 5 years ago

In that regard, note that the functions multipleModelUpdate and multipleModelPred in the Tracker Component Library handle the case where different states have different dimensionalities. The technique for the IMM is discussed in: %[2] T. Yuan, Y. Bar-Shalom, P. Willett, E. Mozeson, S. Pollak, and D. %    Hardiman, "A multiple IMM estimation approach with unbiased mixing for %    thrusting projectiles," IEEE Transactions on Aerospace and Electronic %    Systems, no. 4, pp. 3250-3267, Oct. 2012. I also generalized it to the GPB2, which is implemented in the same file. The main thing is that you don’t just mix in zero (i.e. to extend lower dimensional models), because you will bias stuff.

Hi David, Thanks to the pointer to the article. At this point I don't need the specifics of how handle to conversion. I'm trying to make sure StoneSoup has the infrastructure that allows the user to implement their own. I think we could implement some default methods (like the ones in the paper) that would show users how to implement their own versions should they need them.

If the methods have additional requirements e..g. access to different model parameters, then we need to ensure the infrastructure is able to handle that. So we need to understand what additional parameters are involved in the the model switching, beyond the states and covariances.