IDSIA / crema

Crema: Credal Models Algorithms
https://crema-toolbox.readthedocs.io/
GNU Lesser General Public License v3.0
10 stars 4 forks source link

Posterior query without non-negative constraints (ApproxLP) #25

Closed rcabanasdepaz closed 4 years ago

rcabanasdepaz commented 4 years ago

When running a posterior query without nonnegative constraints and without the epsilon perturbation of 0.0 values, the inference could fail. The reason is that the factor in the denominator contain zeros. This produces many NaN values, which cannot handled by the optimiser.

One possible solution is to keep non-negative constraints and perturb them by adding an epsilon value. However, this can make inference untracktable in large networks. Find a solution that allows omitting non-negative constraints.

Code example:

      double eps = 0.0000001;
        String prj_folder = ".";

        SparseModel model = (SparseModel) IO.read(prj_folder+"/models/chain3-nonmarkov.uai");
        for(int v : model.getVariables()) {
            SeparateHalfspaceFactor f = (SeparateHalfspaceFactor) model.getFactor(v);
            f = f.mergeCompatible();
            f = f.removeNormConstraints();
            //f = f.removeNonNegativeConstraints();   // Not working WITH this
            f = f.getPerturbedZeroConstraints(eps); // Not working WITHOUT this

            model.setFactor(v, f);
        }

        for(int v : model.getVariables()) {
            ((SeparateHalfspaceFactor) model.getFactor(v)).printLinearProblem();
        }

        CredalApproxLP inf = new CredalApproxLP(model);
        SparseModel infModel = (SparseModel) inf.getInferenceModel(1, ObservationBuilder.observe(2,0));

        System.out.println(
                inf.query(1, ObservationBuilder.observe(0,0))
        );
rcabanasdepaz commented 4 years ago

This has been solved (only in posterior queries) in the following way:

Modified factors are not normalised to sum 1.