parthnatekar / Loopy-Belief-Propagation

An implementation of loopy belief propagation for binary image denoising. Both sequential and parallel updates are implemented.
3 stars 0 forks source link

Implemention on an Ising Model #1

Open cxiaab opened 2 years ago

cxiaab commented 2 years ago

Hello, I am new to belief propagation. I saw your program and I think it is exactly what I need (a parallel loopy bp).

However, I am not using parallel loopy bp to denoise pics, but I want to do Bayesian inference on a Ising model, whose factor graph is as follows. I wonder if your code can be modified and apply to my case? That is, I need to input a factor graph like the one in the picture but in your code I can only input an image. Would you please give me some hints on what should I modify in your program?

factor graph of global part HMM
parthnatekar commented 2 years ago

Hi,

Without knowing the specifics of your problem, an image is basically a factor graph similar to what you have shown. All you need to do is load the data as a numpy array instead of loading the image. You can update the preprocess function to read whatever type of input data and convert it to a numpy array - I currently have implemented only images and csv files. Once you do this the rest of the code should run easily.

Just run the code and look at the factor messages, clique messages and image variables once the program converges (or after the number of iterations is complete). These are your hidden and observed variables.

cxiaab commented 2 years ago

Thank you for your kind reply and suggestions. I have checked the preprocess function but I am still a little bit confused.

Let me explain my problem in details first. In my case, as you can see, there are two kinds of factor nodes: unary factor nodes (like h_B,1,1...), and pairwise factor node (the factor nodes connecting two variable nodes). Together with the variable nodes, they form a Markov random field. The unary factor nodes would input some message, and the pairwise factor nodes contains the transition probability (like p(s2=1|s1=1), p(s2=1|s1=0)...). With the input message from the unary factor nodes, I want to do message passing and calculate the final marginal probability of each variable node.

From my understanding, it seems your input is always a 01 matrix. 1) I am not sure what are the parameters of your Ising model? Are they theta and gamma? What do they mean in your Markov random field? 2) It seems to me that in my case, there is no such 01 matrix input, but my input should be the message taken in by the unary factor nodes, which should be some value in (0,1). I dont know how to map my input to your input.