JohannesBuchner / PyMultiNest

Pythonic Bayesian inference and visualization for the MultiNest Nested Sampling Algorithm and PyCuba's cubature algorithms.
http://johannesbuchner.github.io/PyMultiNest/
Other
191 stars 87 forks source link

Why does the eigenvalue decomposition in nested sampling program differ from the actual eigenvalue decomposition result? #234

Closed chelinwang closed 1 year ago

chelinwang commented 1 year ago

Hello! So sorry to to bother you again.

Here is part of my python program(structural damage identification)

gaussdistribution = scipy.stats.norm(0, 0.1)

def prior(cube, ndim, nparams):
    # prepare the output array, which has the same  @shape
    for n in range(ndim):
        cube[n] = gaussdistribution.ppf(cube[n])

beta = 1e4
Nm = MODE_D.shape[1]
# parameter cube is the stiffness reduction factor (SRF)
def loglike(cube, ndim, nparams):
    FEM_D_UP = FEM_A # updating the damaged finite element
    for i in range(ndim):
        FEM_D_UP['INERT'][i] = FEM_D_UP['INERT'][i] * (1 + cube[i]) # using cube(SRF) to update the moment of inertia
    M_D_UP = mglobal(FEM_D_UP, MASS_CEN) # global M matrix
    K_D_UP = kglobal(FEM_D_UP)                       # global K matrix
    MODE_D_UP, EIGV_D_UP = eigen(M_D_UP, K_D_UP) # the defined fuction eign() to handle Eigenvalue decomposition, returning mode and eigvalue
    MODE_D_UP_R = MODE_D_UP[IDEX]
    b = 0
    for n in range(Nm):
        b = b + beta / 2 * ((EIGV_D[n] - EIGV_D_UP[n]) / EIGV_D[n]) ** 2 
    return Nm / 2 * np.log(beta / 2 / np.pi) - b  # the log likelihood

# number of dimensions our problem has
X = ### []
for i in range(NELEM):
    X.append("x" + str(i + 1))
parameters = X

n_params = len(parameters)
# name of the output files
prefix = "detection/1-"

# run MultiNest
result = pymultinest.run(loglike, prior, n_params, init_MPI=True, n_live_points=20000, outputfiles_basename=prefix,
                         resume=False, verbose=True)

# make marginal plots by running:
# $ python multinest_marginals.py chains/3-
# For that, we need to store the parameter names:
import json

with open('%sparams.json' % prefix, 'w') as f:
    json.dump(parameters, f, indent=2)

Given a parameter cube, such as cube=[-0.0763321420369039,0.07388077242023468,0.07784486474094585,0.09779638222088866,0.07046151366758578,0.1017270656774393,-0.033529074173203686,0.08533544428787862,-0.10355496386907588,0.11778798383326926,0.11507137091229327,0.06117944944684767,-0.014982852186534832,0.022343356934935974,-0.034266593911591765,-0.07259655919541884,-0.024551947006594316,0.05694131087676666,-0.11758489135821706,-0.12833658082789784,-0.07837470352038545,-0.086866064530991,-0.13763013614490505,0.0007307396731106368,0.007162400677896131,0.04356593391282579,-0.11493038693052178,0.03456109449405794,-0.08935114511836799,-0.030116820591353146,0.1406800976317542,-0.2049899090132768,0.321386040731783,0.1681322741276427,-0.05732639745036984,0.07520587025493286,0.010880457345982633,0.07249344102782417,0.017287422530331137,0.11565282930240917,-0.013333761328725536,0.1417238947361464,-0.15573234679455694,-0.02129943466156303,-0.18594164289049064,0.0886081898222741,-0.1571540488118369,-0.2758838241210148,-0.050259501081894226,0.08707568444481852,0.07660409871820528,-0.09000561103789823,0.0884263231885546,0.12042308475731478,-0.20829475749426268,-0.04503498622109375,0.0220891144628388,-0.23210361825908143,-0.12129360256451567,0.14315202586090675,0.0692919538499062,-0.01576412112321897,-0.05189337052513007,-0.015008870929086005,-0.05682859924745766,-0.1369860437317299,-0.037864063406824855,0.10482763161523599,0.11569197353917807,0.03773539778004938,0.09790434373381106,0.022866846589447255,0.06860531450414119,0.2626536858910593,0.005277221640287488,0.04348057352204104,-0.09697130176009279,0.05563702832812211,-0.033152641471756554,-0.11203986576911228,0.13772491977904622,0.055614146342794296,0.09622820640557801,0.03906015616688166,-0.08221874663168939,-0.04346466126491563,-0.0841374547547891,0.029719067091466253,-0.09398245520418648,0.18955540211846011,0.014545836177974766,-0.016986311647110904,-0.14289284732870727,0.14885239420357568,0.07023462759363999,0.09438240917287849,-0.06164829204624135,-0.07558136795471554,-0.018141537623398003,0.03292099489098453] # ndim = 100

In the function loglike of nested sampling, ''' def loglike(cube, ndim, nparams): FEM_D_UP = FEM_A # updating the damaged finite element for i in range(ndim): FEM_D_UP['INERT'][i] = FEM_D_UP['INERT'][i] (1 + cube[i]) # using cube(SRF) to update the moment of inertia M_D_UP = mglobal(FEM_D_UP, MASS_CEN) # global M matrix K_D_UP = kglobal(FEM_D_UP) # global K matrix MODE_D_UP, EIGV_D_UP = eigen(M_D_UP, K_D_UP) # the defined fuction eign() to handle Eigenvalue decomposition, returning mode and eigvalue MODE_D_UP_R = MODE_D_UP[IDEX] b = 0 for n in range(Nm): b = b + beta / 2 ((EIGV_D[n] - EIGV_D_UP[n]) / EIGV_D[n]) * 2 return Nm / 2 np.log(beta / 2 / np.pi) - b # the log likelihood ''' the eigenvalue decomposition results in eigenvalue array=[1.89186781, 19.18632987, 67.94419213, 393.66654696, 574.89971165, 835.39448324], but the actual eigenvalue array=[4.91302311e+02, 1.88075345e+04, 1.49310198e+05, 5.42765067e+05, 1.48914363e+06, 3.42779637e+06].

By the way, the eigenvalue decomposition is correct in the nested sampling program. Moreover, given any parameter cube, the eigenvalue decomposition in nested sampling program differ from the actual eigenvalue decomposition result.

I would appreciate it if you could answer my questions! Thank you!