Open ChristopherRabotin opened 4 months ago
Multivariate dispersions after a state space transformation form an N dimensional ellipsoid. Therefore, the 3 sigma bounds test may not be entirely valid.
A better approach is to compute the covariance post dispersion, and check that it matches the input covariance on the items that were to be dispersed.
use nalgebra::{DMatrix, DVector, Cholesky};
use rand::prelude::*;
use rand_distr::StandardNormal;
fn multivariate_normal(mean: &DVector<f64>, cov: &DMatrix<f64>, num_samples: usize) -> Vec<DVector<f64>> {
// Step 1: Cholesky decomposition of the covariance matrix
let chol = Cholesky::new(cov.clone()).expect("Covariance matrix is not positive definite");
let l = chol.l();
// Step 2: Generate samples from a standard normal distribution
let mut rng = rand::thread_rng();
let standard_normal_samples: Vec<DVector<f64>> = (0..num_samples)
.map(|_| {
DVector::from_iterator(mean.len(), (0..mean.len()).map(|_| rng.sample(StandardNormal)))
})
.collect();
// Step 3: Transform the standard normal samples
let transformed_samples: Vec<DVector<f64>> = standard_normal_samples
.iter()
.map(|sample| l * sample)
.collect();
// Step 4: Add the mean vector to the transformed samples
let samples: Vec<DVector<f64>> = transformed_samples
.iter()
.map(|sample| sample + mean)
.collect();
samples
}
fn calculate_sample_covariance(samples: &Vec<DVector<f64>>, mean: &DVector<f64>) -> DMatrix<f64> {
let num_samples = samples.len() as f64;
let mut covariance = DMatrix::zeros(mean.len(), mean.len());
for sample in samples {
let diff = sample - mean;
covariance += &diff * diff.transpose();
}
covariance / num_samples
}
fn main() {
// Example mean vector and covariance matrix
let mean = DVector::from_vec(vec![1.0, 2.0, 3.0]);
let cov = DMatrix::from_row_slice(3, 3, &[
1.0, 0.5, 0.2,
0.5, 1.0, 0.3,
0.2, 0.3, 1.0
]);
let num_samples = 1000;
// Generate samples
let samples = multivariate_normal(&mean, &cov, num_samples);
// Calculate the sample mean
let sample_mean = samples.iter().fold(DVector::zeros(mean.len()), |acc, sample| acc + sample) / num_samples as f64;
println!("Sample mean:\n{}", sample_mean);
// Calculate the sample covariance matrix
let sample_cov = calculate_sample_covariance(&samples, &sample_mean);
println!("Sample covariance:\n{}", sample_cov);
// Compare the sample covariance matrix to the original covariance matrix
println!("Original covariance matrix:\n{}", cov);
let cov_diff = (sample_cov - cov).norm();
println!("Difference between sample covariance and original covariance: {}", cov_diff);
}
This work should also make a simple but important change: if there is only one input variable and that is a settable parameter, then the draw should be single variate. This may lead to renaming the MultivariateNormal
structure to RandomVariable
or something with a more representative name.
High level description
Nyx uses the hyperdual representation of an orbit to compute the partial derivatives of orbital elements with respect to the Cartesian elements. At the moment, the multivariate unit tests attempt to validate this in the
disperse_keplerian
anddisperse_raan_only
unit tests. However, these tests don't meet the success criteria that's coded up: maybe the criteria is wrong?I have high confidence that the computation of the partials of the orbital elements with respect to the Cartesian state is correct for several reasons:
I'm also reasonably confident that the multivariate distribution is correctly implemented for these reasons:
This leads me to believe that the issue may lie in the validity of the pseudo inverse of the Jacobian, or in the test itself. The test counts how many of the distributions are out of the 3-sigma bounds, and expects these to be around 0.3% of the 1000 samples. It does not work.
Test plans