Closed brandonros closed 1 year ago
// Add necessary imports for working with OHLC data
use std::iter::Iterator;
// Define a struct for OHLC candles
#[derive(Debug)]
struct OhlcCandle {
open: f64,
high: f64,
low: f64,
close: f64,
}
// Function to calculate HLC3 values from OHLC candles
fn hlc3(candles: &[OhlcCandle]) -> Vec<f64> {
candles.iter().map(|c| (c.high + c.low + c.close) / 3.0).collect()
}
fn main() {
// ... initialize RNG and ARIMA model parameters as before ...
// Provide your OHLC candles data
let ohlc_candles = vec![
OhlcCandle { open: 100.0, high: 105.0, low: 99.0, close: 104.0 },
// ... more OHLC candles ...
];
// Calculate HLC3 values from OHLC candles
let hlc3_values = hlc3(&ohlc_candles);
// Estimate ARIMA model parameters based on HLC3 values
// ar - order of AR coefficients (e.g., 2)
// d - order of differencing (e.g., 0)
// ma - order of MA coefficients (e.g., 1)
let ar_order = 2;
let diff_order = 0;
let ma_order = 1;
let coef = estimate::fit(&hlc3_values, ar_order, diff_order, ma_order).unwrap();
println!("Estimated parameters: {:?}", coef);
}
From ChatGPT. I guess I misunderstood the part of sim
. Handling OHLC data means sim
rng based data wouldn't be needed.
The next part becomes figuring out (through grid search + mean absolute error I'm guessing) which ar
+ d
+ ma
parameters work best.
Sorry for the thinking outloud spam. Was hoping it'd help somebody else who might land on this and have a very very very lacking/basic understanding of the math/concepts at play here.
I want to feed it
close_difference_percent
I'm pretty sure, then get it to spit out an estimate. What are your thoughts?