Open dragonwong opened 1 month ago
相关系数计算:
$$r=\frac{\sum\left[\left(x_i-\overline{x}\right)\left(y_i-\overline{y}\right)\right]}{\sqrt{\mathrm{\Sigma}\left(x_i-\overline{x}\right)^2\ \ast\ \mathrm{\Sigma}(y_i\ -\overline{y})^2}}$$
当 ∣r∣≥0.8时,可视为高度相关; 当 0.5≤∣r∣<0.8时,可视为中度相关; 当 0.3≤∣r∣<0.5时,可视为低度相关; 当 ∣r∣<0.3时,可视为相关性极弱,可视为不相关。
一元线性回归方程 $y=\beta{0}+\beta{1}x$,计算:
$$\left{\begin{matrix} \beta{1} = \frac{n\sum x{i}y{i}-\sum x{i}\sum y{i}}{n\sum x{i}^{2}-(\sum x{i})^2} \ \beta{0} = \bar{y} - \beta_1\bar{x} \end{matrix}\right.$$
const variable1 = [3471994, 3522934, 3435623, 3368437, 3412836, 3590946, 3485158, 3181179]; const variable2 = [27.7, 27.8, 28.27, 27.58, 28.75, 34.3, 28.55, 26.74]; const sum = arr => arr.reduce((a, b) => a + b, 0); const mean1 = sum(variable1) / variable1.length; const mean2 = sum(variable2) / variable2.length; const differences1 = variable1.map((value, index) => (value - mean1)); const differences2 = variable2.map((value, index) => (value - mean2)); const differences = differences1.map((value, index) => (differences1[index] * differences2[index])); const squaredDifferences1 = differences1.map(value => Math.pow(value, 2)); const squaredDifferences2 = differences2.map(value => Math.pow(value, 2)); const variance1 = sum(squaredDifferences1); const variance2 = sum(squaredDifferences2); const covariance = differences.reduce((a, b) => a + b, 0); const correlationCoefficient = covariance / Math.sqrt(variance1 * variance2); console.log('相关系数为:', correlationCoefficient); const b1 = ((variable1.length * sum(variable1.map((value, index) => variable1[index] * variable2[index]))) - (sum(variable1) * sum(variable2))) / ((variable1.length * sum(variable1.map((value, index) => Math.pow(value, 2)))) - Math.pow(sum(variable1), 2)); const b0 = mean2 - (b1 * mean1); console.log('b0为:', b0); console.log('b1为:', b1);
公式
相关系数计算:
$$r=\frac{\sum\left[\left(x_i-\overline{x}\right)\left(y_i-\overline{y}\right)\right]}{\sqrt{\mathrm{\Sigma}\left(x_i-\overline{x}\right)^2\ \ast\ \mathrm{\Sigma}(y_i\ -\overline{y})^2}}$$
一元线性回归方程 $y=\beta{0}+\beta{1}x$,计算:
$$\left{\begin{matrix} \beta{1} = \frac{n\sum x{i}y{i}-\sum x{i}\sum y{i}}{n\sum x{i}^{2}-(\sum x{i})^2} \ \beta{0} = \bar{y} - \beta_1\bar{x} \end{matrix}\right.$$
代码
工具
参考资料