I have an instance of JupyterLab running locally inside a docker container running JupyterHub, and when attempting to build a bar graph with a relatively small amount of data, it causes the kernel to panic.
thread '<unnamed>' panicked at 'called `Option::unwrap()` on a `None` value', /home/username/.cargo/registry/src/github.com-1ecc6299db9ec823/evcxr-0.10.0/src/eval_context.rs:801:34
Here are my imports:
:dep sqlx = { version = "*", features = [ 'runtime-async-std-native-tls', 'mysql', 'decimal' ] }
:dep serde = { version = "*", features = [ "derive" ] }
:dep dotenv
:dep rust_decimal
:dep plotly = { version = "*", features = ["kaleido"] }
:dep itertools-num
:dep rand_distr
use sqlx::{MySqlPool, FromRow};
use serde::Serialize;
use std::fs::File;
use std::io::prelude::*;
use itertools_num::linspace;
use plotly::common::{
ColorScale, ColorScalePalette, DashType, Fill, Font, Line, LineShape, Marker, Mode, Title,
};
use plotly::layout::{Axis, BarMode, Layout, Legend, TicksDirection};
use plotly::{Bar, NamedColor, Plot, Rgb, Rgba, Scatter};
use rand_distr::{Distribution, Normal, Uniform};
Here is the code I am using:
/*
* x_axis_data is an Vec containing 12 entries
* [ 9, 34082, 14212, 6881, 4176, 2661, 1862, 1326, 1001, 770, 623, 483, ]
* y_axis_data is an Vec containing 12 entries
* [ 0, 0, 50000, 100000, 150000, 200000, 250000, 300000, 350000, 400000, 450000, 500000, ]
*/
let trace = Bar::new(x_axis_data, y_axis_data);
// I never get here...
let mut plot = Plot::new();
plot.add_trace(trace);
let layout = Layout::new().height(800);
plot.set_layout(layout);
plot.lab_display();
Running something similar in Python yields no problems with upwards of 50000 entries, so I have my doubts that it has anything to do with the data amounts. It seems odd that this would cause it to hang the way it has. Perhaps I am missing something? Any and all help is welcome! Thanks!
I'm afraid I can't reproduce the error. Would you be able to verify if the problem persists on the most up to date versions of each of your dependencies?
I have an instance of JupyterLab running locally inside a docker container running JupyterHub, and when attempting to build a bar graph with a relatively small amount of data, it causes the kernel to panic.
Software information:
Here is my system information:
Kernel Panic Details:
Here are my imports:
Here is the code I am using:
Running something similar in Python yields no problems with upwards of 50000 entries, so I have my doubts that it has anything to do with the data amounts. It seems odd that this would cause it to hang the way it has. Perhaps I am missing something? Any and all help is welcome! Thanks!