Open yim0331 opened 2 years ago
@sibyjackgrove this is the test result
Using Pickle file (default dashboard run):
Using Dask option (using "-d true -o folder_name" options):
@yim0331 Thank you for summarizing. I am puzzled by why the peak time in memory usage is so high for Dask.
Describe the bug A clear and concise description of what the bug is. Dashboard data filters do not work with large data set. The dashboard server responds to the 500 error by numpy.core._exceptions._ArrayMemoryError:
To Reproduce Steps to reproduce the behavior: Install tdcosim in python 3 64bit and run the dashboard with 5GB pickle file FYI. the 5GB data file is available D://dashboard/ folder in the testing server machine
Expected behavior A clear and concise description of what you expected to happen. The dashboard data filter is working without error
Screenshots If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context Add any other context about the problem here.