Closed mmcky closed 5 years ago
First Task would be to take execute_nb.py
script from quantecon.build.lectures
and move it upstream into the extension. I think an appropriate conf.py
setting would be:
execute_notebooks = True/False
new configuration values are setup: https://github.com/QuantEcon/sphinxcontrib-jupyter/blob/master/sphinxcontrib/jupyter/__init__.py
which can then pass through to the writer context. We will need to write a function for execution of notebooks (first up) that we can dispatch to a dask worker. I am thinking this interface will be useful as a way of dispatching the execution of notebook and various other workloads that require execution to a waiting client. http://docs.dask.org/en/latest/futures.html. There is also the delayed framework (http://docs.dask.org/en/latest/delayed.html) but I think Futures is what we want.
These dashboards are also very helpful when diagnosing how dask
is working: http://distributed.dask.org/en/latest/diagnosing-performance.html
thanks @AakashGfude for implementing this. Can you let me know how I can open the dask
dashboard to monitor execution as that would be helpful for the documentation.
We should add execution control to the notebooks that includes
dask
for processing notebooks in parallel. QuantEcon currently uses multiprocessing but it has some limitations when parsing TraceBack and errors.dask
should be more flexible for managing the task set_build/jupyter/reports/
_build/jupyter/coverage
sphinx
cache to execute only those notebooks that have changed (this may come for free if we execute from within the ecosystem itself .. i.e. as a process during thewriter
stage)