Open tkoskela opened 7 months ago
I tried passing chunksize = 1
to process_map
as the error message suggested and got
Writing graphs: 0%| | 0/1112 [00:00<?, ?it/s]
concurrent.futures.process._RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/lib/python3.10/multiprocessing/queues.py", line 244, in _feed
obj = _ForkingPickler.dumps(obj)
File "/usr/lib/python3.10/multiprocessing/reduction.py", line 51, in dumps
cls(buf, protocol).dump(obj)
RecursionError: maximum recursion depth exceeded while pickling an object
"""
Are my graphs too large? Can I do anything about this
Here is a link to my ford config
I think the second bug is a duplicate of #517 but with a public repo, so I have some chance of being able to debug it.
I've not managed to track this down yet, but this SO question/answer looks promising: https://stackoverflow.com/questions/63876046/cannot-pickle-object-maximum-recursion-depth-exceeded
Stack OverflowI'm trying to pickle objects that I generate in a script to process them afterwards, but I get this error: File "<ipython-input-2-0f716e86ecd3>", line 1, in <module> pickle.dump(
I'm setting up ford in my fortran project. The html documentation is getting produced fine. I have set
graph: true
to produce dependency graphs, but they aren't getting rendered well in the html output. I tried settinggraph_dir:
to save copies of graphs and I got the below warning and error.I'm using
ford version 7.0.3
andPython 3.10.12