fastai / nbdev

Create delightful software with Jupyter Notebooks
https://nbdev.fast.ai/
Apache License 2.0
4.88k stars 488 forks source link

`nbdev_export` fails #1259

Open RFraser-mtrac opened 1 year ago

RFraser-mtrac commented 1 year ago

I've used the old version of nbdev for a few years now. nbdev_build_lib would work fine when building the python code. I've upgraded to the new version but since then, nbdev_export fails with this error:

nbdev_export Traceback (most recent call last): File "C:\Anaconda\lib\runpy.py", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Anaconda\lib\runpy.py", line 87, in _run_code exec(code, run_globals) File "C:\Anaconda\Scripts\nbdev_export.exe__main__.py", line 7, in File "C:\Anaconda\lib\site-packages\fastcore\script.py", line 119, in _f return tfunc(merge(args, args_from_prog(func, xtra))) File "C:\Anaconda\lib\site-packages\nbdev\doclinks.py", line 137, in nbdev_export for f in files: nb_export(f) File "C:\Anaconda\lib\site-packages\nbdev\export.py", line 59, in nb_export mm.make(cells, all_cells, lib_path=lib_path) File "C:\Anaconda\lib\site-packages\nbdev\maker.py", line 201, in make _all = self.make_all(all_cells) File "C:\Anaconda\lib\site-packages\nbdev\maker.py", line 100, in make_all all_assigns = assigns.filter(lambda o: getattr(_targets(o)[0],'id',None)=='all') File "C:\Anaconda\lib\site-packages\fastcore\foundation.py", line 160, in filter return self._new(filter_ex(self, f=f, negate=negate, gen=gen, kwargs)) File "C:\Anaconda\lib\site-packages\fastcore\basics.py", line 642, in filter_ex return list(res) File "C:\Anaconda\lib\site-packages\nbdev\maker.py", line 100, in all_assigns = assigns.filter(lambda o: getattr(_targets(o)[0],'id',None)=='all') File "C:\Anaconda\lib\site-packages\nbdev\maker.py", line 90, in _targets def _targets(o): return [o.target] if isinstance(o, ast.AnnAssign) else o.targets AttributeError: 'AugAssign' object has no attribute 'targets'

After a few hours of deep diving what could be causing this, I've pinpointed the piece of code to be the following:

`

| export

from functools import partial import concurrent.futures all_functions = []

| export

table = x all_functions += [partial( snowdb2.spark_copy_db2_to_snow , schemas, table, partition_cols=partition_cols, num_partitions=num_partitions )]

| export

table = y all_functions += [partial( snowdb2.spark_copy_db2_to_snow , schemas, table, partition_cols=partition_cols, num_partitions=num_partitions )]

| export

with concurrent.futures.ThreadPoolExecutor(max_workers=num_partitions) as executor:

Start the load operations and mark each future with its URL

future_to_url = {executor.submit(f): f for f in all_functions}
for future in concurrent.futures.as_completed(future_to_url):
    url = future_to_url[future]

` So it seems like the issue might be related to the use of partial function. Any advice on how to fix this? I was wondering if we can add a try, except block to maker.py such that it won't hard fail but rather give a warning

hamelsmu commented 1 year ago

Can you provide a minimally reproduceable example in a new project that we can replicate and share that?

RFraser-mtrac commented 1 year ago

00_core.zip

Please see attached. Small mistake in attached, number_of_partitions=2. Correct version here:

| export

with concurrent.futures.ThreadPoolExecutor(max_workers=2) as executor: future_to_url = {executor.submit(f): f for f in all_functions} for future in concurrent.futures.as_completed(future_to_url): url = future_to_url[future]

RFraser-mtrac commented 1 year ago

Hi @hamelsmu , have you had any time to look into this one?