fastaidocsprint / fastai

Documentation Sprint for the fastai deep learning library
http://fastaidocsprint.github.io/fastai
Apache License 2.0
15 stars 16 forks source link

docment GAN #72

Closed tmabraham closed 2 years ago

tmabraham commented 2 years ago

Okay here's my first docment PR, let me know if there are any issues and I would appreciate any feedback you have.

I'll note I did take out the after_epoch and show_img arguments since they weren't doing anything. I will take a little time to debug and get them working. Shall I make an issue in the main fastai repo? I can work on fixing that after the docment sprint.

review-notebook-app[bot] commented 2 years ago

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

tmabraham commented 2 years ago

This PR would finish https://github.com/muellerzr/fastai-docment-sprint/issues/20.

tmabraham commented 2 years ago

Finished the changes based on @warner-benjamin's and @muellerzr's suggestions.

I will note this very weird issue we discussed on Discord voice channel.

In the code there is the following function:

#export
def generate_noise(
    fn, # Dummy argument so it works with `DataBlock`
    size=100 # Size of returned noise vector
) -> InvisibleTensor: 
    "Generate noise vector."
    return cast(torch.randn(size), InvisibleTensor)

I have added some docmenting but not type annotations. However, once I specifically add a type annotation to size, the full notebook doesn't run without error anymore:

#export
def generate_noise(
    fn, # Dummy argument so it works with `DataBlock`
    size:int=100 # Size of returned noise vector
) -> InvisibleTensor: 
    "Generate noise vector."
    return cast(torch.randn(size), InvisibleTensor)

What happens is that generate_noise is used as get_x for a GAN DataBlock. The DataBlock functionality will pass a Path to generate_noise but it's a dummy argument and will return the noise vector as our x which is what we want. But when I add the type annotation, the Path becomes the x, as I discovered through dblock.summary.

This is a little concerning if code breaks due to the type annotations, so as a side note, we should all be making sure the tests pass for all of these PRs (must be enabled for first-time contributors).

tmabraham commented 2 years ago

@warner-benjamin updated, let me know if there's anything else that needs to be updated...

warner-benjamin commented 2 years ago

One other thought, maybe the model training should be behind a nbdev #slow tag so we are not replicating it every time tests are run on GH Actions. Would also be preferred not to check in the UserWarnings to the documentation

tmabraham commented 2 years ago

One other thought, maybe the model training should be behind a nbdev #slow tag so we are not replicating it every time tests are run on GH Actions. Would also be preferred not to check in the UserWarnings to the documentation

There is an all_slow tag at the top of the notebook so that's not needed. IDK about the UserWarnings though. Technically, there is a fastai PR to fix this which we can resolve after this sprint.

warner-benjamin commented 2 years ago

Alright. Resolve the CI errors and then it looks good to me.

tmabraham commented 2 years ago

@warner-benjamin updated...