FluxML / fluxml.github.io

Flux Website
https://fluxml.ai
MIT License
20 stars 45 forks source link

Create Machine Learning Tutorials with Flux.jl #107

Open logankilpatrick opened 2 years ago

logankilpatrick commented 2 years ago

Hello prospective hacktoberfest contributor! The FluxML community would welcome new tutorials to the Flux website which can generally be found under: https://fluxml.ai/tutorials.html

You can find the source code for the tutorials here: https://github.com/FluxML/fluxml.github.io/tree/main/tutorials/_posts. They are just markdown files.

What we are looking for

We would be open to Pull Request which provide a tutorial topic that is not already covered by the existing tutorials. But no need to re-invent the wheel here. If you have a favorite tutorial that you want to try and re-create using Flux, we would love to help and see it!

Find out more about contributing here: https://github.com/FluxML/fluxml.github.io/blob/main/CONTRIBUTING.md and more general ways of contributing (which may not be open hacktoberfest issues but we can happily make them into issues if that helps you) here: https://github.com/FluxML/Flux.jl/blob/master/CONTRIBUTING.md

Another good starting place would be the Model Zoo: https://github.com/FluxML/model-zoo where we have a bunch of existing models but usually without tutorials built around them.

Dsantra92 commented 2 years ago

I would love to see some of my favourite tutorials in Tensorflow and PyTorch for Flux. I propose to write flux versions for these tutorials:

I would love to hear your feedback on these tutorials.

P.s: These are the tutorials that I can think off the top of my head. If these PRs go well I would love to add more tutorials in the future.

logankilpatrick commented 2 years ago

I would love to see some of my favourite tutorials in Tensorflow and PyTorch for Flux. I propose to write flux versions for these tutorials:

I would love to hear your feedback on these tutorials.

P.s: These are the tutorials that I can think off the top of my head. If these PRs go well I would love to add more tutorials in the future.

This would be incredible! Let us know how we can help.

Fernando23296 commented 2 years ago

I propose to write flux version of this tutorial:

What do you think?

logankilpatrick commented 2 years ago

@Fernando23296 yes! Let's do it.

logankilpatrick commented 2 years ago

@Fernando23296 there is still time if you want to try and get a tutorial created during hacktoberfest!

kailukowiak commented 2 years ago

You never realize how much you rely on tutorial to build novel models until you try to build one with out any similar examples 🙈

logankilpatrick commented 2 years ago

@kailukowiak I highly encourage you to use other popular tutorials (in TF, Keras, PyTorch, etc.) if you are trying to make a Flux version. This should help a ton!

kailukowiak commented 2 years ago

@logankilpatrick I've been working on converting this tutorial tutorial to Julia. I've got a working version for the CPU but my rnn uses a loop with indexes.

function rnn(input, hidden, model)
    combined = [input; hidden] |> model.device
    out = model.in2out(combined)
    hidden = model.in2hidden(combined)
    return out, hidden
end

function predict(X, model::Model)
    hidden = zeros(model.hidden_size) |> model.device
    local ŷ
    for i = 1:size(X, 2)
        ŷ, hidden = rnn(X[:, i], hidden, model) #  |> model.device
    end
    return ŷ
end

This throws an error when I try and take the gradient and straight-up failes if I set: CUDA.allowscalar(false). I could use the build in flux functions but I wanted to stay closer to the method in the tutorial as I think it gives good intuition into what's actually going on. Do you have any idea how I could get around the issues with integer slicing a GPU array?

ToucheSir commented 2 years ago

Hi @kailukowiak, this kind of question is better suited for Discourse (we try to keep the issue tracker to bugs and feature requests). If you wouldn't mind opening a thread there (there's a Github login option) and letting us know, we can pick up there.