Closed AdityaKane2001 closed 3 years ago
Hi @AdityaKane2001, that is awesome!
The original point of this repository was to have a simple, yet decently performant, deep learning library, to write a tutorial about (similar to https://sidsite.com/posts/autodiff/). That target has been met, and the tutorial is pending. But the tutorial doesn't have to be tightly coupled to this repo, so SmallPebble's development can continue. The core point can now be: to have a concise, relatively simple, hackable deep learning library, with good-enough performance.
Here are some suggestions for contributing:
Other ideas and suggestions are also welcome.
@sradc The thing is, I don't think this will fare well against TensorFlow or PyTorch. Those are written in C++ with a python API. In case of new features, do you think it's ok if the library is sort of a "see-through" deep learning library? As the code you have written so far is fairly simple, a good interface with friendly ops will make this a great place to see how big libraries work at their core. Please share your thoughts on this.
I don't think this will fare well against TensorFlow or PyTorch. Those are written in C++ with a python API.
I agree totally, but I would like to quantify the difference.
do you think it's ok if the library is sort of a "see-through" deep learning library? As the code you have written so far is fairly simple, a good interface with friendly ops will make this a great place to see how big libraries work at their core
Sounds good! Do you have any examples in mind?
I think we can implement some of the following features:
As you can probably tell by now, I am taking in a lot of ideology from TensorFlow Keras. This is because I personally like and use TF :) and also it is very fluid and abstract, which makes it very convenient to use and understand. TL;DR, I think making this a small-scale keras may be a start, and we can change and expand as we go on.
This may not come across as a good idea 😅 , but I think borrowing their core, foundational ideas and making them transparent and easy to understand is a good objective.
Thanks for the ideas, here are my initial opinions:
And 3 and 4 are great ideas. I'm of the opinion that we should add 3 and 4 by first deciding on models/use-cases, and implementing the features as needed to build the models/use-cases.
As to borrowing from Keras, I think that's fine, on a case by case basis (e.g. I used the same API as tf.nn.conv2d in sp.conv2d).
@sradc About modularizing, I think it will make things easier as we expand the project. I have experienced the disadvantage of this on a couple of projects 😅. Also, if we modularize correctly, we will be able to freeze the core and work on bigger, more robust features.
About 3 and 4, sure. We can decide on features that we want and proceed accordingly.
Apart from this, I'm interested (and trying) to make a similar library, but which works like TensorFlow, ie C++ as backend and a Python API. That's just a personal thought, please let me know if you'd be interested.
Sorry for the delay.
Having had time to reflect, I'm ok with modularizing.
I'm interested (and trying) to make a similar library, but which works like TensorFlow, ie C++ as backend and a Python API. That's just a personal thought, please let me know if you'd be interested.
That sounds really cool, I might be interested, but can't commit to it. If you go ahead I'll watch with interest.
My own focus is now more towards reading up on, and implementing, more cutting edge models.
Thanks for the reply.
Actually, I am also tied up in various commitments right now. I am happy to tell you that I have been selected for Google Summer of Code (GSoC) at TensorFlow this summer. I'll be working on implementing research models.
That said, this is really one of those project that I want to contribute. I'll try my best to do so as and when possible.
Apart from that, my best to you for your future projects! I'll be in touch through LinkedIn. Thanks for answering my queries in such a calm manner. Hoping to collaborate in future!
Congratulations, that is a fantastic achievement!
SmallPebble will very much welcome your contributions, once you get the time.
Best wishes, hope to collaborate, and good luck at GSoC!
Thanks a lot!
Closing the issue. Will reopen when possible.
Thanks again.
@sradc This is the best repository I have seen in weeks. Two reasons - one, because I wanted to know more about automatic differentiation, and second, it is to the point. I found this through Aurilien Geron's retweet. I want to contribute to this. But I am hazy on where to start, and what's the core point of this repo.Could you please tell where I can start?