-
Hi!
Very cool project.
There are some potential improvements to sequential model found in [Improved Recurrent Neural Networks for Session-based Recommendations](https://arxiv.org/abs/1606.08117). …
-
https://arxiv.org/pdf/1611.01578.pdf
Neural networks are powerful and flexible models that work well for many difficult learning tasks in image, speech and natural language understanding. Despite t…
leo-p updated
7 years ago
-
Ref: "GRUV: Algorithmic Music Generation using Recurrent Neural Networks"
http://cs224d.stanford.edu/reports/NayebiAran.pdf from Stanford NLP 2015 reports (http://cs224d.stanford.edu/reports.html), wi…
-
https://www.sciencedirect.com/science/article/pii/S0169207019301153
-
List out all the popular Techniques being used in NLP. Such examples would be like attention CNN replacing recurrent networks.
-
So I was looking at potentially moving from the tensorflow bindings to tract for some audio based neural networks I have. The preprocessing in the graph gets the MEL features from the audio so needs t…
-
#### Issue Description
Adding support for Convolutional Recurrent Neural Networks will enable interesting capabilities such as music tagging.
We also want the ability to import open source model…
-
### 🚀 The feature, motivation and pitch
I attempted to construct a small-scale natural language processing model using torch.nn.GRU(num_layers=N, bidirectional=True). During training, the loss cons…
-
Reading the documentation, I found that a general introduction to reservoir computing seems to be missing or is not directly accessible from "Getting Started".
If not already done, I propose to wri…
-
I propose imitating @drasmuss docstring standard for defining networks as seen in his HRL code. This is somewhat related to #231
_The standard is as follows:_
- List parameters as specified by [Nump…