issues
search
flrngel
/
understanding-ai
personal repository
36
stars
6
forks
source link
deep-learning
paper-summaries
papers
readme
Understanding Artificial Intelligence
Notes
09/16/2023 - https://github.com/pablovela5620/arxiv-researcher
08/29/2023 - https://github.com/OpenGVLab/all-seeing
04/18/2023 - https://github.com/jdagdelen/hyperDB
04/18/2023 - https://github.com/showlab/Image2Paragraph
Eliminating All Bad Local Minima from Loss Landscapes Without Even Adding an Extra Unit
Massively Multilingual Sentence Embeddings for Zero-Shot Cross-Lingual transfer and Beyond
MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications
Subspace match probably does not accurately assess the similarity of learned representations
Representation Learning with Contrastive Predictive Coding
Neural Discrete Representation Learning
A Quantum Many-body Wave Function Inspired Language Modeling Approach
Self Imitation Learning
Local sparsity control for Naive Bayes with extreme misclassiication costs
What you get is what you see: A visual markup decompiler
A Simple Method for Commonsense Reasoning
Relational recurrent neural networks
Language Modeling with Gated Convolutional Networks
A Hybrid Convolutional Variational Autoencoder for Text Generation
Asynchronous Methods for Deep Reinforcement Learning
Dual Learning for Machine Translation
An Empirical Evaluation of generic Convolutional and Recurrent Networks for Sequence Modeling
Neural Machine Translation in Linear Time
Learning to Generate Reviews and Discovering Sentiment
Zero-Shot Question Generation from Knowledge Graphs for Unseen Predicates and Entity Types
Neural Voice Cloning with a Few Samples
Diversity Is All You Need: Learning Skills without a Reward Function
Non-Autoregressive Neural Machine Translation
Generating Wikipedia By Summarizing Long Sequences
Zero-Shot Super-Resolution using Deep Internal Learning
Convolution Sequence to Sequence Learning
Bi-Directional Block Self-Attention for fast and memory-efficient sequence modeling
Convolution Sequence to Sequence Learning
Implemented Papers
Representation Learning with Contrastive Predictive Coding
Attention Is All You Need
A Structured Self-Attentive Sentence Embedding
Training RNNs as Fast as CNNs (Single Recurrent Unit)
TagSpace: Semantic Embeddings from Hashtags
Personal Research
Character based Temporal Convolutional Networks + Attention Layer