NicolaBernini / PapersAnalysis

Analysis, summaries, cheatsheets about relevant papers
21 stars 4 forks source link

Quantum reservoir processing - Analysis #16

Open NicolaBernini opened 5 years ago

NicolaBernini commented 5 years ago

Overview

Readthrough and Analysis related to Quantum reservoir processing

NicolaBernini commented 5 years ago

Reservoir Computing - Basic Elements

Architecture

ResComp1

The Neural Networks can be thought as composed of 2 main architectural elements

The Traditional Neural Networks used in Machine Learning do not show this distinction very clearly as

The Deep Neural Networks used in Deep Learning show this distinction clearly

Also in Reservoir Computing this distinction is clear

Underlying Idea

The underlying idea seems to be very similar to the SVM Kernel Trick : as the Readout performs a classification hence essentially finds linear subdivisions in input space, then it should be facilitated by the fact it is high dimensional, as it is the Reservoir output space

The idea seems to be if the Reservoir output space is enough high dimensional, there is no need to train it, hence there is no need to fit this mapping on the data, just focus on training the linear discriminator in this space: this makes training much cheaper and easier

Work in progress