Closed marekyggdrasil closed 4 years ago
E-mail draft
Hi, Alexey
In my opinion one was added necessary things, but probably explanation is not clear. Below we add more wide explanation.
Of course, a bi-dimensional and two-dimensional map is the same. Projecting data to a bi-dimensional feature map (see Figure https://codesachin.files.wordpress.com/2015/11/kohonen1.gif) is a significant part of sofm. Bi-dementional feature map as a rule vizualize like U-matrix for clusters detecting. (See Visual Explorations in Finance with Self-Organizing Maps)
=======
topological distances between neurons given
sentence mean? The difference between SOFM and other algorithms is a feature map. The map define neurons placement and set a topological distance between it. Commonly neurons placed in a 2d grid and topological distance assumes distance on the grid. It is not distance between vectors of neurons like Euclidean or Hamming distance.
=======
It is a common abbrevation for SOFM. Searching best matching unit
in google gives the many papers about SOFM.
I think we should keep phrase also called the best matching unit
since is a common term but we can remove other mention of this.
[Fixed in new version]
=====
Is a kernel function which gives learning rate for neighbours of best matching unit. Commonly Gauss function in use. Neigbour function (kernel function) is non-detachable part of SOFM.
Unfortanly, i did not have clear understanding of kernel function significance earlier and we use dirac delta function for kernel in toy example. It doesn't good.
=====
See point 4. Topological neighbors is a closest neurons to winner neuron on feature map.
=====
measurement process quite essential here and should be displayed I am not sure. We present a gate to creating final state (10) from (4). One can extend the gate with other stages. Measurement is a trivial and optional stage.
why did you remove the fig 3a? In my opinion classical result is excess. We present a IBMQ experience result and with crosses we just see all errors of the result. Previous figure was made by me since I did not have a time, and now it looks how it presented in my mind initialy. Now we not need compare image bit by bit.
Of course we can include the other figure too.
[Fixed in new version]
=====
Note that the reducing is optional and it is obvious, for example in schuld2014. In my opinion we should not justify optimised
scheme by it.
But Marek and I added comparison from previous version.
[Fixed in new version]
=====
Current introduction in SOFM:
But in the previous version of SOFM explanation (Section 2) and Toy-example (Section 3) we just forgot about topology.
=====
Sincerely, Marek and Ilia
@kephircheek I think its very well written. Only one point is missing,
- Regarding the notation near expressions (2) and (3), it seems like Tim, me and Marek prefer the new notation, of course we are ok bringing the previous notation back, we let you and Tim decide.
If you include it, then its ok to send. I think this response is actually very helpful, it makes things a bit more general with SOFM, like there can be different distances metric (neighbor function) and thus different definition of neighbor (topological neighbor) etc. Very helpful 👍🏻
e-mail sent, closing!
Alexey left some comments in the maintext. I gathered them and created issues for most of them, all those issues are in the following milestone.
https://github.com/kephircheek/qasofm/milestone/3
The only issues which are not in the milestone are
I think for those we also need Tim's comment, what if we change and then he says that he liked previous one more and then we will change back? Regarding that I propose to write:
Let's make this one quick,. just respond on the issues, those changes which are really minor we can still update the manuscript (things like changing one word etc). I suggest we close it up in one day so that we don't delay Alexey even more.