murrayds / sci-mobility-emb

Embedding of scientific mobility across institutions, cities, regions, and countries
4 stars 0 forks source link

Visualization embeddings #34

Closed jisungyoon closed 4 years ago

jisungyoon commented 4 years ago

This issue is related to Fig 3. visualization of embedding(maybe). I tried the figure which is colored by continent, but I think this figure does not give an enough information.

test

What's your thought about this figure? @yy @murrayds

murrayds commented 4 years ago

And, another thing is I used window-size=2, embedding_dim=200 for this visualization Do we need to change the embedding? Because I saw you changed the dim with 128, 256 recently @murrayds

I'm re-running the workflow now to make sure everything works, and while I thought about using powers of 2 this time, although it is probably best to be consistent. I'll revert back to the old embedding dimensions for re-running the code

jisungyoon commented 4 years ago

Then, I will stop drawing the figure for now. when can we get the results with powers of 2 dims?

murrayds commented 4 years ago

Then, I will stop drawing the figure for now. when can we get the results with powers of 2 dims?

Quick question about neural networks, is it seen as better to have powers of 2 dimensions, or are more linear dimensions fine?

Basically, if we build 5 sets of models, should we use 32-64-128-256, or use 50-100-150-200-250?

jisungyoon commented 4 years ago

Then, I will stop drawing the figure for now. when can we get the results with powers of 2 dims?

Quick question about neural networks, is it seen as better to have powers of 2 dimensions, or are more linear dimensions fine?

Basically, if we build 5 sets of models, should we use 32-64-128-256, or use 50-100-150-200-250?

Power of. 2 dimesnions is more natural to me, and it is usual ways to do that. But we do not write a cs-paper. So, It’s fine with multiple of 50.

murrayds commented 4 years ago

Then, I will stop drawing the figure for now. when can we get the results with powers of 2 dims?

Quick question about neural networks, is it seen as better to have powers of 2 dimensions, or are more linear dimensions fine? Basically, if we build 5 sets of models, should we use 32-64-128-256, or use 50-100-150-200-250?

Power of. 2 dimesnions is more natural to me, and it is usual ways to do that. But we do not write a cs-paper. So, It’s fine with multiple of 50.

ok—I think I will use a linear scale, 50-100-150-200-250, becuse that will make it more intuitive to compare performance between models. I will begin re-running the code tomorrow morning.

In the meantime, I created an "archive" folder on the dropbox that contains the old models for this figure.