Thinking-with-Deep-Learning-Spring-2022 / Readings-Responses

You can post your reading responses in this repository.
0 stars 0 forks source link

Network & Table Learning -Orientation #6

Open lkcao opened 2 years ago

lkcao commented 2 years ago

Post your question here about the orienting readings: “Network Learning” & “Knowledge and Table Learning”, Thinking with Deep Learning, chapters 11 & 12.

JadeBenson commented 2 years ago

Thank you! I was interested in the Cora example where the articles could be represented by networks of citations but this was improved when including text from the abstracts. I was wondering if this has been applied in other settings too. Like in geospatial data, could we think of maps not only as networks but including important features about those locations too (like their label, population, proximity to other sites, etc.)

borlasekn commented 2 years ago

I am interested in some of the practicalities behind Deep Walk and Random Walk. I've heard of these before, but only learned about them in passing. I was just wondering if when building these walks, you take into account any sort of probability that you move from one node to another? If so, do these build probability models? Thanks!

isaduan commented 2 years ago

I am still a bit confused about why we move from shallow networks to deep ones. If they are to represent the non-regularity of data or non-Euclidean geometries, what does it mean for a 'relation' and an 'edge' to be non-Euclidean? Is there a concrete real-world example you can give to help us understand this?

thaophuongtran commented 2 years ago

Question for "Chapter 11: Network Learning": Since I'm not very familiar with network or graph data, I have questions on some terms and definitions: the distinction between shallow graph versus deep graph or the eigen vector representation for network/graph data. Thank you!

sabinahartnett commented 2 years ago

I am really interested in Network Diffusion - it makes sense to me that contagion is easy to track when it involves a single status (for example: in a pandemic - health status (healthy, infected, recovering) but how can we visualize/interpret more complex status' such as contagions in language/rhetoric (since this is a common phenomenon in online communities I imagine there is work on this)?

pranathiiyer commented 2 years ago
  1. I would like to know how we generate the data that is streamlined in a manner to generate random walks? And how would the structure of this be taken into account to actually apply a model such as word2Vec later on on the generated random walks?
  2. Adding on to Isabella'a question, how do deep graph models actually process image and text differently from CNN or other models?
linhui1020 commented 2 years ago

I am really inspired to see how not only link could be predicted, but also the node. I wonder how social networks of urban systems (e.g. networks of hospital care, networks of school and etc) within a neighborhood could be used as a feature of neighborhood and further used in deep learning. And can we regard subgraph of social network as many features of a instance?

ShiyangLai commented 2 years ago

I am really interested in multilayer networks and heterogenous graph learning. Is there any up-to-date architecture for these types of more complex graph formats?

Yaweili19 commented 2 years ago

I really liked these chapters and the included method talks. However, what I've found challenging so far in the course is that I seldomly see graph datasets so I begin to question the feasibilities whenever I try to use network methods. Except for social media, what are the important network data sources? Do data producers (companies) generate them?

zihe-yan commented 2 years ago

It's nice to see the analogy between graph data and text data in the implementation. I'm genuinely impressed by how the transition is made. I'm interested in community detection. Something that has been troubling me about this idea. I've read about how the detection may not work well on some graphs that are rich in semantic context. For example, we may want to take semantic meanings in a movie dialog and embed them into the graph (usually as edges I guess?). What are some possible ways to process the data while maintaining as much semantic information as we can in such tasks?

yhchou0904 commented 2 years ago

Random walk is a common method we would apply to network analysis. I am just curious about the reason for using it and also the pros and cons. There exist lots of algorithms for us to traverse a graph and get the path from and to any arbitrary nodes, what could random walk bring us? I can understand it would be useful when we only want a subsample or local pattern of a network, but what kind of benefit or say improvement can we gain? Also, would the choice of starting point affect the result? Is this the reason why we need multiple workers?

BaotongZh commented 2 years ago

Chapter 11 introduced very useful techniques of various model to deal with network data , I would like to learn more about the data preprocessing strategies about network data .

yujing-syj commented 2 years ago

My question is for the deep graph modeling. Since the deep graph layer is based on the node's neighborhood, I am wondering is there any rule of thumb about choosing the depth of the neural network? And could you introduce more examples about the application of using the neural network on social network data?

ValAlvernUChic commented 2 years ago

I was wondering whether any research has been done that did network analysis based on probability distributions based on language models ie. has there been something that tracks how language is used between different people and then mapping influences/contagions of language between those people?

Hongkai040 commented 2 years ago

For deepwalk, 'it reasons only from training cases to test cases, which means that the model must be relearned every time a node is added or removed'. I am wondering does it really influence the structure of the network if we add a node to a large network? I also have a question for GraphSAGE. 'the length of the layer_sizes list must be equal to the length of num_samples, ' If we have a large network, doest that mean that we have to create a super large GraphSAGE model for it? Will the size of the network influence the performance of GraphSAGE?

min-tae1 commented 2 years ago

Network diffusion seems to be an interesting task, but I am not sure how it could be applied in social sciences. Would there be an example that shows how it could be performed in the realm of social studies?

javad-e commented 2 years ago

I was wondering whether we can spend some time explaining different methods for combining data from networks from different domains. For example, in a social network study, for each individual, we want to consider both their Facebook friends and Twitter followers. How can we define edges in different domains?

chentian418 commented 2 years ago

I was wondering if we can put some graphical illustration on different kind of network and tables we are learning? For example, in terms of network clustering and classification, how do we label and classify the nodes, and what shape should the training network data look like?

y8script commented 2 years ago

As for graph neural networks, is it meaningful to convert image data to graph data in order to emphasize a certain aspect of the image like the relationship between adjacent pixels?

Emily-fyeh commented 2 years ago

I would also like to explore more applications of deep network embedding in social science study. For example, there are existing applications of the social networks and citation networks, how can deep network embedding outperform the usage of non-deep network methods in terms of interpretability and accuracy?

hsinkengling commented 2 years ago

The PCA representation of wikipedia link embeddings in figure 11-4 seems eerily similar to PCA of kmeans clusters of documents. I'm curious if text cooccurrence (ngrams) can, through some algorithm, become network data? or would this kind of data transformation be impractical or unproductive for some reason?