uchicago-computation-workshop / Spring2024

Spring Workshop 2024, Thursdays 9:30-11:50am
2 stars 0 forks source link

Questions for Uri Hasson concerning his talk on "Deep language models as a cognitive model for natural language processing in the human brain." #4

Open jamesallenevans opened 1 month ago

jamesallenevans commented 1 month ago

Post your questions for Uri Hasson about his talk and paper: Deep language models as a cognitive model for natural language processing in the human brain. Naturalistic experimental paradigms in cognitive neuroscience arose from a pressure to test, in real-world contexts, the validity of models we derive from highly controlled laboratory experiments. In many cases, however, such efforts led to the realization that models (i.e., explanatory principles) developed under particular experimental manipulations fail to capture many aspects of reality (variance) in the real world. Recent advances in artificial neural networks provide an alternative computational framework for modeling cognition in natural contexts. In this talk, I will ask whether the human brain's underlying computations are similar or different from the underlying computations in deep neural networks, focusing on the underlying neural process that supports natural language processing in adults and language development in children. I will provide evidence for some shared computational principles between deep language models and the neural code for natural language processing in the human brain. This indicates that, to some extent, the brain relies on overparameterized optimization methods to comprehend and produce language. At the same time, I will present evidence that the brain differs from deep language models as speakers try to convey new ideas and thoughts. Finally, I will discuss our ongoing attempt to use deep acoustic-to-speech-to-language models to model language acquisition in children.

MaxwelllzZ commented 2 weeks ago

Thank you Prof. Hasson. For Children's Language Development specifically, how are deep acoustic-to-speech-to-language models being used to study language acquisition in children? What insights have these models provided so far, and what are the potential implications for understanding developmental language disorders?

YucanLei commented 2 weeks ago

How can we distinguish and exploit the similarities and differences in language processing between the brain and deep neural networks when applying these findings in practice?

hantaoxiao commented 2 weeks ago

Could you elaborate on the specific computational principles that are shared between deep language models and the neural processes of the human brain? How do these principles align or diverge? Where do you find that deep neural networks fail to capture the nuances of natural language processing as performed by the human brain, especially in conveying new ideas and thoughts?

QichangZheng commented 2 weeks ago

The study highlights the limitations of large-scale language models in capturing the nuances of children's speech, raising questions about potential improvements through focusing on phonetic and prosodic features. Adapting these models to different languages could help investigate variations in task performance based on linguistic differences. Furthermore, linking individual language processing to collective cultural behaviors using language models could provide valuable insights. What specific modifications to these models would be most effective in enhancing their applicability across these diverse contexts?

xinyi030 commented 2 weeks ago

Thanks for sharing your work. You mentioned shared computational principles between deep language models and natural language processing in the human brain, which I find really interesting. I have a question: How do you define 'overparameterized optimization methods' in the context of both deep language models and the brain's language processing mechanisms? I wonder if you could also elaborate on specific neural processes that support natural language processing in adults, and how they compare to those in deep language models?

66Alexa commented 2 weeks ago

Thanks for sharing! My questions are: How do you envision the role of deep language models in studying language acquisition in children? What are the potential benefits and limitations of using acoustic-to-speech-to-language models in this context?

DonnieTang1 commented 2 weeks ago

Thanks for your sharing! My question is what are the significant differences in how each handles the generation of novel ideas and thoughts, and how might this impact our understanding of language development in children?

yunshu3112 commented 2 weeks ago

In the paper "Shared computational principles for language processing in humans and deep language models", you mentioned that there are three fundamental computational principles for natural narrative. I wonder how these principles were formed and what is the general discussion about how the DLMs and human brains can be compared.

Vindmn1234 commented 2 weeks ago

Thanks for the illustration. I'm curious how do you address the potential differences in the underlying principles of neural noise and error handling between human brains and deep neural networks, especially when modeling complex cognitive processes like language development in children?

yuanninghuang commented 2 weeks ago

How do you envision the integration of evolutionary principles into the design and optimization of artificial neural networks to enhance their adaptability and robustness in complex and dynamic environments?

Yuxin-Ji commented 2 weeks ago

Thanks for sharing your work! I wonder how generalizable are your findings to children exposed to different language environment. There had been research demonstrating effects of difference language and environmental input on the learning process, such as different culture of infant-directed speech, bilingual and multilingual families, and children whose native language is sign language. Particularly, for signers, the learning mode differs from the "acoustic-to-speech-to-language" and is more of "visual-to-signs-to-language".

JerryCG commented 2 weeks ago

Hi Uri,

It is interesting to see that brains work in some ways similar to how deep neural networks work. Is it possible to speed up the processing efficiency of brains to catch up with networks given similarities?

Best, Jerry Cheng (chengguo)

vigiwang commented 2 weeks ago

Hi Professor Hasson, I think your work makes great contribution to our understanding of the substuition ability of human labor and the AI algorithm, thanks for sharing! It has been widely discussed that the adoption of AI algorithm will devalue human labor in some aspects where AI can mimic the processment of human, do you think there exist any field that will be super hard for AI to transcend human?

WonjeYun commented 2 weeks ago

Thank you for sharing you work. The understanding of human brain in terms of ANNs seems to be an interesting research. I understand that there has been efforts to understand natural phenomena with the use of neural networks. However, I am not yet convinced that the similar structure, or the ability to mimic human brain activity can imply that the neural network can be used to investigate the natural behavior. How would you be able to add robustness to the argument?

yunfeiavawang commented 2 weeks ago

I appreciate your sharing. In one of your studies, brain reactions and predictions are mostly studied in English-speaking environments. When applied to languages with radically different syntactic or morphological patterns, how may these computational concepts change or adapt?

ChrisZhang6888 commented 2 weeks ago

Hi Professor Hasson, thanks for sharing! Based on your works, could you elucidate how these deep language models might facilitate the progression towards robots exhibiting self-aware behaviors? Additionally, what significant advancements are necessary to bridge the gap from current AI technologies to this level of sophistication?

Joycepeiting commented 2 weeks ago

Hi Professor Hasson, I would like to know whether the method can be adapted to different language systems or the circumstance more than two language systems coexists (as some early-age children are in a multilingual environment).

MaoYingrong commented 1 week ago

Thank you for sharing your work! I'm wondering what's the application of such findings of similarity. I thought that many deep-learning models were simulating how the human brain processes stuff. How do these findings facilitate the study of Children's language development? In addition, is there an individual basis among human brains that they may rely on different methods to process and produce languages?

franciszz992 commented 1 week ago

Thank you for sharing your research, Professor Hasson. Some might say both the human brain and deep neural networks are black boxes for the observer. What might appear to be similar in the outcome might be due to very distinct yet unobservable differences in the process. How would you respond to this? Why you think it is possible to make comparison between two blackbox?

nalinbhatt commented 1 week ago

Hi Prof Hasson. As per the reading, there are overlaps between DLMs pattern recognition and how our brain performs similar computations. The authors mention that this is not sufficient and more research might be needed to understand the formation of innovative thoughts. I am curious if the latter could be modeled by augmenting the DLMs (which inform our understanding of pattern recognition within brains) with a psycholinguistic approach or something entirely tangential to understand complex thought formation?

Ry-Wu commented 1 week ago

Hi professor Hasson, thank you for sharing your interesting study! It was very inspiring to learn about the innovative research. I wonder how the shared computational principles impact our understanding of language acquisition in children, particularly in terms of the differences between ANNs and neural processes in the developing brain?

ZenthiaSong commented 1 week ago

Hello, thank you for sharing your research! You mention that while there are shared computational principles, there are differences when it comes to conveying new ideas and thoughts. Can you elaborate on these differences? What aspects of human language processing do current models fail to capture?

yiang-li commented 5 days ago

Thanks for sharing the research. How do the computational constraints and capabilities of the human brain compare to those of current deep learning models, especially regarding the processing of natural language?

boki2924 commented 5 days ago

Thank you for sharing! What specific computational principles shared between deep language models and the neural processes of natural language processing in the human brain have been identified, and how do these principles inform our understanding of language development and acquisition in children?

Adrianne-Li commented 5 days ago

Hello Professor Hasson,

I am deeply interested in your work on the parallels between deep learning architectures like LSTMs and Transformers, and human cognitive processes in natural language processing. Given the rapid advancements and the diverse methodologies in NLP, how should we approach the integration of these complex computational models into cognitive neuroscience studies? Furthermore, in your research, are we focusing more on discovering core similarities between neural networks and the human brain, or are there other aspects of this interdisciplinary field that you find particularly promising?

Adrianne(zhuyin) Li (CNetID: zhuyinl)

schen115 commented 5 days ago

Hi professor Hasson, thanks for sharing such fantastic work with us! I was curious that given the shared computational principles between the human brain and autoregressive deep language models, how might this alignment influence the development of educational tools or therapeutic interventions for language-related disorders?

naivetoad commented 2 days ago

What evidence supports the claim that the human brain and deep neural networks share some computational principles in natural language processing, and how do they differ when conveying new ideas?