Computational-Content-Analysis-2020 / Readings-Responses

Repository for organising "exemplary" readings, and posting reponses.
6 stars 1 forks source link

Discovering Higher-Level Patterns - Timmermans & Tavory 2012 #26

Open jamesallenevans opened 4 years ago

jamesallenevans commented 4 years ago

Timmermans, Stefan and Iddo Tavory. 2012. “Theory Construction in Qualitative Research: From Grounded Theory to Abductive Analysis.” Sociological Theory 30(3) 167–186.

tzkli commented 4 years ago

This paper seeks to strike a balance between atheist empiricism and theoretical monotheism. The idea is very attractive, but I'm wondering how we can operationalize it in treating text as data. This reminds me of how Prof. Evans divides the field of text analysis into research with the purpose of discovering theory and research aimed at confirming theory. Evans and Timmermans et al. both argue that these two processes should be interactive. But it is less clear how. That some models we use in Machine Learning are black boxes complicates the problem. For example, how should we balance the interpretability of a model with the discovery of novelties when using unsupervised machine learning techniques? How should we use the results to guide our supervised models?

lkcao commented 4 years ago

This paper introduces a new principle in qualitative research other than deduction and induction: abduction, which is featured by an iterative analysis between theory and data. However, as we can read in this piece, this is mainly a principle for qualitative research. I guess quantitative research is different since we often have more data but our data are not so flexible and fluid (as what we may collect in the field). So, is there any difference between abduction of qualitative research and abduction of quantitative research?

katykoenig commented 4 years ago

This paper argues the use of abduction to theories, at one point noting that abduction "should be understood as a continuous process of conjecturing about the world that is shaped by the solutions a researcher has 'ready-and-hand.'" Would this imply that more diverse research teams are beneficial to theory construction using abduction?

laurenjli commented 4 years ago

This paper was an interesting introduction to the methods behind theory construction with qualitative data. I liked the idea of the defamiliarization periods and viewing abduction as a recursive process. But with that comes the question, more generally, of knowing when or having a framework for understanding when you've reached the "base" case. How does this method conclude if there is ultimately no theory to construct?

di-Tong commented 4 years ago

It is interesting to compare this paper with the computational grounded theory piece by Laura Nelson: while (1) Timmermans & Tavory hold a critical eye on the inductive stance of grounded theory; (2) Nelson does not refer to abductive analysis in her framework, they seem to point to the same theory-producing processes, though in slightly different orders and operating on different data (qualitative vs massive-scale quantitative). The first step of Nelson's computational grounded theory--inductive computational exploration of text through unsupervised machine learning--seems to solve the inner contradiction of grounded theory that theoretical sensitivity is both required to have and avoid, since unsupervised learning can perform inductive reasoning totally without the inference of pre-existing theories or researcher's subjective bias. Moreover, given the aim and capacity of computational methods to detect novel and surprising patterns, this step also involve abductive elements in terms of producing new hypotheses and theories based on surprising research evidence.

Timmermans & Tavory argue that: 'Abduction suggested explanations, which were then formalized into deductions, while induction confirmed them through empirical testing: “Abduction seeks a theory. Induction seeks for facts”.' I wonder how can we better implement the causal inference part of abductive analysis with computational methods? How can we identify and validate the underlying mechanisms of the novel patterns detected by unsupervised learning?

arun-131293 commented 4 years ago

Timmermans & Tavory's paper is situated in the "meta-theoritical debate about the relation between data and theory" as others have alluded to above. It's a paper about about how to construct theory as much as how to organize a research project which informs theory construction. They criticize a data-dominant approach that inductive grounded-theory proponents practice and advocate for an approach that "privileges" abduction while leaving space for inductive approaches. Specifically they advocate focus on "anomalous and surprising empirical findings" in relation to "multiple existing sociological theories" to develop "new concepts [that] account for puzzling empirical materials".

Since this is a sociological paper, it is understandable that it does not place importance on the role of deduction in theory construction. In the hard sciences, like Physics, theories that are build deductively are routinely privileged over inductive theories, even if the former doesn't account for observed data while the latter does. For instance, Galileo's theory of the rate of fall of objects being constant independent of their mass was accepted and it's importance understood even though the phenomena wasn't experimentally observable and the discrepancy with the actual data was unexplained at the time. But in sociology there are no strong first principles like in Physics to do the kind of deductive thought experiments routinely common in the latter. How does this affect the role of deduction in the field of Sociology and humanities in general?

arun-131293 commented 4 years ago

This paper seeks to strike a balance between atheist empiricism and theoretical monotheism. The idea is very attractive, but I'm wondering how we can operationalize it in treating text as data. This reminds me of how Prof. Evans divides the field of text analysis into research with the purpose of discovering theory and research aimed at confirming theory. Evans and Timmermans et al. both argue that these two processes should be interactive. But it is less clear how. That some models we use in Machine Learning are black boxes complicates the problem. For example, how should we balance the interpretability of a model with the discovery of novelties when using unsupervised machine learning techniques? How should we use the results to guide our supervised models?

Usually supervised Machine Learning work done in the Computer Science deparments is "theory atheist" even when dealing with problems of cognitive science(like Object recognition or language learning), linguistics (machine translation), political science/communication(fake news detection) etc. for which abundant theories are available from the respective departments. The supervised learning in these cases is only constrained by data and not theory. This can be both an advantage and a disadvantage. The advantage is descriptive models that "work well" (for instance Google translate) , the disadvantage is the lack of explainability of the logic of the model. However there is some work being done to address such deficiencies with computer science departments increasingly reaching out to other departments to use theory to constrain learning. Here is example from PNAS on how computer scientists are working with cognitive scientists to improve object recognition in images.

wunicoleshuhui commented 4 years ago

I found the emphasis of abductive analysis on looking for interesting data patterns for developing new theoretical frameworks very inspiring, and this aspect reminds me of the idea of paradigm shifts. However, how do we become certain that we have exhausted possible explanatory variables for such data under existing theoretical frameworks, before being able to develop new changes to existing theoretical frameworks or new paradigms?

ckoerner648 commented 4 years ago

This article illustrates how important it is to question concepts that we–consciously or not–bring to our own research, and that flipping around possible theoretical explanations, further defining and operationalizing the concepts that we have in mind can lead to new and better theoretical insights. Timmermans and Tavory 2012 argue that an abductive research process can help to do this: They illustrate their argument with Diane Vaughan’s highly cited study of “The Challenger Launch Decision.” Vaughn did neither apply a solely inductive nor an exclusively deductive research method. Instead, she used a mixed approached and analyzed her empirical case with a “layering of preexisting theoretical tools [in an] ongoing back and forth between analysis and data-gathering.” Following that iterative process, Vaughan’s study showed how an apparently logical explanation that convinced many at first, can turn out to be wrong.

rachel-ker commented 4 years ago

The emphasis on rethinking and defamiliarizing in considering multiple theoretical viewpoints and to consider different interpretations in this abductive approach suggest diversity in thinking to be a critical. In addition to collaborative relationships among scholars, would it also be helpful to have a participatory metholodogy to gain insights from a wider group of the population?

bjcliang-uchi commented 4 years ago

The authors write that

Although some abductions are productive, there are still many more dead ends and false starts than good ideas that culminate in theory construction. ...

they therefore conclude that

There is little methodological value in gathering confirm ing cases; the strategy is to look for negative cases or alternative explanations to account for the phenomena.

I am still struggling to understand how they can conclude the normally-decade-long generation process of a complicated grounded theory into almost mutually excludable categories of "abduction," "Induction," and "deduction," especially when most of these processes are not published (for example, a university's oral tradition or a piece of childhood memory). This kind of echoes with the questions people have asked here: how is that possible to quantitatively measure the differences among the three rationales rigorously, or at least apply the conclusion into real life.

luxin-tian commented 4 years ago

The above discussions have really good reflections on the different methodologies in scientific theory construction. Social science can differ from natural science in a dramatic way as there can be man-made factors that make it impossible to find the "first principles" to render deductive reasoning. However, as is mentioned in this paper, its value in generating new theory should be integrated with that of inductive reasoning into abduction. As Steven N. S. Cheung, an economist specializing in transaction costs and property rights, suggests similar ideas with this paper in his notable book, Economic Explanation, it is somewhat a tautology to judge a theory by discussing whether the delineated premises in the deduction holds, while it is of equal importance to observe and understand the reality to avoid "empty talk" that deviates from the real world. I would like to see more discussion about these in other fields of social sciences like economics, etc.

deblnia commented 4 years ago

Grounded theory has always seemed to me to better capture the realities of research and the scientific process, the theory-laden-ness of observation, and generally, I'm really fond of Diane Vaughan's The Challenger. At the risk of sounding naive, though, I'm just not sure that Timmermans and Tavory do enough to distinguish this GT from plain old living. Is grounded theory not the method by which most people, by default, go about living their lives? How does this qualify as a legitimate academic epistemology if it does not preserve the distinction between the sacred and the profane?

YanjieZhou commented 4 years ago

To introduce a new idea for research, or more specifically qualitative research is really impressive. But I am still wondering that in the form of iteration over data and theories, can abduction qualify as a useful research method in other areas, like psychology? And regarding quantitative research, does abduction have its own values of scientific application?

sunying2018 commented 4 years ago

I'm interested in the recursive process of double-fitting data and theories in abductive analysis. This article mentions two complementary ways, and the first way is to share research among a community of inquiry stimulates the articulation and refinement of theoretical constructs. But consensus is not necessary can be reached in this type of research collaboratives. Though this article list some points there, but I am still confused about how it works to articulate and refine the theories in this case.

skanthan95 commented 4 years ago

Like @laurenjli, I'm interested in how we decide that we've reached a sound abductive inference (after we've iteratively refit data and theories). What's an example content analysis context for this?

Peirce came up with two definitions of abduction - one in the 19th century (an inversion of deductive syllogism), and one at the turn of the 20th century ("process by which any hypothesis or conjecture that explains some surprising fact is set forth.").

Guy Deutscher argues that computational language analysts use the outdated, overly-restrictive 19th century Peircian definition of abduction; the authors of the present article use the updated, 20th century definition. Does the outdated definition have properties that make it better suited for content analysis, or are the analysts using this definition simply misunderstanding/conflating Peirce's two definitions?

alakira commented 4 years ago

The proposed sequence of steps (method) felt to be a regular research process that we already know and perform to some extent. For example, though with little emphasis on developing new theories, this kind of process was demonstrated by Popper's "Hypothetico-deductive model". I wonder why the authors refer to a more abstract modern philosophy, rather than a deep historical debate about scientific methodologies?

Furthermore, as a big fan of Popper, it's a bit confusing that the author refers to him as "a logical empiricist philosopher of science." He was criticizing the very philosophers from the standpoint of critical rationalism.

yirouf commented 4 years ago

The paper talks about ground theory. The authors propose that abduction, a creative inferential process aimed at producing new hypotheses and theories based on surprising research evidence. It is interesting to see that abductive analysis arises from actors’ social and intellectual positions but can be further aided by methodological data analysis. Information can be derived form the process of revisiting, defamiliarization, and alternative casing. From this perspectives, I wonder whether this could offer new methodology to understanding theory of other fields, such as social network theory from psychology.

jsmono commented 4 years ago

The paper is inspiring to read as authors advocate a new way of conducting research. Their target audiences are qualitative oriented scholars but it can also be beneficial for scholars applying quantitative methods since such an approach is applicable in many fields outside of social science. However, when applying to quantitative study, I am curious how the computer can re-enter or defamiliarize the data? Or in this case, the researchers should bring more data to the sets to accomplish this process crucial to abductive analysis?

gracefulghost31 commented 4 years ago

I found the authors’ claim about the abduction as a guideline for theorizing a very natural extension of the availability of computation resources. While it might be conceptually viable, without the capability to datify data, it can be challenging to carry out the iterative finding and testing process. As classmates have noted, the abductive analysis cycles the authors propose characterizes the gist of computational social science or the integration of machine learning and causal inference, which has a fundamental limit, namely the credibility of causal assumptions that are to be assessed substantively. Relatedly, I'd like to get perspective on what should be the preferable one, if there is a normative standard, that propels the process of knowledge generation to satisfactorily fulfill the need to pursue it. Although Timmermans & Tavory (2012) put an emphasis on “generating creative and novel insights,” that “there is little methodological value ingathering confirming cases” (181) does seem to undermine consideration of validity and even contradictory to methodological heuristics they propose.  

On a slightly different note, I am genuinely curious about what qualifies as a distinction between qualitative and quantitative research.

heathercchen commented 4 years ago

I am wondering whether grounded theory can be applied to general empirial research in the field of social sciences. In the field of education, grounded theory is used especially for analyzing qualitative data, such as the texts extracted from a person to person interview. Therefore, why in detail that grounded theory now failed in qualitative researches?

cindychu commented 4 years ago

This article brought up the importance of theory driven abduction approach in social science research, especially questioning the current ‘bottom-up’ investigation trend led by grounded theory; while, however, I am very interested in how could content analysis better serve for ‘abduction’ approach? We know for induction, content analysis is very helpful for bottom-up observations; for deduction, it could also assist in gathering relevant features to test hypothesis; while what might be an good example of content analysis used from ‘abduction’ perspective?

yaoxishi commented 4 years ago

This paper argues to use abduction to form a new theory based on surprising observations from the data, and it is through the engagement of data with a multiplicity of theorizations that we can make the most of the possibility of generative abduction. I am wondering in the computational text analysis world, how could the algorithms help researchers with this process, is abduction a responsibility mainly on the researchers' side, or computational methods could also be adapted into it to assist?

rkcatipon commented 4 years ago

I found myself wondering how much of Timmermans & Tavory's approach to theory formation through iterative abduction, has been adopted? I noticed in the other comments here that the questions seem to frame around practical implementation as well.

As we read through these articles in class, it seems to me that computational social science has remained more descriptive in analysis, such as describing the lifecycle of users or whether a German artist was mentioned or not mentioned in books during WWII. In their comparison of reasoning forms, the authors push for abduction as the only reasoning form that conjectures, a "cause and effect, hidden from view..." (171), and yet, causal mechanisms remain elusive in social science research. Assuming as a field, we are taking an iterative approach to theory formation through a continual dialogue with surprising data findings, then why are we not creating more causal theories?

I might be misunderstanding the reading here, so really would love to hear other people's opinions on this as well!

HaoxuanXu commented 4 years ago

This paper introduces an interesting approach to theory creation. It adds more weight to data-informed decisions. It would be interesting to know if it can be applied in more qualitative fields and how it should interact with existing theories.

acmelamed commented 4 years ago

The introduction of abductive methodology to computational science is an very interesting concept, and this paper advocates well for it. In fact, the semiotic theory of Peirce likely has unexplored relevance to content analysis far beyond what is discussed here. For instance, his taxonomy of legisigns could form the basis of a strong grounded topic model design. I am curious to see what other researchers have taken inspiration from Peirce and practically employed his specific theories.

ziwnchen commented 4 years ago

This paper rethinks the relationship between data and theory construction and develops a new framework for building theory. I find such a topic of great importance as this is what I often wonder. As @clk16 pointed out, is it possible that we extend the abduction framework to quantitive research? It is widely acknowledged that data scientists from a stem background usually neglect social theories or deliberately choose theories that they like when explaining patterns found in the data mining approach. Therefore, to have a robust way to infer theory from the big data would be very useful.

kdaej commented 4 years ago

This article discusses the importance of abduction for generating a hypothesis in research. While only using deduction and induction for inference may stop us from asking different questions and coming up with other possible explanations, I think abduction also needs to be checked for its own assumptions. From my understanding, abduction seems to rely on how researchers make sense of the evidence rather than the evidence itself. This process can be easily influenced by what researchers believe about the world and logic, which is difficult to check. I wonder how this can be resolved more systematically.

meowtiann commented 4 years ago

The alternative casing and defamiliarization are very interesting concepts. Defamiliarization sounds similar to reflexivity in interviews, that interviewees should be reflexive on their social or cultural background when conducting interviews. Being reflexive helps interviewees to be more aware of the context around the topic of the interview. The alternative casing is to consider an alternative hypothesis or theories. In some sense, it is to jump out of the box of one single theory and be eclectic and open to various theories. Or at least to justify why one theory is better by comparing a bunch of theories. I feel like this also not to be bounded or constrained in one context but to look at data more acceptingly.

However, both sound as if they are just a general research guide to me. Even if some works are under inductive reasoning, they still need to work with data and adopt deductive reasoning. Any incoherence with the old theory needs deductive reasonings to make them into new theories. Starting from data and deductive reasoning and make up a totally new theory seems unlikely for most of the researchers. I believe a mix of both in various ways would be way more common.

sanittawan commented 4 years ago

I am curious about the research process of abduction detailed by the author at the end of the article. I understand the value of abduction that it could lead to interesting hypotheses or even theories. But since the process is iterative, how can researchers prevent themselves from the temptation of cherry-picking evidence and molding theories to fit what they found?

adarshmathew commented 4 years ago

I can't help but draw a parallel between abduction and Bayesian thinking -- theory as priors, being tested against evidence to create a (coherent) posterior. I'd be curious to know how this works with very strong or very weak priors/theory.

chun-hu commented 4 years ago

An interesting read on the abductive and inductive theories. I'm wondering how we can apply abductive theories in the more qualitative fields, e.g. psychology. Personally I found it hard to distinguish these research processes in practice.

cytwill commented 4 years ago

I agree with the point that the abuse of abductive research is likely to make some plausible but inaccurate interpretation of our data. For the three steps mentioned in the article (Revisiting, De-familiarization, and Alternative Casing), though they gave us some directions to improve abductive analysis, I think it is not quite easy to put them into practice sometimes. For example, when revisiting the data, how can we judge whether the former insights are better or worse than current ones? If there are no reliable metrics, we may fall into the trap of subjectivity. Also, for the Alternative Casing, the author emphasizes the importance of each case in the data. But in many cases, we would preclude those outliers or anomalies before analysis, so how can we balance the tradeoff the information loss of special cases and generality?

Lizfeng commented 4 years ago

The author proposes a strong argument against the inductive grounded theory, which has driven the data analysis process for a century. He argues that grounded theory has very little theoretical novelty in comparison to abductive research. The emphasis on both theoretical sensitivity and methodological heuristic makes abductive analysis better. However, I also have one major concern. If I understand it correctly, inductive research is more closely related to frequentist statistics while abductive research shares the idea of Bayesian statistics. If that is the case, I don't think inductive and abductive methods are against each other, instead, they should be complementing each other.

VivianQian19 commented 4 years ago

Timmermans et al. argue that different from a grounded theory approach, in adductive analysis, researchers should “enter the field with the deepest and broadest theoretical base possible and develop their theoretical repertoires throughout the research process” (180). I remember in one of the exemplary readings, Nelson proposes a computational grounded theory which formulates three steps that combine different computational content analysis approaches, the first step of which is using exploratory analysis such as topic modeling to find patterns. I wonder how an adductive analysis can be translated to computational content analysis? And can it be that these two approaches are not exclusive but their applications might depend on different research questions?