uchicago-computation-workshop / Winter2024

Winter Computational Social Science Workshop
3 stars 0 forks source link

Questions for Sandra Gonzáles-Bailón's talk on "The Diffusion and Reach of Information on Social Media". #5

Open jamesallenevans opened 5 months ago

jamesallenevans commented 5 months ago

Share your questions regarding the 2/15 talk by Sandra Gonzáles-Bailón about The Diffusion and Reach of Information on Social Media. Social media create the possibility for rapid, viral spread of content. We analyze the virality of and exposure to information on Facebook during the US 2020 Presidential election by examining the diffusion trees of the approximately 1B posts that were reshared at least once by US-based adults (from July 1 2020 to February 1 2021). Only N ∼ 12.1 million posts (1.2%) were reshared more than 100 times, involving N ∼ 114 million adult U.S. users and accumulating ∼ 55% of all views. We differentiate broadcasting versus peer-to-peer diffusion to show that: (1) Facebook is predominantly a broadcasting (rather than viral) medium of exposure; (2) Pages (not Groups) are the key engine for high-reach broadcasting; (3) misinformation (as identified by Meta’s Third Party Fact Checkers) reverses these trends: this type of content relies on viral spread through long, narrow, and slower chains of resharing activity; and (4) a very small minority of users (older and more conservative) power the spread of misinformation, triggering very deep cascades that accumulate large numbers of views. The paper will be shared by email (or by request to jevans@uchicago.edu), but cannot be posted online.

bhavyapan commented 5 months ago

Thanks for sharing your work! I found the concept of diffusion trees as well as your study design very interesting. In economics, for example, I could anticipate applications of such methodologies in precipitating research on game-theoretic models of bargaining in networks to help predict the behavior of agents depending on the type of network or modes of interaction on social media. I'm wondering what could be some other applications of such computational tools in the social science context.

XiaotongCui commented 5 months ago

Interest research! What are the similarities and differences between the propagation patterns of error messages on Facebook compared to the previous research on Twitter? Does this difference reflect the variances in algorithms and user behaviors on different platforms?

kiddosso commented 5 months ago

Thanks for sharing your insightful research! Your use of data and research design are very impressive. I wonder how much applicability of your research finding is. In other words, is the same trend of misinformation could also be spotted in other social media platform? If so, what are its implications?

ethanjkoz commented 5 months ago

I found this paper fascinating. Others have pointed out a similar curiosity of mine: how does misinformation diffusion change by platform? This question stems from the discussion on page 4 where affordances of social media diffusion are discussed. Because others have already commented these questions, I will propose an extending question: does the choice of topic (the US 2020 election) lend itself towards being dominantly spread through broadcasting as opposed to virality? Are there other topics (political or not) that might be instead spread virally? Furthermore it would be interesting to see if the same mechanisms driving difussion of misinformation for Facebook in 2020 will be present for the 2024 election.

shaangao commented 5 months ago

Interesting work! I wonder if there is a systematic mechanistic account for this disparity in modes of transmission between misinformation and other information. E.g., what properties in (mis)information make it more prone to viral spread? And subsequently, how can we leverage these properties to aid the spread of helpful information and curb the spread of misinformation?

C-y22 commented 5 months ago

One intriguing aspect of this research is the exploration of whether certain populations are more susceptible to misinformation and echo-chamber effects, as well as the investigation into potential asymmetries in information diffusion patterns between right-leaning and left-leaning actors. This prompts further reflection on the implications of these findings for understanding the impact of social media on political discourse and the formation of public opinion. I wonder how the identified structural properties of diffusion trees on social media platforms, as discussed in the paper, contribute to our understanding of the spread of misinformation and political content.

yunfeiavawang commented 5 months ago

This paper is super impressive! As illustrated in this paper, the retransmission of misinformation highly relies on virality rather than broadcasting. I am curious about the identity of the accounts which retweet the misinformation posts. Intuitively, this group of online users will have specific demographical features, or they will probably be social bots implemented strategically by some political/commercial entities. I was wondering if there's a possibility to further investigate the properties of the retweeters based on the current study.

Dededon commented 5 months ago

Hi Professor Sandra, thank you for the presentation! I'm interested about how can we get access to the Facebook API resources, and how to designed a good research question based on the API-provided dataset.

xiaowei-v commented 5 months ago

The finding about the pattern of viral diffusion of the misinformation is interesting. I wonder what is the fundamental drive of people passing on this type of information in this way. I wonder that the data can tell about why people behave in certain patterns (especially those similar across multiple platforms).

ksheng-UChicago commented 5 months ago

Thank you, Professor González-Bailón, for sharing your work and research. I think the demonstration that misinformation follows viral spread is understandable. I think the platforms can either eliminate the source of misinformation, eliminate the viral spread during the process, or educate the audience (label the misinformation). Amongst the three possible ways, which one do you think would be the most ethical and effective way?

yuzhouw313 commented 5 months ago

Hello Professor Gonzáles-Bailón, thank you for presenting your work! I am interested in hearing more about how do you think the characteristics of diffusion trees (compare to other structures or even social network), specifically their shape and the number of exposures, influence the effectiveness of information spread on social media, and what implications does this have for distinguishing between broadcast and viral diffusion mechanisms?

beilrz commented 5 months ago

Hello Professor Gonzáles-Bailón,

Thanks for presenting your research. I think the finding is very interesting, and I was wondering would it be possible to identify misinformation from the network structure alone? is the difference between the diffusion pattern of misinformation and factual information significant enough, so that we do not have to rely on the content of message to identify whether it is possible misinformation?

AnniiiinnA commented 5 months ago

Hello Professor Gonzáles-Bailón, thank you for sharing this amazing work! I found the different models of exposure presented in the article particularly interesting, especially the conclusion that misinformation spreads in a viral rather than broadcast mode. I didn't see an analysis of the content of the pages in the article, and I'm wondering whether the posts were categorized according to the form of their content (e.g., with or without emoji, purely images or a combination of graphics and texts, including meme images or not, etc.) when this study was conducted? Does the difference in the type of tweets also affect the mode of spread as well as the audience?

HongzhangXie commented 5 months ago

Thank you very much for sharing this interesting study. The research finds that the spread of misinformation primarily originates from a minority of users. So, for those users who seldom spread misinformation, is it because they have fewer opportunities to come across this information, or is it because they are able to identify the errors in the information and stop its propagation?

hchen0628 commented 5 months ago

Thank you very much for the exciting and enlightening share. This article comprehensively discusses the spread patterns and inherent logic of false information, thoroughly describing the characteristics and behaviors of users involved in this process. I've noticed that some other studies have focused on bots' role in propaganda and spreading false information. How do you think these robots will interact with the theories of this research?

jiayan-li commented 5 months ago

Thank you for sharing! The study's findings suggest that misinformation tends to spread differently than other types of content on social media platforms. Specifically, misinformation appears to rely less on broad broadcasting and more on peer-to-peer diffusion through long and narrow paths. How do the findings impact strategies for mitigating the spread of misinformation on social media platforms?

secorey commented 5 months ago

Hi Dr. Gonzáles-Bailón, thanks for coming to present your research. I was interested in the finding that a small number of older, more conservative users drive the majority of the spread of misinformation on Facebook. Could you break down how this works? Do these users have large networks of friends or followers?

yunshu3112 commented 5 months ago

Hi Sandra, thank you for introducing this impressive work. Regarding the diffusion of misinformation in the older population, do you think there might be a reciprocal effect in the young population (i.e. teenagers?) I am very curious about how social media impact the life of secondary or even primary school students, since they have access to the Internet at a startling young age.

zhian21 commented 5 months ago

Thanks for sharing this interesting paper. The study explores the dynamics of information diffusion on Facebook, with a particular focus on the 2020 U.S. Presidential election, revealing two critical insights: (1) the significant role of Facebook Pages in broadcasting information, which underscores the influence of organized entities in shaping the social media information landscape, and (2) the finding that misinformation is primarily propagated through long, narrow chains of resharing by a small minority of users, highlighting the challenges in controlling false information once it gains momentum. Given these insights, then how can social media platforms like Facebook effectively curb the spread of misinformation propagated by influential Pages and a minority of users, without stifling free speech or limiting the reach of legitimate information?

alejandrosarria0296 commented 5 months ago

Very interesting work! You mention at the end of the paper that a transition from user-curated social media feeds like Facebook and Twitter circa 2020 in favor of algorithimically created feeds (i.e. Tiktok and Reels). Do you think that this transition also implies a shift in the study of misinformation spread that de-centers users and focuses on how misinformation may exploit algorithms? How would we study individual sensibilities to misinformation in such a context? Thank again for sharing your work! I'm excited to hear your talk.

fabrice401 commented 5 months ago

Thank you for sharing this interesting work! After reviewing the paper, I have two questions:

  1. Given the study's findings that misinformation spreads more through peer-to-peer diffusion rather than broadcasting, how can social media platforms effectively design interventions to curb the spread of misinformation without infringing on user privacy and freedom of expression? This question considers the delicate balance between controlling misinformation and maintaining an open, democratic platform for discourse.
  2. Considering the role of older and more conservative users in the spread of misinformation, as indicated in the study, what targeted educational or informational strategies could be implemented to specifically address and mitigate the susceptibility to misinformation among these demographic groups? This question explores potential methods for increasing digital literacy and critical thinking skills among populations more likely to spread misinformation.
volt-1 commented 5 months ago

Thank you for sharing. Based on the research indicating that Facebook primarily serves as a broadcasting medium rather than relying on viral spread, how do Facebook and other sm platform's recommendation algorithms balance user interests with information diversity? Particularly in the context of politically sensitive content, does the algorithm tend to reinforce users' existing political views, thereby promoting a unidirectional flow of information instead of providing diverse perspectives?

QIXIN-ACT commented 5 months ago

In the current digital age, there's a significant emphasis on addressing misinformation on social media platforms. However, these platforms are also crucial for disseminating information that may be of urgent nature. This raises the question: how can social media balance the need to prevent misinformation while ensuring timely access to vital information? Moreover, it's worth considering whether social media companies will take active steps to combat misinformation. There's a perspective that misinformation could inadvertently benefit these platforms due to its ability to engage users and increase popularity.

natashacarpcast commented 5 months ago

Thank you for the exciting research!

Do you have any hypotheses regarding the mechanisms underlying the involvement of older/conservative individuals in spreading misinformation? I've been thinking about it and have some thoughts...

In the era before the internet, newspapers and television served as primary sources of information on current events. Given that these traditional mediums are typically more regulated and structured, it's reasonable that individuals would place greater trust in the information they provide. Considering that older individuals spent much of their lives relying on these mediums, is it possible that they are simply accustomed to believing what they read or watch? In contrast, younger generations, who have grown up in the age of the internet, might be more aware of the prevalence of misinformation due to their familiarity with online platforms.

Does this hypothesis sound reasonable to you? Do you have any other ideas?

Thank you!

fvescia commented 5 months ago

Thank you for sharing your work! Could you talk a bit about how you approached deciding what to put in the body of the paper vs. what to put in the supplemental materials? Were your decisions informed by where/how you hope to ultimately publish the work? If so, how?

anzhichen1999 commented 5 months ago

Thanks for the sharing! After reading the paper, I want to ask how does the implementation of sequential sampling and adjustment of sampling probabilities, as described in the paper, impact the representativeness and bias of the final sample, particularly in relation to user engagement levels, geographic distribution, and race/ethnicity composition

Hai1218 commented 5 months ago

Hi Prof. Gonzáles-Bailón - my question is a bit outside of your paper. I am particularly interested in your experience coworking with researchers from the industry. Given the often differing goals between academia, which focuses on theoretical advancements and knowledge dissemination, and industry, which prioritizes product development and profitability, how have you navigated the challenges of aligning these objectives in your collaborations? Specifically, when dealing with proprietary algorithms that are central to a partnership but might limit open academic inquiry, how have you managed to balance the industry's need for confidentiality with the academic mandate for publication and sharing of research findings? Could you share strategies or examples where you successfully addressed these conflicting priorities?

lguo7 commented 5 months ago

Thank you for sharing your exacting work! The study highlights a significant reliance on Third-Party Fact Checkers (3PFCs) for labeling misinformation. What are the limitations of this approach in the context of rapidly evolving information landscapes, and how could the process be improved to better detect and mitigate misinformation, especially when it goes unlabelled or is politically nuanced?

PaulaTepkham commented 5 months ago

Thank you so much for the informative paper. I am so intrigued by the data and analysis. As the main inspire question is 'what type of spreading patterns (broadcast 80 vs viral) were more likely to be seen on Facebook during the 2020 election – and which surfaces 81 and types of content generated different types of diffusion behavior?', are there any difference between critical and specific moment and normal time. Since during a typical time, there is some viral suddenly happen which I found it might be really interesting to content creator to know how the viral works.

yuy123337 commented 5 months ago

Hi Professor Gonzáles-Bailón! I am wondering how is misinformation identified and categorized within the research framework, and what defines an authoritative source given the diverse nature of information, ranging from subjective to objective and various ideological perspectives. Additionally, is there a discernible pattern in the spread of misinformation on other social media platforms except Facebook?

Caojie2001 commented 5 months ago

Thank you for sharing your interesting research. I wonder whether the means of information dissemination would vary from section to section in the whole population. In the research paper, the variation between different age groups is mentioned. In your research, did you observe this similar variation among other population sectors? Besides, for students like us who may not be able to achieve the official API of the online social platforms, do you have any alternative suggestions for related research?

zihua-uc commented 5 months ago

Thanks for the interesting paper!

I was wondering how the data was recorded for shares. Does a "share" count only when we click on the "share" button on a post? What about screenshots of a page and how different groups of people might use screenshots to share? I am asking because I sometimes share screenshots because my friends can view what I'm sharing instantly instead of having a link where they have to click.

yuhanwang7 commented 5 months ago

Thank you for sharing this insightful research. It's fascinating to see the observed pattern indicating that misinformation tends to spread virally, which is contrast with the broadcasting nature of Facebook. This raises the question: why does misinformation spread differently and cause significant issues? What mechanisms underlie its propagation, allowing it to be driven by a relatively small subset of individuals while still exerting a substantial influence? Additionally, I wonder if the dynaics of misinformation on Instagram, which relies less on text and more on visual content compared to Facebook, would be similar.

Jessieliao2001 commented 5 months ago

Thank you for your wonderful research! I have a confusion about: How does the virality and exposure dynamics of information on social media platforms like Facebook during significant political events, such as the US 2020 Presidential election, influence macroeconomic stability through changes in consumer confidence and investment behaviors?

Weiranz926 commented 5 months ago

Thank you for your sharing! I am wondering how the characteristics of users who predominantly share misinformation (specifically older and more conservative users as mentioned) influence the strategies for combating misinformation on social media platforms.

Zhuojun1 commented 5 months ago

Thanks for sharing! I wondered after analyzing the diffusion of information on Facebook during the 2020 U.S. Presidential election, how can we evaluate the effectiveness of Meta’s Third Party Fact Checkers in reducing the spread and impact of misinformation?

JerryCG commented 5 months ago

Dear Sandra,

I am looking forward to your exciting speech! I wonder what policy implications we can draw from the findings of diffusion and broadcasting patterns here. How to most effectively, efficiently and accurately spead an important news about government policy, for example?

Best, Jerry Cheng (chengguo)

zhuoqingli526 commented 5 months ago

Thanks for sharing! Your research is quite enlightening. I noticed that the posts analyzed in your study include various forms of content such as text, URLs, links, images, and videos. I'm curious whether the form of content impacts whether it's more likely to be spread through broadcast-style or viral diffusion. For instance, do short videos, with their visual appeal and ease of consumption, tend to favor viral spread? Additionally, I'm interested in whether your research considered the role or issue of bot forwarding or automated accounts in the dissemination of information.

lbitsiko commented 5 months ago

I was wondering whether you could give us some insights on across platforms differences, particularly considering platform-specific mechanisms for the dissemination of content (e.g. you mention content affinities).

HamsterradYC commented 5 months ago

Thanks for sharing! Given the limitation highlighted in your study, regarding the complex interplay among source following distributions (or network sizes), posting activity, algorithmic ranking, and policy interventions, which your current dataset does not allow for detailed parsing, you've still managed to measure the resulting diffusion patterns with broader data than those utilized in previous research. Considering many studies on social media are constrained by these conditions, I'm curious about your perspective on the directions or methodologies we might pursue to more finely dissect these intricate interactions and overcome these limitations. How can future research better parse and understand these dynamics, perhaps through novel data collection methods or analytical approaches?

isaduan commented 5 months ago

Thanks for sharing your research with us! I wonder to what extent you think the findings are mediated by the platform's own strategies and technical stacks in combating misinformation? How do researchers navigate this mediation factor when conducting similar research?

zimoma0819 commented 5 months ago

Thank you for sharing this research! Nowadays, with the rising popularity of short video platforms like TikTok, short videos have the advantage of conveying information in a very brief period. It allows people to watch dozens of videos within 10 minutes. Similarly, a single video from a well-known influencer can achieve millions of views and likes in a short time.This shows the rapid spread of information through short videos. I'm curious, if misinformation spreads in the form of short videos, would its speed lead to widespread dissemination of such misinformation?

MaoYingrong commented 5 months ago

Thank you for sharing this research! I'm wondering, if the result shows that Facebook is predominantly a broadcasting (rather than viral) medium of exposure, whether this to some extent argues against the increasing trend toward information cocoon.

erikaz1 commented 5 months ago

Thank you for sharing your findings for this large scale study. I found the last paragraph particularly interesting. It briefly mentions Facebook's plans to shift their recommendation algorithm beyond a social-networked model to a "content affinity" one, like TikTok. Would this model be more compatible with Facebook's broadcasting goals, or would it encourage more "viral"/deep news? What were FB's intention when declaring such a shift, and what did they envision would be gained (or lost) in this process (what does the future of social networking look like, what is driving recommendation system shifts)?

wenyizhaomacss commented 5 months ago

Thank you for sharing your work. Could you discuss any methodological challenges you faced in distinguishing between different types of content spread (e.g., genuine versus misinformation) and how you addressed potential biases in data collection or analysis? Furthermore, although the limitation has been discussed, are there currently any viable strategies to mitigate the influence of unique algorithmic content distribution mechanisms across platforms on the identified patterns of spread?

grawayt commented 5 months ago

In your paper, you classify political ideology along a one-dimensional scale (liberal to conservative). Would it also be possible to classify ideology along multiple dimensions, for example socially liberal to socially conservative and economically liberal to economically conservative?

ecg1331 commented 5 months ago

Thank you for sharing your research!

I thought it was interesting that you mentioned that you did “not have access to individual-level data and we 90 only offer in-depth analyses for the posts that were shared (both privately and publicly) at least 91 k = 100 times by U.S. users” (7) in order to protect user privacy.

I am curious if you thought about keeping individual profiles in your study for even more insights into the diffusion of information (like possibly identifying key players in certain spreading patterns) or if the ethical implications of a possible privacy violation were too great to even consider it.

nourabdelbaki commented 5 months ago

Thank you for sharing your research! It is very insightful. The paper identifies misinformation as relying more on "viral" spread through narrow, deep cascades. Does this suggest that misinformation is inherently more engaging or attention-grabbing than other types of content? Does it raise concerns about the platform's potential role in amplifying misinformation? Moreover, I wonder if we can extend the study to possibly analyze how specific design choices and algorithms influence the spread of different types of content, including misinformation?

saniazeb8 commented 5 months ago

Hi,

Thank you for sharing your work. It’s such an interesting and valuable concept in today’s world where political ideology and volatility shapes major dynamic economic outcomes. However, I am intrigued to know more on how you computed it and modelled it. Looking forward to it.

essicaJ commented 5 months ago

Hello, professor. I really enjoyed reading your work. You mentioned that a very small minority of older and more conservative users power the spread of misinformation. Could you elaborate more on how do the behaviors of these users differ from the broader population on Facebook in terms of content creation and engagement? Thanks