Bennett, W. L. (2012). The Personalization of Politics: Political Identity, Social Media, and Changing Patterns of Participation. The ANNALS of the American Academy of Political and Social Science, 644(1), 20–39. https://doi.org/10.1177/0002716212451428
How social media change the way people participate in activities and its advantages and disadvantages as a new form of organization and coordination.
Bennett, W. L., & Iyengar, S. (2008). A New Era of Minimal Effects? The Changing Foundations of Political Communication. Journal of Communication, 58(4), 707–731. https://doi.org/10.1111/j.1460-2466.2008.00410.x
The history of political communication and how the development of technologies changes individuals’ behavior and the media landscape of political information. Especially focused on the fragmentation and selective exposure to political news.
Bennett, W. L., & Segerberg, A. (2012). The Logic of Connective Action: Digital media and the personalization of contentious politics. Information, Communication & Society, 15(5), 739–768. https://doi.org/10.1080/1369118X.2012.670661
Analyzed the difference between collective and connective actions with a detailed elaboration of how connective actions are constructed via digital technologies.
Berman, R., & Katona, Z. (2020). Curation Algorithms and Filter Bubbles in Social Networks. Marketing Science, 39(2), 296–316. https://doi.org/10.1287/mksc.2019.1208
The effect of curation algorithm on users and the quality of contents. Discussed the role of algorithm and encourage digital platforms to aim at higher quality instead of better tastes matching.
Bruns, A. (2019). Filter bubble. Internet Policy Review, 8(4). https://doi.org/10.14763/2019.4.1426
Strongly criticized the concept of filter bubble with evidence. Even indiviuduals who are hyperpartisan also engage in mainstream discussions and have a whole picture of what is happening.
Dubois, E., & Blank, G. (2018). The echo chamber is overstated: The moderating effect of political interest and diverse media. Information, Communication & Society, 21(5), 729–745. https://doi.org/10.1080/1369118X.2018.1428656
Use empirical evidence to show the fact that echo chamber is overstated. Specifically showed that individuals who have a higher political interest and more diverse media diets will have a lower chance fall in the echo chamber.
Fletcher, R., & Nielsen, R. K. (2017). Are News Audiences Increasingly Fragmented? A Cross-National Comparative Analysis of Cross-Platform News Audience Fragmentation and Duplication: Are News Audiences Increasingly Fragmented? Journal of Communication, 67(4), 476–498. https://doi.org/10.1111/jcom.12315
Cross-country and cross-media evidence that echo chamber is not reality. fragmentation highly relates to echo chamber and filter bubble that individuals share opinions with like-minded people, and duplication refers to the situation where most people browse different media and these media have diverse audiences.
Freelon, D. G., Kriplean, T., Morgan, J., Bennett, W. L., & Borning, A. (2012). Facilitating Diverse Political Engagement with the Living Voters Guide. Journal of Information Technology & Politics, 9(3), 279–297. https://doi.org/10.1080/19331681.2012.665755
Draws on LVD’s data analyzed individuals’ deliberate behaviors on this platform with political implications. Proposed ways to promote potential positive online behaviors.
Guo, L., A. Rohde, J., & Wu, H. D. (2020). Who is responsible for Twitter’s echo chamber problem? Evidence from 2016 U.S. election networks. Information, Communication & Society, 23(2), 234–251. https://doi.org/10.1080/1369118X.2018.1499793
Used social network analysis to examine online network around 2016 election and focused on opinion leaders’ influence during the process. Clearly differentiate homophily and homogeneity.
Jamieson, K. H. (2008). Echo chamber: Rush Limbaugh and the conservative media establishment. Oxford University Press.
Proposed the concept echo chamber and used the term to analyze media landscape.
Jones-Jang, S. M., & Chung, M. (2022). Can we blame social media for polarization? Counter-evidence against filter bubble claims during the COVID-19 pandemic. New Media & Society, 00(0), 1–20.https://doi.org/10.1177/14614448221099591
Examined the moderating role of social media in the latest Covid-19 situation with supporting results that social media usage can weaken the link between political identities and affective partisan polarization, as well as the link between misguided beliefs about child vaccines and vaccine hesitancy. So not support the idea of filter bubble.
Kanai, A., & McGrane, C. (2021). Feminist filter bubbles: Ambivalence, vigilance and labour. Information, Communication & Society, 24(15), 2307–2322. https://doi.org/10.1080/1369118X.2020.1760916
Conducted semi-structured interviews with women and discussed a new perspective of echo chamber–how personal vulnerability should be taken into consideration when discussing filter bubble. If filter bubble can serve as safety space, what is the limitation and provided firsthand information about how users feel in the filter bubbles.
Makhortykh, M., & Wijermars, M. (2021). Can Filter Bubbles Protect Information Freedom? Discussions of Algorithmic News Recommenders in Eastern Europe. Digital Journalism, 1–25. https://doi.org/10.1080/21670811.2021.1970601
Brought the concept of filter bubbles into Eastern Europe countries and examined their situations in terms of personalization. Used document analysis, has limitations.
Möller, J., Trilling, D., Helberger, N., & van Es, B. (2018). Do not blame it on the algorithm: An empirical assessment of multiple recommender systems and their impact on content diversity. Information, Communication & Society, 21(7), 959–977. https://doi.org/10.1080/1369118X.2018.1444076
Justified the function of algorithm. The debate about dose algorithm limit users’ visions. From this study’s view, algorithm do not decrease online diversity.
Pariser, E. (2012). The filter bubble: What the Internet is hiding from you. Penguin Books.
Proposed the idea of filter bubble with elaboration on its formation, potential risks and suggestions on how to address the problem.
Pariser, E. (2019). Restoring Dignity to Technology How to Design Tools to Set Right What Has Gone Wrong Online. TIME Magazine, 193(3), 34–35.
Further discussed the harm of personalization and held technology giants accountable for the problematic present.
Sunstein, C. R. (2001). Republic.com. Princeton University Press.
The role of filtering. The situation of fragmentation. The consumer and citizen discussion. The downsides of private communities and explanation of the mechanism behind. Main argument: complete personalization will lead to diverse problems.
Sunstein, C. R. (2005). Group judgments: Statistical means, deliberation, and information markets. New York University Law Review (1950), 80(3), 962-1049.
Discussed how group aggregate information from members, the importance of deliberation, information markets, how to ensure the free expression within groups.
Sunstein, C. R. (2006). Infotopia. Oxford University Press, Incorporated.
Talked about how Wikipedia, open source software and blog shows new ways to collaborate online, to converge diverse perspectives and how that can help build a better information environment. Also introduced prediction market, in general, how to gain information from different minds.
Turow, J. (2020). Personalization. In Keywords for Media Studies (Vol. 5, pp. 135–137). New York University Press. https://doi.org/10.18574/nyu/9781479817474.003.0048
Introduced the history of personalization and current discussions on the issue.
Wieland, M., Nordheim, G. V., & Kleinen-von Königslöw, K. (2021). One Recommender Fits All? An Exploration of User Satisfaction With Text-Based News Recommender Systems. Media and Communication, 9(4), 208–221. https://doi.org/10.17645/mac.v9i4.4241
Conducted a survey to analyze individuals’ satisfaction with recommender systems. The systems were specially designed for the research. The dependent variable–satisfaction–is measured by how individuals perceive the content selected by the system.
Willson, M. (2014). The politics of social filtering. Convergence: The International Journal of Research into New Media Technologies, 20(2), 218–232. https://doi.org/10.1177/1354856513479761
Explained what social filtering is and how to analyze it through the perspective of social network sites. Proposed a new framework with detailed attributes.
Willson, M. (2017). Algorithms (and the) everyday. Information, Communication & Society, 20(1), 137–150. https://doi.org/10.1080/1369118X.2016.1200645
By putting algorithms under the frame of everyday life and analyzing the interactive effects between algorithms and individuals, this study proposed a new structure in understanding algorithms and future directions for exploring consequences caused by technologies in everyday life.
Zuiderveen Borgesius, F. J., Trilling, D., Möller, J., Bodó, B., de Vreese, C. H., & Helberger, N. (2016). Should we worry about filter bubbles? Internet Policy Review, 5(1). https://doi.org/10.14763/2016.1.401
Made clear distinctions between self-selected personalization and pre-selected personalization, summarized concerns for both types, mentioned both Pariser and Sunstein, and thought that technology is insufficient right now to cause severe problems.