References
[1]. N. Tufekci, "YouTube’s Algorithmic Radicalization Problem, " Wired, 2018.
[2]. Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. PublicAffairs.
[3]. M. Eslami et al., "User Attitudes Toward Algorithmic Opacity, " Proc. ACM Hum.-Comput. Interact.2019.
[4]. Pariser, E. (2011). The filter bubble: How the new personalized web is changing what we read. Penguin Books.
[5]. A. Alter. (2017).Irresistible: The Rise of Addictive Technology.
[6]. O’Neil, C. (2016).Weapons of math destruction: How big data increases inequality and threatens democracy. Crown Publishing Group.
[7]. D. Lazer et al.(2020) ."Social Media and Political Polarization, " Science.
[8]. H. Allcott et al.(2020).Am. Econ. Rev."The Welfare Effects of Social Media".
[9]. J. A. Konstan et al., "Recommender Systems: From Algorithms to User Experience, " User Model. User-Adap. Inter., 2022.
[10]. Gillespie, T. (2018).Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media [In Chinese]. China Machine Press.
[11]. Social Media and Wellbeing Consortium.(2023).Algorithmic amplification effects on mental health: Longitudinal findings from the Social Media and Wellbeing Study [Technical report].https: //www.smws.org/report/2023-mental-health
[12]. European Commission.(2025).Regulation(EU)2025/217 on transparency-by-design requirements for algorithmic systems.Official Journal of the European Union, https: //eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX: 32025R0217