From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs
Research Article
Open Access
CC BY

From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs

Yuyue Xie 1*
1 Macau University of Science and Technology
*Corresponding author: 1220020688@student.must.edu.mo
Published on 19 August 2025
Journal Cover
LNEP Vol.116
ISSN (Print): 2753-7056
ISSN (Online): 2753-7048
ISBN (Print): 978-1-80590-331-4
ISBN (Online): 978-1-80590-332-1
Download Cover

Abstract

Nowadays, Algorithmic technology has already fully penetrated all aspects of people's lives. And perceptions of algorithmic threats in today's academia are still relatively optimistic, there is still a gap in the research on exactly how the irreversible impact of the information cocoon constructed by algorithms is practised on the masses and what the flaws in the crowd's perception of algorithms are. This paper explores the deep logic behind the interplay of the information cocoon through an in-depth analysis of algorithmic technology, information cocoon, filtering bubbles, cognitive construction and algorithmic boycott behaviour. It examines how the information cocoon trains human cognitive construction from the nature of commercial algorithms. The paper also explains the joint role of algorithms in building a personalised filtering system through content and collaborative filtering, as well as the three major paradoxes of the “closed-loop” mechanism between algorithms and users. Finally, it discusses the adverse effects of the cocoon effect on the construction of human cognition and how individuals can resist it. Elaborate on the following topics: the commercial nature of algorithms; how algorithms build their personalised filtering system through the joint action of content filtering and collaborative filtering; the three major paradoxes of the 'closed-loop' mechanism between algorithms and users; the adverse effects of the cocoon effect on the construction of human cognition; and how individuals should resist it. From a logical point of view, algorithms can't achieve invisible regulation without a push from humans themselves.

Keywords:

Algorithmic Technology, Information Cocoon, Filtering Bubbles, Cognitive Construction, Algorithmic Resistance Behavior

View PDF
Xie,Y. (2025). From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs. Lecture Notes in Education Psychology and Public Media,116,24-31.

References

[1]. Ministry of Industry and Information Technology of the People's Republic of China, China's netizens reach 1.108 billion people, Internet penetration rate rises to 78.6%, 2025-01-17, 2025-07-22.

[2]. Eli Pariser, Filter bubbles: the hidden manipulation of us by the Internet, People's University of China Press, Beijing, 95-115, 2020

[3]. Aicher, A., Kornmüller, D., Minker, W., & Ultes, S. (2023). Self-imposed filter bubble model for argumentative dialogues. In Proceedings of the 5th international conference on conversational user …, 2023.

[4]. Hashim, S., & Waden, J. (2023). Content-based filtering algorithm in social media. Wasit Journal of Computer and Mathematics Science, 2(1), 14–17.

[5]. Fareed, A., Hassan, S., Belhaouari, S. B., & Halim, Z. (2023). A collaborative filtering recommendation framework utilizing social networks. Machine Learning with Applications, 14, 100495.

[6]. Zhao, F., Yan, F., Jin, H., Yang, L. T., & Yu, C. (2017). Personalized Mobile Searching Approach Based on Combining Content-Based Filtering and Collaborative Filtering. IEEE Systems Journal, 11(1), 324–332. https: //doi.org/10.1109/jsyst.2015.2472996

[7]. Michel Foucault, Surveiller et punir, SDX Joint Publishing Company, Beijing, 173, 2019

[8]. Mengqi, D. (2023). Discipline and Resistance: User Autonomous Awakening and Resistance Practices under Algorithmic Recommendations. Radio & TV Journal, (09), 133-136. doi: 10.19395/j.cnki.1674-246x.2023.09.009.

[9]. Kasy, M. (2024). Algorithmic bias and racial inequality: a critical review. Oxford Review of Economic Policy, 40(3), 530–546.

[10]. Wang, R., Harper, F. M., & Zhu, H. (2020). Factors Influencing Perceived Fairness in Algorithmic Decision-Making: Algorithm Outcomes, Development Procedures, and Individual Differences (Version 1). arXiv.

Cite this article

Xie,Y. (2025). From Algorithmic Gazing to Cognitive Closure: The Invisible Discipline of the Information Cocoon on Human Cognitive Constructs. Lecture Notes in Education Psychology and Public Media,116,24-31.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

About volume

Volume title: Proceeding of ICIHCS 2025 Symposium: Exploring Community Engagement: Identity, (In)equality, and Cultural Representation

ISBN: 978-1-80590-331-4(Print) / 978-1-80590-332-1(Online)
Editor: Enrique Mallen, Nafhesa Ali
Conference date: 29 September 2025
Series: Lecture Notes in Education Psychology and Public Media
Volume number: Vol.116
ISSN: 2753-7048(Print) / 2753-7056(Online)