Social Media Algorithms Destructive Effect


Before the age of social media, information are for the most part  only accessible to a limited number of people. With the advancement of technology and the increasing accessibility of  mobile phones, social media is now easily accessible for anyone with a mobile phone and internet connection. In 2020 alone, users spent an average of 145 minutes online every day, more than 50% of the amount  in  2012 (Tankovska, 2021).  As a result, sharing and accessing information can be done effortlessly, sometimes without any verification. This essay is going to describe 2 ways social media algorithms destroy the knowledge acquisition process for a solid knowledge foundation in society today using statistics and expert opinion. The first is that it distorts information distribution in a way that echo chambers are formed. The second is that it reduces readers’ critical thinking ability.

Looking at one of the most popular social media, such as Instagram, Facebook, and TikTok, all of them have one element in common. They use personalised algorithms that might limit the idea and lead to cognitive bias. Indeed, TikTok ever explains the algorithm of its For You page on their website. The social media company mention that the For You page created by “a complex set of weighted signals”. These signals include hashtags, video duration, kind of device, and so on. It constructs a shield that limits our access to information other than what we agreed. This might makes us an obvious target for polarisation. Although social media algorithms can result in reading more information, information only circulates among the community. Indicating the information that we draw is actually limited and only something relevant to our knowledge (TikTok, 2020; Menczer & Hills, 2020). 

It indeed happened during the Spanish General Election in April 2019. Based on Social Network Analysis of related tweets—contain a single link or more. Link sharing increased to 8.72 million with more than 80% read by a single community only. As this data showed, link sharing has mostly happened in the same community. Also, keep in mind that deprecate information most likely has a minimal chance of sharing (Morales-i-Gras, 2020). This data showed that information only circulates among the community. Indicating the information that we obtain is actually limited and only something relevant to our knowledge or in other words known as echo chambers. 

With echo chambers, users might also fall for false information. Since the information we read on social media might have not been verified and we have no idea about the topic, there is a possibility for us to construct the wrong knowledge foundation. During the early stages of the COVID-19 pandemic alone, Facebook mentions that they had removed around 7 million posts—containing fake preventive measures and hyperbolic cures (Reuters, 2020). This indicates that social media have an alarming number of false information. In social media, anyone can share anything. With no control or verification, false information can be shared instantly. From that data also, Facebook might use algorithms that determine fake information and removed it, yet it takes time for the algorithm to verify. The time gap creates a loop-hole for the information to spread and create more post to be flagged or removed. During this gap, there is a chance that someone might share that false information with other family or community members. As there is a tendency to believe the information given by someone we trust, there is still a high chance that we might fall into belief in the fake news. Which is harmful and destructive for acquiring a knowledge foundation.

Other than creating echo chambers environment, the same algorithms reduce readers’ critical thinking ability. Information overload is one of the biggest factors. The rapid exposure to new information creates a path to less time for deep thought and reflection. In his book “The Shallow”, Carr (2010) thinks that the fast-paced of internet communication—including tweets, comments, and simple messages causing a decline in reflective thought. A study conducted by Trapnell and Sinclair (2013) involving 2.314 students, suggested that there is a steady decline in reflective thought from 2007 to 2010. In the same article, they mentioned that students with a high activity rate on social media tend to have shallower goals or thinking. 

As reflection is also an important path to grants time for our brain to digest and connect deeper with the topic in achieving a solid knowledge foundation. With this design of algorithm, the user itself needs to set a time for reflection. It is possible to some extent, keep in mind that the algorithm was designed to keep users online and keep reading. Therefore, it can be a challenging and difficult task to do for majority of people as we keep on scrolling. As a result, we will read more than we should. 

Reading scads of information and resources is indeed important to some extent. Nevertheless, overloaded information can lead to stress. In a Forbes article, Marr (2015) mentioned that technology evolved and changing a short amount of time which lead to more data being process, while our brain is designed to evolve at a more steady rate. This imbalance causes our brain to work harder and upload more information, resulting in a brain malfunction that we know as stress. With stress, it makes us less focused, thus the information that we have read might not be fully absorbed. It is also showing a lack of reflection that output critical thinking since we will form a great understanding during deep thought. Causing a distortion in the process of knowledge acquisition.    

Social media create a destructive effect on the knowledge acquisition process. Its ability to create a new library of information making it easier to find what we want to know. However, with advancements in technology, the algorithm had been tweak and led to polarisation.  Information—including false information might only travel between the community, forming echo chambers. With rapid exposure to new information, there is some reduction in critical thinking ability. That is caused by a lack of reflective thought, creating a less stable foundation of knowledge.  

References 

Carr, N. G. (2010). The shallows: What the internet is doing to our brains. W. W. Norton & Company.

Marr, B. (2015, November 25). Why too much data is stressing us out. Forbes. https://www.forbes.com/sites/bernardmarr/2015/11/25/why-too-much-data-is-stressing-us-out/?sh=5bcd79eaf763 

Menczer, F., & Hills, T. (2020, December 1). Information overload helps fake news spread, and social media knows it. Scientific American. https://www.scientificamerican.com/article/information-overload-helps-fake-news-spread-and-social-media-knows-it/

Morales-i-Gras, J. (2020, June 15). Cognitive biases in link sharing behavior and how to get rid of them: Evidence from the 2019 Spanish general election Twitter conversation. Social Media + Society, 6(2), 2056305120928458. https://doi.org/10.1177/2056305120928458

Reuters. (2020, August 12). Facebook removes seven million posts for sharing false information on coronavirus. NBC News. https://www.nbcnews.com/tech/tech-news/facebook-removes-seven-million-posts-sharing-false-information-coronav-rcna77

Tankovska, H. (2021, February 8). Daily time spent on social networking by internet users worldwide from 2012 to 2020. https://www.statista.com/statistics/433871/daily-social-media-usage-worldwide/

TikTok. (2020, June 19). How TikTok recommends videos #ForYou [Press release]. https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you 

Trapnell, P. & Sinclair, L. (2013). Texting frequency and the moral shallowing hypothesis. Dept. of Psychology, The University of Winnipeg. https://news.uwinnipeg.ca/wp-content/uploads/2013/04/texting-study.pdf

Sorry,

We are glad that you like it, but you cannot copy from our website. Just insert your email and this sample will be sent to you.


By clicking “Send”, you agree to our Terms of service and Privacy statement. We will occasionally send you account related emails. x close