Social media is pretty important. The President of the United States uses it often to talk about his plans and spread his message to his political supporters. Many of us here at Manual use social media daily to communicate with friends and family and obtain information about sports, scheduled events, and news. However, these platforms may be dividing us more than almost any other technology in history.
I recently viewed the Netflix Documentary “The Social Dilemma.” It does a wonderful job of presenting problems that social media contributes to beyond stereotypical claims such as “It’ll rot your brain.” For example, social media is believed to have contributed to the rise of teen suicide rates, especially among women. The film goes on to discuss multiple reasons why this increase is occurring. For one, human brains are not designed to receive such instantaneous and overwhelming feedback. An image posted on social media can give the user an influx of comments almost immediately, or, perhaps worse, can go unnoticed. Some children are tying more and more of their self-worth into comments on their social media, or losing self esteem if they don’t get enough likes on photos. Social media in its current format is arguably having an adverse effect on the mental health of many American children.
But there is another glaring problem posed by how social media currently works. Most of these platforms– Twitter, Snapchat, and the like– are now run by algorithms. The content you see is determined by code, not a human being. This may not surprise you, but the implications it has are massive. For one, these algorithms are designed to get users to spend as much time possible on the site. They pull from massive sets of data to build a profile of who you are, to then send you posts that appeal most to your tastes. This is part of why social media can be so addictive. A computer with far more processing power than any human brain that knows and remembers just about everything you’ve ever done online is a difficult opponent to beat, to say the least. The companies behind these platforms want to get a monopoly on our free time because it benefits them. But one simple fact compounds this problem: the computer does not know what is true in the real world, and it has no way to learn or tell.
If this does not worry you, how about an example? Depending on where you are in the United States, you will get different Google autofill results for certain searches. If you search “Climate change is” here in Louisville, you get variations of real, happening, accelerating, etc. In other places, this search will yield the results “Climate change is fake,” or “a hoax.” These same biases can be seen on our social media. For example, I was recently witness to a debate between two Manual students about mail-in voting. One argued for the system’s necessity and how it would make the election safer, and one argued that it would cause widespread voter fraud, and that mail-in ballots were being dumped out and sold by mailmen across the country. I tried to look up the second student’s claims and do some research about the topic, and I got no relevant results. Now, to be clear, there are some recorded incidents of ballots being tampered with, lost, or dumped, but they appear to be few in number and isolated, not part of some large scheme to impact the election.
My point, and a point illustrated in the aforementioned documentary, is that across the internet and especially on social media, we are being shown different, perhaps entirely contradictory news when we search for current events. An example from the movie that puts this well is as follows: imagine that Wikipedia showed you different facts based off of your political affiliation and the place you lived. That is what is happening. Social media is presenting us with opposing views that are both stated to be the truth. How are we meant to have meaningful political discussions, come to any consensus, when the facts we work from do not agree? In fact, the bots that drive these sites have been stated to spread false information six times as fast as the truth. Think about it. The bot’s goal is to get you involved with the site. It has no concept of truth. Therefore, it shows you what is most likely to get you online, regardless of whether or not it is true, and often, it may not be. There are some people that go to social media to have political debate, and want to have their ideas challenged. They are not most people. Most people go to social media to find people who agree with them and data that supports their beliefs. They do not want those beliefs challenged, because it would require them to really do some introspection, and maybe admit that they have made mistakes, something that is anathema to many people in America today.
So what can be done about it? How do we make sure that the information we are told is the truth is really the truth, that social media does not continue to damage our children and divide our country? On the first front, some propose harsher age limits on social media. Social media has proven to be addictive and formative of habits, and restricting access to a later age means that kids would be less impressionable. On the second, the answer is possibly more complex still. Everyone is entitled to their opinion, but the line between opinion and lies is thin. To a certain degree, some people will believe what they want to believe, but a good first step might be this- modify the algorithm to reduce the effect of the echo chamber. Many people on social media create a bubble of people with similar ideas that they spend almost all their time on social media inside. Create an option to present these bubbles with information that challenges these claims, but, importantly, does not dismiss them as baseless. Even if the information people trust is false, very few that you argue with will respond well to being told that all their sources are wrong and they’re an idiot. Another helpful step might be to create a better space for public forum on social media. Create a place where people can debate and discuss freely with people who do not always agree with them. The technology is already clearly present to determine what side a person leans towards politically, and applying that to a new portion of a website is much easier than building it from scratch. Ultimately, we need to create an online environment that is constructive and communal. People from all walks of life should be easily able to communicate, and the systems in place need to help build compromises and self confidence, instead of criticism that does not drive people to examine their own personal biases and faults.