More concerning news on social media and how it influences us
I’m very concerned about social media, where it is taking us, and how hard it is to extract ourselves from it, even if we want to. Worse, many of the platforms seek to get their users hooked, and data suggests they are downright harmful in a number of ways. I’m beginning to start thinking about ways to reduce how much I rely on these platforms.
Today, I want to look briefly at some aspects of Facebook, TikTok, and YouTube, and then make a few remarks.
The latest on Facebook
I particularly wanted to highlight a recent series from the Wall Street Journal called The Facebook Files which looked in-depth stories at some of the harm Facebook has done, often knowingly. Here’s a brief summary from the WSJ:
Facebook Inc. knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands. That is the central finding of a Wall Street Journal series, based on a review of internal Facebook documents, including research reports, online employee discussions and drafts of presentations to senior management. Time and again, the documents show, Facebook’s researchers have identified the platform’s ill effects. Time and again, despite congressional hearings, its own pledges and numerous media exposés, the company didn’t fix them. The documents offer perhaps the clearest picture thus far of how broadly Facebook’s problems are known inside the company, up to the chief executive himself.
Articles made these key points, among others:
- Instagram (owned by Facebook) is particularly harmful for teen girls, with internal studies showing Instagram is actively harmful to many users. It increases depression, suicidal thoughts, and tendencies towards eating disorders. But Facebook has minimized attention given these risks.
- Facebook made changes to its algorithms to promote meaningful social interactions between people and engagement, but these changes “made angry voices louder”. The WSJ writes the changes were “making Facebook, and those who used it, angrier” but Facebook has resisted making changes because these would hurt user engagement. (Of course, Facebook also previously famously experimented with adjusting people’s news feeds to manipulate their emotional states.) The larger picture here is that Facebook is manipulating us, our emotional states, and the types of interactions we have with other people, in ways we don’t understand and which may promote conflict and division between groups.
- Facebook says its rules apply equally to everyone, but it has a special system for high-profile users which exempts them from certain checks and allows them to promote content it would otherwise prohibit.
Of course, there has also been a lot of news recently on Facebook’s “censorship” of certain ideas and voices, but I won’t get into details of that today. While at some level Facebook (and Twitter, etc.) can run its business how it likes, the concern is that these social media giants control so many of the means of communication that they can easily exclude certain voices or ideas from public discourse, which in the long run could mark the end of democracy. But that’s a story for another day. Nor will I get into privacy concerns today.
Ultimately, though, Facebook seeks to earn money, not to benefit its users. Its algorithms seek engagement by any means necessary, except when public outcry becomes enough to drive it to change its policies (and even then, it may not enforce policies evenly). Why, then, should we expect it to be a healthy place for us? It’s designed to suck us in and keep us scrolling that infinite feed, offering us new things to which will infuriate or delight us.
Another recent story touched on TikTok
Another recent story dealt with TikTok, looking at “How TikTok serves up sex and drug videos to minors”. Essentially, a simple series of experiments showed how teens who indicate an interest in particular topics – even if they don’t click on any of the content – can quickly get sucked into more and more extreme content along those same lines, with it coming to dominate their feeds. The article says:
An earlier video investigation by the Journal found that TikTok only needs one important piece of information to figure out what a user wants: the amount of time you linger over a piece of content. Every second you hesitate or re-watch, the app tracks you.
Through that one powerful signal, TikTok can learn your most hidden interests and emotions, and drive users of any age deep into rabbit holes of content—in which feeds are heavily dominated by videos about a specific topic or theme. It’s an experience that other social-media companies like YouTube have struggled to stop.
(emphasis mine)
Accounts which indicate an interest in a particular topic can become dominated by that topic, having effects like making it seem like drug use is normal and so on. This is the social media equivalent of the Bible’s “bad company corrupts good character” (1 Cor. 15:33); presumably it takes far more effort to resist doing wrong if we believe we’re surrounded by people who are engaged in that activity than if we aren’t, so being immersed in a steady diet of that particular sin presents an unusually harmful environment.
Because of the sheer volume of content posted rapidly on TikTok, there is no way it can police content as rapidly as it is posted, so in many cases the videos served up are content that is banned because it is so extreme, but simply hasn’t been taken down yet.
While this is bad enough for mature users, TikTok does the same even for 13-18 year olds, meaning that even early teens are subject to these influences. TikTok also serves videos labeled “adults only” to minors in this age bracket.
My own experience with YouTube
I want to relate a brief anecdote from my own experience with YouTube. I use it only rarely; a friend used to send Chess videos, and I occasionally use it to stream worship songs or look up home repair tips. During the Olympics, I watched some replays of key men’s and women’s track events (I’m a runner). Recently, on several occasions when opening my YouTube app to pull up a worship video, I noticed that it was auto-playing other women’s track and field events for me instantly when I opened the app – especially those featuring particularly scantily clad women whose attire was nearing a state of the “wardrobe malfunction”. I quickly exited, but it was obvious what was going on: YouTube knows I’m a male who has (apparently) shown an interest in women’s (and men’s) track and field events so it wants to show me what it thinks is going to best engage “typical” men in watching these events: athletic women showing a bit more than they had intended.
At some level, I was horrified – I’m actively trying to avoid this content (Matthew 5:27-30) yet it’s thrust right in front of me by these algorithms. Yes, I understand exactly how it can happen. YouTube wants to make money through advertising and user engagement, and it’ll make the most money if it can offer me exactly what I want to see and get me hooked on it. And statistically, probably most guys with my other interests would want to watch those videos. But it alarms me, in part because if it’s this easy for me to end up watching content I’m trying to avoid when I’m trying to avoid it, how hard will it be for people who aren’t trying as hard? What about college students who are trying to steer clear of lust, or my sons as they begin to come of age? Here’s this engine trying to find ways to engage us, even if that means appealing to our worst desires.
12 Ways Your Phone is Changing You
Recently, I read 12 Ways Your Phone is Changing You by Tony Reinke and John Piper, which I very much enjoyed; I may write a review at a later date (but here’s a fairly good summary/review). Some aspects of this book stuck with me and relate to the present discussion:
- We password protect our phones in part because they are a window into our deepest desires and longings. If someone gets our phone and gets onto our social media, they can see our secret interests. If that’s scary, it may mean we need to re-evaluate our media use (and our interests) before God.
- Social media often provokes us to be harsh with one another. Or we get involved in the wrong way – intervening when we shouldn’t, or staying out of things we ought to get involved with.
- Social media can promote loneliness; the device which in principle can connect us to so many often causes us to neglect those we are closest to, or even real-world interactions.
- Our phones feed our desire for immediacy, excitement, recognition and applause.
The book was a sobering reminder to me to be careful to only use social media when it’s the best tool for the job – not to fill time or to try and find connection. Not only are the algorithms behind it designed from an unbiblical worldview and primarily to earn money, but they easily divert us away from what is truly most important to us. That tyrrany of “just a little more scrolling” can divert so much time and energy away from real life and real people who might need us.
Social media raises some concerns, but what should we do?
At some level, social media seems to be here to stay. It’s ubiquitous, even though platforms may come and go. It’s hard to simply say no to it. For example, some businesses and organizations use it as a primary means of staying in touch or connecting people. It’s hard to beat the convenience of being able to post a single message or tweet and have everyone who cares be able to find it and re-share it, etc.
At the same time, there are legitimate reasons to eschew these platforms. I know a number of people who have deleted their Facebook, Instagram, or other social media apps (or deleted their accounts entirely) because of their concerns. Some have deleted them because of the temptations these mediums present – temptations to lust, discontent, excessive spending, or other sins. Others have deleted them because of the amount of time they were sucking up, or out of privacy concerns or other reasons. I have the greatest admiration for those who carefully and prayerfully decide to make such changes.
The platforms themselves, of course, aren’t the problem – it’s the human heart. Since these platforms and algorithms designed to identify what we want and to try and offer it up to us, they can be particularly dangerous – but the technology itself is not the problem. Given that, I think each of us ought to carefully consider how these are affecting us and whether we are using them to glorify God or whether they are leading us into temptation and sin, and decide how to approach these platforms in view of that.
As for myself, I’m planning to try and find more ways to communicate with my community outside of social media. Probably this means I’m going to rely more on Slack and/or MatterMost as well as possibly Discord, or even texts and e-mail. I’d like to better separate the ways I communicate from people from platforms which are also offering me advertising and content they select, so I can focus on what God wants me to devote my attention to – not on things pre-selected for me by these platforms.
I will likely try harder to avoid platforms where I am the “product” in some sense (like Facebook and Twitter) except when I really need to use these for some reason, and I want to do more to support people who are trying to get away from these platforms entirely. I don’t want to end up in a position where I’m promoting a platform and technology that I know will lead some in to harm.
Christians aren’t called to flee from the world; rather, we are to engage with it and seek to bring glory to God. However, with respect to social media, I don’t think engagement with the world necessarily means we should try to redeem Facebook, TikTok, YouTube, etc., as whole for our use. The underlying algorithms may simply be designed on fundamentally unbiblical assumptions about what people desire – and that may mean they pose too much potential harm for us to encourage sinners, even redeemed ones, to engage in them en masse. And I’m particularly concerned about young people, because these platforms can have such a powerful influence that can easily go unnoticed.
I hope you’ll join with me in wrestling through and discussing these issues. I’d love to hear your thoughts.