Bye, Facebook


After fifteen years on the platform, it’s time I said farewell to Facebook.


 

Note: The following post originally appeared on my Facebook page. It is reproduced here with light modification.

Some days you wake up and just make a decision. Today I’ve decided to permanently step away from Facebook. This may come as a shock to some of you considering how diligently I’ve used the site over the last decade. But it’s something I’ve contemplated and wrestled with on and off for the last few years. There’s no single catalyst behind this decision really, apart from a few friends I respect recently doing the same. Rather, it’s the culmination of two different crosscurrents pushing me in a new direction.

The first is that I earnestly view Facebook as a net harm to society rather than a social good. There is a paradox inherent to social networks specifically and the internet more generally in that as information technology on this grand scale has enabled us to be more open and inclusive and allowed us to spread the ideas and values we cherish, it has also led to more closed and siloed communities — to hermetic, algorithmically generated echo chambers that amplify groupthink. As much as we depend on it today, the internet is to a great extent a manifestation of how people use confirmation bias to validate their worldview. And I see Facebook as the living apotheosis of this phenomenon.

In retrospect, it seems self-evident that Facebook and Twitter haven’t been good for human unity, and we’re recognizing some of their more pernicious effects on connected society. We unleashed these great technological behemoths without putting the proper safeguards in place to prevent exploitation by bad actors, and without having the self-awareness and foresight to ensure that these spaces be used to uplift our society rather than tear it down. Though I’ve felt (and argued) this for some time, I nevertheless rationalized remaining on here despite my many misgivings.

I’ve come to expect the divisiveness and aggressive polarization embedded in American society, but I’ve come to the conclusion that I don’t need around-the-clock reminders of these dispiriting signs of decay to which Facebook gleefully pays homage. By Facebook’s own admission and internal research, their algorithms prioritize overall engagement over social harmony — something that should be obvious to anyone who’s spent more than five minutes on here. For those of us who assumed this problem would improve over time, we were as naïve as we were mistaken.

The second reason for leaving is that I’d like to extend my activism and passions into new horizons. While I’ve gained tremendously from the many conversations I’ve had here over the years, it’s time to move on. I’m honestly not sure what else there is to gain from my 157th thread on climate change or biblical inerrancy or why authoritarianism is bad. I think there are other ways to utilize my talents, and other mediums geared toward furthering the causes I care about. I don’t know what these look like yet, but I look forward to figuring it out in the months ahead. I also plan to spend more time on so called “deep work” — with books and published research as opposed to news and social media.

At the same time, I’ve managed to build up a respectable following and have contributed a decent amount of viral content. I’d not like to see that content vanish out of thin air (nor, presumably, would the people who have shared, engaged, and otherwise benefited from that content). So rather than deactivate or delete my account, I’ll leave it up, along with all of my posts, shares, and comments. I just won’t be posting anything new from here on out. If you’d like to reach out and say hi, I’ll still be on Messenger for the time being, so if you have questions or just want to chat about stuff, reach out to me there. You can also email me if you’d like.

I do want to say thank you to all the wonderfully passionate and brilliant people I’ve met on here — far too many to name. Some of you I’ve had the pleasure of meeting in person, while others I’ve only engaged virtually. I cherish the many deep and thoughtful conversations we’ve had together; I wouldn’t be who I am today without your knowledge, insight, and wit. I hope you all continue to be a light to those in your life as I aim for the same. I hope that all of us will continue to use our passions to change the world into a better place than it is today. As the late anthropologist Margaret Mead wrote, “Never doubt that a small group of thoughtful, committed, citizens can change the world. Indeed, it is the only thing that ever has.”

Veni, vidi, vici ✌️

Update 1.24.2021: FiveThirtyEight published an article this week that goes into how the widespread problem of misinformation on Facebook can be traced to its core design. Its algorithms prey on our obsession with drama and controversy to drive engagement and clicks. In short, divisiveness keeps the lights on at Facebook, and its higher ups have known this for years.

The company thus finds itself in the precarious position of balancing its desire for constant revenue growth — that according to their internal data is fueled by polarizing, sensational content, of which misinformation and science denial form a large part — against any ethical obligations it may have to society. The context of Kaleigh Rogers’ article is anti-vaxx sentiment and false claims around Covid-19 vaccines, but the issue of misaligned incentives is industry-wide and applies equally to the spread of other types of misinformation. Excerpt:
 

“Our research shows how seamlessly old narratives can be repurposed to fit new contexts,” said Rory Smith, a research manager at First Draft and a co-author of the report. “When demand for information about a topic is high but the supply of credible information is low, you get a data deficit, and that deficit will quickly be filled up with misinformation.”

[…]

“The anti-vaxx movement has done so well on Facebook in part because it is controversial, and controversy helps make Facebook a lot of money. In 2019, 98 percent of Facebook’s revenue was from advertising — $20 billion in all. Facebook’s advertising is so valuable because it can be microtargeted, based on the data Facebook collects on its users. To collect more and better data (and to expose users to more ads), Facebook needs its users to be active and engaged: liking posts, sharing links, joining groups and commenting. One surefire way to keep people engaged is to expose them to content that provokes an emotional response, like a post claiming the vaccine you’re planning to give your toddler will cause him or her to develop autism.”

[…]

“A Wall Street Journal investigation last year uncovered how teams within Facebook tasked with addressing the site’s disinformation crisis cited the platform’s design as the root of the problem. An internal company presentation from 2018 included slides that said Facebook’s algorithms “exploit the human brain’s attraction to divisiveness,” and, if not altered, would surface “more and more divisive content in an effort to gain user attention and increase time on the platform.”

[…]

“Social media preys on the most primal parts of your brain. The algorithm maximizes your attention by hitting you repeatedly with content that triggers your strongest emotions — it aims to provoke, shock and enrage,” Kendall said in his opening statement. “This is not by accident. It’s an algorithmically optimized playbook to maximize user attention — and profits.”

[…]

“Industry researchers believe there are other efforts Facebook could make to reduce the impact of the anti-vaxx movement on the site. Last year, nonprofit research group Ranking Digital Rights released a report on how algorithmically driven advertising structures have exacerbated the disinformation epidemic by increasing its spread, and recommended social media sites look at changing these systems — rather than moderating content — to curb the spread. People will always post nonsense on the internet. The platforms we use don’t need to be designed to lead people to it.”

 


 

Further reading:

Feature images credit: AFP/Alastair Pike; Richard Drew/AP Photo