Why I’m Deleting Facebook Tomorrow
Tomorrow, I will delete my Facebook account. Before I do, I want to share why I made this decision and believe it matters.
I quit Facebook in 2020 because I was deeply concerned that Mark Zuckerberg and Meta’s leadership weren’t serious about addressing the harm created on their platform. Over time, it became clear to me that Facebook’s priorities were growth and power, not the safety or well-being of its users.
I didn’t plan to become a public critic when I left the company. Leadership dismissed my concerns, and I second-guessed myself until January 6, 2021. Watching the storming of the Capitol, I knew Facebook had contributed to spreading “Stop the Steal” propaganda and enabling this attempted coup.
On January 11, a Meta leader deflected responsibility, claiming: “I think these events were largely organized on platforms that don’t have our abilities to stop hate, don’t have our standards, and don’t have our transparency.” That statement was a turning point for me. It was disingenuous. Facebook didn’t clearly understand how its platform contributes to hate, division, and misinformation. Worse, this was just another example of its leadership dodging accountability.
Later, it became clear that I was right. Since then, Facebook's role in enabling January 6th has been well covered and understood internally.
A Pattern of Neglect and Harm
The latest news that Meta is ending its fact-checking program and scaling back user protections is the final straw for me. This is the same company whose platform contributed to genocide in Myanmar, harmed teenage users and enabled political and racial polarization across the globe. Instead of addressing these harms, Meta consistently prioritizes its growth over public safety.
One moment that stands out is the summer of 2020. As political and racial tensions in the U.S. surged, I saw those same divisions mirrored in the data from CrowdTangle, a tool Facebook had purchased to help media companies analyze the performance of their content. CrowdTangle became an invaluable resource for researchers and journalists to identify and understand harmful content on Facebook.
Rather than use CrowdTangle to improve transparency and mitigate these issues, Facebook leadership sought to discredit the tool when it revealed inconvenient truths. They rejected strong internal proposals for greater transparency and doubled down on growth at all costs. By August 2024, Meta had shut down CrowdTangle entirely, replacing it with the Meta Content Library—a platform with limited functionality and accessibility, restricted to academic researchers and nonprofits, leaving journalists and watchdogs in the dark.
This move was part of a broader pattern: when transparency threatens profits, Meta chooses secrecy.
A Platform of Division and Death
Meta’s latest actions reinforce its transformation into the world’s largest platform for division and harm. Early in my career there, I believed in the original mission “To give people the power to share and make the world more open and connected” and was even more excited about the 2017 revision, “to give people the power to build community and bring the world closer together.” However, my experience taught me that the company’s focus has always been on power—the power to grow, dominate, and influence—while the promise of “bringing the world closer together” remains little more than a marketing slogan.
Today, Meta’s tools are indeed building communities—but they are communities of distrust, hate, and division.
While legislation could rein in platforms like Facebook, I don’t expect meaningful action in the U.S. anytime soon. Meta is already ingratiating itself with policymakers, including the incoming president, while bending to political pressures to weaken safeguards against misinformation.
For now, I am encouraged by international efforts, such as the EU’s Digital Services Act (DSA) and Digital Markets Act (DMA), which hold tech companies to higher standards of transparency and accountability. These regulations represent a critical step in ensuring digital platforms serve the public good.
The Power of Choosing to Leave
The question we all face is: Do we support Meta or walk away? Do you choose to work there? Do you keep your Facebook, Instagram, and Threads accounts? While one person leaving might seem insignificant, collective action can have a real impact.
For me, the choice is clear. I no longer want to support a company whose values and actions are deeply misaligned with mine. If you feel the same, I encourage you to consider your role in enabling Meta’s power.
You can find me on Signal and Bluesky: btb.bsky.social.