Being Able to Point at Some Issue and Say "I'm Helping!"

Facebook has started targeting users 1 2 who have engaged in "extremist content" on the platform, attaching a sticky warning to the top of the feed whenever Facebook is opened. There are two warnings issued depending on the user:

  • "You may have been exposed to harmful extremist content recently. Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others. Get support from experts."
  • "Are you concerned that someone you know is becoming an extremist? We care about preventing extremism on Facebook. Others in your situation have received confidential support. Hear stories and get advice from people who escaped violent extremist groups."

... I think this is bad and dangerous. It seems more like manipulating its users with misleading soundbites than "helping extremists". I think it may have the opposite effect, in fact.

Always Has Been.jpeg

This is mistake theory looking at... not even necessarily conflict theory, but self-necessitating bureaucracy in action. You see this as Facebook failing, but from a 'meshing of interests' perspective all this means is that Facebook needs to further act to counter extremism because it's identifying extremists who are radicalizing. Whether they are radicalizing in response to Facebook is irrelevant to whether Facebook will be pressured - or seek - to take further interventions to counter the problem. The American bias to 'don't just stand there, do something' and never let a crisis go to waste combine to validate further Facebook political initiatives that put it in good standing with a ruling party power base that insists Facebook needs to do more to stay in good standing.

I believe the technical term is 'self-licking icecream cone.'

If you look at this from a stated-goals metric of 'is this countering extremism,' yeah it's stupid and probably counter-productive. People are allergic to unsubtle propaganda that pretends to be subtle, and this is pretty bad for anyone who doesn't already see Facebook as an authoritative figure.

But if you look at this from an annual review seeker perspective of 'what measurable metric can I point to say I was helping?', this is fine. Facebook employee/boss initiatives don't get graded by how effective they are at what they claim to do, they get evaluated by being able to point at some issue and say 'I'm helping!' It's the citation, not the effect, that matters, and even when effects do matter its the effects cared not the effects claimed.

If you look at this from a 'does this work as an appeal to the political elites with power to influence Facebook's fortunes,' it's anything but an obvious loser. A major social media company repeats ruling party rhetoric that validates dominant political pieties and vilifies the relatively impotent opposition? Supports narratives that various political opponents are exceptionally dangerous, unwell, and should be marginalized? No country turns against friendly corporations that toe the ruling party line, and the Biden Administration has certainly been willing to go to bat for American digital media companies against, say, European attempts to levy higher taxes on American digital firms. So did Trump, mind you, but Trump's not in office with power right now, and thus his social media access doesn't need to be protected.

If Mark Zuckerburg wants to ingratiate himself into the ruling power circles of the United States, and does so by making the ruling party's political opponents sound crazy and paranoid and implicitly evil, what's the actual crime? What even makes it exceptional, when the likes of the New York Times has been doing that for decades?