On Principles of Discourse

Wall Street Journal investigation confirms Google operates censorship blacklist

It's not a violation of free speech, as many like to claim whenever this comes up.

Whether or not it's a problem has always been a separate question.

So, how do you feel about the separate question?

I think it's clear Google should not be doing this, but AFAICT there are only two positions here: the principled one (Google should not be tweaking its search results to a political agenda) and the unprincipled (Google should only be tweaking its search results to my political agenda).

The "free speech" thing is just a semantic argument, assuming one takes the principled view.

First of all, those aren't principled and unprincipled, they're just two different principles.

There's this tendency around here to act as though all politics is cynical and self-interested greed and tribalism, but most people really do have principled reasons for their beliefs - no matter how they came by those beliefs in the first place.

'Everyone should be able to say anything they want on any platform they want and never suffer for it' is a principle, but so is 'We should make people feel included and not tolerate bigotry' or 'We should protect people's privacy and safety' or 'Private citizens can use their property however they want, including businesses moderating their online platforms' or etc etc etc.

These are all principled stances that the people advocating them really believe in.

Now, it's true that the 'everyone can say whatever they want anywhere with no consequences' is a simple, easy to arbitrate principle, whereas most of the other principles involve gray areas and subjective judgements that people may disagree on. And yes, I've noticed there's a strong desire among many people for simple, black-and-white rules, with zero ambiguity or subjective judgement. We see this in discussions of free speech, we see this in discussions of sexual/romantic norms, we see this in discussion of affirmative action and discrimination, etc.

And certainly there's a lot to be said in favor of using such rules; they're easy to use and easy to verify, they let people know what to expect and plan accordingly, and most of all, yes, people do exploit rules that involve subjective judgements or grey areas towards their own ends.

However, these are all practical concerns about which type of rule is most useful in a given situation, not a moral difference between types of rules, or a difference between principled vs unprincipled or consistent vs hypocritical or anything like that.

Simplistic universal rules are often valuable for the benefits I listed, but they're also often impractical or suboptimal. Life is too complicated to expect to be able to get simple, black-and-white, zero-ambiguity rules with no grey areas, which can be stated in a sentence or two and correctly resolve all relevant real-world situations, pretty much ever. Sometimes we adopt rules that are sort of like that because they're better than the alternatives, but they're not more virtuous in any a priori way.


Second, you obviously know there's a million different principles that people could and do use to decide what should be censored, and none of them are charitably steelmanned as 'Google should only be tweaking its search results to my political agenda'. Yes, you can find a way to paint a lot of them that way, with clever and uncharitable rhetoric; but lets get real, the 'no moderation at all' stance pretty clearly favors one side of the political spectrum these days and primarily gets made by people on that side, it's easy to paint everyone with the 'only pushing my political agenda' brush if we're going to be that uncharitable and cynical about it.


Finally, the initial question, what do I think.

I care a lot about the difference between the government doing something and private organizations doing something, in a way that a lot of people here seem not to, as far as I can tell.

My image of the world is that the government is given a monopoly on coercion - they are the only ones allowed to initiate violence and coercive power over someone, and use laws and violence to prevent anyone else from doing that - and in exchange they are democratically held to account to the citizenry for how they use that power, to make sure it is not abused. This is not a perfect system, but the alternative is private citizens having free access to violence and coercion with no accountability, and that's a life of daily terror and exploitation for most people in the world. The state-monopoly-on-coercion thing is the best we've come up with for maintaining polite civilization for most people most of the time, and until we get a better system up and running I care very much about preserving and recognizing it.

So people call me stupid or dishonest for acting as if there's a hard-line difference between government censorship and private companies moderating their private platforms, and for wanting to call 'the right to free speech' the former and not the latter. But I think it's a crucially important distinction, and we blur or ignore it at our peril.

People here have talked about the dangers in redefining 'rape' to mean an increasingly long list of less-and-less objectionable things, in that it reduces the vigilance and violence towards 'actual' rapists. While I have nuanced feelings about that, I think it recognizes a real problem with blurring your terminology until it is toothless and no longer able to serve its intended function, and I think the people who want to call platform moderation a violation of free speech are dancing down that very dangerous slope.

When stories were going around about the White House censoring or suppressing reports from climate scientists and telling them what words they were allowed to use, I thought 'Ok, we're completely fucked now,' but most of the 'free speech absolutists' were more angry about Alex Jones being deplatformed or Milo being uninvited from a college or w/e.

That was a soft case because the scientists were government employees, the government was only censoring it's own speech, which is a gray area but ok, sure, you can decide that's a principle if you want. But I'm pretty terrified that if the White House did start telling news organizations or media platforms that they had to start censoring people talking about climate change, people would still treat that as the same thing as private deplatformings and start saying 'haha libs hoisted by your own petard' or 'your rules applied fairly' or w/e, instead of recognizing it as a categorically different and massively more terrifying thing.

So that's why I'm pretty strong on painting a bright line on this distinction and not letting anyone pretend it doesn't exist.


All of that said though, I'm not blind to the fact that capitalism has fucked up the 'Monopoly on coercion' system quite a bit, and many large multinational corporations in practice have a lot of coercive power, even if they don't have standing armies.

And in principle I am just as committed to stopping them from misusing their coercive power as I am to stopping the government from misusing it, and therefore I am in fact pretty sympathetic to the idea that some platforms are so huge and monopolistic that they need to be regulated like a utility, and something like net neutrality/content-agnosticism is probably the best way to do it.

(although good luck coming up with a principle that does that, but still stops child porn, and also isn't ideologically exploitable later).

What I'd prefer over that is that we take the coercive power away from the private corporations instead of telling them what they can and can't do with it, though. I recognize that network effects make this difficult, you can't just 'break up' Facebook like it was Ma Bell, a cluster of tiny fragmented social media platforms isn't useful and a new behemoth will just emerge to fill the void. I've talked about dream scenarios like fully distributed social media platforms, where encrypted bits of the platform are kept on and served form every user's computer and there's no central ownership or moderation, and people can just set their own personal filters. Maybe there are better options than that, I don't know.