For big tech platforms, one of the more urgent questions to arise during the pandemic’s early months was how the forced closure of offices would change their approach to content moderation. Facebook, YouTube, and Twitter all rely on huge numbers of third-party contract workers to police their networks, and traditionally those workers have worked side by side in big offices. When tech companies shuttered their offices, they closed down most of their content moderation facilities as well.
Happily, they continued to pay their moderators — even those who could no longer work, because their jobs required them to use secure facilities. But with usage of social networks surging and an election on the horizon, the need for moderation had never been greater. And so Silicon Valley largely shifted moderation duties to automated systems.
The question was whether it would work — and this week, we began to get some details.
Compare people’s opinions of cigarette companies with Johnson & Johnson, manufacturer of Tylenol. Because of how Johnson & Johnson handled the Tylenol tampering incident, Americans believe in what that company says – whereas no one believes a word out of the tobacco companies.
There are always multiple factors favoring any course of action. With the plethora of examples of product defects being swept under the rug it is always refreshing to find a company which addresses issues head-on. The tobacco industry is the opposite; a huge part of its current problems comes from the fact that no one believes a word of any study financed by anyone other than public health agencies (and often they are unreliable because of use of industry data). Consider Philip Morris’s concealment of cigarette filter contamination issues (U.S. Newswire, 2002); the automatic assumption is that the company is lying about not knowing that its … Read More