Social Media platforms have a responsibility to bring us into the future without dragging us back to darker times.
July 5, 2019

It seems SOME country's governments actually prioritize regulating social media platforms and the kinds of speech they allow. Just not the United States' government. Germany, for example, has fined Facebook $2 million for underreporting the amount of hate speech — anti-Semitism, in particular — on its site. As Ali Velshi pointed out, Germany is more vigilant than most when it comes to policing anti-Semitism because of the Holocaust, but Velshi gave Europe credit for generally being ahead of the U.S. on this issue.

Rashad Robinson, president of Color of Change, was Velshi's guest, and the two discussed the difference between the ways Europe and the U.S. hold platforms like FB accountable. The gravity became clear when Robinson connected the dots between the rot that grows from a lack of accountability to the secret FB groups containing law enforcement officials with disgusting, abhorrent, racist, and sexist comments.

ROBINSON: The fine, as you said, is small. But the symbol of the fine, the symbol of holding an institution like Facebook accountable is serious. And unfortunately, we've not had that level of oversight from the FTC here, from Congress, from the authorities that are supposed to be stepping in when institutions, communications platforms others do not.

VELSHI: Why do you think that is? Is it a will thing? Is it an understanding?

ROBINSON: It's a couple things. First of all, it is certainly a will thing. It's a will thing when it comes to Congress. It's become politicized on both sides. And I don't want to create false equivalencies, because the Republicans controlled the levers of power and could've stepped in after the 2016 election and done a lot more. But the FTC lacks the type of teeth that it needs, and so what we've seen is them sort of win settlements, or give fines, and not able to implement or hold accountable once Facebook then goes back and violates some of the things that they were called out on the first place.

So, what about, let's say, groups of law enforcement officials who publish racist, sexist, violent posts in private Facebook groups? Should Facebook be held accountable for those groups and what is published there? Or at the very least, should they be held responsible for the monitoring of those groups so that hate crimes cannot be facilitated or joked about? What is its "duty to report?"

How about the secret FB group of nearly 10,000 Customs and Border Patrol agents sharing memes of Trump sexually assaulting AOC? Calling for other agents to throw food at the Congressional delegations when they visited the concentration camps in which they're torturing children? Should Facebook be held responsible for allowing such things on its platform? The head of the Department of Homeland Security is investigating, but only because someone in the group leaked the posts to Pro Publica, who issued the report about the group.

Robinson had this to say:

ROBINSON: Yeah, Facebook, does have a role in that. Because Facebook, these private groups they mine the data, and they mine the data and they work to make money off of that data. So they've created these private groups. And then they have community standards about hate. About what you can say and what you can do on their platform.

VELSHI: And that applies even if your group is private?

ROBINSON: It should, it does apply, and it needs to apply, no matter how powerful the people who are doing it are. The trick is, it's okay to have traffic rules, right?

VELSHI: Yeah.

ROBINSON: But the traffic rules only matter if they apply to all of us. And the fact of the matter is these private groups can't exist on their platform and then allow for the type of sowing and creation of hate and then nothing happens. And facebook has a role in this. I also think the government has a role. The fact that people felt so comfortable. In a space -- law enforcement -- law enforcement figures that get a gun and badge and are supposed to uphold the laws of this country felt so comfortable to say some of the most hateful and derogatory things about people in peril and elected officials of color and women -- 10,000 in the group.

VELSHI: Did Facebook make them say these things? People had racist, sexist, misogynist things to say. Do we win if we get it off of Facebook?

ROBINSON: No, but I think we have to recognize the ways technology and new tools allows for information to move quicker, right? It's like lighting a match is one thing. Lighting a match next to lighter fluid is a different thing. In many ways, whether it's Youtube, Facebook or Twitter, these platforms can be the lighter fluid. That means we have to keep an eye on it. And we have to pay special attention to how this information can spread in different ways. Yeah, someone can say hateful things out on the street. And that's freedom of speech and they, you know, not without consequences for their job or workplace or whatever.

This is what people who scream about censorship don't understand. You can say whatever the hell you want. But you aren't free from the consequences of that speech. And when law enforcement officials feel super safe and comfortable saying the most dangerous, abusive, vile, and violent things about minorities and women to 10,000 other people they don't even know, there should damn well be consequences. If Facebook doesn't dole them out, then Facebook should see consequences, too.

Can you help us out?

For nearly 20 years we have been exposing Washington lies and untangling media deceit, but now Facebook is drowning us in an ocean of right wing lies. Please give a one-time or recurring donation, or buy a year's subscription for an ad-free experience. Thank you.

Discussion

We welcome relevant, respectful comments. Any comments that are sexist or in any other way deemed hateful by our staff will be deleted and constitute grounds for a ban from posting on the site. Please refer to our Terms of Service for information on our posting policy.
Mastodon