Posted on 08/17/2022 11:57:35 AM PDT by ransomnote
We all want to be able to speak our minds online—to be heard by our friends and talk (back) to our opponents. At the same time, we don’t want to be exposed to speech that is inappropriate or crosses a line. Technology companies address this conundrum by setting standards for free speech, a practice protected under federal law. They hire in-house moderators to examine individual pieces of content and remove them if posts violate predefined rules set by the platforms.
The approach clearly has problems: harassment, misinformation about topics like public health, and false descriptions of legitimate elections run rampant. But even if content moderation were implemented perfectly, it would still miss a whole host of issues that are often portrayed as moderation problems but really are not. To address those non-speech issues, we need a new strategy: treat social media companies as potential polluters of the social fabric, and directly measure and mitigate the effects their choices have on human populations. That means establishing a policy framework—perhaps through something akin to an Environmental Protection Agency or Food and Drug Administration for social media—that can be used to identify and evaluate the societal harms generated by these platforms. If those harms persist, that group could be endowed with the ability to enforce those policies. But to transcend the limitations of content moderation, such regulation would have to be motivated by clear evidence and be able to have a demonstrable impact on the problems it purports to solve.
Moderation (whether automated or human) can potentially work for what we call “acute” harms: those caused directly by individual pieces of content. But we need this new approach because there are also a host of “structural” problems—issues such as discrimination, reductions in mental health, and declining civic trust—that manifest in broad ways across the product rather than through any individual piece of content. A famous example of this kind of structural issue is Facebook’s 2012 “emotional contagion” experiment, which showed that users’ affect (their mood as measured by their behavior on the platform) shifted measurably depending on which version of the product they were exposed to.
MORE AT LINK: https://www.technologyreview.com/2022/08/09/1057171/social-media-polluting-society-moderation-alone-wont-fix-the-problem/?utm_source=pocket-newtab
I believe MIT's article is trying to make the case for deplatforming websites that go against 'the narrative'. Basically, its an argument for more efficient and thorough censorship, written up as if we, the reads, are clamoring for someone to save us from 'wrong thoughts'. This is the state of collapsed/captured Communist academia.
No - moderation IS the problem.
Social media continues to try to control the narrative which cuts DOWN on interactivity and promotes more us vs them thoughts AND actions.
Cutting people off the internet outright is against the 1st amendment.
Deal with it.
It isn’t “polluting society” half as bad as the DNC’s “mainstream propaganda media” is with its fake news and LIES.
So-called “social media” is not part of my life.
MIT, among the bastion of Delusional Lying Leftist Universities.
I’m sure MIT’s solution is more unconstitutional government.
These guys never quit. We’ve got to be at least a determined as they to recover our freedoms as they are to steal them.
Critical Race Theory and drag queen story hours are polluting society.
We need more Subject Matter Experts. That’ll put us on the right track.
This thoroughly dystopian article is adorned with a photograph of a dump truck shown pushing a pile of skulls. The skulls are depicted using business-style art (big blocks of simplified color). I think that’s kind of the art the movie Fahrenheit 451 would employ, or perhaps the movie Soylent Green. The irony seems to have escaped those producing this ‘piece’. I’ll link the image here:
https://wp.technologyreview.com/wp-content/uploads/2022/07/social-media-pollution.jpeg?fit=1080,607
This article is just another roundabout way of claiming censorship is necessary “for the public good”. Since the elite narratives are starting to fail as they come under scrutiny, the only thing they have left is silencing dissent.
This insanity is nothing less than tyranny, complete with government-approved censors to ensure that no one says anything that contradicts the official narrative. The next step is secret police to make those who oppose the government disappear. We’re already well on our way toward that as well.
“Everything in moderation. Especially moderation.” - Benjamin Franklin..........................
The Left knows their survival depends on censorship.
...why not require a government “moderator” to be present any time a group of individuals gets together socially to discuss the issues of the day?
__________________
Well, depending on the size of the *group*, there’s no way too tell whether or not one of them is a thought nanny.
As for the secret police, that’s more than ‘well on the way’.
There’s a fine line here. Truth Social agreed to release comments of users due to threats. Some moderation there might have been a good thing? I tried warning folks it was full of msm. Oh Well. Not everything a person thinks should be shared globally. Loose lips sink ships. This 💩 will drip, drip, drip for all they can squeeze out of it. Maybe until 2024.
It's quite easy for Big Tech if they'd do it. Any post that threatens a person or property should be referred to law enforcement. Otherwise, let it pass no matter how wacky the moderator thinks it is.
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.