Posted on 08/02/2017 5:40:33 AM PDT by Drew68
A little over a month ago, we told you about the four new steps were taking to combat terrorist content on YouTube: better detection and faster removal driven by machine learning, more experts to alert us to content that needs review, tougher standards for videos that are controversial but do not violate our policies, and more work in the counter-terrorism space.
We wanted to give you an update on these commitments:
Better detection and faster removal driven by machine learning: Weve always used a mix of technology and human review to address the ever-changing challenges around controversial content on YouTube. We recently began developing and implementing cutting-edge machine learning technology designed to help us identify and remove violent extremism and terrorism-related content in a scalable way. We have started rolling out these tools and we are already seeing some positive progress:
Accuracy: The accuracy of our systems has improved dramatically due to our machine learning technology. While these tools arent perfect, and arent right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed.
Scale: With over 400 hours of content uploaded to YouTube every minute, finding and taking action on violent extremist content poses a significant challenge. But over the past month, our initial use of machine learning has more than doubled both the number of videos we've removed for violent extremism, as well as the rate at which weve taken this kind of content down.
More experts: Of course, our systems are only as good as the the data theyre based on. Over the past weeks, we have begun working with more than 15 additional expert NGOs and institutions through our Trusted Flagger program, including the Anti-Defamation League, the No Hate Speech Movement, and the Institute for Strategic Dialogue. These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists. We will also regularly consult these experts as we update our policies to reflect new trends. And well continue to add more organizations to our network of advisors over time.
Tougher standards: Well soon be applying tougher treatment to videos that arent illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism. If we find that these videos dont violate our policies but contain controversial religious or supremacist content, they will be placed in a limited state. The videos will remain on YouTube behind an interstitial, wont be recommended, wont be monetized, and wont have key features including comments, suggested videos, and likes. Well begin to roll this new treatment out to videos on desktop versions of YouTube in the coming weeks, and will bring it to mobile experiences soon thereafter. These new approaches entail significant new internal tools and processes, and will take time to fully implement.
Early intervention and expanding counter-extremism work: Weve started rolling out features from Jigsaws Redirect Method to YouTube. When people search for sensitive keywords on YouTube, they will be redirected towards a playlist of curated YouTube videos that directly confront and debunk violent extremist messages. We also continue to amplify YouTube voices speaking out against hate and radicalization through our YouTube Creators for Change program. Just last week, the U.K. chapter of Creators for Change, Internet Citizens, hosted a two-day workshop for 13-18 year-olds to help them find a positive sense of belonging online and learn skills on how to participate safely and responsibly on the internet. We also pledged to expand the programs reach to 20,000 more teens across the U.K.
And over the weekend, we hosted our latest Creators for Change workshop in Bandung, Indonesia, where creators teamed up with Indonesias Maarif Institute to teach young people about the importance of diversity, pluralism, and tolerance.
Altogether, we have taken significant steps over the last month in our fight against online terrorism. But this is not the end. We know there is always more work to be done. With the help of new machine learning technology, deep partnerships, ongoing collaborations with other companies through the Global Internet Forum, and our vigilant community we are confident we can continue to make progress against this ever-changing threat. We look forward to sharing more with you in the months ahead.
The YouTube Team
They will do this by burying them where they cannot be found, thumbed-up, commented upon, suggested or advertised on.
If you try searching for them, you will find a list of "approved" videos that will provide progressive education.
The system will utilize the assistance of 15 leftist NGOs to provide guidance on what constitutes "supremacist" content.
It's a brave new world folks!
Any conservative speech will be labeled “hate speech.” E.T. Williams “The Doctor of Common Sense” is already having problems with censorship. YouTube needs a good competitor.
C'mon, now. You don't trust the Anti-Defamation League to be fair and objective?
“Controversial” = conservative.
Lefties at work.
Again. Competition is needed—FAST.
I sure would like to get a look at what YT calls “standards” with regard to hate speech and violent extremism.
Perhaps this is why Jordan Peterson’s channel was locked up yesterday with no explanation given.
Or we throw sand into the gears.
One mechanism YouTube will utilize to enforce their new policy will be hiding videos that receive too many thumbs-down. So now much like SJWs can shout down conservative speakers on college campuses, they'll now be able to do the same thing to "offensive" YouTube videos by organizing large numbers of viewers to click the thumbs-down icon.
So what if everyone clicked thumbs down on every video they watched? Would it not render this mechanism useless.
I'm just brainstorming here before this thread dies due to disinterest.
Though she presents FACTS about Islam, I expect Pam Geller to be censored too.
I don't know about locking channels but it is my understanding that YouTube won't be actually removing any videos. Reason being, they get reposted and draw more attention to themselves, defeating the purpose of removing them in the first place.
Unlike the morons running the cable news channels, they're not idiots over at YouTube. The goal here is to silence objectionable content, not popularize it by turning it into forbidden fruit.
I’m a YouTube viewer, but not a member so I don’t give thumbs up or down or comment. But wouldn’t YouTube be able to discern if a member gave only thumbs down and then merely delete that member?
The past few months I've noticed that Google's search engine has deliberately deteriorated. Many searches no longer return the target of the search but the exact opposite. Search for global warming topics and you often end up with a list of websites attacking "deniers", instead of what you were looking for.
George Soros owns or funds 90% of the NGOs. NGO is a synonym leftist front group.
My two older kids, 8 and 6, watch nothing on a TV screen except YouTube. They don't watch cable. They don't watch cable channels. Older kids are the same way. MSM cable news channels are today the abode of old people.
The libertarian-leaning right has embraced YouTube, taken it over really, as THE medium to get their message out. And people are finding them, younger people.
The election of Trump, the rise of Europe's alt.right, and polling showing that the generation following millennials are abandoning "progressive" ideals, the powers behind Google and YouTube are nervous and feeling responsible.
They feel they have a social obligation to tamp this down and reel it in. This is a knee-jerk response that I dearly hope will have unforeseen consequences. This move needs to fail.
I trust it will. The people running YouTube are smart but their ideals are old and stale and there is a generation coming up behind them that are clever as well. And they're on the offensive and passionate.
Bannon is right. Make them a public utility.
And by controversial, they mean any video that advocates for whites, heterosexuals, Christians, or males.
JoMa
And by "terror content" they mean not Muslims.
A major change to YouTube that will silence conservative voices generates no interest on Free Republic.
Sad.
Crickets...
Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.