Facebook News Feed now downranks sites with stolen content

Facebook is degrading poor quality news editors and other websites that scrape and illegally publish content from other sources with little or no modification. Today he told TechCrunch exclusively that he will show less prominent links in the News Source if they have a combination of this new sign about the authenticity of the content along with clickbait headlines or landing pages filled with low quality ads. The move comes after Facebook polls and in-person interviews found that users hate scraped content.

If improperly acquired intellectual property gets less distribution from News Feed, you will receive less referral traffic, you will get less advertising revenue and there will be less incentive for criminals to steal items, photos and videos in the first place. That could create an umbrella effect that would improve the authenticity of the content on the web.

And in case the profile data stolen from 29 million users in Facebook's recent massive breach of security ended up being published online, Facebook would already have an established policy for

This is an example of the type of site that could be degraded by the latest Facebook news feed change. "Latest Nigerian News" scraped one of my recent TechCrunch articles and surrounded it with tons of ads.

A site full of ads that scraped my recent TechCrunch article. This site may be affected by a degradation of the news source

"As of today, we are launching an update so that people see fewer publications that adhere to low quality sites that, for the most part, copy and publish content from other sites without providing a unique value. "We are adjusting our Publishing Guidelines accordingly," Facebook wrote in an addendum to its May 2017 post on degraded sites filled with crappy ads. Facebook tells me that the new guidelines for the editor will warn the news media to add content or original value to the re-published content or invoke the wrath of the social network.

Personally, I think the importance of transparency in these issues justifies a new publication in the Facebook blog, as an update to the original publication that links it.

So, how does Facebook determine if the content or is it stolen? Their systems compare the main text content of a page with the rest of the text content to find potential matches. The degree of coincidence is used to predict that a site stole its content. Then it uses a combined classifier that combines this prediction with the way that the clickbaity holders of a site are more the quality and quantity of ads on the site.

Leave a Reply