Facebook News Feed: Redirecting the stream

Facebook News Feed: Redirecting the stream

A simple question:

"If censorship is not practiced at Facebook then how come porn doesn't make it into the newsfeed?"

Facebook isn't 4chan, it censors content for the public good (values of a civil society). Certain labels are applied to stories (pieces of content) and actions are performed. For example, any post marked as 'porn' is never shown.

The system of censorship within Facebook is very active and healthy. With that in mind, I believe the methods proposed below by Facebook might fall short of being a viable solution for the revised News Feed:

  1. Boost certain publishers — ones whose content is “trustworthy, informative, and local.”
  2. Use reader surveys to determine which sources are trustworthy.

One, defining 'trustworthy' is fraught with bias because user-generated (maybe even crowdsourced) opinions are no higher in value than a well-informed individual (or expert) opinions. Two, I think the approach to 'boost' also leads to problems of favouritism. Three, surveys sound boring.

Rupert Murdoch also chimes in:

if Facebook truly valued “trustworthy” publishers, it should pay them carriage fees.

This is a good point. There needs to be a tighter scope to avoid unbalance between platform and media. Furthermore, one main value that is driven from technology is the ability to conduct data collection and analysis - I believe the solution should remain in that domain.

Also I want to reframe the problem. I want to start with news and journalism in general. Frankly speaking, I think news content publishers have a problem and that is the news industry is struggling to keep its integrity. Partisanship in the US is at an all-time time high: for example CNN versus Fox. Many stations amplify the filter bubbles and destroy the foundations of keeping the powerful accountable. Brietbart, Hannity, Media Matters exist on ends of the spectrum, with Conservatives and Liberals aligned respectively.

I also think the challenge is not solely because of journalistic integrity due to political influence, but because news publishers are looking for ways to win advertisers back. Digital advertising platforms such as AdWords and Facebook Ad Manager are far superior with a higher performing ROI for advertisers - improved segmentation and delivery cost.

I remain skeptical of the business interests for journalists in news publishers (and in fact any declining newsroom), I'm worried they need to make Facebook the scapegoat for partisanship and fake news - "the vector":

"it's not our fault we reported the election wrong or wrote so much crap about Trump, it's Facebooks problem for allowing our bullshit to spread."

It is really an industry wide fault.

I believe the issues with newsfeed also relate to the problems with partisan journalism. I am totally suggesting we might be able to fix both at once and importantly address a very important social illness: rankism.

Rankism is one of the outcomes of extreme partisanship - it gives power to demean others. The more people become polarised the more they hate and denigrate each other - rankism spreads. As Robert Fuller suggests the remedy for rankism is dignity.

The hope##

I believe Facebook can be the revamped key vector that provides tools that assist Facebook users to understand journalism better and the effects of polarisation. The end goal is to help users understand the dynamics of civil society and enjoy a dignified news feed.

What I am proposing##

I strongly believe the solution is not in becoming a media company or censorship agency, but for Facebook to become an education agent in the content landscape.

By asking the question - what hinders civil society? We have the foundation for a solution. This is well addressed in the article by Cass R. Sunstein, Professor at Harvard Law School.

Propose the objectives as follows:

  1. What is Extreme Partisanship?
  2. Define and label it as NewsWhip and PEW Research does - liberal > conservative
  3. Change how each of these partisan labels are applied to stories and publishers in the feed.
  4. Provide a stream of user stories that are safe and clearly labelled

Some ideas to mull-over##

This only initially applies to content from news publishers, brands or pages. Eventually, as the system becomes more intelligent, steps are taken to label and identify user accounts, for example, an account posting upwards of 20 articles per day is marked for analysis.

We all know we can’t edit the source content (censor) or 'shadow 'content (what Twitter is doing). However, Facebook can:

  1. Define the type of content in the stream
  2. Redirect the content in stream
  3. Crowdsource the categorisation and labelling of content (Why not? Civil duty - I would invest in this for the sake of others, especially after a crappy experience lately with newsfeed.)
  4. Define what is 'shown first'

Most importantly, the lens with which we view and categorise or label content needs some additional parameters. As mentioned before I would advocate that these labels come from a bi-partisan agency such as PEW Research.

Filter options##

  • Trusted (Accredited Publisher or Journalist)
  • Partisanship Badge US: left, centre, right, far right, extreme
  • Untrusted
  • Explicit content porn (already censored out)
  • Explicit content hate speech (should be censored out, its un-empathetic)

The simple idea is to enable tools to helps users to identify partisan content... and to help themselves identify their own bias. I would highly recommend :

  1. Publisher accreditation (yes publishers apply to be accredited).
  2. A team or partnership on technology such as NewsWhip that can be used to help identify and define the partisanship of story/post sources.
  3. A tool to help people understand their own current political typology, and is a great way to get users thinking about partisanship and bias and decide what they want in the news feed diet.
  4. Combat the 'The Colorado Effect' - always ensure that views not-necessarily their own (but respectfully written) make it into the newsfeed
  5. Provide a space (or feed) for those who want to pursue their partisan agenda but with warnings of filter bubble effect.

Importantly, news publishers and users will be able to understand the slant of their content and engage in debate - what a good functioning democracy is all about! Right? The discussions will become centered around bias, influence and transparency. The hope is that this discussion helps to make the world more open and connected.

Summary##

I am not advocating for direct censorship. I am not advocating for Facebook to become a media company. I am suggesting that the Facebook categorises it's content and enables a filter function for users (starting initially with publishers and moving through the user base in an organised manner). The result will be a newsfeed enhanced with a labelling system and an accreditation system that identifies stories or publishers as partisan or extremist. In doing so, the stories in the users news feed will be more aligned to their interests - making users more satisfied. Lastly it will create a platform for debate and challenge notions of transparency, it will enable user to understand how the stories they share relate to our friends on Facebook and the social groups we are part of.