THE FACEBOOK CONUNDRUM

WHY ZUCKERBERG AND COMPANY CAN’T – OR WON’T - ELIMINATE FAKE NEWS AND RIGHT WING BULLSHIT


The short answer to this issue is actually quite simple since it lies at the very core of how Facebook is “managed:" If an issue isn’t data driven or can’t be resolved through analyzing data, Facebook engineers are basically lost.  You know of the recent controversy over the photograph of the young naked Vietnamese girl running from napalm that so aptly illustrates the problem of what’s appropriate to post and what’s not.  Humans at Facebook’s offices around the world are generally not involved in such decisions in Facebook's News Feeds.  It's not until  Facebook users complain in sufficient numbers or the press picks up on something (as with this photograph that Facebook initially removed but then reposted) there is little human intervention.  The past year has been rough sledding for Facebook and Mark Zuckerberg due to the controversies of “Fake News” and offensive right wing posts proliferating on the social media platform to ay nothing of the "Russian trolls" issue. 

From Sunday's New York Times Magazine:  “But the solution to the broader misinformation dilemma – the pervasive climate of rumor, propaganda and conspiracy theories that Facebook may have inadvertently  incubated (“The Pope Supports Trump,” for example) – may require something that Facebook has never done: ignoring the likes and dislikes of its users.  [NOTE:  There is no “Dislike” button on Facebook as far as I know but it might be a good idea.]   Facebook believes the pope-endorses-Trump type of made-up news stories are only a tiny minority of pieces that appear in News Feed; they account for a fraction of 1 percent of the posts, according to Facebook.  The question the company faces now is whether the misinformation problem resembles “clickbait” [which Facebook has reduced through the application of algorithms] at all.  Facebook’s entire project, when it comes to news, rests on the assumption that people’s individual preferences ultimately coincide with the public good, and that if it doesn’t appear that way at first, you’re not delving deeply enough into the data.  By contrast, decades of social science research shows that most of us simply prefer stuff that feels true to our worldview even if it isn’t true at all and that the mining of all those preference signals is likely to lead us deeper into bubbles rather than out of them.”

This operational modus operandi is basically applying data analysis to bring solutions to issues that don't lend themselves to such analysis.   Such an approach can inappropriately use scientific means – hard science, not social science – to bear on phenomenon that are inherently psychological and emotional in nature.  In a way, it resembles searching for black holes using a pinhole camera.  Our political preferences are not determined by the makeup of our DNA nor by the chemical balances or imbalances in our brain matter.  True, there are a whole host of influences that determine our politics – our parents' politics, where we grew up, where we live, our life experiences – but none of these influences can be said to directly “pre-determine” our politics.  The social sciences can, indeed, predict with a high degree of accuracy whether we are left-leaners or right-leaners based on such observable factors and it is data analysis that figures this out.  Facebook's algorithms use data gleaned from other observable and measurable events and characteristics – how many Facebook users “Like” a particular story or how many times a post is “Shared” - but do not and cannot determine the core reasons why someone “Likes” and/or “Shares” a given post.   Sure, gather sufficient date about a particular user’s preferences and you can predict with a high degree of accuracy the kind of posts he or she will “Like” and all that data can be grouped across a bread range of users to create cohort groups (liberal-leaning, conservative-leaning, middle-of-the-roaders) but these methods cannot tell us a thing about the reasons why.  Was Facebook’s “Anonymous Male-X” user abused by his violent, right wing, Nazi father as a child and as a result he leans far to the left?  Or did he follow in his father’s footsteps and become a member of the Westboro Baptist Church?  Who knows? because the reasons why lie deep within his psyche and may not even be known to him much less Facebook’s algorithms.   Think also about how many Facebook users are listed on local and state Criminal Child Sex Offender lists.  Such lists may be public but Facebook’s engineers and algorithms won’t know this since convicted child sex abusers are unlikely to include such information as part of their profiles.  So you may have befriended a sexual predator and you’ll never know it.

Those of us who lived through the Viet Nam War are intimately familiar with the photo of Phan Kim Phuc being napalmed.  It must have appeared on the front page of every newspaper in the country at the time and was featured on every nightly newscast.  We know it.  We were horrified by it.  But it is a legitimate piece of America’s history just as the horror of the Twin Towers’ collapse on 9-11 is too.  Neither - as offensive as both might be to some folks - can be simply excised from the pubic sphere, and Facebook is certainly at the core of this, without rewriting history.  The fact the offending photo was of a child only makes the photo even more horrifying.  Just because Millennials or other Americans are either too young to know of it or too ignorant, and, therefore, might object to its publication, is no reason for Facebook removing it.   My own experience suggests that it’s not easy alerting Facebook to “troublesome issues” like Fake News.  I attempted to communicate with the Facebook staff requesting that the “Pizzagate” page be removed since it was so obviously a fake and was also pretty damned offensive.  But, no reply.  Five times.  No reply.  Why?  Maybe I was alone in voicing my complaints and the Facebook algorithms only respond when some mysterious percentage of users complain about a given post or Facebook page.

Last year Facebook did institute efforts to curb misinformation by “flagging” posts that Snopes and Politifact had deemed problematic or contra factual and Facebook provided such a warning (at least according to the Times Magazine article.  I’ve seen none.).  Facebook immediately ran headlong into the fury of The Daily Caller and Breitbart who claimed that Facebook had “teamed up with liberal hacks motivated by partisanship.”   This is right wing pushback in action, something the left has little experience in exercising.  And herein lies Facebook’s essential problem and it's a problem in our larger society as well.  Any effort to “censor” offensive right wing bullshit and lies runs directly into the Supreme Court’s ruling that Fox News doesn’t have to tell the truth.   We on the left enjoy no such privilege.  


"If Facebook were to take more significant action, like hiring human editors, creating a reputational system or paying journalists, the company would instantly become something it has long resisted: a media company rather than a neutral tech platform" according to the article. 


Facebook has travelled light years from its original purpose of linking college students together using an online social media platform.   It is the planet’s predominant social “hook-up” site today which is what Mark Zuckerburg intends.  But it has become much more.  In his words:  “There’s a social infrastructure that needs to get built for modern problems in order for humanity to get to the next level.  Having more people not oriented just toward short-term things but toward building the long-term social infrastructure that needs to get built across all these things in order to enable people to come together is going to be a really important thing over the next decades.” 

No small vision is this.  It is idealistic and remarkably Utopian in its claim of advancing humanity through social media.  Through Facebook.  (I’m not sure I buy into this theme as a practicality or even a desirable goal.)  But Zuckerberg’s refusal to see his creation not as simply a techie mechanism to link one person to another and to create an all inclusive, everything you need to know in one place platform based on data analytics and algorithms is fundamentally flawed.   Hook-up sites like Christian Mingles, E-Harmony and Grindr – all pursuing much the same goal as was Facebook’s original function - don’t wield the kind of power that worldwide organizations like today’s Facebook exercise.  Do you care if E-Harmony is matching only conservatives and not liberals?  No.  You don’t and neither do I.  Do you care that Grindr is a social platform that links up gay men to each other?  No.  Why?  Because Grindr has a singular purpose that really doesn’t impact my life.  At its base, it’s a dating site interested only in linking one man to another man for sex.  These sites do not give us news about the protests against Anne Coulter at Berkley nor the U.S. missile strike against Syria. 

Facebook’s next big leap into the future world of online social media?   The ability to digitally alter your photos and videos.   So then you and I can make and post fake pics and fake videos somewhat like the doctored videos that have caused a dozen states to defund Planned Parenthood and restrict their activities after Jason Chaffetz’s ridiculous Congressional hearing over the issue.  But as fake as the videos may be, it has caused the loss of health services around the country.  So then what happens when any of us have the same ability to create fake news just like The Center for Medical Progress (a dystopian title if I’ve ever heard one) did? 


We’ve already had real-time shootings, murders and suicides on Facebook.  Sure, they occupy super-miniscule places in Facebook’s data banks, but these are events that generated record “views” from the Facebook Community, as Zuckerberg calls us.  I’m not down with this and I think until Facebook puts humans rather than algorithms in charge, even if this means that Facebook becomes a media company, the fake news and censorship issues are not going away.  They will get magnified. 

Oh, and by the way, Facebook is already a media company anyway. 



Thanks.  Stay Sharp.  Stay Focused.  Keep Resisting.

PS: The info for this piece comes from the New York Times Magazine as does the photo below.

CAN FACEBOOK FIX ITSELF?


Comments

Popular posts from this blog

REAL LIFE STORY OR REAL LIFE HOAX?

INTERNATIONAL AFFAIRS: THE TRUMP/PUTIN LOVE CHRONCILES

SHE SHOWED HIM HERS, HE SHOWED HER HIS. AND HE GOT BUSTED!