It should come as no shock that last week’s presidential election was as shocking a political result many of us have seen in our lifetimes. From the most complex data analytics operations within both parties to old-school “gut instinct” pundits and prognosticators, nearly everyone not on President-Elect Trump’s payroll thought Secretary Clinton had the election in the bag. Obviously, they were wrong — as conventional “wisdom” is so often — and here we are.
In the wake of such an unexpected, and crushing, defeat, fingers are pointing all over the place. Secretary Clinton’s landed on FBI Director Comey. Others have landed on Clinton’s campaign not visiting her “firewall” states enough. Or courting white, working class voters. Regardless of which culprit you find most compelling — or which combination thereof — two of our tech giants also found themselves in the crosshairs…
Facebook and Google.
Facebook has been skewered across the media for its role in allowing fake news stories to propagate and spread on its platform. Being that Facebook does not grant outside access to their algorithms and resulting data, it’s tough to know how much the social media echo chamber effect and the spread of these fake stories actually influenced the election. But, considering many of these outright false stories were shared and read by more than 1 million people in some cases, it’s tough to argue they had no effect whatsoever (even though Zuckerberg tried that tack early on, but to little positive or believable effect).
Google found itself similarly embroiled in controversy this week when it turns out a fake news story infiltrated its top results for which candidate won the popular vote (Secretary Clinton did, as it turns out — an obscure right-wing blog claimed that designation for Trump).
One of the most consequential stances both companies have adopted through the years is that they’re platforms, not media companies. Facebook doesn’t write the content, they just provide the platform for you (or publishers) to distribute it. Google doesn’t write the news, it just surfaces the results of your searches for the news.There’s a lot of reasons why both companies might choose to have taken this stance, but the one I find most compelling is this:
If you’re a publisher or a media company, if you knowingly publish something false or defamatory, you’re liable to potentially huge lawsuits for slander, libel or defamation. The writers and publishers of that content are culpable for the words they write and publish. But, the courts have held time and again that simply providing access to information is not the same as endorsing the information presented.
Much as you can’t sue your ISP for the information flowing through its pipes, so too can you not sue Facebook for your crazy uncle’s erroneous political post.
But here’s the problem — consumers don’t really buy that. At least, not any more they don’t.
It’s one thing to consistently prioritize one source of valid information before another; for instance, you can’t always defer to a conservative search result over a neutral or liberal one and maintain credibility or objectivity as a platform. But, you certainly can filter out stories or sources known to be false. Facts aren’t partisan — that’s what make them facts.
In today’s corporate climate, consumers expect brands to act with social responsibility. In many ways, younger consumers especially, demand companies act with environmentally responsibility.
Facebook and Google have achieved enormous feats — they’re indispensable to modern life. But, consumers also expect companies with that level of power to wield that power responsibly. If they’re not also improving our lives and our society, why continue bestowing that level of power onto them?
Twitter has long been criticized for its lackadaisical response to bullying and trolling on its platform, and I think that criticism has hurt Twitter’s growth and engagement. People don’t want to be on social networks replete with trolls and bullies at every corner. So, too, do most individuals seeking news from Facebook or Google want real and accurate reporting.
There are legal hurdles to Facebook and Google taking more active editorial roles. There are questions about what, then, their culpability looks like and how they differentiate between verified news sources and false ones; which publishers get preferential treatment and which ones don’t.
But, there’s a lesson for every business here. Consumers expect more out of us now than ever before. Consumers want authentic and transparent brands. They want to feel good about buying your product or growing your revenues. Partners want to feel confident your company is both good and responsible. And when you’ve earned a user’s trust (or in the case of Facebook and Google, their time), citizens the world over expect you to handle that power responsibly and in service of a greater good. Tone-deaf abdication of blame or responsibility isn’t a good look on anybody. But, if you and your company hold yourselves to the same high standard your users or partners do? It works out better for everyone. Trust me.
Jeff Francis is a veteran entrepreneur and co-founder of Dallas-based digital product studio ENO8. Jeff and his business partner, Rishi Khanna, created ENO8 to empower companies of all sizes to design, develop and deliver innovative, impactful digital products. With more than 18 years working with early-stage startups, Jeff has a passion for creating and growing new businesses from the ground up, and has honed a unique ability to assist companies with aligning their technology product initiatives with real business outcomes.
Whether you have your ducks in a row or just an idea, we’ll help you create software your customers will Love.LET'S TALK
Get this free handbook today & learn how to catapult your chances for software success!