God I hate tech hyperbole. And I literally hate the use of “literally” to mean its opposite … but let me explain how bad technology product management can literally lead to the end of the world.
Donald Trump represents an existential threat to humanity. To put such a man at the helm of the world’s most powerful nation is like handing over the controls of a nuclear submarine to a petulant baby. That’s a poor simile only because it’s not an analogy but a nearly literal description.
What can I do about this? I’m just one person, one vote. Moreover, I’m in California, which will surely vote Clinton anyway, so my vote won’t sway the outcome. I could advocate, I could preach to everyone around me, but really most of the people in physical proximity to me already agree with me.
What about technology? I’m in the center of Silicon Valley, I know me some techmology, can’t I do something wizardly to extend the power of a single voice? Nope. I mean, I can write this little essay, and maybe my fifty readers will like it, but those fifty people and everyone they’ll share it with already agree with me.
But about twenty miles from here, there are a couple of dozen people who literally hold the fate of our political conversation in their hands. In fact, it’s been in their hands for quite some time now, and they’ve made decisions which, only in retrospect, appear to have been disastrous for our nation’s politics.
At Facebook, the News Feed is the main stream of information that people see when they use the service. It has become the single most important source of news and conversation for many if not most Americans. It is designed to show people information that they want, which largely means showing people what they already agree with, from people who they already are inclined to sympathize with.
At Google, the search results page answers billions of queries each day, from billions of people. The results are carefully shaped not just with regard to each query, but as much as possible conformed for the particular user, so that the user sees results they are more likely to want to click, which in essence means showing them information they already agree with.
I’m not the first to note that the creation of these echo chambers only serves to reinforce existing biases, and isolate people from diverse opinions that could broaden their horizons and enrich our society. I might be among the first to charge that the product managers who now lead Facebook News Feed and Google Search are failing at their jobs.
On its face that’s a ridiculous statement, as we are talking about two of the most successful products in history, literal world changers. And who could argue against the general strategy of conforming experience to user tastes? But there comes a time in the life cycle of even massively successful products, when the product has attained a use and effect that were never anticipated through all of the prior success. Product managers who do not grapple with what their products have become, in all dimensions, are not doing their jobs well.
News Feed and Search are unique in landscape of all products. These are no longer simply things that people use, and therefore need to be designed to be as pleasant and popular as possible. These products now form the infrastructure of political conversation, they have become the backbone of our polity, they are the means by which citizens of our nation engage with each other on the essential ideas of community. The success of these products must now be judged on how well they serve beneficial outcomes in our society, especially our politics.
There are plenty of people at Facebook and Google who are deeply invested in denying this responsibility, which is so self-evident to all of the rest of us mere users. They would like to say that their products are designed to be “neutral,” to simply follow algorithms that have no sense of society or humanity. They want to hide their power behind obfuscating explanations of math and probability.
Some of this may be a difference in perspective. Some of this may be benign short-sightedness. But some of it is moral cowardice. I hate to make such an inflammatory charge, but when you have the ability to shape a product in a way that would reduce the likelihood of a fascist from taking the reins of a country with the firepower to end life as we know it, and you deny that you have this power, I have a hard time calling this anything other than what it is.
Facebook and Google know that their products contribute to a stifled political conversation that only hardens lines of hate and allows well-meaning people to isolate themselves in their own safe spaces. Will they continue to build their products in a way that divides our society? Or will they take real moral responsibility for how their products shape our political conversation, and make their products a conduit for uncomfortable ideas that could improve our world? Will they break down the barriers between hardened positions, expose ignorance to truth, measure hatred and inject love? Or will they claim that these goals are too soft, and anyway achieving them is too hard?
When Philip Morris discovered that their product was killing their customers, they hid the evidence for as long as they could, and they denied the truth even after it was apparent to everyone else, all so they could squeeze out the last dollars from their death-dealing empire. When Coca-Cola realized that sugary drinks were contributing to unprecedented rates of obesity, they diversified their product lines to include healthy drinks as well as sugar bombs – not exactly admirable, but at least preparing for a shift where people who could watch out for themselves would continue to contribute to the company’s bottom line. At this point, I would be okay with lesser evils, but I would prefer to see moral courage. Product managers at Facebook News Feed and Google Search: Do Your Jobs.