You are here
Can Silicon Valley navigate its moral choices?
OH, HOW we miss the trolley problem.
There's a runaway trolley plunging towards a widow and five orphans, but if you pull the lever to divert it, you'll hit Elon Musk. Which do you choose?
Not long ago, that diabolical choice was the hardest moral question a technologist was likely to face over the course of a typical day. (And it was utterly hypothetical, unless you were building autonomous vehicles.) Today, the average person in the tech space is likely to be pressed far more often on far more realistic moral quandaries. Such as: Which authoritarian regimes is it okay to make rich by letting them invest in your company?
Is it worse to pay teens for their mobile activity data or to sell them lollipop- flavoured nicotine?
If you refuse to work with the Defense Department, is it OK to work with Beijing?
Welcome to your new ethics test, Silicon Valley: an adaptive quiz that becomes harder the more questions you get right. And with technology companies facing recurring media and regulatory scrutiny, company founders are finding it difficult to avoid the ethical chasm. If Silicon Valley was once converging on a moral cohesion of sorts - where progressive values and workmanship acted as a loose ethical framework - it's now becoming harder to avoid the varying ethical debates concerning privacy, addiction and growing geopolitical discord.
Consider the outcry last week when TechCrunch reported that Facebook had been paying people (including teens) US$20 a month to download an app that monitors a user's mobile and Web activity. Apple, long a privacy advocate and favoured troll of Facebook, reacted swiftly by cutting off "Facebook Research" and the company's internal apps. Tech Twitter, however, seemed largely perplexed: In the future, won't we all sell our data rather than give it away for free?
Addiction has become another ethical landmine where dopamine hits - and how one administers them - are the key to a company's growth. E-cigarette maker Juul Labs, founded in 2017 and now the fastest-growing startup in history, with a valuation of US$38 billion, is largely responsible for a grave new statistic: About 20 per cent of teens have admitted to vaping in school. In many ways, that shouldn't surprise us. Juul is the logical extension of the Silicon Valley growth-hacking playbook: Design a flawless product, add a dopamine response, stir in some influencers and watch your product, game or app go viral.
Technologists are generally fine with designing elegant software that maximises engagement or obsession, even when it's linked to increased prevalence of teen depression. But a screenless hardware device that delivers dopamine on demand? That crosses the ethical chasm. Most institutional investors have dismissed Juul's success despite its adherence to the Valley growth playbook, but some investors are becoming more open to other vices: Marijuana and other legal cannibinoids have sucked up nearly US$1 billion in private investment dollars since 2012, raising the question: What kind of dopamine hits aren't we comfortable with?
For the virtuous founders avoiding addictive products and invasive data-gathering, even they have to worry about who's getting on their equity ownership table. A year ago, Silicon Valley's gender problem was so severe that accepting money from a firm that lacked diversity was heresy in many quarters. Now, founders have to wonder whether that seemingly diverse venture capital fund is backed by regimes where women, minorities and dissidents are killed for expressing themselves. The killing of Post contributing columnist Jamal Khashoggi popularised the question, "Who are your limited partners?" But, by and large, it's still easier to defend taking money from questionable sources than from a firm that looks plucked straight from an Ivy League rowing regatta, circa 1969.
In the 1980s, when finance fell off the moral bandwagon, business schools reacted by requiring students to take courses on Kant and other philosophers that had little to do with daily management worries. The field is becoming irrelevant now, when the dominant industry - technology - is decentralised and able to grow US$40 billion companies in just 18 months.
Some investors and watchdogs are trying to encourage start-up leaders to recognise their moral burdens from the moment of inception. The Time Well Spent initiative is an example of Silicon Valley's ethical conscience, encouraging founders to monitor how product architecture affects the daily lives of their users. My own firm has published questions that we now ask all founders concerning how they can build "minimum virtuous products" instead of minimum viable ones. But, of course, the morality of a company often depends on the morality of the people in charge.
So, yes, we all miss the relative simplicity of the trolley problem. By the time Silicon Valley converges on a moral compass that both Middle America and Washington can stomach, the trolley will be fully autonomous and on its third voyage to Mars. WP