“God is an algorithm.” So said famous author and speaker Yuval Noah Harari in a recent podcast I listened to. Yuval has listed the three main challenges that mankind has to wrestle with in the coming decades as: nuclear war, climate change, and technological disruption. This is Yuval’s list, and you may disagree or have other topics at the top of your list, eradicating malaria for instance.
What is thought-provoking to me as a trained engineer, a venture capital investor, and, more broadly, an optimist about the positive change technological innovation can bring to mankind, is that last topic, technological disruption. Listen more to Yuval here.
I have witnessed close up the boom and bust of the first Internet wave, and I have co-developed an investment firm that has backed nearly 100 technology companies in US, China, India, and Europe.
I spent 12 years with Nokia, from 1990 onwards, and left the company in 2002, shortly before the market capitalisation peaked at $245 Billion, one of the world’s largest companies at the time.
Little did I imagine that 7 of the 10 largest companies today in terms of market capitalisation would be technology companies. The other three are: JP Morgan, a bank that is very much a technology company; Johnson & Johnson, the FMCG company; and evergreen Berkshire Hathaway.
Little did I imagine that 7 of the 10 largest companies today in terms of market capitalisation would be technology companies.
When the algorithms in the systems being developed by either the tech giants of today or the VC-backed companies of tomorrow decide which patient is prioritised to get a liver transplant, which cars have priority in traffic jams, who gets a mortgage, who gets admitted to university, who gets social benefits, who gets persecuted for tax evasion, etc., etc., etc., you start to see the shape of the sinister meaning in the Yuval’s term “God is an algorithm.”
Yuval was putting a finger on the potential dangers of the development of AI, software, processing power, and billions of hyper-connected devices. Our systems of corporate governance, societal organisation, and liberal economics, together with the continued drive for effectiveness, are likely to promote making humans redundant wherever possible. Step by step, industry by industry.
His point is that, if we do not find meaningful ways for humans to become useful in society, we could end up with what he calls a “useless” class of people. A class that are irrelevant to the interests of companies.
This may be a dystopian vision of Yuval’s, but, as he is not a technologist, I take heed of his views; as always, it is more likely that an industry outsider will be able to tell us where we are heading rather than the “expert” insiders wedded to the tech industry. In fact, we may be better off having outsiders setting the guardrails for our industry.
As always, it is more likely that an industry outsider will be able to tell us where we are heading rather than the “expert” insiders wedded to the tech industry.
Lina Khan, a brilliant young lawyer, may be such a person. She was thrust into the limelight in 2016 when she wrote a brilliant paper while still at Yale law school, “Amazon’s Antitrust Paradox”. The paper essentially reframed what monopoly power is and how it should be viewed under anti-trust legislation, in this case US anti-trust legislation.
In my view, it is unavoidable that we in the technology industry are going to be subject to substantially more regulation and government intervention. This is not necessarily a bad thing; many industries, such as finance, telecommunications, transportation, health care, etc., function under regulation and supervision.
In my view, it is unavoidable that we in the technology industry are going to be subject to substantially more regulation and government intervention.
When you involve deeply human and ethical choices concerning our health, access to education or our personal data as I outlined above, it is inevitable that regulation will come into play. The EU’s GDPR is a good example of a privacy protection regulation that is now being followed in many other jurisdictions around the world.
What has brought me to write this note is the apparent lack of recognition by our industry that we have to take ownership of and responsibility for the dialogue about the impact we impart on society. Many entrepreneurs are already working to impact regulations relevant to their business and lobbying politicians to implement favourable tax treatment of stock option plans for instance. Some entrepreneurs are part of handpicked governmental advisory bodies and are working on many other important topics.
What has brought me to write this note is the apparent lack of recognition by our industry that we have to take ownership of and responsibility for the dialogue about the impact we impart on society.
However, we need to work on shifting the dialogue away from these “tactical” conversations to the issues that are really important for our mission. We need to help educate, not only politicians but policymakers, lobbyists, influencers, trade unions, educators and journalists, about how hyper-connectivity and algorithmic support can create augmented intelligence where humans are in the driver’s seat. We can continue to make our world a better place.
It is easy to imagine the next populist movement, not against globalisation and immigration, but against technology and the displacement of workers and disruption of the established order that it creates. It is a narrative in the making, and much too close for comfort for us all.
Facebook was famously slow to wake up to the responsibility of connecting billions of people on their platforms, so to bring these thoughts together you can also watch here how in April this year Mark Zuckerberg sat down with Yuval.
Engage in the dialogue where you can. Make a difference – and let us make sure the human is in charge.