Victor Luca, 26-Dec-19.
The Doomsday Clock reads 100 seconds to midnight, a decision made by The Bulletin of Atomic Scientists.
Does it matter to anyone reading these lines that after their passing, human life as we know it on this earth could cease to exist? After all, when we are gone, we are gone, and so what happens doesn’t matter anymore?
Existential risks are those that threaten the entire future of humanity. Believe it or not, there are teams of academics that think, study and obsess about such things. Cambridge University’s Center for the Study of Existential Risk (https://www.cser.ac.uk/) has compiled a list of global existential risks. Oxford University’s Future of Humanity Institute and the Global Challenges Foundation have compiled comparable lists. I have condensed these risks down to the following:
1) Climate Change, 2) Nuclear War, 3) Artificial Intelligence, 4) Advanced Biotechnologies, 5) Naturally Occurring Pandemics, 6) Super-eruptions or impacts of outer space objects, 7) Cosmic Radiation.
Risks 1-5 in this list are termed endogenous or anthropogenic risks since they are largely due to our activities and 6 & 7 are exogenous risks since they are not within our control. Exogenous risks have been present since before we, homo sapiens, first walked the earth over 300,000 years ago. The probability that these risks cause our demise as a species is considered extremely small. Anthropogenic or endogenous risks are, to a large degree, a result of our breathtakingly rapid (exponential) scientific and technological development and their probability is not negligible.
Citizens and national policy makers are generally not good at understanding, assessing and mitigating global catastrophic risks. Our feeble response to climate change is an example in point. Anthropogenic climate change has been discussed in the realm of science for more than one hundred years and has interested me for the last thirty years. Yet it is only very recently that climate change is entering into widespread public discussion. Despite the seriousness of this endogenous risk, it fails to rally us into serious action. We posture and we vacillate, but we take no serious action. As I have said many times, we need to roll out a global Manhattan- or Sputnik-type Project response to climate change. Sadly, virtually nothing is happening! If we can pump trillions of dollars into keeping the global economic ponzi scheme going as happened after the Global Financial Crisis of 2008, then we should be able to create a few trillion dollars to keep the global mean temperature down, ice sheets and glaciers intact, the sea level where it is, storms and droughts at bay, maintain earth’s biodiversity and keep our environment pristine. I would consider declaring the Trump administration, on its own, a global existential risk. They have so far rolled back more than one hundred environmental regulations in the United States and are showing no signs of stopping. Do your own research on this one please.
Volcanic eruptions are phenomena that our community has had to come to grips with only very recently. Whakaari, aka White Island, is located 50 Km from the mouth of the Whakatāne river and it has bubbled away for at least the past 150,000 years. Whakaari was in more-or-less continuous eruption from December 1975 to September 2000 (the longest historic eruption episode). There have also been many other eruptive events 2012-2013 and 2016 (see https://www.geonet.org.nz/about/volcano/whiteisland). The remoteness of Whakaari has lulled us into a false sense of security and maybe our attitude to the risks has been somewhat blasé. Yet even a minor eruptive event can have serious consequences for our community. Imagine what would happen if White Island turned into a super-volcano? Since a super-volcano is one of the two exogenous risks on the list above that we can do little or nothing about, my advice is to focus principally on the endogenous risks that have a much higher probability of taking out much of human civilization and which we can do something about. Thus, I ask, how are we prepared for climate change that could wreak havoc on our community or the world for that matter?
Nuclear war has represented a potential threat to our existence since the first atomic bomb leveled Hiroshima. We have subsequently come close to the breakout of full out nuclear war on many occasions, either through accident or misunderstanding or competition between superpowers. I get the impression that we have largely dismissed the risk of nuclear war because, well, it hasn’t happened so far, so let’s sweep it under the carpet. The book entitled “The Doomsday Machine – Confessions of a Nuclear War Planner” by Daniel Ellsberg gives a frightening bird’s eye view of the machinations of the American Military-Industrial Complex. Of course, there are many nations with such doomsday machines, although the Americans have by far the most powerful. The American Military’s cut of the last US budget was close to 1 Trillion dollars.
To give an idea of just one of the threats that lurks in the murky depths, take the now obsolete Russian Akula class nuclear submarine (known as Typhoon in the west). This deadly weapon could stay submerged for decades if it weren’t for the need to surface to replenish food stocks. An Akula class sub carried twenty R-39 (sturgen) ballistic missiles each of these in turn carrying 10 independently targetable war heads of 100-200 kilotonnes. For reference, the Hiroshima bomb was about 15 kilotonnes. The 200 hundred warheads could be launched in less than the time it takes to order a pizza and almost simultaneously knock out all major US cities. The Americans have the Ohio class submarines and other super subs which are smaller but more stealthy and capable and they have more of them. I would hate to imagine what they have developed that is not even in the public domain. If nuclear war were to break out, we in Aotearoa would not be in the main path of any conflict but that is not to say that we could get away Scott free. Nuclear war would generate an enormous amount of radioactive fallout which would undoubtedly smother us also and dramatically change the way we live. I will leave this fascinating subject for another time since I am sure we all get the idea.
Suffice it to say for now that on 23 January of this year the Doomsday Clock was updated to 100 seconds to midnight. Midnight is essentially the end of organized human life. To find out about the doomsday clock please refer to the following link.
The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.
Artificial Intelligence (AI) is something that few of us have probably thought too much about in the context of global existential threats. I have been hearing about AI since I was at university and I now think the time has come to take this technology seriously. The science fiction series of Terminator films gave us a taste of what to watch out for. The films depict a world in which the machines produced by a global corporation try to take over the world of humans. The film’s producers adopt an optimistic trajectory to the film’s conclusion in which humans always manage to wrest back control of their world. Since I first heard the term AI, science and technology has moved forward in leaps and bounds. The growth of scientific knowledge has gone exponential. The meaning of the term exponential I tried to explain in a previous article in this paper. Today, the autonomous driving vehicle is here! Google’s self-driving car has clocked well over 1.6 million kilometers on America’s roadways without a single accident. Few humans could boast such a record.
The center piece of self-driving technology is AI. A silicon brain that receives inputs from a barrage of highly sensitive sensors is able to ‘think’ and take action (drive). Although cybernetic organisms (cyborgs) such as the Cyberdyne Systems Model 101 or T-800 cyborgs or the liquid metal T-1000 that entertained us so much from 1984 may be a far stretch, the leap from self-driving cars to advanced robots is a relatively small one. Self-driving vehicles are going to replace an entire workforce and robots of the future could also make many more of us redundant. When robots with super intelligence learn to build more advanced versions of themselves, then why would they need us? In the Terminator films the humans come out on top. American films typically end on a high note. But the risk that AI contributes to terminating human civilization is not insignificant. And it is not a given that we would beat the machines. We should take all necessary measures not to be hoisted by our own petard.
When I first started writing this article in December of last year, I was not going to mention the threat of global pandemics. Since that time, the Coronavirus has unfortunately started to become a household word. The coronavirus was subsequently declared a global health emergency by the World Health Organization (WHO) and the rest is history, although we will deal with the aftermath for a long time to come. So it would have been remiss of me not to at least make a mention of pandemics. As a result of COVID-19 we became more acquainted with the exponential growth that I wrote about toward the beginning of the year and the term “flattening the curve” entered into common usage. In retrospect we were very lucky that we came through relatively unscathed at least in terms of health and that our national healthcare system was not tested to any significant degree. However, were the virus to have been slightly more contagious and lethal we may not have been so lucky. So it is fortunate that we have a reasonably well functioning health system. Of course, I am being sarcastic.
In this era in which truth is losing its meaning, and social media adds more confusion than clarity, we should perhaps heed the adage “believe nothing of what you read and only half of what you see”. We must be skeptical and take a serious interest in what is going on around us and then make decisions based on the science, data and evidence. We must be proactive rather than reactive. A focus on short-term economic rationalism and political expediency can have devastating consequences for society at large. As a society we must as former Australian Democratic leader, Don Chipp used to say “keep the bastards honest”.
In the face of all of the above, I would still advocate not losing hope. After all, how many times have the All Blacks pulled a game back from the brink?