EA - The most important climate change uncertainty by cwa
The Nonlinear Library: EA Forum - A podcast by The Nonlinear Fund
Categorie:
Link to original articleWelcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: The most important climate change uncertainty, published by cwa on July 26, 2022 on The Effective Altruism Forum. Short summary: we don't know how much warming will make civilization collapse, and it might not be unreasonable to think of climate change as a major x-risk on these grounds. Background/Epistemic status: I’m a climate scientist who has been in and around the Effective Altruism community for quite a few years and has spent some time thinking about worst-case outcomes. The following content was partially inspired by this forum comment by SammyDMartin; it includes many of the same points, but also takes things further and potentially in some different directions. I claim no certainty (or even high confidence) in anything said here — the main goal is to encourage discussion. Introduction Outside of the EA world, discussion of climate change as a top-priority emergency and/or existential threat is ubiquitous: from politicians to scientists to popular movements and books. EA literature typically takes a different view: climate change is bad but not an existential threat, and there are many other threats that could cause more damage (like unaligned AI or engineered pandemics) and deserve our marginal resources more. The disconnect between these two views on climate change has bothered me for a long time. Not because I necessarily agree strongly with one side or the other, but because I just haven’t had a good model for why this disconnect exists. I don’t think the answer is quite as simple as “the people who think climate change is a top priority haven’t thought seriously about existential risks and cause prioritization”; anecdotally, I’ve met many people within EA who have, and who still have a nagging feeling that EA is somehow downplaying the risks from climate change. So what is going on? This post examines two distinct, related, and potentially provocative claims that I think cut to the heart of the disconnect: most of the subjective existential risk from climate change comes from the uncertainty in how much damage is caused by a given amount of warming it's not obviously unreasonable to think the existential risk from climate change is as large as that from AI or pandemics. The focus is on the importance of climate change as a cause area, not neglectedness or tractability; I’ll revisit this omission a bit at the end. The discussion is also purposely aimed at a longtermist/existential-risk-prioritizing audience; the importance of climate change under other moral views can and should be discussed, but I will not do so here. Background Following, e.g., Halstead, it is instructive to split the question of climate change damages into three numbered questions: How much greenhouse gas will humanity emit? How much warming will there be per unit CO2-equivalent greenhouse gas? How much damage will be caused (to humanity) by a given amount of warming? Question 1 can in principle be addressed by economic and political projections, thinking about the economics of decarbonization, and so on. Question 2 is addressed by climate science: climate modeling, looking at past climate changes, and so on. Question 3 has typically been addressed by economic modeling (more on this later) and by considering hard limits for habitability by humans. Much existing EA analysis of climate change as an existential risk focuses on Question 2. This makes sense in part because it’s the most well-quantified: scientists and the IPCC publish probability distributions of this value, called the “climate sensitivity”. Conditional on a given carbon emission scenario, we can then find out the probability of a certain amount of warming by a certain date (e.g. 2100). If you think that some extreme level of warming will cause existential catastrophe (say 6/10/14 °C, whichever you fancy), then the existe...
