Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 78

Frontline Impact Defense

Climate change won’t cause human extinction, extinction rhetoric discourages


action

Rick Newman, January 6, 2022, Let’s stop making this climate change mistake,
https://finance.yahoo.com/news/lets-stop-making-this-climate-change-mistake-
151934749.html
The Earth is ablaze, apparently. The New York Times recently published “Postcards from a world on fire,” a detailed accounting of
climate change disruptions in each of 193 countries. Atop the multimedia version of the feature, a spinning globe spews flame and
smoke, like the Twin Towers before they collapsed on 9-11. Climate change reportage routinely declares we are
destroying the planet, wrecking the Earth and imperiling the world, as if the entire geologic mass is about
to go poof. The countdown is on for the number of years—50? 30? 10?—we have to save the planet. Stay ahead of the market
These characterizations are not quite right—and overstating the consequences of a warming climate
may already be undermining efforts to take needed action . A warming climate is undoubtedly changing the
planet in ways dangerous to humans and other living things. But the Earth isn’t on fire, and the planet itself is not
endangered. What we’re damaging is our own habitat, and those of other species. The planet will carry on one way
or another. “We’re riding this planet right now,” says Bob Bunting, CEO of the Climate Adaptation Center in Sarasota and former
lead forecaster for the government weather agency NOAA. “It remains to be seen how permanent we are. The planet will evolve
with or without us. The planet doesn’t care whether we’re part of it or not.” We tend to anthropomorphize Earth—“Mother
Nature”—yet humans have only been part of the planet for a tiny portion of its existence. And the Earth has been as warm as it is
now at least three times during the last 400,000 years, according to data from Columbia University’s Earth Institute . Species
have come and gone, but a warming climate has never threatened the Earth itself . What's different
now is record levels of carbon in the atmosphere, suggesting temperatures will eventually hit unprecedented levels. Whether
humans will survive that is the real question. It might seem like innocent hyperbole or dramatic license to say we’re wrecking the
planet when we’re really damaging just a specific part of it that happens to be vital to us. After all, if we go extinct, the planet will
cease to exist, for humans. At that point, who cares if it continues to circle the sun without us. Yet existential
alarmism is
counterproductive when public support is crucial to addressing a problem as vast as climate
change. Most people, if told the planet is on fire, can look around and plainly see that it’s not.
Others may feel a sense of dread and think it’s pointless to do anything, if we’re really
doomed. Even people who know climate change is making floods, fires, droughts and storms worse can rightfully ask how urgent
the problem really is and how much climate activists exaggerate. For all the people killed and displaced by
freakish weather, there are many more who still don’t feel any direct impact from a warming
planet—and might even think a shorter winter in northern climes would be welcome. Most
Americans recognize that climate change is a serious problem and many consider it a crisis. But that’s not the same as resolving to
take action. Economists almost universally agree that one of the most effective ways to trigger a green-energy transformation would
be to enact a carbon tax that makes fossil fuels increasingly expensive, and renewables ever cheaper by comparison. Yet that has
proven politically impossible. President Biden is pushing for a huge green-energy transformation, but his plan doesn’t include a
carbon tax, because you simply can’t win elections by promising to raise the cost of fueling cars and heating homes. In Washington
state, one of the most liberal and environmentally aware, voters nixed carbon tax initiatives in 2016 and 2018. People hold
cardboard signs cut in shapes of burning trees and homes and flames, symbolizing the present day impacts of climate change, during
a 'non-violent resistance' climate change protest organized by Extinction Rebellion in the Manhattan borough of New
York City, U.S., September 17, 2021. REUTERS/Caitlin Ochs People hold cardboard signs cut in shapes of burning trees and homes
and flames, symbolizing the present day impacts of climate change, during a 'non-violent resistance' climate change protest
organized by Extinction Rebellion in the Manhattan borough of New York City, U.S., September 17, 2021. REUTERS/Caitlin Ochs
Some voters say they’re willing to sacrifice to help deal with a warming planet, but that hasn’t yet translated into political action.
Biden’s Build Back Better legislation includes several hundred billion dollars in green-energy investments, but that hasn’t passed yet,
and may never. Aside from that, U.S. efforts to address climate change have been modest at best: tax incentives for electric vehicles,
a bit of infrastructure funding, on-off-and-on-again increases in fuel-efficiency standards. Not much, given the scale of the problem.
Keeping global temperatures at manageable levels is going to be really expensive. The International Energy Agency says it will take
$5 trillion in global energy investment per year by 2030. The International Renewable Energy Agency estimates a total need for $131
trillion in global energy investment by 2050. If the U.S. contributed according to its proportion of global GDP, that would be $21
trillion during the next 30 years or so, or $700 billion every year above what we’re spending now. Some of that would be private
investment, but it would require policy changes likely to increase the return on renewables while lowering the return on carbon.
Hence the political barriers. It would also require some amount of taxpayer funding way higher than anybody is seriously talking
about now.

Other ex risks comparatively outweigh.


Hilton, 22 -- Research Analyst at 80,000 Hours & Policy Advisor in the Cabinet Office and
Treasury and Department for International Trade.

[Benjamin Hilton, "Climate change poses a tremendous challenge and will cause immense
suffering, but does it threaten humanity's very survival?," 80,000 Hours, 5-18-2022,
https://80000hours.org/problem-profiles/climate-change/, accessed 5-23-2022; AD]
Reasons not to work on climate change

It’s not as neglected as other issues

Climate change as a whole gets a lot of attention and funding. In particular, it gets much more attention than
many other pressing global issues.

The US federal budget included about $23 billion of climate change spending in the 2021 fiscal
year. The UK spent about £4 billion in the 2021–22 financial year. And several hundred million dollars are
spent each year by foundations .24 Philanthropic spending on climate change is $5 to $10 billion a year. On top of this,
many businesses and universities around the world work on general climate change research or
technologies designed to reduce emissions. The Climate Policy Initiative counted over $600
billion in climate-related spending in 2020.

In comparison, biosecurity in general receives around $3billion per year, preventing catastrophic pandemics
in particular receives around $1 billion, and reducing risks from artificial intelligence receives
between $10 million and $50 million. 25
If a problem is less neglected, it will be harder for an additional person to make as much of a difference working on it.

Other existential threats seem considerably greater

Another consideration is that, for all that climate change is a serious problem, there appear to be other risks that pose
larger threats to humanity’s long-term thriving.

Experts studying risks of human extinction usually think nuclear war, great power conflict in general,
and certain dangerous advances in machine learning or biotechnology all have a higher likelihood of
causing human extinction than climate change.
This seems roughly right to us.

You might think that, for the reasons discussed above, climate change is a substantial enough factor contributing to other risks to be
worth prioritising. That wouldn’t be an unreasonable view.

But these other, often more direct threats to humanity often also act as contributing factors. For example,
pandemics can increase geopolitical tensions, and thus risks of conflict. And
some of them — especially engineered
pandemics and risks from misaligned AI — seem to be direct extinction threats themselves on top of that.
No climate impact---bad studies and adaption.
Nils P. Gleditsch 21, Research Professor at the Peace Research Institute Oslo, “This time is
different! Or is it? NeoMalthusians and environmental optimists in the age of climate change,”
Journal of Peace Research, pg. 5-6, 2021, SAGE. clarification denoted with brackets.
The most extreme contrarian position is, of course, to deny one or both key conclusions of the IPCC: the reality of global warming or
the human contribution to it. However, most environmental optimists accept these two key conclusions but raise other
problems with the panel’s discussion of the social effects of climate change and even more so with
popular interpretations of the panel reports. For instance, Hausfather & Peters (2020), by no means ‘climate deniers’,
decry the common use of choosing the high-risk [scenario] RCP8.59 to illustrate ‘business as
usual’ as misleading.

The causal chains from climate change to the proposed effects on human beings are long and
complex, and the uncertainty increases every step of the way. In the literature on the social effects
of climate change, including the IPCC reports, statements abound that something ‘may’ lead to
something else, or that a variable ‘is sensitive to’ another, without any guidelines for how to
translate this into probabilities (Gleditsch & Nordås, 2014: 87f). Uncritical use of the precautionary
principle, where any remotely possible calamity unwittingly becomes a probable event, is not
helpful.

Gleditsch & Nordås (2014: 85) note that while AR5 (IPCC, 2014) did not find strong evidence for a direct link
between climate change and conflict, it argued that climate change is likely to impact known conflict-
inducing factors like poverty and inconsistent political institutions and therefore might have an indirect effect on conflict. But
this assumes that correlations are transitive, which is not generally the case. If A correlates with B
and B with C, we know nothing about how A relates to C unless both correlations are extremely
high. The strongest case for the climate–conflict link is the effect of interaction between climate change and factors like poverty,
state failure, or ethnic polarization. It may be more cost-effective to try to deal with these other risk factors than with global
warming itself if the goal is to reduce the ‘risk multiplier’ effect of climate change on armed conflict.

The articles in this special issue do not generally see scarcity by itself as necessarily resulting in strongly negative outcomes. Factors
like development, state failure, and previous overload on ecosystems continue to play an important role in that they interact with
climate change to produce conflict and other social outcomes. For instance, Ide, Kristensen & Bartusevicˆius (2021) conclude that
the impact of floods on political conflict are contingent on other factors such as population size and regime type. Moreover, most of
the articles do not assume that scarcities are likely to arise at the global level. They may be regional (mostly in Africa), national, or
local. Urban and rural areas may be affected by different scarcities. Climate change may also affect particularly strongly groups that
are already at an economic or political disadvantage. The effects can be alleviated and adaptations constructed at these levels.

The argument about how climate change may indirectly impact conflict leans heavily on the negative
economic consequences of climate change, but with little or no reference to the research that
explicitly deals with this topic . In fact, the relevant chapter in AR5 concluded that for most sectors of the economy, the
impact of climate change was likely to be dwarfed by other factors. Tol (2018) finds that the long-term global economic effects are
likely to be negative, but that a century
of climate change will have about the same impact on the
economy as the loss of one year of economic growth. Other economists are more cautious, but the dean of
climate change economics, William Nordhaus (2018: 345, 359), estimates that ‘damages are 2.1 percent of global income at 3C
warming and 8.5 percent of income at 6C’, while also warning that the longer the delay in taking decisive action, the harsher the
necessary countermeasures. Stern (2006) is more pessimistic, based mainly on a lower discount rate (the interest rate used to
calculate the present value of future cash flows) as are Wagner & Weitzman (2015). Heal (2017) argues that the Integrated
Assessment Models generally used
in the assessment of the economics of climate change are not
accurate enough to provide quantitative insights and should not be taken as serious forecasts. Yet,
all these economists take the basically optimistic view that climate change is manageable with appropriate policies for raising the
price on the emission of greenhouse gases. With a chapter heading from Wagner & Weitzman (2015: 17): ‘We can do this’.

This more optimistic assessment of climate change does not assume that the challenge will go away by itself or can be left to the
market. A plausible approach, favored by most economists,10 is the imposition of a robust and increasing price on carbon emissions
(whether as a carbon tax or through a cap and trade scheme) high enough to reduce the use of fossil fuels and encourage the search
for their replacement. More than 25 countries had such taxes by early 2018 (Metcalf, 2019), but generally not at a level seen as
necessary for limiting global warming to, say, 2C. This approach relies on the use of the market mechanism, but with targets fixed by
public policy. Income from a carbon tax can be channeled back to the citizens to avoid increasing overall taxation. To speed up the
transition, funds can also be allocated to the research and development of cheaper and more efficient production of various forms
of fossil-free energy, including nuclear power (Goldstein & Qvist, 2019).

The response of the environmental optimists continues to emphasize the role of innovations; technological
innovations, such
as improvements in battery technology, the key element in the 2019 Nobel Prize in chemistry,11
but also social innovations, as exemplified by the experimental approach to the alleviation of
poverty, rewarded in the same year by the Nobel Prize in economics.12

While the most important countermeasures will be directed at the mitigation of climate change, thereis also a strong case
for adaptation. If sea-level rise cannot be totally prevented, dikes and flood barriers will be cost-effective
and necessary, at least in high-value urban areas. If parts of Africa suffer from drought, there will be increased use
for new crops that are more suitable for a dry climate, possibly developed in part by GMO
technology. Industrialization in Africa can decrease the one-sided reliance on rain-fed agriculture, as
it has in other parts of the world, which have moved human resources from the primary sector to industry (and then to
services). Continuing urbanization will move millions out of the most vulnerable communities (Collier,
2010). While structural change failed to produce economic growth in Latin America and Africa after 1990, Africa has experienced a
turnaround in the new millennium (McMillan & Rodrik, 2014) and there are also potentials for increasing
productivity by structural change within agriculture in Africa (McCullough, 2017).

No warming impact and emissions are inevitable

a) Huge uncertainties---climate sensitivity models range from barely any warming to


catastrophic with no gauge of certainty

b) Can’t be existential---the worst-case models assume impossible emissions levels with no


mitigation or adaptation

c) Timeframe---impacts are slow which allows time to adapt and manage the consequence

d) Renewables worse---fast transition locks in natural gas as a bridge fuel which makes
zero emissions impossible OR causes energy shortages because storage tech isn’t ready---that’s
Curry.

Judith Curry 19, President of Climate Forecast Applications Network (CFAN), Professor Emerita
of Earth and Atmospheric Sciences at the Georgia Institute of Technology, Ph.D. in atmospheric
science from the University of Chicago, 2/9/19, “Statement to the Committee on Natural
Resources of the United States House of Representatives,”
https://curryja.files.wordpress.com/2019/02/curry-testimony-house-natural-resources.pdf
The urgency (?) of CO2 emissions reductions
In the decades since the 1992 UNFCCC Treaty, global CO2 emissions have continued to increase, especially in developing countries.
In 2010, the world’s governments agreed that emissions need to be reduced so that global temperature increases are limited to
below 2 degrees Celsius.17 The target of 2oC (and increasingly 1.5oC)18 remains the focal point of international climate agreements
and negotiations.

The original rationale for the 2oC target is the idea that ‘tipping points’ − abrupt or nonlinear transition to
a different climate state − become likely to occur once this threshold has been crossed , with consequences
that are largely uncontrollable and beyond our management. The IPCC AR5 considered a number of potential tipping points,
including ice sheet collapse, collapse of the Atlantic overturning circulation, and permafrost
carbon release. Every single catastrophic scenario considered by the IPCC AR5 (WGII, Table 12.4) has a
rating of very unlikely or exceptionally unlikely and/or has low confidence. The only tipping
point that the IPCC considers likely in the 21st century is disappearance of Arctic summer sea ice
(which is fairly reversible, since sea ice freezes every winter).

In the absence of tipping points on the timescale of the 21st century, the 2oC limit iss more
usefully considered by analogy to a highway speed limit :19 driving at 10 mph under the speed limit is not
automatically safe, and exceeding the limit by 10 mph is not automatically dangerous, although the faster one travels the greater the
danger from an accident. Analogously, the 2oC (or 1.5oC) limit should not be taken literally as a real
danger threshold. An analogy for considering the urgency of emissions reductions is your 401K account: if you begin making
contributions early, it will be easier to meet your retirement goals.

Nevertheless, the 2oC and 1.5oC limits are used to motivate the urgency of action to reduce CO2
emissions. At a recent UN Climate Summit, (former) Secretary-General Ban Ki-moon warned that: “Without significant cuts in
emissions by all countries, and in key sectors, the window of opportunity to stay within less than 2 degrees [of warming] will soon
close forever.”20 Actually, this
window of opportunity may remain open for quite some time. The
implications of the lower values of climate sensitivity found by Lewis and Curry21 and other recent studies is
that human caused warming is not expected to exceed the 2oC ‘danger’ level in the 21st
century. Further, there is growing evidence that the RCP8.5 scenario for future greenhouse gas
concentrations, which drives the largest amount of warming in climate model simulations, is
impossibly high, requiring a combination of numerous borderline impossible socioeconomic
scenarios.22 A slower rate of warming means there is less urgency to phase out greenhouse gas
emissions now, and more time to find ways to decarbonize the economy affordably and with a
minimum of unintended consequences. It also allows for the flexibility to revise our policies as
further information becomes available.

Is it possible that something truly dangerous and unforeseen could happen to Earth’s climate during
the 21st century? Yes it is possible, but natural climate variability (including geologic processes) may be
a more likely source of possible undesirable change than manmade warming. In any event,
attempting to avoid such a dangerous and unforeseen climate by reducing fossil fuel emissions
will be futile if natural climate and geologic processes are dominant factors . Geologic processes are an
important factor in the potential instability of the West Antarctic ice sheet that could contribute to substantial sea level rise in the
21st century.23

Under the Paris Agreement, individual countries have submitted to the UNFCCC their Nationally Determined
Contributions (NDCs). Under the Obama Administration, the U.S. NDC had a goal of reducing emissions by 28% below 2005 levels
by 2025. Apart from considerations of feasibility and cost, it has been estimated24 using the EPA MAGICC model that this
commitment will prevent 0.03oC in warming by 2100. Whencombined with current commitments from other
nations, only a small fraction of the projected future warming will be ameliorated by these
commitments. If climate models are indeed running too hot,25 then the amount of warming prevented would be even smaller.
Even if emissions immediately went to zero and the projections of climate models are to be
believed, the impact on the climate would not be noticeable until the 2nd half of the 21st
century. Most of the expected benefits to the climate from the UNFCCC emissions reductions policy will be realized in the 22nd
century and beyond.

Attempting to use carbon dioxide as a control knob to regulate climate on decadal to century timescales
is arguably futile. The UNFCCC emissions reductions policies have brought us to a point between a rock
and a hard place, whereby the emissions reduction policy with its extensive costs and questions
of feasibility are inadequate for making a meaningful dent in slowing down the expected
warming in the 21st century. And the real societal consequences of climate change and extreme
weather events (whether caused by manmade climate change or natural variability) remain largely unaddressed.
This is not to say that a transition away from burning fossil fuels doesn’t make sense over the course of the 21st century. People
prefer ‘clean’ over ‘dirty’ energy – provided that all other things are equal, such as reliability, security, and economy. However,
assuming that current wind and solar technologies are adequate for providing the required
amount and density of electric power for an advanced economy is misguided .26
The recent record-breaking cold outbreak in the Midwest is a stark reminder of the challenges of providing a reliable power supply in
the face of extreme weather events, where an inadequate power supply not only harms the economy, but jeopardizes lives and
public safety. Last week, central Minnesota experienced a natural gas ‘brownout,’ as Xcel Energy advised customers to turn
thermostats down to 60 degrees and avoid using hot water.27 Why? Because the wind wasn’t blowing during an exceptionally cold
period. Utilities pair natural gas plants with wind farms, where the gas plants can be ramped up and down quickly when the wind
isn’t blowing. With bitter cold temperatures and no wind, there wasn’t enough natural gas.

A transition to an electric power system driven solely by wind and solar would require a massive
amount of energy storage. While energy storage technologies are advancing, massive
deployment of cost-effective energy storage technologies is well beyond current capabilities.28
An unintended consequence of rapid deployment of wind and solar energy farms may be that
natural gas power plants become increasingly entrenched in the power supply system.

Apart from energy policy, there


are a number of land use practices related to croplands, grazing lands,
forests and wetlands that could increase the natural sequestration of carbon and have ancillary
economic and ecosystem benefits.29 These co-benefits include improved biodiversity, soil
quality, agricultural productivity and wildfire behavior modification.

In evaluating the urgency of CO2 emissions reductions, we


need to be realistic about what reducing emissions will actually
accomplish. Drastic reductions of emissions in the U.S. will not reduce global CO2 concentrations if
emissions in the developing world, particularly China and India, continue to increase. If we believe
the climate model simulations, we would not expect to see any changes in extreme weather/climate
events until late in the 21st century . The greatest impacts will be felt in the 22nd century and beyond, in terms of
reducing sea level rise and ocean acidification.

Resilience, anti-fragility and thrivability

Given that emissions reductions policies are very costly, politically contentious and are not expected to change the climate in a
meaningful way in the 21st century, adaptation strategies are receiving increasing attention in
formulating responses to climate change.
The extreme damages from recent hurricanes plus the recent billion dollar disasters from floods, droughts and wildfires, emphasize
that the U.S. is highly vulnerable to current weather and climate disasters. Even worse disasters were encountered in the U.S. during
the 1930’s and 1950’s. Possible scenarios of incremental worsening of weather and climate extremes over the course of the 21st
century don’t change the fundamental storyline that many regions of the U.S. are not well adapted to the current weather and
climate variability, let alone the range that has been experienced over the past two centuries.
As a practical matter, adaptation has been driven by local crises associated with extreme weather and
climate events, emphasizing the role of ‘surprises’ in shaping responses. Advocates of adaptation to climate
change are not arguing for simply responding to events and changes after they occur; they are arguing for anticipatory
adaptation. However, in adapting to climate change, we need to acknowledge that we cannot know how the climate will evolve
in the 21st century, we are certain to be surprised and we will make mistakes along the way.

‘Resilience’ is the ability to ‘bounce back’ in the face of unexpected events. Resilience carries a connotation of returning to the
original state as quickly as possible. The difference in impact and recovery from Hurricane Sandy striking New York City in 2012
versus the impact of Tropical Cyclone Nargis striking Myanmar in 200830 reflects very different vulnerabilities and capacities for
bouncing back.

To increase our resilience to extreme weather and climate events, we can ‘bounce forward’ to reduce future vulnerability by
evolving our infrastructures, institutions and practices. Nicholas Taleb’s concept of antifragility31 focuses on learning from adversity,
and developing approaches that enable us to thrive from high levels of volatility, particularly unexpected extreme events. Anti-
fragility goes beyond ‘bouncing back’ to becoming even better as a result of encountering and overcoming challenges. Anti-fragile
systems are dynamic rather than static, thriving and growing in new directions rather than simply maintaining the status quo.

Strategies to increase antifragility include: economic development, reducing the downside from volatility, developing a range of
options, tinkering with small experiments, and developing and testing transformative ideas. Antifragility is consistent with
decentralized models of policy innovation that create flexibility and redundance in the face of volatility. This ‘innovation dividend’ is
analogous to biodiversity in the natural world, enhancing resilience in the face of future shocks.32

Similar to anti-fragility, the concept of ‘thrivability’ has been articulated by Jean Russell:33 “It isn’t enough to repair the damage our
progress has brought. It is also not enough to manage our risks and be more shock-resistant. Now is not only the time to course
correct and be more resilient. It is a time to imagine what we can generate for the world. Not only can we work to minimize our
footprint but we can also create positive handprints. It is time to strive for a world that thrives.”

A focus on policies that support resilience, anti-fragility and thrivability avoids the hubris of thinking we can predict the future
climate. The relevant questions then become:

• How can we best promote the development of transformative ideas and technologies?

• How much resilience can we afford?

The threats from climate change (whether natural or human caused) are fundamentally regional, associated not only with regional
changes to the weather/climate, but with local vulnerabilities and cultural values and perceptions. In the least developed countries,
energy poverty and survivability is of overwhelming concern, where there are severe challenges to meeting basic needs and their
idea of clean energy is something other than burning dung inside their dwelling for cooking and heating. In many less developed
countries, particularly in South Asia, an overwhelming concern is vulnerability to extreme weather events such as floods and
hurricanes that can set back the local economies for a generation. In the developed world, countries are relatively less vulnerable to
climate change and extreme weather events and have the luxury of experimenting with new ideas: entrepreneurs not only want to
make money, but also to strive for greatness and transform the infrastructure for society.

Extreme weather/climate events such as landfalling major hurricanes, floods, extreme heat waves and droughts become
catastrophes through a combination of large populations, large and exposed infrastructure in vulnerable locations, and human
modification of natural systems that can provide a natural safety barrier (e.g. deforestation, draining wetlands). Addressing
current adaptive deficits and planning for climate compatible development will increase societal
resilience to future extreme events that may possibly be more frequent or severe in the future.
Ways forward

Climate scientists have made a forceful argument for a future threat from manmade climate change. Based
upon our
current assessment of the science, the threat does not seem to be an existential one on the time
scale of the 21st century, even in its most alarming incarnation . However, the perception of manmade
climate change as a near-term apocalypse and alignment with range of other social objectives has narrowed
the policy options that we’re willing to consider .
No impact to warming.

--CO2 levels are historically low

--CO2 is not correlated with higher temperatures

--Humans and fossil fuels are the primary cause of carbon concentrations

Jay Lehr 19, Ph.D. in Groundwater Hydrology from the University of Arizona, and Tom Harris,
Executive Director of the International Climate Science Coalition, “Global Warming Myth
Debunked: Humans Have Minimal Impact on Atmosphere’s Carbon Dioxide and Climate”,
Western Journal, 2-14, https://www.westernjournal.com/global-warming-myth-debunked-
humans-minimal-impact-atmospheres-carbon-dioxide-climate/ [language modified]

Global warmingactivists argue carbon-dioxide emissions are destroying the planet, but the
climate impacts of carbon dioxide are minimal, at worst. Activists would also have you believe fossil-fuel
emissions have driven carbon-dioxide concentrations to their highest levels in history. The Obama-era Environmental Protection
Agency went so far as to classify carbon dioxide as a toxic pollutant, and it established a radical goal of closing all of America’s coal-
fired power plants.

Claims of unprecedented carbon-dioxide levels ignore most of Earth’s 4.6-billion-year history.


Relative to Earth’s entire record, carbon-dioxide levels are at historically low levels; they only
appear high when compared to the dangerously low levels of carbon dioxide that occurred in
Earth’s very recent history. The geologic record reveals carbon dioxide has almost always been
in Earths’ atmosphere in much greater concentrations than it is today. For example, 600 million
years ago, when history’s greatest birth of new animal species occurred , atmospheric carbon-dioxide
concentrations exceeded 6,500 parts per million (ppm) — an amount that’s 17 times greater than it
is today.
Atmospheric carbon dioxide is currently only 410 parts per million. That means only 0.04 percent of our atmosphere is carbon
dioxide (compared to 0.03 percent one century ago). Only one molecule in 2,500 is carbon dioxide. Such levels certainly do not pose
a health risk, as carbon-dioxide levels in our naval submarines, which stay submerged for months at a time, contain an average
carbon-dioxide concentration of 5,000 ppm.

The geologic record is important because it reveals relationships between carbon-dioxide levels, climate, and life on Earth. Over
billions of years, the geologic record shows there is no long-term correlation between
atmospheric carbon-dioxide levels and Earth’s climate. There are periods in Earth’s history when
carbon dioxide concentrations were many times higher than they are today, yet temperatures
were identical to, or even colder than, modern times. The claim that fossil-fuel emissions
control atmospheric carbon-dioxide concentrations is also invalid, as atmospheric
concentrations have gone up and down in the geological record, even without human influence.
The absurdity of climate alarmism claims gets even stranger when you consider there are 7.5 billion people on our planet who,
together, exhale 2.7 billion tons of carbon dioxide each year, which is almost 10 percent of total fossil-fuel emissions every year.
However, we are but a single species. Combined, people and all domesticated animals contribute 10 billion tons.

Further, 9
percent of carbon-dioxide emissions from all living things arise not from animals, but
from anaerobic bacteria and fungi . These organisms metabolize dead plant and animal matter in soil via decay
processes that recycle carbon dioxide back into the atmosphere. The grand total produced by all living things is estimated to be 440
billion tons per year, or 13 times the amount of carbon dioxide currently being produced by fossil-fuel emissions. Fossil-fuel
emissions are less than 10 percent of biological emissions. Are you laughing yet?
Every apocalyptic pronouncement you hear or read is [totally wrong] nothing short of insanity. Their
primary goal is not to save plants, humans, or animals, but rather to use climate “dangers” as a justification for centralizing power in
the hands of a select few.

Even extreme warming won’t cause extinction

Dr. Toby Ord 20, Senior Research Fellow in Philosophy at Oxford University, DPhil in Philosophy
from the University of Oxford, The Precipice: Existential Risk and the Future of Humanity,
Hachette Books, Kindle Edition, p. 110-112

But the purpose of this chapter is finding and assessing threats that pose a direct existential risk to humanity. Even
at such
extreme levels of warming, it is difficult to see exactly how climate change could do so . Major effects
of climate change include reduced agricultural yields, sea level rises, water scarcity, increased tropical
diseases, ocean acidification and the collapse of the Gulf Stream. While extremely important when assessing
the overall risks of climate change, none of these threaten extinction or irrevocable collapse.

Crops are very sensitive to reductions in temperature (due to frosts), but less sensitive to increases. By all appearances
we would still have food to support civilization.85 Even if sea levels rose hundreds of meters (over
centuries), most of the Earth’s land area would remain. Similarly, while some areas might conceivably become
uninhabitable due to water scarcity, other areas will have increased rainfall. More areas may become susceptible to
tropical diseases, but we need only look to the tropics to see civilization flourish despite this. The main
effect of a collapse of the system of Atlantic Ocean currents that includes the Gulf Stream is a 2°C cooling of Europe—something that
poses no permanent threat to global civilization.

From an existential risk perspective, a more serious concern


is that the high temperatures (and the rapidity of their
change) might cause a large loss of biodiversity and subsequent ecosystem collapse . While the pathway
is not entirely clear, a large enough collapse of ecosystems across the globe could perhaps threaten human extinction. The idea that
climate change could cause widespread extinctions has some good theoretical support.86 Yet
the evidence is mixed. For
when we look at many of the past cases of extremely high global temperatures or extremely
rapid warming we don’t see a corresponding loss of biodiversity.87
[FOOTNOTE]

We don’t see such biodiversity loss in the 12°C warmer climate of the early Eocene, nor the
rapid global change of the PETM, nor in rapid regional changes of climate. Willis et al. (2010) state: “We
argue that although the underlying mechanisms responsible for these past changes in climate were very different (i.e. natural
processes rather than anthropogenic), the
rates and magnitude of climate change are similar to those
predicted for the future and therefore potentially relevant to understanding future biotic
response. What emerges from these past records is evidence for rapid community turnover,
migrations, development of novel ecosystems and thresholds from one stable ecosystem state
to another, but there is very little evidence for broad-scale extinctions due to a warming
world.” There are similar conclusions in Botkin et al. (2007), Dawson et al. (2011), Hof et al. (2011) and
Willis & MacDonald (2011). The best evidence of warming causing extinction may be from the end-Permian mass extinction,
which may have been associated with large-scale warming (see note 91 to this chapter).

[END FOOTNOTE]

So the most important known effect of climate change from the perspective of direct existential
risk is probably the most obvious: heat stress. We need an environment cooler than our body temperature to be able to rid
ourselves of waste heat and stay alive. More precisely, we need to be able to lose heat by sweating, which depends on the humidity
as well as the temperature.
A landmark paper by Steven Sherwood and Matthew Huber showed that with sufficient warming there would be parts of the world
whose temperature and humidity combine to exceed the level where humans could survive without air conditioning.88 With 12°C of
warming, a very large land area—where more than half of all people currently live and where much of our food is grown—would
exceed this level at some point during a typical year. Sherwood and Huber suggest that such areas would be uninhabitable. This may
not quite be true (particularly if air conditioning is possible during the hottest months), but their habitability is at least in question.

However, substantial regions would also remain below this threshold. Even with an extreme
20°C of warming there would be many coastal areas (and some elevated regions) that would
have no days above the temperature/humidity threshold .89 So there would remain large areas in
which humanity and civilization could continue. A world with 20°C of warming would be an unparalleled human
and environmental tragedy, forcing mass migration and perhaps starvation too. This is reason enough to do our utmost to prevent
anything like that from ever happening. However, our present task is identifying existential risks to humanity and it is hard to
see how any realistic level of heat stress could pose such a risk . So the runaway and moist greenhouse effects
remain the only known mechanisms through which climate change could directly cause our extinction or irrevocable collapse.

This doesn’t rule out unknown mechanisms. We are considering large changes to the Earth that may even be unprecedented in size
or speed. It wouldn’t be astonishing if that directly led to our permanent ruin. The best argument against such unknown
mechanisms is probably that the PETM did not lead to a mass extinction, despite temperatures rapidly rising about 5°C, to reach a
level 14°C above pre-industrial temperatures.90 But this is tempered by the imprecision of paleoclimate data, the sparsity of the
fossil record, the smaller size of mammals at the time (making them more heat-tolerant), and a reluctance to rely on a single
example. Most importantly, anthropogenic warming could be over a hundred times faster than warming during the PETM, and rapid
warming has been suggested as a contributing factor in the end-Permian mass extinction, in which 96 percent of species went
extinct.91 In the end, we can say little more than that direct existential risk from climate change appears very
small, but cannot yet be ruled out.

Climate doesn’t cause extinction.


Kerr et al. 19 – Dr. Amber Kerr, Energy and Resources PhD at the University of California-
Berkeley, known agroecologist, former coordinator of the USDA California Climate Hub. Dr.
Daniel Swain, Climate Science PhD at UCLA, climate scientist, a research fellow at the National
Center for Atmospheric Research. Dr. Andrew King, Earth Sciences PhD, Climate Extremes
Research Fellow at the University of Melbourne. Dr. Peter Kalmus, Physics PhD at the University
of Colombia, climate scientist at NASA’s Jet Propulsion Lab. Professor Richard Betts, Chair in
Climate Impacts at the University of Exeter, a lead author on the Fourth Assessment Report of
the Intergovernmental Panel on Climate Change (IPCC) in Working Group 1. Dr. William
Huiskamp, Paleoclimatology PhD at the Climate Change Research Center, climate scientist at the
Potsdam Institute for Climate Impact Research. [Claim that human civilization could end in 30
years is speculative, not supported with evidence, 6-4-2019,
https://climatefeedback.org/evaluation/iflscience-story-on-speculative-report-provides-little-
scientific-context-james-felton/]

There is no scientific basis to suggest that climate breakdown will “annihilate intelligent life” (by
which I assume the report authors mean human extinction) by 2050.
However, climate breakdown does pose a grave threat to civilization as we know it, and the potential for mass suffering on a scale
perhaps never before encountered by humankind. This should be enough reason for action without any need for exaggeration or
misrepresentation!

A “Hothouse Earth” scenario plays out that sees Earth’s temperatures doomed to rise by a further 1°C (1.8°F) even if we
stopped emissions immediately.
Peter Kalmus, Data Scientist, Jet Propulsion Laboratory:

This word choice perhaps reveals a bias on the part of the author of the article. A temperature can’t be doomed. And while I
certainly do not encourage false optimism, assuming that humanity is doomed is lazy and counterproductive.

Fifty-five percent of the global population are subject to more than 20 days a year of lethal heat conditions beyond that
which humans can survive

Richard Betts, Professor, Met Office Hadley Centre & University of Exeter:

This is clearly from Mora et al (2017) although the report does not include a citation of the paper as the source of that statement.
The way it is written here (and in the report) is misleading because it gives the impression that everyone dies in those conditions.
That is not actually how Mora et al define “deadly heat” – they merely looked for heatwaves when somebody died (not everybody)
and then used that as the definition of a “deadly” heatwave.

North America suffers extreme weather events including wildfires, drought, and heatwaves. Monsoons in China fail, the
great rivers of Asia virtually dry up, and rainfall in central America falls by half.

Andrew King, Research fellow, University of Melbourne:

Projections of extreme events such as these are very difficult to make and vary greatly between
different climate models.
Deadly heat conditions across West Africa persist for over 100 days a year

Peter Kalmus, Data Scientist, Jet Propulsion Laboratory:

The deadly heat projections (this, and the one from the previous paragraph) come from Mora et al (2017)1.

It should be clarified that “deadly heat” here means heat and humidity beyond a two-dimension threshold where at least one person
in the region subject to that heat and humidity dies (i.e., not everyone instantly dies). That said, in my opinion, the projections in
Mora et al are conservative and the methods of Mora et al are sound. I did not check the claims in this report against Mora et al but I
have no reason to think they are in error.

1- Mora et al (2017) Global risk of deadly heat, Nature Climate Change

The knock-on consequences affect national security, as the scale of the challenges involved, such as pandemic disease
outbreaks, are overwhelming. Armed conflicts over resources may become a reality, and have the potential to escalate
into nuclear war. In the worst case scenario, a scale of destruction the authors say is beyond their capacity to model,
there is a ‘high likelihood of human civilization coming to an end’.

Willem Huiskamp, Postdoctoral research fellow, Potsdam Institute for Climate Impact Research:

This is a highly questionable conclusion. The reference provided in the report is for the “Global Catastrophic Risks 2018” report from
the “Global Challenges Foundation” and not peer-reviewed literature. (It is worth noting that this latter report also provides no peer-
reviewed evidence to support this claim).

Furthermore, if it is apparently beyond our capability to model these impacts, how can they assign a ‘high likelihood’ to this
outcome?

While it is true that warming of this magnitude would be catastrophic, making claims such as this without evidence serves only to
undermine the trust the public will have in the science.

Daniel Swain, Researcher, UCLA, and Research Fellow, National Center for Atmospheric Research:

It seems that the


eye-catching headline-level claims in the report stem almost entirely from these
knock-on effects, which the authors themselves admit are “beyond their capacity to model.” Thus, from a
scientific perspective, the purported “high likelihood of civilization coming to an end by 2050” is
essentially personal speculation on the part of the report’s authors, rather than a clear conclusion drawn
from rigorous assessment of the available evidence.
Adaptation is guaranteed, zeroing the impact.
Lomborg ’21 [Dr. Bjorn; 2021; President of the Copenhagen Consensus Center, Former
Director of the Danish Government's Environmental Assessment Institute, PhD in Political
Science at the University of Copenhagen, M.A. in Political Science at the University of Aarhus, BA
from the University of Georgia; Wall Street Journal, “Climate Change Calls for Adaptation, Not
Panic,” https://www.wsj.com/articles/climate-change-adaptation-panic-exaggerating-disaster-
11634760376]

It’s easy to construct climate disasters. You just find a current, disconcerting trend and project it
into the future, while ignoring everything humanity could do to adapt. For instance, one widely
reported study found that heat waves could kill thousands more Americans by the end of the century if global warming
continues apace—but only if you assume people won’t use more air conditioning. Yes, the climate is likely to change,
but so is human behavior in response.

Adaptation doesn’t make the cost of global warming go away entirely, but it does reduce it
dramatically. Higher temperatures will shrink harvests if farmers keep growing the same crops, but they’re likely to
adapt by growing other varieties or different plants altogether . Corn production in North America has
shifted away from the Southeast toward the Upper Midwest, where farmers take advantage of longer growing seasons and less-
frequent extreme heat. When sea levels rise, governments build defenses—like the levees, flood walls
and drainage systems that protected New Orleans from much of Hurricane Ida’s ferocity this year.

Nonetheless, many in the media push unrealistic projections of climate catastrophes, while
ignoring adaptation. A new study documents how the biggest bias in studies on the rise of sea
levels is their tendency to ignore human adaptation, exaggerating flood risks in 2100 by as much
as 1,300 times. It is also evident in the breathless tone of most reporting: The Washington Post frets that sea level rise could
“make 187 million people homeless,” CNN fears an “underwater future,” and USA Today agonizes over tens of trillions of dollars in
projected annual flood damage. All three rely on studies that implausibly
assume no society across the world will
make any adaptation whatever for the rest of the century. This isn’t reporting but scaremongering.
You can see how far from reality these sorts of projections are in one heavily cited study, depicted in the graph nearby If you assume
no society will adapt to any sea-level rise between now and 2100, you’ll find that vast areas of the world will be routinely flooded,
causing $55 trillion in damage annually in 2100 (expressed in 2005 dollars), or about 5% of global gross domestic product. But as the
study emphasizes, “in reality, societies are likely to adapt.”

By raising the height of dikes, the study shows that humanity


can negate almost all that terrible projected
damage by 2100. Only 15,000 people would be flooded every year, which is a remarkable improvement compared with the 3.4
million people flooded in 2000. The total cost of damage, investments in new dikes, and maintenance costs of existing dikes will fall
sixfold between now and 2100 to 0.008% of world GDP.

Adaptation is much more effective than climate regulations at staving off flood risks. Compare the two types of policies in isolation.
Without any climate mitigation to help, dikes would still safeguard more than 99.99% of the flood victims you’d see if global
warming continued on current trends. Instead of 187 million people flooded in 2100, there would be only 15,000. Climate policy
achieves much less on its own. Without adaptation, even stringent regulations that keep the global temperature rise below 2
degrees Celsius would reduce the number of flood victims only down to 85 million a year by the end of the century.

Stringent climate policy still has only a mild effect when used in concert with dikes: Instead of the 15,000 flood victims you’d get with
only adaptation, you’d have 10,000. And getting there would cost hundreds of trillions of dollars, which is hardly mitigated by the
$40 billion drop in total flood damage and dike costs climate regulations would achieve. As I’ve explained in these pages before, this
kind of policy has a high human cost: the tens of millions of people pricey climate regulations relegate to poverty.

You don’t have to portend doom to take climate change seriously. Ignoring the benefits of adaptation may make
for better headlines, but it badly misinforms readers.
The ev is cherry-picked anecdotes or predictive models off by several orders of
magnitude.
Zycher ’21 [Dr. Benjamin; 2021; Senior Fellow at the American Enterprise Institute, Doctorate
in Economics from UCLA, Master in Public Policy from the University of California, Berkeley, and
Bachelor of Arts in Political Science from UCLA, Former Senior Economist at the RAND
Corporation, Former Adjunct Professor of Economics at the University of California, Los Angeles
(UCLA) and at the California State University Channel Islands, and Former Senior Economist at
the Jet Propulsion Laboratory; California Institute of Technology, “The Case for Climate Change
Realism,” https://www.aei.org/articles/the-case-for-climate-change-realism/]
CLIMATE TRENDS

Beyond exhibiting extreme overconfidence in a cherry-picked analysis of climate-change


causes, politicians and activists frequently ground their alarmism in frightening predictions about
consequences that are likewise far from certain. This is not only true within the very new (and still quite
unreliable) field of predictive climate science; it is true even in the context of ongoing climate phenomena. Indeed, politicians and
journalists frequently characterize dramatic or unusual climate phenomena as the product of
anthropogenic climate change, yet there is little evidence to support those claims.

For one thing, there


is no observable upward trend in the number of “hot” days between 1895 and
2017; 11 of the 12 years with the highest number of such days occurred before 1960 . Since 2005,
NOAA has maintained the U.S. Climate Reference Network, comprising 114 meticulously maintained
temperature stations spaced more or less uniformly across the lower 48 states, along with 21 stations in Alaska and two stations in
Hawaii. They are placed to avoid heat-island effects and other such distortions as much as possible. The
reported data
show no increase in average temperatures over the available 2005-2020 period. In addition, a
recent reconstruction of global temperatures over the past 1 million years — created using data
from ice-sheet formations — shows that there is nothing unusual about the current warm
period.

Rising sea levels are another frequently cited example of impending climate crisis. And yet sea
levels have been rising
since at least the mid-19th century. This rise is tied closely with the end of the Little Ice Age that occurred not long
before, which led to a rise in global temperatures, some melting of sea ice, and a thermal expansion of sea water. There is some
evidence showing an acceleration in sea-level rise beginning in the early 1990s: Satellite measurements of sea levels began in 1992
and show a sea-level rise of about 3.2 millimeters per year between 1993 and 2010. Before 1992, when sea levels were measured
with tidal gauges, the data showed an increase of about 1.7 millimeters per year on average from 1901 to 1990.

But because the datasets are from two different sources — satellite measurements versus tidal gauges — they are not
directly comparable, and therefore they cannot be interpreted as showing an acceleration in sea-level
rises. Moreover, the period beginning in 1993 is short in terms of global climate phenomena . Since sea
levels have risen at a constant rate, remained constant, or even fallen during similar relatively short periods, inferences drawn
from them are problematic. It is of course possible there has been an acceleration in sea-level rise, but even still, it would
not be clear whether such a development stemmed primarily from anthropogenic or natural causes; clearly, both processes are
relevant.

A study of changes in Arctic and Antarctic sea ice yields very different inferences. Since 1979, Arctic
sea ice has declined relative to the 30-year average (again, the degree to which this is the result of anthropogenic factors is not known).
Meanwhile, Antarctic sea ice has been growing relative to the 30-year average, and the global sea-ice total
has remained roughly constant since 1979.
Extreme weather occurrences are likewise used as evidence of an ongoing climate crisis, but
again, a study of the available data undercuts that assessment. U.S. tornado activity shows either no
increase or a downward trend since 1954. Data on tropical storms, hurricanes, and accumulated cyclone
energy (a wind-speed index measuring the overall strength of a given hurricane season) reveal little change since satellite
measurements of the phenomena began in the early 1970s. The number of wildfires in the United States
shows no upward trend since 1985, and global acreage burned has declined over past decades. The Palmer
Drought Severity Index shows no trend since 1895. And the IPCC’s Fifth Assessment Report, published in 2014, displays substantial divergence between
its discussion of the historical evidence on droughts and the projections on future droughts yielded by its climate models. Simply put, the available data
do not support the ubiquitous assertions about the causal link between greenhouse-gas accumulation, temperature change, and extreme weather
events and conditions.

Unable to demonstrate that observed climate trends are due to anthropogenic climate change — or even that these events are particularly unusual or
concerning — climate catastrophists will often turn to dire predictions about prospective climate phenomena. The
problem with such predictions is that they are almost always generated by climate models driven by
highly complex sets of assumptions about which there is significant dispute. Worse, these
models are notorious for failing to accurately predict already documented changes in climate . As
climatologist Patrick Michaels of the Competitive Enterprise Institute notes:

During all periods from 10 years (2006-2015) to 65 (1951-2015) years in length, the observed temperature
trend lies in the lower half of the collection of climate model simulations, and for several
periods it lies very close (or even below) the 2.5th percentile of all the model runs. Over shorter
periods, such as the last two decades, a plethora of mechanisms have been put forth to explain the observed/modeled divergence, but
none do so completely and many of the explanations are inconsistent with each other.

Similarly, climatologist John Christy of the University of Alabama in Huntsville observes that almost all of the 102 climate models
incorporated into the Coupled Model Intercomparison Project (CMIP) — a tracking effort conducted by the Lawrence Livermore National Laboratory — 
overstate past and current temperature trends by a factor of two to three, and at times even
more. It seems axiomatic to say we should not rely on climate models that are unable to predict the past or the present to
make predictions about the distant future.

The overall temperature trend is not the only parameter the models predict poorly. As an example, every CMIP climate
model predicts that increases in atmospheric concentrations of greenhouse gas should create an
enhanced heating effect in the mid-troposphere over the tropics — that is, at an altitude over the tropics of about
30,000-40,000 feet. The underlying climatology is simple: Most of the tropics is ocean, and as increases in greenhouse-gas
concentrations warm the Earth slightly, there should be an increase in the evaporation of ocean water in this region. When the
water vapor rises into the mid-troposphere, it condenses, releasing heat. And yet the satellites
cannot find this heating
effect — a reality suggesting that our understanding of climate and atmospheric phenomena is
not as robust as many seem to assume.

The poor predictive record of mainstream climate models is exacerbated by the tendency of the
IPCC and U.S. government agencies to assume highly unrealistic future increases in greenhouse-
gas concentrations. The IPCC’s 2014 Fifth Assessment Report, for example, uses four alternative “representative
concentration pathways” to outline scenarios of increased greenhouse-gas concentrations yielding anthropogenic warming. These
scenarios are known as RCP2.6, RCP4.5, RCP6, and RCP8.5. Since 1950, the average annual increase in greenhouse-gas
concentrations has been about 1.6 parts per million. The average annual increase from 1985 to 2019 was about 1.9 parts per million,
and from 2000 to 2019, it was about 2.2 parts per million. The largest increase that occurred was about 3.4 parts per million in 2016.
But the assumed average annual increases in greenhouse-gas concentrations through 2100 under the four RCPs are 1.1, 3.0, 5.5, and
an astounding 11.9 parts per million, respectively.

The studies generating the most alarmist predictions are the IPCC’s Special Report on Global Warming of 1.5°C and the U.S.
government’s Fourth National Climate Assessment, both of which were published in 2018. Both assume RCP8.5 as the scenario most
relevant for policy planning. The average annual greenhouse-gas increase under RCP8.5 is over five times the
annual average for 2000-2019 and almost four times the single biggest increase on record . Climatologist
Judith Curry, formerly of the Georgia Institute of Technology, describes such a scenario as “borderline impossible.”

RCP6 is certainly more realistic. It predicts a temperature increase of 3 degrees Celsius by 2100 in the average of the CMIP models.
But on average, those CMIP models overstate the documented temperature record by a factor of at least two. Ultimately,
models with a poor record of successfully accounting for past data and highly unrealistic future
greenhouse-gas concentrations should not be considered a reasonable basis for future policy
formulation.

It doesn’t cause extinction.


Zeke Hausfather & Glen P. Peters 20. *Director of climate and energy at the Breakthrough
Institute in Oakland, California. **Research director at the CICERO Center for International
Climate Research in Oslo, Norway. "Emissions – the ‘business as usual’ story is misleading".
Nature. 1-29-2020. https://www.nature.com/articles/d41586-020-00177-3

In the lead-up to the 2014 IPCC Fifth Assessment Report (AR5), researchers developed four
scenarios for what might happen to greenhouse-gas emissions and climate warming by 2100.
They gave these scenarios a catchy title: Representative Concentration Pathways (RCPs)1. One
describes a world in which global warming is kept well below 2 °C relative to pre-industrial
temperatures (as nations later pledged to do under the Paris climate agreement in 2015); it is
called RCP2.6. Another paints a dystopian future that is fossil-fuel intensive and excludes any
climate mitigation policies, leading to nearly 5 °C of warming by the end of the century2,3. That
one is named RCP8.5.

RCP8.5 was intended to explore an unlikely high-risk future2. But it has been widely used by
some experts, policymakers and the media as something else entirely: as a likely ‘business as
usual’ outcome. A sizeable portion of the literature on climate impacts refers to RCP8.5 as
business as usual, implying that it is probable in the absence of stringent climate mitigation.
The media then often amplifies this message, sometimes without communicating the nuances.
This results in further confusion regarding probable emissions outcomes, because many climate
researchers are not familiar with the details of these scenarios in the energy-modelling
literature.

This is particularly problematic when the worst-case scenario is contrasted with the most
optimistic one, especially in high-profile scholarly work. This includes studies by the IPCC, such
as AR5 and last year’s special report on the impact of climate change on the ocean and
cryosphere4. The focus becomes the extremes, rather than the multitude of more likely
pathways in between.

Happily — and that’s a word we climatologists rarely get to use — the world imagined in RCP8.5
is one that, in our view, becomes increasingly implausible with every passing year5. Emission
pathways to get to RCP8.5 generally require an unprecedented fivefold increase in coal use by
the end of the century, an amount larger than some estimates of recoverable coal reserves6. It
is thought that global coal use peaked in 2013, and although increases are still possible, many
energy forecasts expect it to flatline over the next few decades7. Furthermore, the falling cost
of clean energy sources is a trend that is unlikely to reverse, even in the absence of new climate
policies7.

Assessment of current policies suggests that the world is on course for around 3 °C of warming
above pre-industrial levels by the end of the century — still a catastrophic outcome, but a long
way from 5 °C7,8. We cannot settle for 3 °C; nor should we dismiss progress.

Plan for progress

Some researchers argue that RCP8.5 could be more likely than was originally proposed. This is
because some important feedback effects — such as the release of greenhouse gases from
thawing permafrost9,10 — might be much larger than has been estimated by current climate
models. These researchers point out that current emissions are in line with such a worst-case
scenario11. Yet, in our view, reports of emissions over the past decade suggest that they are
actually closer to those in the median scenarios7. We contend that these critics are looking at
the extremes and assuming that all the dice are loaded with the worst outcomes.

Asking ‘what’s the worst that could happen?’ is a helpful exercise. It flags potential risks that
emerge only at the extremes. RCP8.5 was a useful way to benchmark climate models over an
extended period of time, by keeping future scenarios consistent. Perhaps it is for these reasons
that the climate-modelling community suggested RCP8.5 “should be considered the highest
priority”12.

We must all — from physical scientists and climate-impact modellers to communicators and
policymakers — stop presenting the worst-case scenario as the most likely one. Overstating the
likelihood of extreme climate impacts can make mitigation seem harder than it actually is. This
could lead to defeatism, because the problem is perceived as being out of control and
unsolvable. Pressingly, it might result in poor planning, whereas a more realistic range of
baseline scenarios will strengthen the assessment of climate risk.

No climate impact
Zycher 21 --- Benjamin Zycher is a resident scholar at the American Enterprise Institute,
doctorate in economics from UCLA, a Master in Public Policy from the University of California,
Berkeley, and a Bachelor of Arts in political science from UCLA, “The Case for Climate-Change
Realism”, National Affairs, Summer 2021,
https://www.nationalaffairs.com/publications/detail/the-case-for-climate-change-realism

Beyond exhibiting extreme overconfidence in a cherry-picked analysis of climate-change causes, politicians


and activists frequently ground their alarmism in frightening predictions about consequences that are
likewise far from certain. This is not only true within the very new (and still quite unreliable) field of predictive climate
science; it is true even in the context of ongoing climate phenomena. Indeed, politicians and journalists frequently
characterize dramatic or unusual climate phenomena as the product of anthropogenic climate
change, yet there is little evidence to support those claims
For one thing, there is no observable upward trend in the number of "hot" days between 1895 and 2017; 11
of the 12 years with the highest number of such days occurred before 1960. Since 2005, NOAA has maintained the
U.S. Climate Reference Network, comprising 114 meticulously maintained temperature stations spaced more or less
uniformly across the lower 48 states, along with 21 stations in Alaska and two stations in Hawaii. They are placed to avoid heat-
island effects and other such distortions as much as possible. The
reported data show no increase in average
temperatures over the available 2005-2020 period. In addition, a recent reconstruction of global
temperatures over the past 1 million years — created using data from ice-sheet formations — shows that
there is nothing unusual about the current warm period.

Rising sea levels are another frequently cited example of impending climate crisis. And yet sea levels have been rising
since at least the mid-19th century . This rise is tied closely with the end of the Little Ice Age that occurred not long
before, which led to a rise in global temperatures, some melting of sea ice, and a thermal expansion of sea water. There is some
evidence showing an acceleration in sea-level rise beginning in the early 1990s: Satellite measurements of sea levels began in 1992
and show a sea-level rise of about 3.2 millimeters per year between 1993 and 2010. Before 1992, when sea levels were measured
with tidal gauges, the data showed an increase of about 1.7 millimeters per year on average from 1901 to 1990.

But because the datasets are from two different sources — satellite measurements versus tidal gauges — they are
not directly comparable, and therefore they cannotbe interpreted as showing an acceleration in sea-level
rises. Moreover, the period beginning in 1993 is short in terms of global climate phenomena. Since sea levels have risen at a
constant rate, remained constant, or even fallen during similar relatively short periods, inferences drawn from them are
problematic. It is of course possible there has been an acceleration in sea-level rise, but even still, it would not be clear whether such
a development stemmed primarily from anthropogenic or natural causes; clearly, both processes are relevant.

A study of changes in Arctic and Antarctic sea ice yields very different inferences . Since 1979, Arctic
sea ice has declined relative to the 30-year average (again, the degree to which this is the result of anthropogenic factors is
not known). Meanwhile, Antarctic sea ice has been growing relative to the 30-year average, and the global sea-
ice total has remained roughly constant since 1979.

Extreme weather occurrences are likewise used as evidence of an ongoing climate crisis, but again, a study of the available data
undercuts that assessment. U.S. tornado activity shows either no increase or a downward trend since 1954.
Data on tropical storms, hurricanes, and accumulated cyclone energy (a wind-speed index measuring the
overall strength of a given hurricane season) reveal little change since satellite measurements of the
phenomena began in the early 1970s. The number of wildfires in the United States shows no upward trend since 1985,
and global acreage burned has declined over past decades. The Palmer Drought Severity Index shows no trend since 1895. And the
IPCC's Fifth Assessment Report, published in 2014, displays substantial divergence between its discussion of the historical evidence
on droughts and the projections on future droughts yielded by its climate models. Simply put, the available data do not support the
ubiquitous assertions about the causal link between greenhouse-gas accumulation, temperature change, and extreme weather
events and conditions.

Unable to demonstrate that observed climate trends are due to anthropogenic climate change — or even that these events are
particularly unusual or concerning — climate
catastrophists will often turn to dire predictions about
prospective climate phenomena. The problem with such predictions is that they are almost
always generated by climate models driven by highly complex sets of assumptions about which
there is significant dispute. Worse, these models are notorious for failing to accurately predict
already documented changes in climate. As climatologist Patrick Michaels of the Competitive Enterprise Institute notes:

During all periods from 10 years (2006-2015) to 65 (1951-2015) years in length, the observed temperature trend
lies in the lower half of the collection of climate model simulations, and for several periods it lies very close
(or even below) the 2.5th percentile of all the model runs. Over shorter periods, such as the last two decades, a plethora of
mechanisms have been put forth to explain the observed/modeled divergence, but none do so completely and many of the
explanations are inconsistent with each other.
Similarly, climatologist John Christy of the University of Alabama in Huntsville observes that almost all of the 102 climate
models incorporated into the Coupled Model Intercomparison Project (CMIP) — a tracking effort conducted by the Lawrence
Livermore National Laboratory — overstate past and current temperature trends by a factor of two to
three, and at times even more. It seems axiomatic to say we should not rely on climate models that are unable to
predict the past or the present to make predictions about the distant future.

The overall temperature trend is not the only parameter the models predict poorly. As an example, every CMIP climate
model predicts that increases in atmospheric concentrations of greenhouse gas should create an
enhanced heating effect in the mid-troposphere over the tropics — that is, at an altitude over the tropics of about 30,000-
40,000 feet. The underlying climatology is simple: Most of the tropics is ocean, and as increases in greenhouse-gas concentrations
warm the Earth slightly, there should be an increase in the evaporation of ocean water in this region. When the water vapor rises
into the mid-troposphere, it condenses, releasing heat. And yet
the satellites cannot find this heating effect — a
reality suggesting that our understanding of climate and atmospheric phenomena is not as
robust as many seem to assume.

The poor predictive record of mainstream climate models is exacerbated by the tendency of the
IPCC and U.S. government agencies to assume highly unrealistic future increases in greenhouse-
gas concentrations. The IPCC's 2014 Fifth Assessment Report, for example, uses four alternative "representative
concentration pathways" to outline scenarios of increased greenhouse-gas concentrations yielding anthropogenic warming. These
scenarios are known as RCP2.6, RCP4.5, RCP6, and RCP8.5. Since 1950, the average annual increase in greenhouse-gas
concentrations has been about 1.6 parts per million. The average annual increase from 1985 to 2019 was about 1.9 parts per million,
and from 2000 to 2019, it was about 2.2 parts per million. The largest increase that occurred was about 3.4 parts per million in 2016.
But the assumed average annual increases in greenhouse-gas concentrations through 2100 under the four RCPs are 1.1, 3.0, 5.5, and
an astounding 11.9 parts per million, respectively.

The studies generating the most alarmist predictions are the IPCC's Special Report on Global Warming of 1.5°C
and the U.S. government's Fourth National Climate Assessment, both of which were published in 2018. Both assume
RCP8.5 as the scenario most relevant for policy planning. The average annual greenhouse-gas
increase under RCP8.5 is over five times the annual average for 2000-2019 and almost four times the single
biggest increase on record. Climatologist Judith Curry, formerly of the Georgia Institute of Technology, describes such a
scenario as "borderline impossible."
RCP6 is certainly more realistic. It predicts a temperature increase of 3 degrees Celsius by 2100 in the average of the CMIP models.
But on average, those CMIP models overstate the documented temperature record by a factor of at least two. Ultimately,
models with a poor record of successfully accounting for past data and highly unrealistic future
greenhouse-gas concentrations should not be considered a reasonable basis for future policy
formulation.

No climate impact.
Michael Shellenberger 20, Founder and President of Environmental Progress and Co-Founder
of the Breakthrough Institute, “Why I Believe Climate Change Is Not the End of the World”,
Quillette, 7/8/2020, https://quillette.com/2020/07/08/why-i-believe-climate-change-is-not-the-
end-of-the-world/
What the IPCC had actually written in its 2018 report and press release was that in order to have a good chance of limiting warming
to 1.5 degrees Celsius from preindustrial times, carbon emissions needed to decline 45 percent by 2030. The
IPCC did not say
the world would end, nor that civilization would collapse, if temperatures rose above 1.5
degrees Celsius.
Scientists had a similarly negative reaction to the extreme claims made by Extinction Rebellion. Stanford University atmospheric
scientist Ken Caldeira, one of the first scientists to raise the alarm about ocean acidification, stressed that “ while
many
species are threatened with extinction, climate change does not threaten human extinction.”
MIT climate scientist Kerry Emanuel told me, “I don’t have much patience for the apocalypse criers. I don’t think it’s helpful to
describe it as an apocalypse.”

An AOC spokesperson told Axios, “We can quibble about the phraseology, whether it’s existential or cataclysmic.” But, he added,
“We’re seeing lots of [climate change–related] problems that are already impacting lives.”

But if that’s the case, the


impact is dwarfed by the 92 percent decline in the decadal death toll from
natural disasters since its peak in the 1920s . In that decade, 5.4 million people died from natural disasters. In the
2010s, just 0.4 million did. Moreover, that decline occurred during a period when the global population nearly quadrupled.

In fact, both rich and poor societies have become far less vulnerable to extreme weather events in recent decades. In 2019, the
journal Global Environmental Change published a major study that found death rates and economic damage dropped by 80 to 90
percent during the last four decades, from the 1980s to the present.

While global sea levels rose 7.5 inches (0.19 meters) between 1901 and 2010, the IPCC estimates sea levels will rise as much as 2.2
feet (0.66 meters) by 2100 in its medium scenario, and by 2.7 feet (0.83 meters) in its high-end scenario. Even
if these
predictions prove to be significant underestimates, the slow pace of sea level rise will likely
allow societies ample time for adaptation.

We have good examples of successful adaptation to sea level rise. The Netherlands, for instance,
became a wealthy nation despite having one-third of its landmass below sea level , including areas a full
seven meters below sea level, as a result of the gradual sinking of its landscapes.

And today, our capability for modifying environments is far greater than ever before. Dutch experts
today are already working with the government of Bangladesh to prepare for rising sea levels.

What about fires? Dr. Jon Keeley, a US Geological Survey scientist in California who has researched the topic for 40 years, told me, “We’ve looked at the history of climate and
fire throughout the whole state, and through much of the state, particularly the western half of the state, we don’t see any relationship between past climates and the amount
of area burned in any given year.”

In 2017, Keeley and a team of scientists modeled 37 different regions across the United States and found that “humans may not only influence fire regimes but their presence
can actually override, or swamp out, the effects of climate.” Keeley’s team found that the only statistically significant factors for the frequency and severity of fires on an annual
basis were population and proximity to development.

As for the Amazon, the New York Times reported, correctly, that “[the 2019] fires were not caused by climate change.”

In early 2020, scientists challenged the notion that rising carbon dioxide levels in the ocean were
making coral reef fish species oblivious to predators. The seven scientists who published their study in the journal
Nature had, three years earlier, raised questions about the marine biologist who had made such claims in the journal Science in
2016. After an investigation, James Cook University in Australia concluded that the biologist had fabricated her data.

When it comes to food production , the Food and Agriculture Organization of the United Nations (FAO) concludes
that crop yields will increase significantly, under a wide range of climate change scenarios. Humans today produce
enough food for ten billion people, a 25 percent surplus, and experts believe we will produce even more despite climate change.

Food production, the FAO finds, will depend more on access to tractors, irrigation, and fertilizer than on climate change, just as it did in the last century. The FAO projects that
even farmers in the poorest regions today, like sub-Saharan Africa, may see 40 percent crop yield increases from technological improvements alone.

In its fourth assessment report, the IPCC projected that by 2100, the global economy would be three to six times larger than it is
today, and that the costs of adapting to a high (4 degrees Celsius) temperature rise would reduce gross domestic product (GDP) just
4.5 percent.

Does any of that really sound like the end of the world?
The apocalypse now

Anyone interested in seeing the end of the world up close and in person could do little worse than to visit the Democratic Republic of the Congo in central Africa. The Congo has a way of putting first-world
prophecies of climate apocalypse into perspective. I traveled there in December 2014 to study the impact of widespread wood fuel use on people and wildlife, particularly on the fabled mountain gorillas.
Within minutes of crossing from the neighboring country of Rwanda into the Congolese city of Goma, I was taken aback by the extreme poverty and chaos: children as young as two years old perched on the
handlebars of motorcycles flying past us on roads pockmarked with giant potholes; tin-roofed shanties as homes; people crammed like prisoners into tiny buses with bars over the windows; trash everywhere;
giant mounds of cooled lava on the sides of the road, reminders of the volcanic anger just beneath the Earth’s surface.

In the 1990s and again in the early 2000s, Congo was the epicenter of the Great African War, the deadliest conflict since World War II, which involved nine African countries and resulted in the deaths of three to
five million people, mostly because of disease and starvation. Another two million people were displaced from their homes or sought asylum in neighboring countries. Hundreds of thousands of people, women,
and men, adults, and children, were raped, sometimes more than once, by different armed groups.

During our time in the Congo, armed militias roaming the countryside had been killing villagers, including children, with machetes. Some blamed Al-Shabaab terrorists coming in from Uganda, but nobody took
credit for the attacks. The violence appeared unconnected to any military or strategic objective. The national military, police, and United Nations Peacekeeping Forces, about 6,000 soldiers, were either unable or
unwilling to do anything about the terrorist attacks.

“Do not travel,” the United States Department of State said, bluntly, of the Congo on its website. “Violent crime, such as armed robbery, armed home invasion, and assault, while rare compared to petty crime, is
not uncommon, and local police lack the resources to respond effectively to serious crime. Assailants may pose as police or security agents.”

One reason I felt safe traveling to the eastern Congo and bringing my wife, Helen, was that the actor Ben Affleck had visited several times and even started a charity there to support economic development. If the
eastern Congo was safe enough for a Hollywood celebrity, I reasoned, it would be safe enough for Helen and me.

To make sure, I hired Affleck’s guide, translator, and “fixer,” Caleb Kabanda, a Congolese man with a reputation for keeping his clients safe. We spoke on the telephone before I arrived. I told Caleb I wanted to
study the relationship between energy scarcity and conservation. Referring to the North Kivu province capital of Goma, the sixth most populated city in the Congo, Caleb asked, “Can you imagine a city of nearly
two million people relying on wood for energy? It’s crazy!”

Ninety-eight percent of people in eastern Congo rely on wood and charcoal as their primary energy for cooking. In the Congo as a whole, nine out of 10 of its nearly 92 million people do, while just one out of five
has any access to electricity. The entire country relies on just 1,500 megawatts of electricity, which is about as much as a city of one million requires in developed nations.

The main road Caleb and I used to travel from Goma to the communities around Virunga Park had recently been paved, but there was little else in the way of infrastructure. Most roads were dirt roads. When it
rained, both the paved and unpaved roads and the surrounding homes were flooded because there was no flood control system. I was reminded of how much we take for granted in developed nations. We
practically forget that the gutters, canals, and culverts, which capture and divert water away from our homes, even exist.

Is climate change playing a role in Congo’s ongoing instability? If it is, it’s outweighed by other factors. Climate change, noted a large team of researchers in 2019, “has affected organized armed conflict within
countries. However, other drivers, such as low socioeconomic development and low capabilities of the state, are judged to be substantially more influential.”

There is only a barely functioning government in the Congo. When it comes to security and development, people are mostly on their own. Depending on the season, farmers suffer too much rain or not enough.
Recently, there has been flooding once every two or three years. Floods regularly destroy homes and farms.

Researchers with the Peace Research Institute Oslo note, “Demographic and environmental variables have a very moderate effect
on the risk of civil conflict.” The IPCC agrees. “There
is robust evidence of disasters displacing people
worldwide, but limited evidence that climate change or sea-level rise is the direct cause.”
Lack of infrastructure plus scarcity of clean water brings disease. As a result, Congo suffers some of the highest rates of cholera,
malaria, yellow fever, and other preventable diseases in the world.

“Lower levels of GDP are the most important predictor of armed conflict,” write the Oslo researchers,
who add, “Our results show that resource scarcity affects the risk of conflict less in low-income
states than in wealthier states.”

If resources determined a nation’s fate, then resource-scarce Japan would be poor and at war
while the Congo would be rich and at peace . Congo is astonishingly rich when it comes to its lands, minerals, forests,
oil, and gas.
There are many reasons why the Congo is so dysfunctional. It is massive—it is the second largest African nation in area, behind only Algeria—and difficult to govern as a single country. It was colonized by the
Belgians, who fled the country in the early 1960s without establishing strong government institutions, like an independent judiciary and a military.

Is it overpopulated? The population of Eastern Congo has doubled since the 1950s and 1960s. But the main factor is technological: the same area could produce much more food and support many more people if
there were roads, fertilizers, and tractors.

The Congo is a victim of geography, colonialism, and terrible post-colonial governments. Its economy grew from $7.4 billion in 2001 to $38 billion in 2017, but the annual per capita income of $561 is one of the
lowest in the world, leading many to conclude that much of the money that should flow to the people is being stolen.

For the last 20 years, the Rwandan government has been taking minerals from its neighbor and exporting them as its own. To protect and obscure its activities, Rwanda has financed and overseen the low-intensity
conflict in Eastern Congo, according to experts.

There were free elections in 2006 and optimism around the new president, Joseph Kabila, but he proved as corrupt as past leaders. After being re-elected in 2011, he stayed in power until 2018, when he installed
a candidate who won just 19 percent of the vote as compared to the opposition candidate, who won 59 percent. As such, Kabila and his allies in the legislature appear to be governing behind the scenes.

Low levels of GDP, not climate change, are correlated with armed conflict, such as in the Congo

Billions won’t die

It won’t cause extinction


Shannon Osaka 20, MPhil in Nature, Society and Environmental Governance from the
University of Oxford, AB in Environmental Science from Princeton University, Reporter at Grist,
and Kate Yoder, Associate Editor at Grist, Former Publishing Fellow at Goshen College, “Climate
change Is A Catastrophe. But Is It An ‘Existential Threat’?”, Grist Magazine, 3/3/2020,
https://grist.org/climate/is-the-climate-crisis-an-existential-threat-scientists-weigh-in/
That’s part of the reason that climate scientists have criticized activist rhetoric that humans have until 2030 to stop dangerous
climate change. Sure, it
might soon be too late to meet some of our most ambitious climate goals, such as keeping
warming to 1.5 degrees Celsius, yet any amount of action in the present will help create a less overheated planet in the
future.

So is climate change an existential threat? According to the scientific definition, likely not. As far as
scientists can predict, a warming planet won’t cause changes so severe that they threaten the
survival of the entire human species. And there is evidence that some of our most pessimistic
projections may be exaggerated (though a Hothouse Earth wouldn’t be so fun).
That’s the sort-of good news. The bad news is that saying climate change won’t kill all of humanity is … pretty much the lowest bar
possible.

In the meantime, politicians should


be careful not to deploy the term “existential threat” too loosely.
In all likelihood, they don’t mean that human life on the planet will go extinct. They mean that climate
change is a really, really big deal and must be taken seriously. That should be sufficient reason to act.

Warming doesn’t cause extinction---new studies.


Nordhaus 20 Ted Nordhaus, an American author, environmental policy expert, and the director
of research at The Breakthrough Institute, citing new climate change forecasts. [Ignore the Fake
Climate Debate, 1-23-2020, https://www.wsj.com/articles/ignore-the-fake-climate-debate-
11579795816]//BPS
Beyond the headlines and social media, where Greta Thunberg, Donald Trump and the online armies of climate
“alarmists” and “deniers” do battle, there is a real climate debate bubbling along in scientific journals,
conferences and, occasionally, even in the halls of Congress. It gets a lot less attention than the boisterous and fake debate that
dominates our public discourse, but it is much more relevant to how the world might actually address the problem. In the real climate
debate, no one denies the relationship between human emissions of greenhouse gases and a warming climate. Instead, the
disagreement comes down to different views of climate risk in the face of multiple, cascading uncertainties. On one side of the debate
are optimists, who believe that, with improving technology and greater affluence, our societies will prove quite adaptable to a
changing climate. On the other side are pessimists, who are more concerned about the risks associated with rapid, large-scale and
poorly understood transformations of the climate system. But mostpessimists do not believe that runaway climate
change or a hothouse earth are plausible scenarios, much less that human extinction is
imminent. And most optimists recognize a need for policies to address climate change, even if they don’t support the radical
measures that Ms. Thunberg and others have demanded. In the fake climate debate, both sides agree that economic growth and
reduced emissions vary inversely; it’s a zero-sum game. In the real debate, the relationship is much more complicated. Long-term
economic growth is associated with both rising per capita energy consumption and slower population growth. For this reason, as the
world continues to get richer, higher per capita energy consumption is likely to be offset by a lower population. A
richer world
will also likely be more technologically advanced, which means that energy consumption should be less
carbon-intensive than it would be in a poorer, less technologically advanced future. In fact, a number of the high-emissions
scenarios produced by the United Nations Intergovernmental Panel on Climate Change involve futures in which the world is relatively
poor and populous and less technologically advanced. Affluent, developed societies are also much better equipped to respond to
climate extremes and natural disasters. That’s why natural disasters kill and displace many more people in poor societies than in rich
ones. It’s not just seawalls and flood channels that make us resilient; it’s air conditioning and refrigeration, modern transportation and
communications networks, early warning systems, first responders and public health bureaucracies. New
research published in
the journal Global Environmental Change finds that global economic growth over the last decade has reduced
climate mortality by a factor of five, with the greatest benefits documented in the poorest nations. In low-lying
Bangladesh, 300,000 people died in Cyclone Bhola in 1970, when 80% of the population lived in extreme poverty. In 2019, with less
than 20% of the population living in extreme poverty, Cyclone Fani killed just five people. “Poor nations are most vulnerable to a
changing climate. The fastest way to reduce that vulnerability is through economic development.” So while it is true that poor nations
are most vulnerable to a changing climate, it is also true that the fastest way to reduce that vulnerability is through economic
development, which requires infrastructure and industrialization. Those activities, in turn, require cement, steel, process heat and
chemical inputs, all of which are impossible to produce today without fossil fuels. For this and other reasons, the world is unlikely to
cut emissions fast enough to stabilize global temperatures at less than 2 degrees above pre-industrial levels, the long-standing
international target, much less 1.5 degrees, as many activists now demand. But recent forecasts also suggest that many of
the worst-case climate scenarios produced in the last decade, which assumed unbounded economic growth and fossil-
fuel development, are also very unlikely. There is still substantial uncertainty about how sensitive global
temperatures will be to higher emissions over the long-term. But the best estimates now suggest that the world is
on track for 3 degrees of warming by the end of this century, not 4 or 5 degrees as was once feared. That is due in part
to slower economic growth in the wake of the global financial crisis, but also to decades of technology policy and energy-
modernization efforts. “We have better and cleaner technologies available today because policy-makers in the U.S. and elsewhere set
out to develop those technologies.” The energy
intensity of the global economy continues to fall. Lower-carbon
natural gas has displaced coal as the primary source of new fossil energy. The falling cost of wind and solar
energy has begun to have an effect on the growth of fossil fuels. Even nuclear energy has made a modest
comeback in Asia. a major cost-slashing drug benefit, risking liberal defections and potentially
alienating moderates if deficit estimates

The risk of extinction from climate is very low


Farquhar et al., Oxford University PhD Candidate, ’17
[Sebastian Farquhar, PhD Candidate, Oxford University, with 5 co-authors from Oxford
University, EXISTENTIAL RISK: DIPLOMACY AND GOVERNANCE, Global Priorities Project, 2017, p.
8]

The most likely levels of global warming are very unlikely to cause human extinction.15 The
existential risks of climate change instead stem from tail risk climate change – the low
probability of extreme levels of warming – and interaction with other sources of risk. It is
impossible to say with confidence at what point global warming would become severe enough
to pose an existential threat. Research has suggested that warming of 11-12°C would render
most of the planet uninhabitable,16 and would completely devastate agriculture.17 This would
pose an extreme threat to human civilisation as we know it.18 Warming of around 7°C or more
could potentially produce conflict and instability on such a scale that the indirect effects could
be an existential risk, although it is extremely uncertain how likely such scenarios are.19
Moreover, the timescales over which such changes might happen could mean that humanity is
able to adapt enough to avoid extinction in even very extreme scenarios

Not existential AND their models fail.


Piper 19---Kelsey Piper, citing John Halstead climate change mitigation researcher at the
Founders Pledge. [Is climate change an "existential threat" — or just a catastrophic one? 6-28-
2019, https://www.vox.com/future-perfect/2019/6/13/18660548/climate-change-human-
civilization-existential-risk]

I also talked to some researchers who study existential risks, like John Halstead, who studies climate change mitigation at the
philanthropic advising group Founders Pledge, and who has
a detailed online analysis of all the (strikingly few)
climate change papers that address existential risk (his analysis has not been peer-reviewed yet).
Halstead looks into the models of potential temperature increases that Breakthrough’s report highlights. The models show a
surprisingly large
chance of extreme degrees of warming. Halstead points out that in many papers, this is the
result of the simplistic form of statistical modeling used. Other papers have made a convincing case that this
form of statistical modeling is an irresponsible way to reason about climate change, and that the
dire projections rest on a statistical method that is widely understood to be a bad approach for
that question.
Further, “the carbon effects don’t seem to pose an existential risk,” he told me. “People use 10 degrees as an illustrative example”
— of a nightmare scenario where climate change goes much, much worse than expected in every respect — “and looking at it,
even 10 degrees would not really cause the collapse of industrial civilization,” though the effects
would still be pretty horrifying. (On the question of whether an increase of 10 degrees would be survivable, there is much debate.)

Does it matter if climate change is an existential risk or just a really bad one?

That last distinction Halstead draws — of climate change as being awful but not quite an existential threat — is a controversial one.

That’s where a difference in worldviews looms large: Existential risk researchers are extremely concerned with the difference
between the annihilation of humanity and mass casualties that humanity can survive. To everyone else, those two outcomes seem
pretty similar.

To academics in philosophy and public policy who study the future of humankind, an existential risk is a very specific thing: a disaster
that destroys all future human potential and ensures that no generations of humans will ever leave Earth and explore our universe.
The death of 7 billion people is, of course, an unimaginable tragedy. But researchers who study existential risks argue that the
annihilation of humanity is actually much, much worse than that. Not only do we lose existing people, but we lose all the people who
could otherwise have had the chance to exist.

In this worldview, 7
billion humans dying is not just seven times as bad as 1 billion humans dying —
it’s much worse. This style of thinking seems plausible enough when you think about past tragedies; the Black Death, which
killed at least a tenth of all humans alive at the time, was not one-tenth as bad as a hypothetical plague that wiped us all out.

Most people don’t think about existential risks much. Many


analyses of climate change — including the report Vice
based its article on — treat
the deaths of a billion people and the extinction of humanity as pretty
similar outcomes, interchangeably using descriptions of catastrophes that would kill hundreds of millions and
catastrophes that’d kill us all. And the existential risk conversation can come across as tone-deaf and off-puttingly academic, as if it’s
no big deal if merely hundreds of millions of people will die due to climate change.

Obviously, and this needs to be stressed, climate change is a big deal either way. But there are differences between catastrophe and
extinction. If the models tell us that all humans are going to die, then extreme solutions — which might save us, or might have
unprecedented, catastrophic negative consequences — might be worth trying. Think of plans to release aerosols into the
atmosphere to reflect sunlight and cool the planet back down in the manner that volcanic explosions do. It’d be an enormous
endeavor with significant potential downsides (we don’t even yet know all the risks it might pose), but if the alternative is extinction
then those risks would be worth taking.

But if the models tell us that climate change is devastating but survivable, as most models show, then those
last-ditch solutions should perhaps stay in the toolkit for now.

Then there’s the morale argument. Defenders of overstating the risks of climate change point out that, well, understating them isn’t
working. The IPCC may have chosen to maintain optimism about containing warming to 2 degrees Celsius in the hopes that it’d spur
people to action, but if so, it hasn’t really worked. Maybe alarmism will achieve what optimism couldn’t.

That’s how Spratt sees it. “Alarmism?” he said to me. “Should we be alarmed about where we’re going? Of course we should be.”

Swedish teenager Greta Thunberg has taken an arguably alarmist bent in her advocacy for climate solutions in the EU, saying, “Our
house is on fire. I don’t want your hope. ... I want you to panic.” She’s gotten strong reactions from politicians, suggesting that at
least sometimes a relentless focus on the severity of the emergency can get results.

So where does this all leave us? It’s worthwhile to look into the worst-case scenarios, and even to highlight and emphasize them. But
it’s important to accurately represent current climate consensus along the way. It’s hard to see how we
solve a problem we have widespread misapprehensions about in either direction, and when a warning is overstated or inaccurate, it
may sow more confusion than inspiration.

Climate change won’t kill us all. That matters. Yet it’s one of the biggest challenges ahead of us, and the results of our
failure to act will be devastating. That message — the most accurate message we’ve got — will have to stand on its own.

Extinction requires 12 degrees


Sebastian Farquhar 17, leads the Global Priorities Project (GPP) at the Centre for Effective
Altruism, et al., 2017, “Existential Risk: Diplomacy and Governance,”
https://www.fhi.ox.ac.uk/wp-content/uploads/Existential-Risks-2017-01-23.pdf
The most likely levels of global warming are very unlikely to cause human extinction.15 The
existential risks of climate change instead stem from tail risk climate change – the low probability of
extreme levels of warming – and interaction with other sources of risk. It is impossible to say with
confidence at what point global warming would become severe enough to pose an existential threat. Research has suggested that
warming of 11-12°C would render most of the planet uninhabitable,16 and would completely devastate
agriculture.17 This would pose an extreme threat to human civilisation as we know it.18 Warming of around 7°C or more could
potentially produce conflict and instability on such a scale that the indirect effects could be an existential risk, although it is extremely
uncertain how likely such scenarios are.19 Moreover, the
timescales over which such changes might happen
could mean that humanity is able to adapt enough to avoid extinction in even very extreme
scenarios.
The probability of these levels of warming depends on eventual greenhouse gas concentrations. According to some experts, unless
strong action is taken soon by major emitters, it is likely that we will pursue a medium-high emissions
pathway.20 If we do, the chance of extreme warming is highly uncertain but appears non-negligible. Current concentrations of
greenhouse gases are higher than they have been for hundreds of thousands of years,21 which means that there are significant
unknown unknowns about how the climate system will respond. Particularly concerning is the risk of positive feedback loops, such as
the release of vast amounts of methane from melting of the arctic permafrost, which would cause rapid and disastrous warming.22 The
economists Gernot Wagner and Martin Weitzman have used IPCC figures (which do not include modelling of feedback loops such as
those from melting permafrost) to estimate that if
we continue to pursue a medium-high emissions pathway, the
probability of eventual warming of 6°C is around 10%,23 and of 10°C is around 3%.24 These
estimates are of course highly uncertain.
It is likely that the world will take action against climate change once it begins to impose large
costs on human society, long before there is warming of 10°C. Unfortunately, there is significant inertia in the
climate system: there is a 25 to 50 year lag between CO2 emissions and eventual warming,25 and it is expected that 40% of the peak
concentration of CO2 will remain in the atmosphere 1,000 years after the peak is reached.26 Consequently, it is impossible to reduce
temperatures quickly by reducing CO2 emissions. If the world does start to face costly warming, the international community will
therefore face strong incentives to find other ways to reduce global temperatures.

Human resiliency checks


Seth Shostak 19, senior astronomer at the SETI Institute, Ph.D. in astrophysics from the
California Institute of Technology, 9/26/19, “Humanity will outlive climate change and nuclear
war, no matter how bad it gets,” https://qz.com/1716016/why-humans-will-outlive-climate-
change-and-nuclear-war/
That’s because apocalypse
is well-nigh impossible. We’re like ants: We’re vulnerable to being killed
en masse, but the species will survive because, like ants, we’re numerous and dispersed. No matter
how many supposedly humanity-ending threats you hurl—literally, in the case of ballistic missiles—humans
will continue to crawl the Earth. This comfort may be cold, but it’s still a fact.
Consider the enumerated threats, beginning with a pandemic. A century ago, the Spanish flu caused a staggering 20-50 million deaths,
more than WWI. Still, the toll amounted to less than 3% of the world population. As ghastly as it was, the Spanish flu didn’t even rise
to the level of decimation; viruses can slay, but they can’t annihilate. If past mortality is prologue, a millennial has less chance of
succumbing to a new pandemic than dying in an auto accident.

OK, well whatabout climate change, now recognized as a non-hoax by 75% of Americans? It’s not the heat per se that
will waste us, but the knock-on effects. Low-lying nations will turn into aquariums and Caribbean countries will be
pummeled and pelted by savage storms. The economic disruptions will be severe, including ones you might not have considered, such
as the damage to your investment portfolio: Five of the world’s 10 largest companies are in the fossil-fuel business.

The World Health Organization estimates that between 2030 and 2050, 5 million people
will perish due to the
consequences of climate change. Nonetheless, if aliens visit Earth in 2050, they’ll still find billions of
humans. Indeed, probably more than walk the planet today.

Warming won’t be catastrophic


Dr. Benjamin Zycher 21, Senior Fellow at the American Enterprise Institute, Doctorate in
Economics from UCLA, Master in Public Policy from the University of California, Berkeley, and
Bachelor of Arts in Political Science from UCLA, Former Senior Economist at the RAND
Corporation, Former Adjunct Professor of Economics at the University of California, Los Angeles
(UCLA) and at the California State University Channel Islands, and Former Senior Economist at
the Jet Propulsion Laboratory, California Institute of Technology, “The Case for Climate Change
Realism”, 6/21/2021, https://www.aei.org/articles/the-case-for-climate-change-realism/
Unable to demonstrate that observed climate trends are due to anthropogenic climate change — or even that these events are particularly unusual or

concerning — climate catastrophists will often turn to dire predictions about prospective climate phenomena. The
problem with such predictions is that they are almost always generated by climate models driven by
highly complex sets of assumptions about which there is significant dispute . Worse, these
models are notorious for failing to accurately predict already documented changes in climate.
As climatologist Patrick Michaels of the Competitive Enterprise Institute notes:

During all periods from 10 years (2006-2015) to 65 (1951-2015) years in length, the observed temp erature trend lies
in the lower half of the collection of climate model simulations, and for several periods it lies very
close (or even below) the 2.5th percentile of all the model runs. Over shorter periods, such as the last two decades, a
plethora of mechanisms have been put forth to explain the observed/modeled divergence, but none do so completely and many of the explanations
are inconsistent with each other.

Similarly, climatologist John Christy of the University of Alabama in Huntsville observes that almost all of the 102 climate models
incorporated into the Coupled Model Intercomparison Project (CMIP) — a tracking effort conducted by the Lawrence Livermore National Laboratory — 

overstate past and current temperature trends by a factor of two to three , and at times even
more . It seems axiomatic to say we should not rely on climate models that are unable to predict the past or the present to
make predictions about the distant future.

The overall temperature trend is not the only parameter the models predict poorly. As an example, every CMIP climate
model predicts that increases in atmospheric concentrations of greenhouse gas should create
an enhanced heating effect in the mid-troposphere over the tropics — that is, at an altitude over the tropics of
about 30,000-40,000 feet. The underlying climatology is simple: Most of the tropics is ocean, and as increases in greenhouse-gas
concentrations warm the Earth slightly, there should be an increase in the evaporation of ocean water in this region. When the
water vapor rises into the mid-troposphere, it condenses, releasing heat. And yet the satellites
cannot find this
heating effect — a reality suggesting that our understanding of climate and atmospheric
phenomena is not as robust as many seem to assume.
The poor predictive record of mainstream climate models is exacerbated by the tendency of
the IPCC and U.S. government agencies to assume highly unrealistic future increases in
greenhouse-gas concentrations. The IPCC’s 2014 Fifth Assessment Report, for example, uses four alternative
“representative concentration pathways” to outline scenarios of increased greenhouse-gas concentrations yielding anthropogenic
warming. These scenarios are known as RCP2.6, RCP4.5, RCP6, and RCP8.5. Since 1950, the average annual increase in greenhouse-
gas concentrations has been about 1.6 parts per million. The average annual increase from 1985 to 2019 was about 1.9 parts per
million, and from 2000 to 2019, it was about 2.2 parts per million. The largest increase that occurred was about 3.4 parts per million
in 2016. But the assumed average annual increases in greenhouse-gas concentrations through 2100 under the four RCPs are 1.1, 3.0,
5.5, and an astounding 11.9 parts per million, respectively.

The studies generating the most alarmist predictions are the IPCC’s Special Report on Global Warming of 1.5°C and the U.S.
government’s Fourth National Climate Assessment, both of which were published in 2018. Both assume RCP8.5 as the scenario most
relevant for policy planning. The average annual g reen h ouse- g as increase under RCP8.5 is over five times the
annual average for 2000-2019 and almost four times the single biggest increase on record .
Climatologist Judith Curry, formerly of the Georgia Institute of Technology, describes such a scenario as “borderline

impossible.”
RCP6 is certainly more realistic. It predicts a temperature increase of 3 degrees Celsius by 2100 in the average of the CMIP models.
But on average, those CMIP models overstate the documented temperature record by a factor of at least two. Ultimately

Extinction requires 12 degrees


Sebastian Farquhar 17, leads the Global Priorities Project (GPP) at the Centre for Effective
Altruism, et al., 2017, “Existential Risk: Diplomacy and Governance,”
https://www.fhi.ox.ac.uk/wp-content/uploads/Existential-Risks-2017-01-23.pdf

The most likely levels of global warming are very unlikely to cause human extinction.15 The
existential risks of climate change instead stem from tail risk climate change – the low probability of
extreme levels of warming – and interaction with other sources of risk. It is impossible to say with
confidence at what point global warming would become severe enough to pose an existential threat. Research has suggested that
warming of 11-12°C would render most of the planet uninhabitable,16 and would completely devastate
agriculture.17 This would pose an extreme threat to human civilisation as we know it.18 Warming of around 7°C or more could
potentially produce conflict and instability on such a scale that the indirect effects could be an existential risk, although it is
extremely uncertain how likely such scenarios are.19 Moreover, the
timescales over which such changes might
happen could mean that humanity is able to adapt enough to avoid extinction in even very
extreme scenarios.
The probability of these levels of warming depends on eventual greenhouse gas concentrations. According to some experts,
unless strong action is taken soon by major emitters , it is likely that we will pursue a medium-high
emissions pathway.20 If we do, the chance of extreme warming is highly uncertain but appears non-negligible. Current
concentrations of greenhouse gases are higher than they have been for hundreds of thousands of years,21 which means that there
are significant unknown unknowns about how the climate system will respond. Particularly concerning is the risk of positive
feedback loops, such as the release of vast amounts of methane from melting of the arctic permafrost, which would cause rapid and
disastrous warming.22 The economists Gernot Wagner and Martin Weitzman have used IPCC figures (which do not include
modelling of feedback loops such as those from melting permafrost) to estimate that if
we continue to pursue a
medium-high emissions pathway, the probability of eventual warming of 6°C is around 10%,23 and
of 10°C is around 3%.24 These estimates are of course highly uncertain.

It is likely that the world will take action against climate change once it begins to impose large
costs on human society, long before there is warming of 10°C. Unfortunately, there is significant inertia in the
climate system: there is a 25 to 50 year lag between CO2 emissions and eventual warming,25 and it is expected that 40% of the peak
concentration of CO2 will remain in the atmosphere 1,000 years after the peak is reached.26 Consequently, it is impossible to reduce
temperatures quickly by reducing CO2 emissions. If the world does start to face costly warming, the international community will
therefore face strong incentives to find other ways to reduce global temperatures.

No environment impact, emissions are inevitable, the 2 degree claim is silly


Judith Curry 19, President of Climate Forecast Applications Network (CFAN), Professor Emerita
of Earth and Atmospheric Sciences at the Georgia Institute of Technology, Ph.D. in atmospheric
science from the University of Chicago, 2/9/19, “Statement to the Committee on Natural
Resources of the United States House of Representatives,”
https://curryja.files.wordpress.com/2019/02/curry-testimony-house-natural-resources.pdf
The urgency (?) of CO2 emissions reductions

In the decades since the 1992 UNFCCC Treaty, global CO2 emissions have continued to increase, especially in developing countries.
In 2010, the world’s governments agreed that emissions need to be reduced so that global temperature increases are limited to
below 2 degrees Celsius.17 The target of 2oC (and increasingly 1.5oC)18 remains the focal point of international climate agreements
and negotiations.

The original rationale for the 2oC target is the idea that ‘tipping points’ − abrupt or nonlinear transition to
a different climate state − become likely to occur once this threshold has been crossed , with consequences
that are largely uncontrollable and beyond our management. The IPCC AR5 considered a number of potential tipping points,
including ice sheet collapse, collapse of the Atlantic overturning circulation, and permafrost
carbon release. Every single catastrophic scenario considered by the IPCC AR5 (WGII, Table 12.4) has a
rating of very unlikely or exceptionally unlikely and/or has low confidence. The only tipping
point that the IPCC considers likely in the 21st century is disappearance of Arctic summer sea ice
(which is fairly reversible, since sea ice freezes every winter).

In the absence of tipping points on the timescale of the 21st century, the 2oC limit iss more
usefully considered by analogy to a highway speed limit :19 driving at 10 mph under the speed limit is not
automatically safe, and exceeding the limit by 10 mph is not automatically dangerous, although the faster one travels the greater the
danger from an accident. Analogously, the 2oC (or 1.5oC) limit should not be taken literally as a real
danger threshold. An analogy for considering the urgency of emissions reductions is your 401K account: if you begin making
contributions early, it will be easier to meet your retirement goals.

Nevertheless, the 2oC and 1.5oC limits are used to motivate the urgency of action to reduce CO2
emissions. At a recent UN Climate Summit, (former) Secretary-General Ban Ki-moon warned that: “Without significant cuts in
emissions by all countries, and in key sectors, the window of opportunity to stay within less than 2 degrees [of warming] will soon
close forever.”20 Actually, this
window of opportunity may remain open for quite some time. The
implications of the lower values of climate sensitivity found by Lewis and Curry21 and other recent studies is
that human caused warming is not expected to exceed the 2oC ‘danger’ level in the 21st
century. Further, there is growing evidence that the RCP8.5 scenario for future greenhouse gas
concentrations, which drives the largest amount of warming in climate model simulations, is
impossibly high, requiring a combination of numerous borderline impossible socioeconomic
scenarios.22 A slower rate of warming means there is less urgency to phase out greenhouse gas
emissions now, and more time to find ways to decarbonize the economy affordably and with a
minimum of unintended consequences. It also allows for the flexibility to revise our policies as
further information becomes available.

Is it possible that something truly dangerous and unforeseen could happen to Earth’s climate during
the 21st century? Yes it is possible, but natural climate variability (including geologic processes) may be
a more likely source of possible undesirable change than manmade warming. In any event,
attempting to avoid such a dangerous and unforeseen climate by reducing fossil fuel emissions
will be futile if natural climate and geologic processes are dominant factors . Geologic processes are an
important factor in the potential instability of the West Antarctic ice sheet that could contribute to substantial sea level rise in the
21st century.23

Under the Paris Agreement, individual countries have submitted to the UNFCCC their Nationally Determined
Contributions (NDCs). Under the Obama Administration, the U.S. NDC had a goal of reducing emissions by 28% below 2005 levels
by 2025. Apart from considerations of feasibility and cost, it has been estimated24 using the EPA MAGICC model that this
commitment will prevent 0.03oC in warming by 2100. When combined with current commitments from other
nations, only a small fraction of the projected future warming will be ameliorated by these
commitments. If climate models are indeed running too hot,25 then the amount of warming prevented would be even smaller.
Even if emissions immediately went to zero and the projections of climate models are to be
believed, the impact on the climate would not be noticeable until the 2nd half of the 21st
century. Most of the expected benefits to the climate from the UNFCCC emissions reductions policy will be realized in the 22nd
century and beyond.

Attempting to use carbon dioxide as a control knob to regulate climate on decadal to century timescales
is arguably futile. The UNFCCC emissions reductions policies have brought us to a point between a rock
and a hard place, whereby the emissions reduction policy with its extensive costs and questions
of feasibility are inadequate for making a meaningful dent in slowing down the expected
warming in the 21st century. And the real societal consequences of climate change and extreme
weather events (whether caused by manmade climate change or natural variability) remain largely unaddressed.
This is not to say that a transition away from burning fossil fuels doesn’t make sense over the course of the 21st century. People
prefer ‘clean’ over ‘dirty’ energy – provided that all other things are equal, such as reliability, security, and economy. However,
assuming that current wind and solar technologies are adequate for providing the required
amount and density of electric power for an advanced economy is misguided .26
The recent record-breaking cold outbreak in the Midwest is a stark reminder of the challenges of providing a reliable power supply in
the face of extreme weather events, where an inadequate power supply not only harms the economy, but jeopardizes lives and
public safety. Last week, central Minnesota experienced a natural gas ‘brownout,’ as Xcel Energy advised customers to turn
thermostats down to 60 degrees and avoid using hot water.27 Why? Because the wind wasn’t blowing during an exceptionally cold
period. Utilities pair natural gas plants with wind farms, where the gas plants can be ramped up and down quickly when the wind
isn’t blowing. With bitter cold temperatures and no wind, there wasn’t enough natural gas.

A transition to an electric power system driven solely by wind and solar would require a massive
amount of energy storage. While energy storage technologies are advancing, massive
deployment of cost-effective energy storage technologies is well beyond current capabilities.28
An unintended consequence of rapid deployment of wind and solar energy farms may be that
natural gas power plants become increasingly entrenched in the power supply system.

Apart from energy policy, there


are a number of land use practices related to croplands, grazing lands,
forests and wetlands that could increase the natural sequestration of carbon and have ancillary
economic and ecosystem benefits.29 These co-benefits include improved biodiversity, soil
quality, agricultural productivity and wildfire behavior modification.

In evaluating the urgency of CO2 emissions reductions, we


need to be realistic about what reducing emissions will actually
accomplish. Drastic reductions of emissions in the U.S. will not reduce global CO2 concentrations if
emissions in the developing world, particularly China and India, continue to increase. If we believe
the climate model simulations, we would not expect to see any changes in extreme weather/climate
events until late in the 21st century . The greatest impacts will be felt in the 22nd century and beyond, in terms of
reducing sea level rise and ocean acidification.

Resilience, anti-fragility and thrivability

Given that emissions reductions policies are very costly, politically contentious and are not expected to change the climate in a
meaningful way in the 21st century, adaptation strategies are receiving increasing attention in
formulating responses to climate change.
The extreme damages from recent hurricanes plus the recent billion dollar disasters from floods, droughts and wildfires, emphasize
that the U.S. is highly vulnerable to current weather and climate disasters. Even worse disasters were encountered in the U.S. during
the 1930’s and 1950’s. Possible scenarios of incremental worsening of weather and climate extremes over the course of the 21st
century don’t change the fundamental storyline that many regions of the U.S. are not well adapted to the current weather and
climate variability, let alone the range that has been experienced over the past two centuries.

As a practical matter, adaptation has been driven by local crises associated with extreme weather and
climate events, emphasizing the role of ‘surprises’ in shaping responses. Advocates of adaptation to climate
change are not arguing for simply responding to events and changes after they occur; they are arguing for anticipatory
adaptation. However, in adapting to climate change, we need to acknowledge that we cannot know how the climate will evolve
in the 21st century, we are certain to be surprised and we will make mistakes along the way.

‘Resilience’ is the ability to ‘bounce back’ in the face of unexpected events. Resilience carries a connotation of returning to the
original state as quickly as possible. The difference in impact and recovery from Hurricane Sandy striking New York City in 2012
versus the impact of Tropical Cyclone Nargis striking Myanmar in 200830 reflects very different vulnerabilities and capacities for
bouncing back.

To increase our resilience to extreme weather and climate events, we can ‘bounce forward’ to reduce future vulnerability by
evolving our infrastructures, institutions and practices. Nicholas Taleb’s concept of antifragility31 focuses on learning from adversity,
and developing approaches that enable us to thrive from high levels of volatility, particularly unexpected extreme events. Anti-
fragility goes beyond ‘bouncing back’ to becoming even better as a result of encountering and overcoming challenges. Anti-fragile
systems are dynamic rather than static, thriving and growing in new directions rather than simply maintaining the status quo.

Strategies to increase antifragility include: economic development, reducing the downside from volatility, developing a range of
options, tinkering with small experiments, and developing and testing transformative ideas. Antifragility is consistent with
decentralized models of policy innovation that create flexibility and redundance in the face of volatility. This ‘innovation dividend’ is
analogous to biodiversity in the natural world, enhancing resilience in the face of future shocks.32

Similar to anti-fragility, the concept of ‘thrivability’ has been articulated by Jean Russell:33 “It isn’t enough to repair the damage our
progress has brought. It is also not enough to manage our risks and be more shock-resistant. Now is not only the time to course
correct and be more resilient. It is a time to imagine what we can generate for the world. Not only can we work to minimize our
footprint but we can also create positive handprints. It is time to strive for a world that thrives.”

A focus on policies that support resilience, anti-fragility and thrivability avoids the hubris of thinking we can predict the future
climate. The relevant questions then become:

• How can we best promote the development of transformative ideas and technologies?

• How much resilience can we afford?

The threats from climate change (whether natural or human caused) are fundamentally regional, associated not only with regional
changes to the weather/climate, but with local vulnerabilities and cultural values and perceptions. In the least developed countries,
energy poverty and survivability is of overwhelming concern, where there are severe challenges to meeting basic needs and their
idea of clean energy is something other than burning dung inside their dwelling for cooking and heating. In many less developed
countries, particularly in South Asia, an overwhelming concern is vulnerability to extreme weather events such as floods and
hurricanes that can set back the local economies for a generation. In the developed world, countries are relatively less vulnerable to
climate change and extreme weather events and have the luxury of experimenting with new ideas: entrepreneurs not only want to
make money, but also to strive for greatness and transform the infrastructure for society.

Extreme weather/climate events such as landfalling major hurricanes, floods, extreme heat waves and droughts become
catastrophes through a combination of large populations, large and exposed infrastructure in vulnerable locations, and human
modification of natural systems that can provide a natural safety barrier (e.g. deforestation, draining wetlands). Addressing
current adaptive deficits and planning for climate compatible development will increase societal
resilience to future extreme events that may possibly be more frequent or severe in the future.
Ways forward

Climate scientists have made a forceful argument for a future threat from manmade climate change. Based
upon our
current assessment of the science, the threat does not seem to be an existential one on the time
scale of the 21st century, even in its most alarming incarnation . However, the perception of manmade
climate change as a near-term apocalypse and alignment with range of other social objectives has narrowed
the policy options that we’re willing to consider .

Warming doesn’t trigger extinction


— peer-reviewed journal shows IPCC exaggeration
— history proves resilience
— no extinction- warming under Paris goals
— rock breaking strategy could offset warming

IBD 18 [Investors Business Daily, Citing Study from Peer reviewed journal by Lewis and Curry,
“Here's One Global Warming Study Nobody Wants You To See”, 4/25/18,
https://www.investors.com/politics/editorials/global-warming-computer-models-co2-
emissions/]

Settled Science: A new study published in a peer-reviewed journal finds that climate models
exaggerate the global warming from CO2 emissions by as much as 45%. If these findings hold
true, it's huge news. No wonder the mainstream press is ignoring it.

In the study, authors Nic Lewis and Judith Curry looked at actual temperature records and
compared them with climate change computer models. What they found is that the planet has
shown itself to be far less sensitive to increases in CO2 than the climate models say. As a result,
they say, the planet will warm less than the models predict, even if we continue pumping CO2
into the atmosphere.

As Lewis explains: "Our results imply that, for any future emissions scenario, future warming is
likely to be substantially lower than the central computer model-simulated level projected by
the (United Nations Intergovernmental Panel on Climate Change), and highly unlikely to exceed
that level.

How much lower? Lewis and Curry say that their findings show temperature increases will be
30%-45% lower than the climate models say. If they are right, then there's little to worry about,
even if we don't drastically reduce CO2 emissions.

The planet will warm from human activity, but not nearly enough to cause the sort of end-of-
the-world calamities we keep hearing about. In fact, the resulting warming would be below the
target set at the Paris agreement.

This would be tremendously good news.

The fact that the Lewis and Curry study appears in the peer-reviewed American Meteorological
Society's Journal of Climate
lends credibility to their findings. This is the same journal, after all, that recently published
widely covered studies saying the Sahara has been growing and the climate boundary in central
U.S. has shifted 140 miles to the east because of global warming.

The Lewis and Curry findings come after another study, published in the prestigious journal
Nature, that found the long-held view that a doubling of CO2 would boost global temperatures
as much as 4.5 degrees Celsius was wrong. The most temperatures would likely climb is 3.4
degrees.

It also follows a study published in Science, which found that rocks contain vast amounts of
nitrogen that plants could use to grow and absorb more CO2, potentially offsetting at least some
of the effects of CO2 emissions and reducing future temperature increases.
General Warming Theory Defense

CO2 not responsible for warming


Harris and Carter 9/14, Tom, director of the Ottawa-based International Climate Science Coalition, Bob, former professor
and head of the School of Earth Sciences at James Cook University in Australia , “Leo vs. science: vanishing evidence for climate
change,” 9/14, http://nypost.com/2014/09/14/leo-v-science-vanishing-evidence-for-climate-change/

But he has identified the wrong monster. It is the climate scare itself that is the real threat to civilization .
DiCaprio is an actor, not a scientist; it’s no real surprise that his film is sensationalistic and error-riddled. Other climate-change
fantasists, who do have a scientific background, have far less excuse. Science is never settled, but the
current state of
“climate change” science is quite clear: There is essentially zero evidence that carbon dioxide
from human activities is causing catastrophic climate change. Yes, the “executive summary” of
reports from the UN’s International Panel on Climate Change continues to sound the alarm — but the
summary is written by the politicians. The scientific bulk of the report, while still tinged with improper
advocacy, has all but thrown in the towel . And the Nongovernmental International Panel on Climate Change lists
thousands of scientific papers that either debunk or cast serious doubt on the supposed “consensus”
model. Oregon-based physicist Gordon Fulks sums it up well: “CO2 is said to be responsible for global warming that is not occurring,
for accelerated sea-level rise that is not occurring, for net glacial and sea ice melt that is not occurring . . . and for increasing extreme
weather that is not occurring.” Consider:  According
to NASA satellites and all ground-based temperature
measurements, global warming ceased in the late 1990s. This when CO2 levels have risen almost 10 percent
since 1997. The post-1997 CO2 emissions represent an astonishing 30 percent of all human-related emissions since the Industrial
Revolution began. That
we’ve seen no warming contradicts all CO2-based climate models upon which
global-warming concerns are founded. Rates of sea-level rise remain small and are even slowing,
over recent decades averaging about 1 millimeter per year as measured by tide gauges and 2 to 3 mm/year as inferred from
“adjusted” satellite data. Again, this
is far less than what the alarmists suggested .  Satellites also show
that a greater area of Antarctic sea ice exists now than any time since space-based
measurements began in 1979. In other words, the ice caps aren’t melting.  A 2012 IPCC report concluded that
there has been no significant increase in either the frequency or intensity of extreme weather
events in the modern era. The NIPCC 2013 report concluded the same. Yes, Hurricane Sandy was devastating — but it’s not
part of any new trend.

Climate science is wrong


Devine 10-3-15. Miranda, “Math errors discovered in climate model shows UN climate panel
overestimated global warming by at least 10x,” http://www.sott.net/article/303793-Math-
errors-discovered-in-climate-model-shows-UN-climate-panel-overestimated-global-warming-by-
at-least-10x, CMR

A mathematical discovery by Perth-based electrical engineer Dr David Evans may change


everything about the climate debate , on the eve of the UN climate change conference in Paris next month. A former climate
modeller for the Government's Australian Greenhouse Office, with six degrees in applied
mathematics, Dr Evans has unpacked the architecture of the basic climate model which
underpins all climate science. He has found that, while the underlying physics of the model is correct, it
had been applied incorrectly. He has fixed two errors and the new corrected model finds the climate's
sensitivity to carbon dioxide (CO2) is much lower than was thought. It turns out the UN's Intergovernmental Panel on
Climate Change has over-estimated future global warming by as much as 10 times, he says. "Yes, CO2 has
an effect, but it's about a fifth or tenth of what the IPCC says it is. CO2 is not driving the climate
Evans says his discovery "ought to change the world".
; it caused less than 20 per cent of the global warming in the last few decades". Dr

"But the political obstacles are massive," he said. His discovery explains why none of the climate models used by the

IPCC reflect the evidence of recorded temperatures. The models have failed to predict the pause
in global warming which has been going on for 18 years and counting. "The model architecture
was wrong," he says. "Carbon dioxide causes only minor warming. The climate is largely driven by
factors outside our control." There is another problem with the original climate model, which has been around since 1896. While climate scientists have
been predicting since the 1990s that changes in temperature would follow changes in carbon dioxide, the records over the past half million years show that not to be the case.

So,the new improved climate model shows CO2 is not the culprit in recent global warming. But
what is? Dr Evans has a theory: solar activity. What he calls "albedo modulation", the waxing and waning of
reflected radiation from the Sun, is the likely cause of global warming. He predicts global temperatures,
which have plateaued, will begin to cool significantly , beginning between 2017 and 2021. The cooling will be
about 0.3C in the 2020s. Some scientists have even forecast a mini ice age in the 2030s. If Dr Evans is correct, then he has proven the theory on

carbon dioxide wrong and blown a hole in climate alarmism . He will have explained why the doomsday
predictions of climate scientists aren't reflected in the actual temperatures. "It took me years to figure this out, but finally there is a
potential resolution between the insistence of the climate scientists that CO2 is a big problem, and the empirical evidence that it doesn't have nearly as much effect as they say."

DrEvans is an expert in Fourier analysis and digital signal processing, with a PhD, and two Masters
degrees from Stanford University in electrical engineering, a Bachelor of Engineering (for which he won the
University medal), Bachelor of Science, and Masters in Applied Maths from the University of Sydney.
No Way to Assess Risk

No individual risk can be tied to a specific probability or temperature increase,


and most climate risks are small modifications to fundamental societal risks
that we’ve dealt with for millennia
Their cards just laundry-list scary things, but most are examples of societal vulnerabilities that
warming only marginally alters, the probability of which is unquantifiable

Judith Curry 17, President of Climate Forecast Applications Network (CFAN), previously
Professor and Chair of the School of Earth and Atmospheric Sciences at the Georgia Institute of
Technology, 1/29/17, “The ‘threat’ of climate change,” https://judithcurry.com/2017/01/29/the-
threat-of-climate-change/

I think that use


of these words mislead the public debate on climate change — any damages from
human caused climate change are not imminent, we cannot quantify the risk owing to deep
uncertainties, and any conceivable policy for reducing CO2 emissions will have little impact on
the hypothesized damages in the 21st century.
‘Threats’ or ‘reasons for concern’?

I do not question that the possibility of adverse impacts from human caused climate change
should be under consideration. However, the human caused impacts of climate change have been
overhyped from the beginning — the 1992 UNFCCC treaty on avoiding dangerous human interference on the climate.
This implied warming was dangerous before any work had actually been done on this.

Some much needed clarification is presented in a recent article published in Nature: IPCC reasons for concern regarding climate
change risks. This article provides a good overview of the current IPCC framework for considering dangerous impacts. A summary of
the main concerns:

The reasons for concern (RFCs) reported in AR5 are:

Risks to unique and threatened systems (indicated by RFC1)

Risks associated with extreme weather events (RFC2)

Risks associated with the distribution of impacts (RFC3)

Risks associated with global aggregate impacts (RFC4)

Risks associated with large-scale singular events (RFC5)

The eight overarching key risks are:

Risk of death, injury, ill-health, or disrupted livelihoods in low-lying coastal zones and small island developing
states and other small islands due to storm surges, coastal flooding, and sea-level rise.

Risk of severe ill-health and disrupted livelihoods for large urban populations due to inland flooding in some
regions.

Systemic risks due to extreme weather events leading to breakdown of infrastructure networks and critical services
such as electricity, water supply, and health and emergency services.

Risk of mortality and morbidity during periods of extreme heat, particularly for vulnerable urban populations and those
working outdoors in urban or rural areas.
Risk of food insecurity and the breakdown of food systems linked to warming, drought, flooding, and precipitation variability
and extremes, particularly for poorer populations in urban and rural settings.

Risk of loss of rural livelihoods and income due to insufficient access to drinking and irrigation water and reduced
agricultural productivity, particularly for farmers and pastoralists with minimal capital in semi-arid regions.

Risk of loss of marine and coastal ecosystems, biodiversity, and the ecosystem goods, functions, and
services they provide for coastal livelihoods, especially for fishing communities in the tropics and the Arctic.

Risk of loss of terrestrial and inland water ecosystems, biodiversity, and the ecosystem goods, functions,
and services they provide for livelihoods.”

I think that qualitatively, these


are the the appropriate risks to consider. Where I don’t find this analysis
particularly convincing is their links of ‘undetectable’, ‘moderate’, ‘high’, ‘very high’ to specific
levels of temperature increase.

The confounding societal effects on all of these risks are overwhelming , IMO, and very likely to be
of greater concern than actual temperature increase . Apart from (vii) and (viii) related to ecosystems, these risks
relate to vulnerability of social systems. These vulnerabilities have put societies at risk for extreme weather
events throughout recorded history — adding a ‘delta’ to risk from climate change does not
change the fundamental underlying societal vulnerabilities to extreme weather events.

The key point IMO is one that I made in a previous post Is climate change a ‘ruin’ problem? The short answer is
‘no’ — even under the most alarming projections, human caused climate change is not an
existential threat on the timescale of the 21st century .

Going from temperature rise to modeling specific Earth processes to modeling


human interactions creates compounding uncertainty that makes accurate
forecasts impossible
Indur M. Goklany 15, science and technology policy analyst for the United States Department
of the Interior, where he holds the position of Assistant Director of Programs, Science and
Technology Policy, he was a member of the US delegation that established the IPCC and helped
develop its First Assessment Report, he served as a US delegate to the IPCC, and an IPCC
reviewer, he is a member of the GWPF’s Academic Advisory Council, November 2015, “Carbon
Dioxide the Good News”, http://www.thegwpf.org/content/uploads/2015/10/benefits1.pdf

The impacts of global warming are generally estimated using chains of linked computer models.
Each chain begins with a climate model, which itself is driven by a set of socioeconomic scenarios
based on assumptions for population, economic development and technological change over the
entire period of the analysis (often 50– 100 years or more). The climate model is followed by various biophysical,
economic and other downstream models to estimate changes in different aspects of human activity or
welfare, for example agriculture, forestry, health or biodiversity. The uncertain outputs of each upstream model serve
as the inputs of the subsequent downstream model, with the uncertainties cascading down the
chain so that the individual streams of uncertainty combine into a regular torrent.
For example, to estimate the impacts on agriculture and food security, the outputs of the climate model are fed into various crop
models to estimate yields, which then are linked to economic models to estimate supply and demand for the various crops. Supply
and demand are then reconciled via national, regional and global scale trade models.142 Notably, despite the cascade of
uncertainties, to date no climate change impact assessment has provided an objective estimate
of the cumulative uncertainty, starting with the socioeconomic scenarios through to the impact estimate. The ranges
of uncertainty presented in the IPCC impact reports are generally based on the uncertainties only from using
different climate scenarios. But these are much narrower than the true uncertainties that would have been
estimated had the full cascade of uncertainties been considered.
Climate Change Won’t Hurt the Economy

Climate change will not hurt the economy

Negrea & Buchan, 9-16, 22, Dan Negrea served at the US State Department as a member of the
Secretary’s Policy Planning Staff and as the Special Representative for Commercial and Business
Affairs; Sam Buchan is the Director of the Center for Energy and Environment at the America
First Policy Institute. He served as the Director for International Economic Policy at the National
Economic Council and as a Senior Advisor to the Secretary of Energy, On Energy Policy, All-of-
the-Above Is the Only Sensible Choice, https://nationalinterest.org/feature/energy-policy-all-
above-only-sensible-choice-204826?page=0%2C1

Many critics will claim that dramatic changes in the climate warrant an equally dramatic
makeover of the global economy. Instead of bending to specious arguments, the emerging field
of climate economics provides a badly needed context to climate discussions, using facts and
figures instead of adjectives to create more accurate frames of reference. Practitioners of
climate economics have developed models showing that climate change is not the urgent
threat to humanity’s future some make it out to be. There is no need for drastic changes in
America’s economy, and there is time to develop technological solutions.

To wit: last year, two of us published an article reviewing several climate economics studies that
estimated the impact of climate change on the gross domestic product (GDP) of the United
States and of the world in 2050 and in 2100. The general message of these studies was that the
impact of climate change on future GDP levels appears to be manageable.

This year, a white paper produced by Biden’s own Council of Economic Advisors (CEA) and the
Office of Management and Budget (OMB) points in the same direction. It notes that the
president’s fiscal year 2023 budget states that the American GDP twenty-five years from now
will be 4.5 percent lower because of the effects of climate change than it would have been
without climate change effects. But the level of the GDP will still be higher than it is today. The
white paper concludes that Americans would be 1.71 times richer in twenty-five years without
climate change effects but only 1.66 times richer when reflecting climate change damages.
This scenario is based on research by the Network for Greening the Financial System (NGFS), an
international collaboration comprised of more than 100 central banks and financial regulators.

The Biden white paper also addresses the impact of several climate change policies on the
global GDP for the year 2100, again using research from the NGFS. The harshest GDP impact is
under the scenario that assumes the continuation of current policies, in which case the global
GDP at that time will be 13 percent lower because of climate change.

What does this mean? A 2021 UN Intergovernmental Panel on Climate Change (IPCC) study
projects that without the impact of climate change, global GDP will be $529 trillion in the year
2100. A 13 percent loss reduces it to $460 trillion. If we compare these two numbers with a
2021 global GDP of $96.1 trillion, we find that the world would be 5.5 times richer in 2100
without the impact of climate change, compared to 4.8 times richer under the most severe
climate change scenario. A study published in July 2020 by Danish climate scientist Bjorn
Lomborg has comparable findings: “Scenarios set out under the UN Climate Panel (IPCC) show
human welfare will likely increase to 450% of today's welfare over the 21st century. Climate
damages will reduce this welfare increase to 434%.”
Solvency Impossible
Even multi-trillion effort does NOT solve their climate impacts – most recent
IPCC report
Thomsen-Cheek 8-13-21 (Kira Thomsen-Cheek, aka SninkyPoo, Communications Manager
focusing on healthcare, IT and HR, University of Washington Medicine ICD-10 Program, BA
Michigan State University, “Climate Code Red: We Have 5.5 Years.” Daily Kos, 8-13-2021,
https://www.dailykos.com/stories/2021/8/13/2045542/-Climate-Code-Red-We-Have-5-5-Years)
With the IPCC report’s klaxons blaring “CODE RED,” the question arises: what is it in our power as individuals
to do to ensure that the world does not warm beyond 1.5°C by 2100?What actions would YOU be wiling to take? Voting for
Democrats got us the White House back, and (slim) majorities in both houses of Congress. As tremendous as that was, it won’t have
been enough.

As reported in Earth and Sky– and ultimately derived from the IPCC report:

“The speed at which atmospheric carbon dioxide has increased since the industrial revolution (1750) is at least ten times faster than at
any other time during the last 800,000 years, and between four and five times faster than during the last 56 million years.

About 85% of carbon-dioxide emissions are from burning fossil fuels. The remaining 15% are generated from land use change, such
as deforestation and degradation.”

On August 9th, Greta Thunberg tweeted:

According to the new IPCC report, the carbon budget that gives us the best odds of staying
below 1,5°C runs out in less than 5 and a half years at our current emissions rate. Maybe someone should ask the
people in power how they plan to “solve” that?

The people in power who could make the biggest difference – the leaders of the countries with the highest percentage of global
emissions – are presidents Joe Biden and Xi Jinping.

Under President Biden, the infrastructure bills currently making their way through the congressional
sausage factory are woefully insufficient to cope with the need to reduce carbon emissions
enough –or swiftly enough – to meet the US’s burden under the Paris Agreement. Biden’s stated goal is
to cut the nation’s emissions by at least 50 percent by the end of this decade.

That is a worthy aim, but it


is not enough. The United States is responsible for only 15% of global
emissions annually, and per the IPCC report, our global carbon budget does not last until the end of this decade. It lasts until
about 2027.

Even the $3.5 trillion dollar plan largely written by Bernie Sanders (and crafted in response to the known
reality that Republicans would force Dems to water down the first plan) is unlikely to include
enough action on, or money for, the rapid slashing of emissions and radical retooling of the
economy that will be required to keep us under the 1.5°target.
As for the rest of the world? China is responsible for 28% of current annual global emissions –
almost twice our rate. Russia produces 5% of the world’s annual emissions, while India produces
7%. Every other country on Earth, lumped together, produce the remaining 21% .
While President Xi Jinping may have talked a good story at this past April’s virtual climate summit, his stated goals are, as
with President Biden’s, woefully insufficient.
"China will strive to peak carbon dioxide emissions before 2030 and achieve carbon neutrality before2060," the president said. "China
has committed to move from carbon peak to carbon neutrality in a much shorter time span than what might take many developed
countries, and that requires extraordinary hard efforts from China. We will strictly control coal-fired power generation projects. We
will strictly limit the increase in coal consumption over the 14th five-year-plan period and phase it down in the 15th five-year-plan
period."
Unless I am missing something, that was substantially the same as China’s position in 2020. The website Climate Action Tracker
categorized China’s climate response then as “highly insufficient,” and consistent
with global warming within 3°C
and 4°C by 2100 – i.e., more than double the 1.5° that is now generally agreed upon as the upper
“acceptable” limit.
With the
2 largest emitters on Planet Earth both doing far, far less than is needed to reduce
emissions, perhaps there is a global push to at least stop looking for new sources of fossil fuels to
plunder?

That would sound reasonable. It is not, in fact, the case.


Yesterday, I happened upon the following thread tweeted out by Extinction Revolution Cambridge. It appears to be immaculately
sourced. It is terrifying.

Here’s just a taste:

US: Drilling in Alaska; record offshore oil and gas exploration, extracting 17m barrels a day, $323bn to be spent in 4
years(expenditure on climate under Biden’s infrastructure plan is $36bn).

UK: recently granted 113 licenses over 259 drilling blocks in the North Sea, two new platforms installed just last week (tweet was
from this June).

Canada: the tar sands produce 3m barrels a day. Trudeau government just greenlit 3 new offshore sites off Newfoundland.

Uganda: *423* new wells with drilling led by Chinese national oil and gas company. Oil coming on stream 2025.

Nigeria: 100 new oil and gas sites coming into production between 2021 and 2024, one site alone producing 650,000 barrels every
day, starting next year.

New Zealand: new exploration in the waters off the poster child for progressive climate politics.

Iran: 2.4 million barrels a day, up 400,000 barrels a day from April 2020. New discovery adds 2.2 billion barrels a day.

In sum, under current policies – and policies


currently in the pipeline – emissions will not be slashed
enough, or quickly enough, to avoid warming of more than 1.5°C .
Nations across the globe are continuing to seek out new sources of fossil fuels to power their
growing populations and economies.
Leaders – the politicians who enact policies and the global billionaire ruling class who buy so many of those politicians (cough – Joe
Machin – cough) – appear to be perilously close to adopting a modified “business as usual” stance. What
would that look like, if it were to be

, models with a poor record of successfully accounting for past data and highly unrealistic
future greenhouse-gas concentrations should not be considered a reasonable basis for future
policy formulation.

U.S. action alone fails


I&I 21, Issues & Insights Editorial Board, “There’s Nothing The U.S. Can Do To Affect Global
Temperature”, Issues & Insights, 9/7/21, https://issuesinsights.com/2021/09/07/theres-
nothing-the-u-s-can-do-to-affect-global-temperature/

“We simulated the environmental impact of eliminating greenhouse gas emissions from the United
States completely,” Dayaratna said in testimony.
“Simulation results indicate that if
all carbon dioxide, methane, and nitrous oxide emissions were to be
eliminated from the United States completely, the result in terms of temperature reductions would be
less than 0.2 degrees Celsius, 0.03 degrees Celsius, and 0.02 degrees Celsius, respectively. These temperature
reductions would also be accompanied by minuscule changes in sea level rise (less than 2-centimeter
reduction).”

This isn’t hard to understand when it’s put next to the fact that more
than half of the world’s human greenhouse
gas emissions are produced by 25 cities, all but two of them in China, none of them in the U.S.

It’s truly asinineto believe that Washington and our state lawmakers can do anything about
greenhouse gas emissions when China and India have been busy building hundreds of coal
plants and that, as of last year, 350 coal-fired power plants were under construction worldwide. China – which, we must point
out, produces most of the solar panels installed in the West in factories powered by that country’s “ mountain” of
coal – is not going to yield to John Kerry’s embarrassing begging that it cut emissions. Beijing will do
only what it wishes.
AT: Permafrost Methane
Methane is stable---no chance of permafrost release
Chris Mooney 13, science and environment author, staffer at the Washington Post, former
science writer for Mother Jones, “How Much Should You Worry About an Arctic Methane
Bomb?” 8/8/13, Mother Jones, http://www.motherjones.com/environment/2013/08/arctic-
methane-hydrate-catastrophe
According to the Nature commentary, that methane "is likely to be emitted as the seabed warms, either steadily over 50 years or suddenly." Such are
the scientific assumptions behind the paper's economic analysis. But are those assumptions realistic—and could that much methane
really be released suddenly from the Arctic? ¶ A number of prominent scientists and methane experts
interviewed for this article voiced strong skepticism about the Nature paper. "The scenario they used is so unlikely as

to be completely pointless talking about," says Gavin Schmidt, a noted climate researcher at the NASA Goddard Institute for Space
Studies in New York.¶ Schmidt is hardly the only skeptic. "I don't have any problem with 50 gigatons, but they've got the time scale all

wrong," adds David Archer, a geoscientist and expert on methane at the University of Chicago. "I would envision something like
that coming out, you know, over the centuries."¶ Still, the Nature paper is the most prominent airing yet of concerns that a climate
catastrophe could be brought on by the release of Arctic methane that is currently frozen in subsea deposits—concerns that seem to be mounting in
lockstep with the dramatic warming of the Arctic. That's why it's important to put these fears into context and try to determine just how much weight
they ought to be accorded.¶ Methane on Ice¶ Let's start with some basics on methane—CH4—a greenhouse gas that reaches the atmosphere from
sources as diverse as wetlands, gas drilling, and cow burps. Compared with carbon dioxide, methane is kind of like the boxer who punches himself out
in the early rounds, whereas carbon dioxide goes the distance and wins by TKO. Pound for pound, methane causes some 25 times as much global
warming as carbon dioxide does. But it only remains in the atmosphere for about nine years, on average, before chemical processes break it down.
Carbon dioxide, in contrast, has a far longer atmospheric residence time.¶ What this means is that methane is most worrisome if a lot of it gets into the
atmosphere over a relatively short time period—precisely the scenario contemplated by the Nature paper. So could that happen? ¶ The answer
depends on a complicated and uncertainty-laden issue—the stability of frozen deposits of subsea methane in the Arctic region. Frankly, it's hard to
imagine something harder to study: We're talking about deposits residing not only beneath one of the world's most remote and inaccessible oceans,
but also beneath the sea floor itself.¶ Much of the world's methane is concentrated in the form of so-called gas "hydrates," icelike solids that form from
methane and water at cold temperatures and high pressures, e.g., deep beneath the ocean floor. According to the US Geological Survey, the total
global carbon content of such methane hydrates is estimated to equal some 1,800 gigatons (to be sure, there is considerable uncertainty about this
estimate).¶ One thousand eight hundred gigatons would create a climate catastrophe if it were all to be suddenly released, but the vast
majority of subsea methane is under deep water, and quite stable. Only a relatively small fraction of global
methane hydrates are at issue in the Nature paper, and this methane is in a very peculiar situation: It is frozen in the subsea permafrost of relatively
shallow continental shelves in the Arctic region. This frozen sediment was once coastline, but was submerged as oceans rose following the last Ice Age.
And now, it is being bathed in warmer waters due to the warming of the Arctic. ¶ So how much should we worry that these particular methane hydrates
might melt, releasing gas that would then travel through both sediment and seawater to reach the atmosphere? That's where the scientific debate
begins—over both how much methane falls into this category, and how vulnerable it is to the warming that is now gripping the Arctic region. ¶ Peering
Beneath the East Siberian Sea¶ The methane disaster concerns gained major prominence with a 2010 paper
in Science by University of Alaska-Fairbanks researcher Natalia Shakhova and her colleagues, who examined methane emissions in a very remote
area of the Arctic, the East Siberian Sea north of Russia. The continental shelf underlying this ocean is more than 2 million square kilometers in size, and
its subsea permafrost lies only about 50 meters below the sea surface. Traveling to the remote region in Russian ice-breakers, Shakhova's team
sampled water content and air content at the sea surface repeatedly, over a series of years. They found high concentrations of methane in the water
—"50% of surface waters are supersaturated with methane," the paper reported—and some of the gas was also venting from the water into the
atmosphere.¶ Although the Science paper did not contain the figure, it seems clear that Shakhova
is the source for the idea that
a 50-gigaton release of methane could occur in a short time frame . Or as she put it in a 2008 abstract, "[W]e
consider release of up to 50 Gt of predicted amount of hydrate storage as highly possible for abrupt release at any time," adding that this could lead to
"catastrophic greenhouse warming." The Nature paper cited another 2010 paper by Shakhova and her colleagues in the journal Doklady Earth Sciences,
which uses the 50 gigaton figure in discussing possible methane emission scenarios. ¶ Shakhova did not respond to several requests for comment for
this article; her automatic email response said she out doing fieldwork. But Peter Wadhams, the Cambridge physicist who is a coauthor of the Nature
paper, said that his work relied on that of Shakhova and her team because "they’ve done the most work there, working there every year, doing field
observations…we would rather base it on the estimates of the people actually working there, rather than the people who aren’t working there." Here is
a video of Shakhova discussing her research:¶ The trouble is that at this point, many other scientists don't accept that work—
or rather, don't agree about its implications . None seem to dispute the actual measurements taken by Shakhova and her team, but
as soon as the Science paper came out, a group of researchers questioned the idea that there was any cause

for alarm. "A newly discovered [methane] source is not necessarily a changing source , much less a
source that is changing in response to Arctic warming," they wrote. The implication is that perhaps methane has always been in
the water at such levels, without methane hydrates having been disturbed —rather, the methane may be
from another source. According to one 2011 study, for instance, the observed methane probably came not from hydrates, but simply from "the
permafrost's still adjusting to its new aquatic conditions, even after 8,000 years." The
hydrates, in contrast, are thought to be
much deeper below the sea surface, due to basic physical constraints on their formation and
stability. According to the US Geological Survey, "in permafrost areas, methane hydrate is not stable until about 225 m
depth."¶ Indeed, according to Ed Dlugokencky, who monitors global atmospheric methane levels at the National Oceanic and Atmospheric
Administration (NOAA), "so far, there has not been a significant increase in methane emissions in the Arctic ."
In other words, if methane is really starting to vent into the air in large quantities, Dlugokencky says he isn't seeing it. ¶ A Debate Over Hydrate Depth¶
And that's just the first reason that many scientists are skeptical. According to Carolyn Ruppel, who heads the Gas Hydrates Project at
the US Geological Survey, there just isn't that much vulnerable methane in submerged permafrost to
begin with. "We think very little hydrate on this planet is associated with permafrost, either subsea or terrestrial," she says. Inspired in part by the
Shakhova research, the USGS undertook to study the continental shelves of the Beaufort Sea, off Alaska and Canada. "We set out to test this idea that
all of the Arctic shelves were going to have high methane emissions," she says. "And at least for the US Beaufort shelf, we're not seeing them." ¶ Ruppel
acknowledges that due to Arctic warming, more methane is going to be released, much of it from permafrost on land. But, she
continues, "I would say one of the least likely sources is methane gas hydrates. You are limited by the

laws of physics," she adds—noting that the beginning of the zone of stability for these hydrates is some 220
meters deep. That's a recurrent refrain among skeptics—they say hydrates just can't form above a certain depth, and warming can't
penetrate such a depth very quickly. "You've got to go from the sea floor of 50 meters depth,
down to 200 meters where the hydrate is ," explains the University of Chicago's David Archer. "So that just takes a long time."¶
Moreover, even if subsea permafrost methane hydrates do thaw, the liberated gas still has to travel

through layers of sediment just to get to the ocean floor . So how does that happen? "That's kind of mysterious," says
Archer. Perhaps there will be open pathways for gas in some places, but perhaps there won't. Archer also notes that there have been undersea

explosions or landslides that release methane in bursts, but "those kinds of things seem like they would be relatively small
compared to 50 gigatons, and they would happen sporadically in time over centuries, not everything
blows up in a few years."¶ Nonetheless, imagine that methane gas from melted hydrate makes it to the sea floor. It now exists as bubbles with, say, 50
meters to go before they reach the sea surface. Most of the bubbles won't make it, say scientists: They'll be dissolved
in seawater, and then the methane will be broken down by microorganisms over a period of
months. "If methane is in the ocean water column, most of it doesn't get out," explains Bill Reeburgh, a professor of earth system science at the
University of California-Irvine who has spent his career studying methane. "Most of it is oxidized" by bacteria , which turn it into
carbon dioxide and water, Reeburgh continues. "So all these stories about seeps, people seem to think the bubbles go straight to the atmosphere, and
they don't."¶ In other words, while the waters of the
East Siberian Sea may be full of dissolved methane , for many
scientists that doesn't prove that hydrates have been disturbed, or that the Arctic is starting to vent

large amounts of methane from below the sea floor into the atmosphere. Not yet, anyway.
AT: Coral Reefs
Climate change won’t destroy coral reefs

Christopher Intagliate, March 22, 2022, Some Good News about Corals and Climate Change,
https://www.scientificamerican.com/podcast/episode/some-good-news-about-corals-and-
climate-change/?
gclid=CjwKCAjw5s6WBhA4EiwACGncZeKMkfk5GNM4oui3GZRvgTJPCFfAGoUTJHEntX6w91MLvd
Y4sbLzeRoCsP4QAvD_BwE

A nearly two-year-long study of Hawaiian corals suggests some species may be better
equipped to handle warmer, more acidic waters than previously believed. Full Transcript
Christopher Intagliata: Within a few decades, global temperatures are expected to climb to 1.5
degrees Celsius above pre-industrial levels. And that's gonna be really bad for corals, according
to the latest report out from the Intergovernmental Panel on Climate Change. Andréa Grottoli:
So the recent IPCC report says that up to 1.5 we can expect 10 to 30 percent coral survivorship.
And above that, it decreases precipitously. Intagliata: Andréa Grottoli is a distinguished
professor at the Ohio State University. Amid the doom and gloom of the IPCC report, Grottoli
has some rare GOOD news. Corals may be more adaptable to future conditions than we
thought. Her team studied three species of coral from the island of Oahu, in Hawaii. They put
them in tanks with either heat stress; more acidic water; or both. Grottoli: And what really
matters in this study is the one where both increases in temperature and ocean acidification,
because that's exactly what's happening on reefs now. Intagliata: Twenty two months later—
they assessed the winners and losers. They found that on average, more than half the corals
survived. Even after being punished with warmer, more acidic waters—the kind they'd face
under two degrees of global warming. Grottoli: The corals that survived, two of the three
species were actually physiologically performing normally. They were doing more than
surviving. They were coping, they'd acclimatized. They were doing well. Intagliata: The results
appear in the journal Scientific Reports. [Rowan H. McLachlan et al, Physiological acclimatization
in Hawaiian corals following a 22-month shift in baseline seawater temperature and pH] Grottoli
says the study provides hope the world's corals may be more resilient than we thought—
especially since one of the Hawaiian species they studied is widespread around the planet. But
will this good news motivate world leaders to rein in warming? Well, corals may be able to wait
just a little longer to find out.

Sky News, January 6, 2022, Climate change: Hope for millions as study finds damaged coral reefs
can still provide seafood, https://news.sky.com/story/climate-change-hope-for-millions-as-
study-finds-damaged-coral-reefs-can-still-provide-seafood-12510337

Bleached and damaged coral reefs are still able to supply nutritious seafood , a study has found.
Scientists led by Lancaster University used more than 20 years of data from the Seychelles, where tropical reefs
were damaged by a large coral bleaching event in 1998. The bleaching, caused by rising sea temperatures, killed 90% of the corals
found on the islands. Bleaching turns the corals white, and leaves them under stress and at risk of death. Scientists were unsure how
climate change could affect the nutrients available from reef fisheries. But the new findings reveal they may be
more resilient than previously thought. Campaigners say the global oil demand is already met by oil and gas
exploration to date Oil and gas companies operating in North Sea to cash in 'near record' income as energy prices skyrocket
Colourful houses in Hotwells in the city of Bristol seen from above during the first mass ascent, where balloons from all over the
world gather at Ashton Court, Bristol, to take part in the Bristol International Balloon Fiesta. Climate change: Find out the energy
efficiency of homes in your area - as most important factor revealed The research, published in One Earth, finds damaged reef
fisheries remain rich sources of micronutrients, even increasing in nutritional value for some
minerals. This will bring hope to more than six million people who work in the small-scale fisheries and rely on tropical reefs. The
fish they catch are vital to the health of millions of people in the tropics, which suffer from high levels of malnourishment. Bleached
coral reef that is now dominated by seaweed (Lancaster University / Professor Nick Graham) Global warming means coral bleaching
events are becoming more frequent and more severe, placing these vulnerable ecosystems under greater stress. Dr James Robinson,
who led the study, said the findings "underline the continuing importance of these fisheries for vulnerable coastal communities, and
the need to protect against overfishing to ensure long-term sustainability of reef fisheries". "We found that some micronutrient-rich
reef species become more abundant after coral bleaching, enabling fisheries to supply nutritious food despite climate change
impacts," he added, and called for the protection of these systems to be made a "priority". The scientists, who came from the
Seychelles, Australia, Canada, and Mozambique, calculated that reef fish are important sources of selenium and zinc, and contain
levels of calcium, iron, and omega-3 fatty acids comparable to foods like chicken and pork. Iron and zinc were found to be more
concentrated in fish caught on reefs dominated by microalgae and seaweeds. Co-author Professor Christina Hicks said the study
"suggests reef fisheries will continue to play a crucial role, even in the face of climate change, and highlights the vital importance of
investing in sustainable fisheries management". The researchers believe the results underline the need for more of the catches to be
retained for locals and promotion of traditional fish-based diets. They used a combination of experimental fishing, nutrient analysis,
and visual surveys of fish communities to inform the study.
AT: Ocean Acidification
No ocean acidification impact---their evidence assumes far higher CO2 levels
than we’ll ever reach, adaptation solves, and research is systemically biased
toward alarmism
Howard I. Browman 16, Institute of Marine Research, Marine Ecosystem Acoustics
Disciplinary Group, Austevoll Research Station, Norway, “Applying organized scepticism to ocean
acidification research,” ICES Journal of Marine Science, Volume 73, Number 3, February/March
2016, pp. 529-536

[OA = Ocean Acidification]

The first articles on OA were descriptions of the process itself (CO2-driven changes in the biogeochemistry of seawater and
sediments) and its implications. This was followed by an explosion of work (mainly laboratory-based) on the possible effects of OA
on various marine organisms, at first mainly calcifiers or the calcified hard parts of organisms without calcarious shells. These were
mostly restricted to part of one generation (a limited number of life history stages), or at most a single complete life cycle, with one
or a small number of biological endpoints measured as effect indicators. In
early work, treatment exposure levels
often greatly exceeded those predicted to occur hundreds of years into the future even without
any reduction in CO2 emissions. The majority of these early works reported significant negative
effects of high CO2, from which it was inferred that there would be a detrimental effect of OA over the
coming decades–centuries. Thereafter, longer-term effect studies began to appear, which first included single-
generation carry-overs and then multiple generations. By necessity, these have been on organisms with short generation times. As
the approach to CO2 exposures matured, very high treatment levels became less common. More
studies that showed no
effect of high CO2 (predicted for the next century)—and even beneficial effects (e.g. for some
phytoplankton and macrophytes)—appeared. Upwelling and vent systems were used as in situ case studies of natural
future OA-like conditions. Some in situ work mimics such systems by injecting CO2 and following the response of
organisms/communities locally. Results of experiments that included multiple stressors in addition to
CO2 were published. The most common of these has been temperature, but salinity, oxygen, and a variety of others have
also been included (in a global climate change context). Such studies typically report that the additional
driver(s) has a stronger effect than CO2, although it is difficult to isolate the effect of the individual variables. The
reality that the functional response curve of each driver will likely differ , as will the organism's ability to adapt
to them, further complicates interpretations of multiple driver experiments. Studies on the effect of CO2 on trophic interactions
(indirect effects) are sparse—such experiments are logistically complex and difficult to interpret. A small number of recent studies
integrate the results of the preceding body of work into risk assessments and scenario modelling, typically on economically
important species of fish and shellfish; most conclude that the prognosis is dire, although in the context of what follows, that
conclusion might be premature.

The preceding describes how OA research has matured. The following describes how it still has a way to go.

Applying organized scepticism to research on the effects of OA

Scientific or academic scepticism calls for critical scrutiny of research outputs before they are accepted as new knowledge (Merton,
1973). Duarte et al. (2014) stated that “…there is a perception that scientific skepticism has been abandoned or relaxed in many
areas…” of marine science. They argue that OA is one such area, and conclude that there is, at best, weak evidence to
support an OA-driven decline of calcifiers. Below, I raise some of the aspects of OA research to which I contend an
insufficient level of organized scepticism has been applied (in some cases, also to the articles in this theme issue). I arrived at
that conclusion after reading hundreds of articles on OA (including, to be fair, some that also raise these
issues) and overseeing the peer-review process for the very large number of submissions to this themed issue. Importantly, and as
Duarte et al. (2014) make clear, a retrospective application of scientific scepticism such as the one that follows could—and should—
be applied to any piece of/body of research.
Exposure levels, water chemistry, and limits to making inferences about the effect of a long-term driver from a short-term
experiment

Many early
studies on OA applied treatment levels that greatly exceeded even worst-case climate
change scenarios and did not report water chemistry in sufficient detail to determine if the
treatment mimicked future OA-driven seawater conditions. Although most recent work has improved with
respect to treatment levels, mimicking future water chemistry remains tricky.

A rationale commonly used to justify high CO2/low pH treatments is the need to identify at what levels organisms are affected.
However, the limits to making inferences about how an organism or ecosystem will respond to a climate-change scale variable (i.e.
one that changes over decades–centuries) from their response during a short-term challenge experiment (i.e. hours–days–weeks)
has not been adequately addressed—or even mentioned—in most studies. This is reflected in a confusion of terms common in OA
studies—when describing the outcome of a short-term CO2 challenge, authors often make the inferential leap and use “OA” when
discussing their results, without any caveats. Oddly, incorporation of the extensive toxicology literature is almost entirely missing
from OA studies, either when it comes to adopting established exposure protocols or to framing the inferences that can/cannot be
drawn from short-term experiments. Also missing
from most studies is anything more than a superficial
statement about the possibility for acclimation, adaptation, or evolution, something that is
necessary to extend the outcome of a short-term challenge experiment into an inference about the
effect of a long-term driver (see below).

Spatio-temporal variability in CO2 and pH

Biogeochemists are well aware of the spatio-temporal variability in CO2 and pH—daily (high productivity areas), seasonal (blooms),
interannual (higher temperatures), horizontal (coastal upwelling areas, high turbidity zones), and vertical (deep vs. surface waters)
ranges in these can be extensive (e.g. Wootton et al., 2008; Hofmann et al., 2011; Waldbusser and Salisbury, 2014; Kapsenberg et al.,
2015). Biologists have struggled to incorporate this variability into experiments designed to test the effects of OA, and into their
interpretations of the outcomes (Eriander et al., 2016). Some researchers have pointed out that organisms that are exposed to large
ranges in CO2 and pH during their daily lives (e.g. vertical migrators), life cycles (e.g. organisms that reside offshore as larvae but
move to the coast as juveniles or adults), or somewhere in their distributional ranges, should be more tolerant of OA (e.g. Lewis et
al., 2013).

Imbalanced focus on individuals that are affected and insufficient focus on inter-individual variability and within-experiment
selection bias interpretations of ecological impacts

Almost all CO2 challenge experiments produce a range of responses in the test organism—some
individuals are badly affected, others less, and some not at all. There are several issues associated with all such experiments that it is
important to be cognizant of and account for: (i) analyses and interpretations should not ignore or minimize individuals that are little
affected or unaffected (after all, these are the ones whose genes will be passed on to the next generation); (ii) inter-individual
variability should be highlighted; (iii) the longer that the experiment runs the more likely it is that an internal selection process for
the tolerant individuals has occurred. All of these are important in the context of the next section.

Acclimation, generational carry-over effects, adaptation, epigenetics, and evolution

Almost all experiments conducted to assess OA are short-term toxicity challenges . Therefore, using
them as the basis from which to make inferences about a process that will occur slowly over the
next decades–centuries must be made with appropriate caution . That is, the experiments and the
interpretations made from them must consider how populations might acclimatize, adapt, and evolve to
climate change, including OA (e.g. Donelson et al., 2011; Hoffmann and Sgrò, 2011; Sunday et al., 2013; Harvey et al., 2014).
Recent studies indicate that even the effects of OA that are considered most worrisome—various behavioural impairments resulting
from short-term exposure to high CO2 (see Nagelkerken and Munday, 2016)—might be reduced or overcome through adaptation
and evolution (Regan et al., 2016). More knowledge of the mechanisms of direct action of OA-related drivers—higher concentrations
of CO2, hydrogen ions (=lower pH), and/or carbonate chemistry (less carbonate ions)—and of indirect drivers such as the effects of
OA on food quality, are essential to understand what degree of adaption is possible. Readers should be
duly sceptical of
studies that completely ignore the possibility of adaptation when presenting their inferences
about OA, particularly scenario modelling of socio-economic impacts.
We must also do better to incorporate analogous work in other fields, for example, rapid evolution of tolerance to envirotoxins (e.g.
Whitehead et al., 2012) and environmental change (e.g. Collins et al., 2014; Stoks et al., 2015; Thibodeau et al., 2015) via a
combination of genetic and epigenetic mechanisms (Yona et al., 2015).

Publication bias

Negative results—those that do not support a research hypothesis (e.g. OA will have detrimental effects on marine organisms)
—can provide more balance for a subject area for which most published research reports
positive results. Negative results can indicate that a subject area is not mature or clearly enough defined, or that our
current methods and approaches are insufficient to produce a definitive result. Gould (1993) asserted
that positive results tell more interesting stories than negative results and are, therefore, easier to write
about and more interesting to read. He calls this a privileging of the positive. This privileging leads to a bias
that acts against the propagation of negative results in the scholarly literature (see also Browman,
1999). Further, it is also important to recognize that studies showing no effect of OA are less equivocal than those that do, for all of
the reasons noted above. Following from this, it is essential that authors writing about possible effects of OA present and discuss
research that is inconsistent with their results and/or their interpretations—openly, honestly, and rigorously. Readers should be duly
sceptical of articles that do not do this.

Algae adaptation means the ocean is resilient to climate change

MyNextfone, 9-14, 2014 http://www.mynextfone.co.uk/noncyclical-consumer-goods/adapt-to-


higher-temperatures-acidification-evolution-often-overlooked-in-climate-h22041.html “

Ocean algae can evolve fast to tackle climate change-study.

* Microbes adapt to higher temperatures, acidification

* Evolution often overlooked in climate projections

By Alister Doyle, Environment Correspondent

OSLO, Sept 14 (Reuters) - Tiny marine algae can evolve fast enough to cope with climate
change in a sign that some ocean life may be more resilient than thought to rising
temperatures and acidification, a study showed.

Evolution is usually omitted in scientific projections of how global warming will affect the
planet in coming decades because genetic changes happen too slowly to help larger creatures
such as cod, tuna or whales.

Sunday's study found that a type of microscopic algae that can produce 500 generations a year
- or more than one a day - can still thrive when exposed to warmer temperatures and levels of
ocean acidification predicted for the mid-2100s.

The Emiliania huxleyi phytoplankton studied are a main source of food for fish and other ocean
life and also absorb large amounts of carbon dioxide, the main greenhouse gas, as they grow.
Their huge blooms can sometimes be seen from space.
"Evolutionary processes need to be considered when predicting the effects of a warming and
acidifying ocean on phytoplankton," according to the German-led study in the journal Nature
Climate Change.

Sea urchins and coral reefs are resilient

MyNextfone, 9-14, 2014 http://www.mynextfone.co.uk/noncyclical-consumer-goods/adapt-to-


higher-temperatures-acidification-evolution-often-overlooked-in-climate-h22041.html “

Sunday's study showed that algae, taken from water 15 degrees C (59 Fahrenheit) warm off
Norway, tended to evolve to a smaller size in higher temperatures in experiments lasting more
than a year but also grew faster, producing a larger mass overall.

Stephen Palumbi, a professor of biology at Stanford University, said there was evidence that
some coral reefs or sea urchins could be more resilient than expected to ocean changes.
AT: Climate Change Destroys the Grid
Climate change will only have an incidental impact on the grid and renewables
won’t solve

Science Daily, January 5, 2022, Climate change could lead to power outages, higher power costs
on west coast of US, https://www.sciencedaily.com/releases/2022/01/220105103240.htm
Two new studies led by a North Carolina State University researcher offer a preview of what electricity consumers on the West Coast
could experience under two different future scenarios: one where excessive heat due to climate change strains power supplies, and
one where the grid shifts toward renewable energy while the climate follows historic trends. In both cases, they found power costs
and reliability remain vulnerable to extreme weather. "The
impacts of climate change and extreme weather
events on the grid, mostly in the form of drought and heat waves, are going to get worse
under climate change," said Jordan Kern, assistant professor of forestry and environment resources at NC State. "Even as
the West Coast grid moves away from fossil fuels toward wind and solar, these extreme weather events will still impact system
reliability and the price of power." Published in the journal Earth's Future, the two studies project future power supply and demand
under separate scenarios. In the first study, researchers used computer models to simulate the impacts of climate change on the
current power grid in California and the Pacific Northwest. They evaluated the grid's price and reliability under 11 different climate
scenarios between 2030 and 2060, drawing on multiple scientific models for how the climate would change under a "worst-case
scenario" of fossil fuel emissions, and another less severe scenario. advertisement "The worst-case scenario is worth looking at even
if there's some evidence that the world is going to reduce fossil fuel emissions enough to avoid it," Kern said. The
researchers
found greater risk of power blackouts in summer and early fall, largely driven by extreme heat
in California that causes high demand for power as people cool their homes. They projected there would
be shortfall events in all but one scenario where climate change affected power generation in both regions simultaneously.
However, they noted these power shortfalls would remain relatively rare. The maximum under
the worst case was 72 hours of West Coast-wide power supply shortages across 31 years. "As it
gets hotter and hotter and hotter, and demand for electricity gets higher, we expect the grid to fail," Kern said. "Those extreme heat
events are going to become much more severe." Extreme heat in California would also affect the price and supply of power in the
Pacific Northwest. Historically, the regions have shared power. "When prices go up to $1,000 per megawatt hour, that's the grid
ringing the alarm bell," Kern said. "They're making electricity so expensive partly in order to incentivize people to consume less." In a
second study, researchers evaluated the price of power through 2050 with more renewable energy sources added to the grid, while
assuming natural gas power plants would still be in place as back-up. They compared five scenarios for each market: two scenarios
that varied the mix of solar and wind by cost; one scenario with more batteries added to store power; a scenario in which many
people are adopting electric vehicles; and the status quo trend. They assessed the cost of electricity in these different systems under
100 representative years of both normal and extreme weather events that could occur under historic climate conditions -- without
additional climate warming. "With the West Coast grid now, we know certain things about how it will perform because it relies so
much on hydropower -- that a dry year is a bad and a wet year is good," Kern said. "What we wanted to know is: as you decarbonize
the grid out West, adding electric vehicles, batteries, solar and wind, does that shift at all?" Even with renewables, they found
extreme drought and heat would still drive the extremes in price -- with "good" years of the lowest prices driven by mild
temperatures and high streamflow, and the highest prices driven by extreme heat or drought. "When you think about the very worst
years, those conditions will still be driven by what drives those events today: lack of water or a heat wave in the middle of the
summer," Kern said. "Adding renewable energy does not change the very worst or best year, but it kind of shifts things around in the
middle." In California, the future scenario with increased wind energy led to the lowest prices, followed by solar. In the Pacific
Northwest, the scenarios with the highest amount of both wind and solar had the lowest prices. Supply shortfalls would be most
frequent under the pathway with the greatest demand for electric vehicles. "As the grid uses more wind and solar, the price goes
down because it's less expensive, and it pushes out natural gas," Kern said. "The exception is that when you have high demand for
power from electric vehicles, demand gets so high, it breaks the system. It's pretty rare in our models, but it happens when there
isn't much water and there's a heat wave." Kern said the reductions that they projected in greenhouse gas emissions under the five
scenarios were "conservative;" their models chart up to 50% de-carbonization through 2050, while most West Coast states have set
goals to make more substantial shifts sooner. "Our key finding was that as
the grid decarbonizes, you are still going
to be left with that vulnerability to water and heat," Kern said. "This is a system that can't run
away from that." The study, "The Effects of Climate Change on Interregional Electricity Market Dynamics on the U.S. West
Coast," was published online in Earth's Future on Dec. 7, 2021. In addition to Kern, the other authors were Joy Hill, David E. Rupp,
Nathalie Voisin and Gregory Characklis. The study was supported by the National Science Foundation INFEWS program under awards
1639268 T2 and 170082 T1. The second study, "Technology pathways could help drive the U.S. West Coast grid's exposure to
hydrometerological uncertainty," was published online in Earth's Future on Dec. 28, 2022. In addition to Kern, the other authors
include Jacob Wessel, Nathalie Voisin, Konstantinos Oikonomou and Jannik Haas. The study was funded by the U.S. Department of
Energy Office of Science as part of research in the MultiSector Dynamics, Earth and Environmental System Modeling Program as well
as the National Science Foundation INFEWS program award 1639268.
AT: Species/Biodiversity
CO2 concentrations 18 times higher than current levels didn’t cause species
extinctions
Kathy J. Willis et al 10, Professor of Long-Term Ecology at the University of Oxford; Keith D.
Bennett, professor of late-Quaternary environmental change at Queen's University Belfast,
guest professor in palaeobiology at Uppsala University in Sweden, et al, 2010, “4°C and beyond:
what did this mean for biodiversity in the past?,” Systematics and Biodiversity, Vol. 8, No. 1, p.
3-9

Within a time-frame of Earth's history, current


atmospheric CO2 levels at 380 ppmv are relatively low
compared with the past; geological evidence and geochemical models suggest intervals of time when
levels have been up to 18 times higher than present (Royer, 2008). The fossil record thus provides
plenty of opportunity to assess biotic responses to intervals of higher global atmospheric CO2
and temperatures. However, this only makes sense if it is also possible to examine the responses of extant species, which
have modern-day distributions; and where the position of global lithospheric plates is relatively similar to the present. Therefore,
an ideal time interval for consideration is the past 65 million years when many of the ancestors
of modern tropical and temperate trees had evolved (Willis & McElwain, 2002; Murat et al., 2004; Morley, 2007).
It is also fair to assume that these species had broadly similar ecological tolerances to present day; it has been
demonstrated in a number of studies that most species are remarkably conservative in their ecological
niches (Wiens & Graham, 2005), and that these remain relatively unchanged through time despite
populations persisting through intervals of wide amplitude fluctuations in climate (Svenning & Condit,
2008).

The most recent climate models and fossil evidence for the early Eocene Climatic Optimum (53–51 million years
ago) indicate that during this time interval atmospheric CO2 would have exceeded 1200 ppmv and
tropical temperatures were between 5–10 °C warmer than modern values (Zachos et al., 2008). There is also evidence for
relatively rapid intervals of extreme global warmth and massive carbon addition when global
temperatures increased by 5 °C in less than 10 000 years (Zachos et al., 2001). So what was the response of
biota to these ‘climate extremes’ and do we see the large-scale extinctions (especially in the Neotropics)
predicted by some of the most recent models associated with future climate changes (Huntingford et al., 2008)? In fact
the fossil record for the early Eocene Climatic Optimum demonstrates the very opposite. All the evidence
from low-latitude records indicates that, at least in the plant fossil record, this was one of the most biodiverse
intervals of time in the Neotropics (Jaramillo et al., 2006). It was also a time when the tropical forest biome was the
most extensive in Earth's history , extending to mid-latitudes in both the northern and southern hemispheres – and
there was also no ice at the Poles and Antarctica was covered by needle-leaved forest (Morley, 2007).
There were certainly novel ecosystems, and an increase in community turnover with a mixture of tropical and temperate species in
mid latitudes and plants persisting in areas that are currently polar deserts. [It should be noted; however, that at the earlier
Palaeocene–Eocene Thermal Maximum (PETM) at 55.8 million years ago in the US Gulf Coast, there was a rapid vegetation response
to climate change. There was major compositional turnover, palynological richness decreased, and regional extinctions occurred
(Harrington & Jaramillo, 2007). Reasons for these changes are unclear, but they may have resulted from continental drying, negative
feedbacks on vegetation to changing CO2 (assuming that CO2 changed during the PETM), rapid cooling immediately after the PETM,
or subtle changes in plant–animal interactions (Harrington & Jaramillo, 2007).]
at war

Doesn’t involve great powers


Tertrais, International Institute for Strategic Studies, Foundation for strategic research, ‘11
(Bruno, The Climate Wars Myth, http://csis.org/publication/twq-climate-wars-myth)

Some of the most catastrophic scenarios of climate change-induced conflict just do


not stand up to scrutiny. To study the possible political consequences of changes in the
geography of the Arctic region due to climate change is one thing. To imagine this could
lead to armed clashes between Russia and the North Atlantic Treaty Organization
(NATO) is another. First, the diminution of the maximum extent of summer sea ice will
not transform the North-Western Passage and the Northern Maritime Route into vital
maritime trade arteries: they will be open only a few weeks or a few months a year. Second,
the real quantity of hydrocarbon resources in the region is still very much open to
debate; and such resources are, for the most part, located within national maritime
areas. Third, the attitude of all neighboring states regarding this region, including Russia,
reflects a clear preference for settling possible disputes in accordance with
accepted international law. Fourth, the scope of these disputes is not increasing;
rather the opposite: in April 2010, Norway and Russia settled their decades-long dispute on
the delimitation of their respective maritime areas in the Barents Sea.

The interruption of the North Atlantic Conveyor Belt (‘‘Gulf Stream’’) due to global
warming is a favorite of thrillers and science-fiction writers. The study of its consequences
by a consulting firm at the request of the U.S. Department of Defense’s Office of Net Assessment
a few years ago was widely noted.19 The problem is that the credibility of this scenario is
close to nil. Recent scientific research has shown that the Gulf Stream is animated much less by
thermohaline circulation (differences in the temperature and salinity of water) than by the
winds. Moreover, its role in shaping and regulating the climate of Northern Atlantic regions has
been seriously put in doubt.20

Finally, the argument according to which global warming will lead to an increase in the
number of natural catastrophes, with grave humanitarian consequences, should be taken
with a heavy pinch of (marine) salt. The only available evidence that global warming
will lead to more extreme weather events relies on modeling. Data do not really
sustain this hypothesis so far. There has not been any increase in global precipitation
in recent decades.21 Neither have droughts become more frequent or severe.22
Hurricane activity is not stronger, and its variation remains within the range of natural
variation.23 The number of hurricane events has tended to evolve downwards since
1970; in accumulated intensity, 2010 was its lowest in 30 years.24
The Emergency Events Database (EM-DAT) maintained at the Leuwen University in BelgiumÑone
of the most widely used databases for natural disastersÑshows a clear rise in the number of
weather-related catastrophes over the last 30 years. However, this rise can easily be explained
by demographic, economic, sociological, and political factors. EM-DAT only takes into account
events that have caused a significant number of victims (which is rising due to population
increase and the growing number of humans living on exposed areas), for which a state of
emergency has been declared, and a call to international help has been made (the frequency of
which is rising for political and media reasons).25 Furthermore, the number of reported
catastrophes has also increasedÑas compared to what it was say, a century agoÑdue to
improved detection and attention. There is every reason to believe that the human, social,
and economic consequences of natural catastrophes will be increasingly severe, but
this has little to do with climate change.

It should also be noted that natural disasters do not necessarily have only negative
consequences on national and international security. Quite the contrary: disasters appear to
prevent rather than promote civil conflict.26 A case in point is the 2004 Asian tsunami,
which indirectly contributed to the stabilization of the decades-old secessionist
conflict in the Indonesian province of Aceh (a peace agreement was signed in August 2005).
At Middle East War
Climate change not responsible for Middle East conflict and repression

Daoudy, March/April, 2022, Foreign Affairs, Scorched Earth Climate and Conflict in the Middle
East, https://www.foreignaffairs.com/articles/middle-east/2022-02-22/scorched-earth

In the past decade, discussions about the Middle East in Western media, academia, and policy
circles have frequently revolved around the idea that climate change is driving much of the
conflict in the region. Although environmental shifts are affecting the region in crucial ways, this
emerging narrative mischaracterizes—or misunderstands—the way that political choices shape
how vulnerable populations interact with their environment.

Consider Syria: when that country spiraled into civil war in 2011, some observers pointed to
climate change as the instigating cause. Rising temperatures, the theory went, caused a major
drought in Syria from 2006 to 2010, which triggered agricultural failure. This, in turn, spurred
migration and discontent; the uprisings were a natural consequence. In 2015, U.S. President
Barack Obama put forward something akin to this argument. Climate change, he said, “helped
fuel the early unrest in Syria, which descended into civil war.”

This interpretation doesn’t stand up to scrutiny. After all, previous droughts had been severe
and did not lead to violent protests. And struggling farmers and migrants fleeing the drought
were not the instigators of the 2011 uprisings: the earliest protests were against political
repression.

Climate change did not instigate the civil war in Syria.

Politics shaped the environmental challenges preceding the Syrian crisis. After Bashar al-Assad
took power in 2000, the regime ramped up its commitment to neoliberal policies at the behest
of the World Bank, the International Monetary Fund, and domestic elites who stood to profit
from such structural adjustments. These developments came with drastic consequences for
rural populations. The uneven transition from Baathist socialism to what the regime dubbed a
“social market economy” made Syria’s rural poor even poorer. The discriminatory decisions the
government took in building infrastructure—such as the construction of the Tabqa dam, on the
Euphrates River, in the 1970s, which displaced thousands of residents—also left the country
vulnerable, 40 years later, to the rapid advance of the Islamic State (also known as ISIS), which
capitalized on the lack of local control over energy and water to take over wide swaths of rural
Syria. Since the escalation of the crisis in Syria into an all-out war, large groups of displaced
people moving from the country to Europe have joined the massive cohort of vulnerable
populations fleeing conflict-stricken areas. They have faced coercive border practices and
extremely precarious living conditions in refugee camps. And yet their number pales in
comparison to the number of internally displaced people in Syria.

There is no clear evidence, however, that climate change alone triggered these and similar new
migration trends. Multiple social, economic, and political factors lead people to migrate, and it
is difficult to isolate the environment from those other drivers. It is dangerous, moreover, to
point to climate change as the root of the region’s ills, because that supposition risks
promoting deceptively simple conflict-resolution measures and limiting the ability of
policymakers to lay the groundwork for real change.

One of the top priorities when it comes to improving conditions for the people most at risk in
countries such as Syria is recognizing the intersections between the environment and armed
conflict and the ways in which various parties have weaponized the region’s vulnerability to
climate-driven scarcity. Governments and nonstate actors have repeatedly targeted key
infrastructure, depriving people of vital goods and services. During the war in Yemen, for
example, Saudi forces have cut off local populations’ access to clean water and sanitation,
placing citizens at high risk for communicable illnesses. As a result, Save the Children classified
Yemen’s 2016 cholera epidemic as a “man-made crisis.”

In Syria, the government and nonstate actors alike have deliberately damaged water
resources and vital infrastructure as a wartime strategy. In 2013 and 2014, battles between
regime forces and ISIS destroyed water plants and sewage pipelines. At one point,
approximately 35 percent of Syria’s water treatment plants no longer functioned. Meanwhile,
ISIS’s capture of the Tabqa dam in 2013 represented a significant victory for the group: ISIS
threatened to cut off electricity delivery to Damascus, and it released 11 million cubic meters of
water to flood the surrounding farmland, forcing local populations into submission and the
central government into a no-strike agreement. Turkey also weaponized water during the
conflict: to squelch the rise of Kurdish autonomy in northeastern Syria, which threatened to
further radicalize Turkey’s own Kurdish population, Turkish troops shut off water to 460,000
people in the Syrian province of Hasakah and in three different refugee camps at a time when
COVID-19 was running rampant.

The targeting of other infrastructure has also put civilians at risk: when the Syrian government,
in conjunction with Russia, damaged oil refiners in the northeastern part of the country, the
leaks contaminated surrounding groundwater—a risk factor for gastrointestinal illness, damage
to the nervous and reproductive systems, and chronic diseases such as cancer. The Syrians and
the Russians aren’t alone in wreaking havoc: water shutoffs by Turkey, combined with low
rainfall, led the Khabur River to dry up; the river became a landfill and an open sewage site,
spreading disease to neighboring villages.

WATER FOR EVERYONE

Although the United States and European countries seem to be preparing to pivot away from
the Middle East, they and international organizations must work harder to foster international
norms that protect natural resources and infrastructure even in the midst of conflict.
Washington has a limited appetite for confronting such partners as Saudi Arabia on human
rights violations, but applying pressure on U.S. partners in the Middle East, including Ridayh, to
adopt a common set of standards on this issue could help protect civilians around the globe.
After all, there are no long-term winners when infrastructure is destroyed. In addition to the
devastating effects it has on civilians, obliterating basic services creates complications that
foreign actors would prefer to avoid.
In Syria and Yemen, the destruction of infrastructure has helped foster lucrative war economies,
with both pro- and anti-regime elites carrying out smuggling and extortion rackets in exchange
for food, water, and fuel. This dynamic doesn’t work to the benefit of even the most cynical
international actors operating in the region: when civilians can no longer look to the state to
provide necessities such as potable water, there is room for nonstate actors such as ISIS to make
inroads. In the end, the most vulnerable populations, such as refugees, pay the ultimate price.

In Yemen, people’s already insecure access to food supplies has been exacerbated by the
Saudi-led blockade of two major ports, Hodeidah and Salif, where 80 percent of food imports
enter the country. All the parties to the conflict there have used the food supply as a
shortsighted weapon. This includes the Houthis, the Shiite sect that is fighting the country’s
Saudi-backed central government, who have expropriated food aid provided by the World Food
Program for extortion rackets to fund their wartime operations. The COVID-19 pandemic has
only intensified the crisis by disrupting vital supply chains and limiting the purchasing power of
local populations.

No one should downplay the importance of climate change in today’s Middle East or in the
region’s future. But policymakers must also understand that the worst outcomes related to
environmental stress and scarcity in the region are caused not by long-term shifts in the
climate, which are difficult to control, but by short-term choices made and actions taken by
powerful people and institutions, which are far easier to influence. Grasping that fundamental
truth is the first step to both protecting the most vulnerable people in the region and helping
governments transition to more sustainable practices. The cost of those tasks will be high—but
the gains to human security and prosperity far greater
at food
Food shortage doesn’t cause war – best studies
Allouche, research Fellow – water supply and sanitation @ Institute for Development Studies,
frmr professor – MIT, ‘11

(Jeremy, “The sustainability and resilience of global water and food systems: Political analysis of
the interplay between security, resource scarcity, political systems and global trade,” Food
Policy, Vol. 36 Supplement 1, p. S3-S8, January)

The question of resource scarcity has led to many debates on whether scarcity (whether of food
or water) will lead to conflict and war. The underlining reasoning behind most of these
discourses over food and water wars comes from the Malthusian belief that there is an
imbalance between the economic availability of natural resources and population growth since
while food production grows linearly, population increases exponentially. Following this
reasoning, neo-Malthusians claim that finite natural resources place a strict limit on the growth
of human population and aggregate consumption; if these limits are exceeded, social
breakdown, conflict and wars result. Nonetheless, it seems that most empirical studies do not
support any of these neo-Malthusian arguments. Technological change and greater inputs of
capital have dramatically increased labour productivity in agriculture. More generally, the neo-
Malthusian view has suffered because during the last two centuries humankind has breached
many resource barriers that seemed unchallengeable.

Lessons from history: alarmist scenarios, resource wars and international relations

In a so-called age of uncertainty, a number of alarmist scenarios have linked the increasing use of
water resources and food insecurity with wars. The idea of water wars (perhaps more than food
wars) is a dominant discourse in the media (see for example Smith, 2009), NGOs (International
Alert, 2007) and within international organizations (UNEP, 2007). In 2007, UN Secretary General
Ban Ki-moon declared that ‘water scarcity threatens economic and social gains and is a potent
fuel for wars and conflict’ (Lewis, 2007). Of course, this type of discourse has an instrumental
purpose; security and conflict are here used for raising water/food as key policy priorities at the
international level.

In the Middle East, presidents, prime ministers and foreign ministers have also used this bellicose
rhetoric. Boutrous Boutros-Gali said; ‘the next war in the Middle East will be over water, not
politics’ (Boutros Boutros-Gali in Butts, 1997, p. 65). The question is not whether the sharing of
transboundary water sparks political tension and alarmist declaration, but rather to what extent
water has been a principal factor in international conflicts. The evidence seems quite weak.
Whether by president Sadat in Egypt or King Hussein in Jordan, none of these declarations have
been followed up by military action.

The governance of transboundary water has gained increased attention these last decades. This
has a direct impact on the global food system as water allocation agreements determine the
amount of water that can used for irrigated agriculture. The likelihood of conflicts over water is
an important parameter to consider in assessing the stability, sustainability and resilience of
global food systems.

None of the various and extensive databases on the causes of war show water as a casus belli.
Using the International Crisis Behavior (ICB) data set and supplementary data from the
University of Alabama on water conflicts, Hewitt, Wolf and Hammer found only seven disputes
where water seems to have been at least a partial cause for conflict (Wolf, 1998, p. 251). In fact,
about 80% of the incidents relating to water were limited purely to governmental rhetoric
intended for the electorate (Otchet, 2001, p. 18).

As shown in The Basins At Risk (BAR) water event database, more than two-thirds of over 1800
water-related ‘events’ fall on the ‘cooperative’ scale (Yoffe et al., 2003). Indeed, if one takes into
account a much longer period, the following figures clearly demonstrate this argument.
According to studies by the United Nations Food and Agriculture Organization (FAO), organized
political bodies signed between the year 805 and 1984 more than 3600 water-related treaties,
and approximately 300 treaties dealing with water management or allocations in international
basins have been negotiated since 1945 (FAO, 1978 and FAO, 1984).

The fear around water wars have been driven by a Malthusian outlook which equates scarcity
with violence, conflict and war. There is however no direct correlation between water scarcity and
transboundary conflict. Most specialists now tend to agree that the major issue is not scarcity per
se but rather the allocation of water resources between the different riparian states (see for
example Allouche, 2005, Allouche, 2007 and [Rouyer, 2000] ). Water rich countries have been
involved in a number of disputes with other relatively water rich countries (see for example
India/Pakistan or Brazil/Argentina). The perception of each state’s estimated water needs really
constitutes the core issue in transboundary water relations. Indeed, whether this scarcity exists
or not in reality, perceptions of the amount of available water shapes people’s attitude towards
the environment (Ohlsson, 1999). In fact, some water experts have argued that scarcity drives
the process of co-operation among riparians (Dinar and Dinar, 2005 and Brochmann and
Gleditsch, 2006).

In terms of international relations, the threat of water wars due to increasing scarcity does not
make much sense in the light of the recent historical record. Overall, the water war rationale
expects conflict to occur over water, and appears to suggest that violence is a viable means of
securing national water supplies, an argument which is highly contestable.

The debates over the likely impacts of climate change have again popularised the idea of water
wars. The argument runs that climate change will precipitate worsening ecological conditions
contributing to resource scarcities, social breakdown, institutional failure, mass migrations and
in turn cause greater political instability and conflict (Brauch, 2002 and Pervis and Busby, 2004).
In a report for the US Department of Defense, Schwartz and Randall (2003) speculate about the
consequences of a worst-case climate change scenario arguing that water shortages will lead to
aggressive wars (Schwartz and Randall, 2003, p. 15). Despite growing concern that climate
change will lead to instability and violent conflict, the evidence base to substantiate the
connections is thin ( [Barnett and Adger, 2007] and Kevane and Gray, 2008).
at sinks
Their Sag card says “there is the potential for sequestration of millions of
tonnes of CO2” from weed plants–that’s literally a joke compared to the
rainforest
Phillips, professor of tropical ecology – University of Leeds, 3/21/’15
(Oliver, “Amazon carbon sink declines as trees grow fast, die faster,” The Ecologist)

To date the Amazon has been a huge carbon sink, soaking up billions of tonnes of our emissions
from fossil fuels, write Oliver Phillips & Roel Brienen. But now that's changing, as trees grow
faster and die younger: the sink appears to be saturating.

Tropical forests are being exposed to unprecedented environmental change, with huge knock-
on effects. In the past decade, the carbon absorbed annually by the Amazon rain forest has
declined by almost a third.

At 6 million sq.km, the Amazon forest covers an area 25 times that of the UK, and spans large
parts of nine countries. The region contains a fifth of all species on earth, including more than
15,000 types of tree.

Its 300 billion trees store 20% of all the carbon in the Earth's biomass, and each year they
actively cycle 18 billion tonnes of carbon, twice as much as is emitted by all the fossil fuels burnt
in the world.
The Amazon Basin is also a hydrological powerhouse. Water vapour from the forest nurtures agriculture to the south, including the biofuel crops which
power many of Brazil's cars and the soybeans which feed increasing numbers of people (and cows) across the planet.

What happens to the Amazon thus matters to the world. As we describe in research published in Nature, the biomass dynamics of apparently intact
forests of the Amazon have been changing for decades now with important consequences.

Is climate changing the Amazon?

There are two competing narratives of how tropical forests should be responding to global changes. On one hand, there is the theoretical prospect (and
some experimental evidence) that more carbon dioxide will be 'good' for plants.

Carbon dioxide is the key chemical ingredient in photosynthesis, so more of it should lead to faster growth and thus more opportunities for trees and
whole forests to store carbon. In fact almost all global models of vegetation predict faster growth and, for a time at least, greater carbon storage.

Arrayed against this has been an opposing expectation, based on the physical climate impacts of the very same increase in atmospheric CO2. As the
tropics warm further, respiration by plants and soil microbes should increase faster than photosynthesis, meaning more carbon is pumped into the air
than is captured in the 'sink'.

More extreme seasons will also mean more droughts, slowing growth and sometimes even killing trees.

Which process will win?

The work we have led takes a simple approach. With many colleagues, we track the behaviour of individual trees through time across permanent plots
distributed right across South America's rain forests.

Together with hundreds of partners in the RAINFOR network, this close-up look at the Amazon ecosystem has been underway since the 1980s, allowing
an unprecedented assessment of how tropical forests have changed over the past three decades.

Our analysis - based on work across 321 plots, 30 years, eight nations, and involving almost 500 people - first of all confirms earlier results. The Amazon
forest has acted as a vast sponge for atmospheric carbon. That is, trees have been growing faster than they have been dying.
The difference - the 'sink' - has helped to put a modest brake on the rate of climate change by taking up an additional two billion tonnes of carbon
dioxide each year.

This extra carbon has been going into ostensibly mature forests, ecosystems which according to classical ecology should be at a dynamic equilibrium
and thus close to carbon-neutral.

Amazon trees are finding it harder to survive

However we also found a long and sustained increase in the rate of trees dying in Amazon
forests that are undisturbed by direct human impacts.

Tree mortality rates have surged by more than a third since the mid-1980s, while growth rates
have stalled over the past decade. This had a significant impact on the Amazon's capacity to
take-up carbon.

Recent droughts and unusually high temperatures in the Amazon are almost certainly behind
some of this 'mortality catch-up'. One major drought in 2005 killed millions of trees. However
the data shows tree mortality increases began well before then. Some other, non-climatic
mechanism may be killing off Amazonian trees.

The simplest answer is that faster growth, which is consistent with a CO2 stimulation, is now
causing trees to also die faster. As the extra carbon feeds through the system, trees not only
grow quicker but they also mature earlier. In short, they are living faster, and therefore dying
younger.

Thus, 30 years of painstakingly monitoring the Amazon has revealed a complex and changing
picture. Predictions of a continuing increase of carbon storage in tropical forests may be overly
optimistic - these models simply don't capture the important feed-through effect of faster
growth on mortality.

Forests' ability to store carbon is reducing

As the Amazon forest growth cycle has been accelerating, carbon is moving through it more
rapidly. One consequence of the increase in death should be an increase in the amount of
necromass - dead wood - on the forest floor.

While we haven't measured these changes directly, our model suggests the amount of dead
wood in the Amazon has increased by 30% (more than 3 billion tonnes of carbon) since the
1980s. Most of this decaying matter is destined to return to the atmosphere sooner rather than
later.

More than a quarter of current emissions are being taken up by the land sink, mostly by forests.
But a key element appears to be saturating.

This reminds us that the subsidy from nature is likely to be strictly time-limited, and deeper cuts
in emissions will be required to stabilise our climate.
Nuke War Turns Climate

Choi, writer for LiveScience, reprinted in Yahoo! News, 3/29/ 2014


(Charles Q., “'Small' Nuclear War Could Trigger Catastrophic Cooling,”
http://news.yahoo.com/small-nuclear-war-could-trigger-catastrophic-cooling-181056235.html)

To see what effects such a regional nuclear conflict might have on climate, scientists modeled a
war between India and Pakistan involving 100 Hiroshima-level bombs, each packing the
equivalent of 15,000 tons of TNT — just a small fraction of the world's current nuclear arsenal.
They simulated interactions within and between the atmosphere, ocean, land and sea ice
components of the Earth's climate system.

Scientists found the effects of such a war could be catastrophic.

"Most people would be surprised to know that even a very small regional nuclear war on the
other side of the planet could disrupt global climate for at least a decade and wipe out the
ozone layer for a decade," study lead author Michael Mills, an atmospheric scientist at the
National Center for Atmospheric Research in Colorado, told Live Science.

The researchers predicted the resulting firestorms would kick up about 5.5 million tons (5
million metric tons) of black carbon high into the atmosphere. This ash would absorb incoming
solar heat, cooling the surface below.
A2: 1% Risk
1% risk framing causes policy paralysis – short term and high
probability impacts outweigh
Lomborg 8 – Bjorn Lomborg 8, adjunct professor at the Copenhagen Business
School, where he founded and directs its Copenhagen Consensus Center, August 25,
2008, “A warming theory that has melted away,” The Guardian, online:
http://www.guardian.co.uk/commentisfree/2008/aug/25/climatechange.scienceofclim
atechange
But it is interesting to assess Weitzman's argument (My arguments are partly indebted to Professor Nordhaus (pdf)).
Tickell (and many other campaigners) fancies Weitzman, because his economic argument seems to support draconian
climate policies. While very technical, it relies on a fairly straightforward gist. All risks you can think of –
even catastrophic ones – have non-zero risk. Thus, it is possible (if not very likely) that global warming
will not only increase the planet's temperature by 4C, but 10C. Heck, it might even increase beyond 20C – which
Weitzman with armchair climatology, suggests might have a probability of 1%. Since evidence for or against such
extremes is scarce, accumulating evidence can only slowly close us in on their true probability.
Yet, for any given amount of evidence, there will always be sufficiently outrageous risks (think 30C)
that are sufficiently unbound by evidence and sufficiently close to negative infinite
utility that the total net utility is negative infinity. Thus, we should be willing to spend all our
money to avoid it.¶ Now, in principle all economists would agree that non-trivial risks should be included in the
model, and for example, Nordhaus has done that analytically in cost-benefit models (they still show that large emission
cuts are not warranted). However, the Weitzman result curiously means that the more speculative and fuzzy the extreme
event, the more it counts in the total utility.¶ This is an argument driven by a technicality – essentially a claim that we are
willing to pay an infinite amount to avoid even an infinitesimal risk of annihilation. Yet we demonstratively
aren't – and shouldn't be. Civilization-ending asteroids hit the earth once every 100m years, but at present we only
spend $4m per year to track them. Maybe we should pay $1bn. But we shouldn't spend everything. ¶ This underscores the
we allow all scary, fuzzy concerns onto centre
fatal flaw in the Weitzman argument. When
stage, there is no end to where we should spend all our money. Every conceivable policy
measure has a non-zero risk of catastrophe and so should be avoided at any cost.
Biotechnology, strangelets, runaway computer systems, nuclear proliferation, rogue weeds and bugs, pandemics, and
asteroids are just a small sample of the areas each of which we should spend all our money on. ¶ Tickell doesn't deal with
these arguments at all. As with Stern, he simply picks Weitzman because the policy conclusion fits. Tickell then claims that
spending $2tn annually on large-scale emissions cuts will provide the best insurance for mankind. But this ignores that
investments in energy R&D will probably long-term cut 11 times more CO2. Moreover, if our goal is not just to cut CO2
but to help people and the planet, we can do even more good by focusing on simple solutions
such as investing in nutrition, health and agricultural technologies. Instead of avoiding a couple of thousand extra malaria
deaths in a century cases through expensive CO2 cuts, maybe we should avoid a million malaria deaths now through low-
cost health policies.¶ Tickell's reply clearly shows what happens when policy drives the search for suitable facts. The IPCC
is simply ignored, Stern is praised for his policy usefulness, Weitzman embraced irrespective of his analysis
essentially leading to policy paralysis, driven by extreme and pervasive speculative risks. Not surprisingly, Tickell
ends by saying – without a shred of evidence – that his policy would be the best solution, "even without the threat of
global warming".
A2: Impact - Disease
At best, 250,000 dead from disease and not until 2050

New Vision, August 27, 2014,


http://www.newvision.co.ug/news/659167-climate-change-poses-
growing-health-threat-un.html

Warmer temperature and altered rainfall patterns may also extend


the range of mosquitoes that spread malaria, dengue and
chikungunya.

According to WHO figures, at current rates of change, an


additional 250,000 lives could be lost per year between 2030 and
2050, with poor nations continuing to bear the brunt.
Malnutrition, which already kills 3.1 million people per year,
would be to blame for 95,000 of those deaths.
A2: Impact - Agriculture
Demand for meat killing agriculture, and it’s responsible for 30% of emissions

Peter Hannam, 8-27, The Age (Melbourne, Australia), August 27,


2014 Climate change may disrupt global food system within a decade, World Bank says

The world is headed "down a dangerous path" with disruption of


the food system possible within a decade as climate change
undermines nations' ability to feed themselves, according to a
senior World Bank official.
Rising urban populations are contributing to expanded demand for
meat, adding to nutrition shortages for the world's poor.
Increased greenhouse gas emissions from livestock as well as land
clearing will make farming more marginal in many regions,
especially in developing nations, said Rachel Kyte, World Bank
Group Vice President and special envoy for climate change.
"The challenges from waste to warming, spurred on by a growing
population with a rising middle-class hunger for meat, are
leading us down a dangerous path," Professor Kyte told the
Crawford Fund 2014 annual conference in Canberra on Wednesday.
"Unless we chart a new course, we will find ourselves staring
volatility and disruption in the food system in the face, not in
2050, not in 2040, but potentially within the next decade," she
said, according to her prepared speech.
Agriculture and land-use change account for about 30 per cent of
the greenhouse gas emissions blamed for global warming. Feed
efficiency can be so low in arid parts of Africa, where livestock
typically graze on marginal land and crop residues, that every
kilo of protein produced can contribute the equivalent of one
tonne of carbon dioxide - or 100 times more than in developed
nations, Professor Kyte said.
A2: Impact- Ocean Acidification
Algae adaptation means the ocean is resilient to climate change

MyNextfone, 9-14, 2014 http://www.mynextfone.co.uk/noncyclical-consumer-goods/adapt-to-


higher-temperatures-acidification-evolution-often-overlooked-in-climate-h22041.html “

Ocean algae can evolve fast to tackle climate change-study.

* Microbes adapt to higher temperatures, acidification

* Evolution often overlooked in climate projections

By Alister Doyle, Environment Correspondent

OSLO, Sept 14 (Reuters) - Tiny marine algae can evolve fast enough to cope with climate
change in a sign that some ocean life may be more resilient than thought to rising
temperatures and acidification, a study showed.

Evolution is usually omitted in scientific projections of how global warming will affect the
planet in coming decades because genetic changes happen too slowly to help larger creatures
such as cod, tuna or whales.

Sunday's study found that a type of microscopic algae that can produce 500 generations a year
- or more than one a day - can still thrive when exposed to warmer temperatures and levels of
ocean acidification predicted for the mid-2100s.

The Emiliania huxleyi phytoplankton studied are a main source of food for fish and other ocean
life and also absorb large amounts of carbon dioxide, the main greenhouse gas, as they grow.
Their huge blooms can sometimes be seen from space.

"Evolutionary processes need to be considered when predicting the effects of a warming and
acidifying ocean on phytoplankton," according to the German-led study in the journal Nature
Climate Change.

Sea urchins and coral reefs are resilient

MyNextfone, 9-14, 2014 http://www.mynextfone.co.uk/noncyclical-consumer-goods/adapt-to-


higher-temperatures-acidification-evolution-often-overlooked-in-climate-h22041.html “

Sunday's study showed that algae, taken from water 15 degrees C (59 Fahrenheit) warm off
Norway, tended to evolve to a smaller size in higher temperatures in experiments lasting more
than a year but also grew faster, producing a larger mass overall.
Stephen Palumbi, a professor of biology at Stanford University, said there was evidence that
some coral reefs or sea urchins could be more resilient than expected to ocean changes.

The aff’s claim of the impacts of ocean acidification is not true. Acidic
oceans can actually be beneficial for many marine dwellers.
AAAS Science 9
(American Association for the Advancement of Science, 12/1/2009, AAAS Science, “Acidic Oceans
May be a Boon for Some Marine Dwellers”, http://news.sciencemag.org/2009/12/acidic-oceans-
may-be-boon-some-marine-dwellers, 6/25/2014, AC)

Researchers fret that many species of invertebrates will disappear as the oceans acidify due to increased levels of atmospheric
carbon dioxide (CO2). But a new study concludes that some of these species may benefit from ocean
acidification, growing bigger shells or skeletons that provide more protection. The work suggests
that the effects of increased CO2 on marine environments will be more complex than previously thought. Bottom-
dwelling marine critters such as lobsters and corals encase themselves in shells or
exoskeletons made from calcium carbonate. Previous studies predict that rising ocean acidity will result in the
loss or weakening of these exoskeletons or shells and increase their owner's vulnerability to disease, predators, and
environmental stress. But marine scientist Justin Ries of the University of North Carolina, Chapel Hill, hypothesized that not
all ocean organisms would respond the same way to acidity because they use different forms
of calcium carbonate for their shells. Ries and two colleagues from the Woods Hole Oceanographic Institution in
Massachusetts exposed 18 species of marine organisms to seawater with four levels of acidity. The first environment matched
today's atmospheric CO2 levels, and two others were set at double and triple the pre-Industrial CO2 levels, mimicking
conditions predicted to occur over the next century. The fourth CO2 level was 10 times pre-Industrial levels. Although CO2
levels won't rise that high in our lifetime, Ries says they could within 500 to 700 years. The atmosphere did contain that much
CO2 during the Cretaceous period about 100 million years ago, Ries says. "This is an interval in which many of these organisms
lived and apparently did okay, despite the extremely elevated levels of atmospheric CO2 that existed at that time." Blue
crabs, lobsters, and shrimp prospered in the highest CO2 level, growing heavier shells, the
researchers report today in Geology. Ries says a bulkier shell might be more resistant to crushing by
predators. American oysters, scallops, temperate corals, and tube worms all fared poorly and grew thinner, weaker shells.
The biggest losers included clams and pencil urchins; their exoskeletons dissolved at the highest CO2 levels. Susceptibility
to acid depends in part on the type of calcium carbonate the animal makes , the researchers found.
But a shell's mineralogy alone was not the only factor. If critters were able to control pH at their calcification sites by buffering
the acid in the surrounding water, as the calcareous green algae did, they also fared better. But Ries points out that
this coping mechanism takes energy--how much isn't known--which could have side effects such as diverting energy from
maintaining an immune response. "The take-home message is that the responses to ocean acidification are going to be a lot
more nuanced and complex than we thought," Ries says.

Ocean Acidification Is Not A Threat- Previous Studies used Predictions


instead of Empirical Data or Studies
Ridley, PhD in Zoology, 10 (Matt, 2010, The Global Warming Policy Foundation, “Threat
From Ocean Acidification Greatly Exaggerated”, http://www.thegwpf.org/matt-ridley-
threat-from-ocean-acidification-greatly-exaggerated/, 6/29/14, AEG)
Lest my critics still accuse me of cherry-picking studies, let me refer them also to the results of Hendrikset al. (2010, Estuarine,
Coastal and Shelf Science 86:157). Far from being a cherry-picked study, this is a massive meta-analysis .
The authors
observed that `warnings that ocean acidification is a major threat to marine biodiversity are
largely based on the analysis of predicted changes in ocean chemical fields’ rather than
empirical data. So they constructed a database of 372 studies in which the responses of 44
different marine species to ocean acidification induced by equilibrating seawater with CO2-
enriched air had been actually measured. They found that only a minority of studies
demonstrated `significant responses to acidification’ and there was no significant mean
effect even in these studies. They concluded that the world’s marine biota are `more
resistant to ocean acidification than suggested by pessimistic predictions identifying ocean
acidification as a major threat to marine biodiversity’ and that ocean acidification `may not
be the widespread problem conjured into the 21st century…Biological processes can
provide homeostasis against changes in pH in bulk waters of the range predicted during the
21st century.’ This important paper alone contradicts Hoegh-Gudlberg’s assertion that `the
vast bulk of scientific evidence shows that calcifiers… are being heavily impacted already’.

No oceans impact- pH variation is Natural


Hofmann et al 2011 (Gretchen E., Professor of Ecology, Evolution and Marine Biology –
University of California Santa Barbara, et al, “High-Frequency Dynamics of Ocean pH: A
Multi-Ecosystem Comparison,” PLoS ONE Vol. 6, No. 12)

Since the publication of two reports in 2005–2006 [1], [2], the


drive to forecast the effects of anthropogenic
ocean acidification (OA) on marine ecosystems and their resident calcifying marine organisms
has resulted in a growing body of research. Numerous laboratory studies testing the effects of altered
seawater chemistry (low pH, altered pCO2, and undersaturation states - Ω - for calcium carbonate polymorphs) on biogenic
calcification, growth, metabolism, and development have demonstrated a range of responses in marine organisms (for reviews
see [3]–[8]). However, the emerging pictureof biological consequences of OA – from data gathered largely from
laboratory experiments – is not currently matched by equally available environmental data that describe
present-day pH exposures or the natural variation in the carbonate system experienced by most
marine organisms. Although researchers have documented variability in seawater carbonate chemistry on several
occasions in different marine ecosystems (e.g., [9]–[15]), this variation has been under-appreciated in these
early stages of OA research. Recently, a deeper consideration of ecosystem-specific variation in seawater chemistry has
emerged (e.g., [16]–[18]), one that is pertinent to the study of biological consequences of OA. Specifically, assessments of
environmental heterogeneity present a nuanced complement to current laboratory
experiments. The dynamics of specific natural carbonate chemistry on local scales provide
critical context because outcomes of experiments on single species are used in meta-analyses to
project the overall biological consequences of OA [7], [19], to forecast ecosystem-level outcomes [20], and
ultimately to contribute to policy decisions [21] and the management of fisheries [22], [23]. As noted earlier [24],
natural variability in pH is seldom considered when effects of ocean acidification are considered.
Natural variability may occur at rates much higher than the rate at which carbon
dioxide is decreasing ocean pH, about −0.0017 pH/year [25], [26]. This ambient fluctuation in pH may
have a large impact on the development of resilience in marine populations, or it may combine with the
steady effects of acidification to produce extreme events with large impacts [24]. In either case, understanding the
environmental variability in ocean pH is essential. Although data on the natural variation in the seawater CO2
system are emerging, nearly all high-resolution (e.g. hourly) time series are based on pCO2 sensors, with comparatively few
pH time series found in the literature. From a research perspective, the absence of information
regarding natural pH dynamics is a critical data gap for the biological and ecological arm of the
multidisciplinary investigation of OA. Our ability to understand processes ranging from physiological tolerances to
local adaptation is compromised. Specifically, laboratory experiments to test tolerances are often not designed to
encompass the actual habitat exposure of the organisms under study, a critical design criterion in organismal physiology that
also applies to global change biology [27]–[29]. It is noted that neither pH nor pCO2 alone provide the information sufficient to
fully constrain the CO2 system, and while it is preferred to measure both, the preference for measuring one over the other is
evaluated on a case-by-case basis and is often dictated by the equipment available. In this light, data that reveal present-day
pH dynamics in marine environments and therefore ground pH levels in CO2 perturbation experiments in an environmental
context are valuable to the OA research community in two major ways. First, estimates of organismal resilience are greatly
facilitated. Empiricists can contextualize lab experiments with actual environmental data, thereby improving them. Notably,
the majority of manipulative laboratory experiments in OA research (including our own) have been parameterized using pCO2
levels as per the IPCC emission scenario predictions [30]. One consequence of this practice is that organisms are potentially
tested outside of the current exposure across their biogeographic range, and tolerances are not bracketed appropriately. This
situation may not be a lethal issue (i.e. negating all past observations in experiments where environmental context was not
known); however, the lack of information about the ‘pH seascape’ may be translated through these organismal experiments in
a manner that clouds the perspective of vulnerability of marine ecosystems. For example, recent data on the heterogeneity of
pH in coastal waters of the Northeastern Pacific [31], [32] that are characterized by episodic upwelling has caused biologists to
re-examine the physiological tolerances of organisms that live there. Specifically, resident calcifying marine invertebrates and
algae are acclimatized to existing spatial and temporal heterogeneity [17], [18], and further, populations are likely adapted to
local to regional differences in upwelling patterns [33]. Secondly, in addition to improving laboratory experiments, data
regarding the nature of the pH seascape also facilitate hypothesis-generating science. Specifically, heterogeneity in the
environment with regard to pH and pCO2 exposure may result in populations that are acclimatized to variable pH or extremes
in pH. Although this process has been highlighted in thermal biology of marine invertebrates [34], such insight is not available
with regard to gradients of seawater chemistry that occur on biogeographic scales. With that said, recent field studies have
demonstrated that natural variation in seawater chemistry does influence organismal abundance and distribution [16], [35],
[36]. With our newfound access to pH time series data, we can begin to explore the biophysical link between environmental
seawater chemistry and resilience to baseline shifts in pH regimes, to identify at-risk populations as well as tolerant ones.
Additionally, the use of sensors in the field can identify hidden patterns in the CO2 system, revealing areas that are refugia to
acidification or carbonate undersaturation; such knowledge could enable protection, management, and remediation of critical
marine habitats and populations in the future. The recent development of sensors for in situ measurements of seawater pH
[37], [38] has resulted in the ability to record pH more readily in the field in a manner that can support biological and
ecological research. Since 2009, the Martz lab (SIO) has constructed 52 “SeaFET” pH sensors for 13 different collaborators (see
http://martzlab.ucsd.edu) working in a broad range of settings. Using subsamples of data from many of these sensors, here we
examine signatures of pH heterogeneity, presenting time series snapshots of sea-surface pH (upper 10 m) at 15 locations,
spanning various overlapping habitat classifications including polar, temperate, tropical, open ocean, coastal, upwelling,
estuarine, kelp forest, coral reef, pelagic, benthic, and extreme. Naturally, at many sites, multiple habitat classifications will
apply. Characteristic patterns observed in the 30-day snapshots provide biome-specific pH signatures. This comparative
dataset highlights the heterogeneity of present-day pH among marine ecosystems and underscores that contemporary marine
organisms are currently exposed to different pH regimes in seawater that are not predicted until 2100. Results Overall, the
patterns of pH recorded at each of the 15 deployment sites (shown in Figure 1, Table 1) were strikingly different. Figure 2
presents the temporal pattern of pH variation at each of these sites, and, for the sake of comparison, these are presented as 30-
day time series “snapshots.” Note that all deployments generated >30 days of data except for sensors 3, 4, and 13, where the
sensors were deliberately removed due to time constraints at the study sites. Though the patterns observed among the
various marine ecosystems are driven by a variety of oceanographic forcing such as temperature, mixing, and biological
activity, we do not provide a separate analysis of controlling factors on pH at each location. Each time series was accompanied
by a different set of ancillary data, some rich with several co-located sensors, others devoid of co-located sensors. Given these
differences in data collection across sites, here we focus on the comparative pH sensor data as a means to highlight observed
pH variability and ecosystem-level differences between sites. For purposes of comparison, the metrics of variability presented
here are pH minima, maxima, range, standard deviation, and rate of change (see Table 2). The rate presented in Table 2 and
Figure 3 represents a mean instantaneous rate of change in pH hr−1, where a rate was calculated for each discrete time step as
the absolute value of pH difference divided by the length of time between two adjacent data points. In terms of general
patterns amongst the comparative datasets, the open ocean sites (CCE1 and Kingman Reef) and the Antarctic sites (Cape Evans
and Cindercones) displayed the least variation in pH over the 30-day deployment period. For example, pH range fluctuated
between 0.024 to 0.096 at CCE1, Kingman Reef, Cape Evans, and Cindercones (Figure 2A, B and Table 2). In distinct contrast to
the stability of the open ocean and Antarctic sites, sensors at the other five site classifications (upwelling, estuarine/near-
shore, coral reef, kelp forest, and extreme) captured much greater variability (pH fluctuations ranging between 0.121 to 1.430)
and may provide insight towards ecosystem-specific patterns. The sites in upwelling regions (Pt. Conception and Pt. Ano
Nuevo, Figure 2C), the two locations in Monterey Bay, CA (Figure 2D), and the kelp forest sites (La Jolla and Santa Barbara
Mohawk Reef, Figure 2F) all exhibited large fluctuations in pH conditions (pH changes>0.25). Additionally, at these 6 sites, pH
oscillated in semi-diurnal patterns, the most apparent at the estuarine sites. The pH recorded in coral reef ecosystems
exhibited a distinct diel pattern characterized by relatively consistent, moderate fluctuations (0.1<pH change<0.25; Figure 2E).
At the Palmyra fore reef site, pH maxima occurred in the early evening (~5:00 pm), and pH minima were recorded
immediately pre-dawn (~6:30 am). On a fringing reef site in Moorea, French Polynesia, a similar diel pattern was observed,
with pH maxima occurring shortly after sunset (~7:30 pm) and pH minima several hours after dawn (~10:00 am). Finally, the
greatest transitions in pH over time were observed at locations termed our “Extreme” sites - a CO2 venting site in Italy (site S2
in ref. [36]) and a submarine spring site in Mexico. For these sites, the patterns were extremely variable and lacked a
detectable periodicity (Figure 2G). The sites examined in this study do not comprehensively represent pH variability in coastal
ecosystems, partly because we focused on surface epipelagic and shallow benthic pH variability. Many organisms that may be
impacted by pH variability and ocean acidification reside at intermediate (>10 m) to abyssal depths. Notable regimes missing
from Figure 2 include seasonally stratified open ocean locations that exhibit intense spring blooms; the equatorial upwelling
zone; other temperate (and highly productive) Eastern Continental Boundary upwelling areas; subsurface oxygen minimum
zones and seasonal dead zones; and a wide variety of unique estuarine, salt marsh, and tide pool environments. Spring bloom
locations exhibit a marked increase in diel pCO2 variability during the peak bloom with a coincident drawdown similar in
magnitude but opposite in sign to the upwelling signals shown in Figure 2 [39]. Equatorial upwelling locations undergo
significant stochastic variability, as observed by pCO2 sensors in the TAO array (data viewable at
http://www.pmel.noaa.gov/). Intertidal vegetated and tide pool habitats may exhibit major pH fluctuations due to macrophyte
or animal respiratory cycles [15], while CO2 production in oxygen minimum zones can reduce pH to a limit of about 7.4 [40].
Due to local temperature differences, variable total alkalinity, and seasonal differences between deployment dates at each site,
a comparison of average pH across the datasets would be somewhat misleading. However, some information can be gleaned
from an examination of the averages: the overall binned average of all 15 mean values in Table 1 is 8.02±0.1. This pH value is
generally in agreement with the global open ocean mean for 2010 of 8.07, a value generated by combining climatology data for
temperature, salinity, phosphate, silicate [41]–[43], total alkalinity [44], and pCO2 [45] for the year 2000, corrected to 2010
using the average global rise of 1.5 µatm pCO2 yr−1. Rather than make a point-by-point comparison of the mean pH of each
dataset, we focus instead on the differences in observed variability amongst the sites. For this analysis, summary statistics of
the comparative datasets were ranked in order to examine the range of variability across all 15 sites (Fig. 3). Discussion
Collected by 15 individual SeaFET sensors in seven types of marine habitats, data presented here
highlight natural variability in seawater pH. Based on Figure 3, it is evident that regions of the ocean exhibit a
continuum of pH variability. At sites in the open ocean (CCE-1), Antarctica, and Kingman reef (a coastal region in the
permanently stratified open Pacific Ocean with very low residence times, and thus representative of the surrounding open
ocean water), pH was very stable (SD<0.01 pH over 30 days). Elsewhere, pH was highly variable across a range
of ecosystems where sensors were deployed. The salient conclusions from this comparative dataset are two-
fold: (1) most non-open ocean sites are indeed characterized by natural variation in seawater chemistry that can now be
revealed through continuous monitoring by autonomous instrumentation, and (2) in some cases, seawater in these sites
reaches extremes in pH, sometimes daily, that are often considered to only occur in open
ocean systems well into the future [46]. Admittedly, pH is only part of the story with regard to the biological
impacts of OA on marine organisms. However, continuous long-term observations provided by sensors such as the SeaFET are
a great first step in elucidating the biophysical link between natural variation and physiological capacity in resident marine
organisms. In the end, knowledge of spatial and temporal variation in seawater chemistry is a critical resource for biological
research, for aquaculture, and for management efforts. From a biological perspective, the evolutionary history of the resident
organisms will greatly influence the adaptation potential of organisms in marine populations. Thus, present-day
natural variation will likely shape capacity for adaptation of resident organisms, influencing
the resilience of critical marine ecosystems to future anthropogenic acidification. Below we
discuss the comparative SeaFET-collected data and, where applicable, the biological consequences of the temporal
heterogeneity that we found in each of the marine ecosystems where sensors were deployed. As the most stable area, the open
ocean behaves in a predictable way and generally adheres to global models attempting to predict future CO2 conditions based
on equilibration of the surface ocean with a given atmospheric pCO2 (e.g. [47]). This can be shown with longer-term pH
records obtained with SeaFET sensors, which are available at the CCE-1 mooring (Fig. 4). The ambient pH values for this open
ocean location can be predicted to better than ±0.02 from the CO2-corrected climatology mentioned above; pH has dropped by
about 0.015 units since 2000. At CCE-1, the annual carbonate cycle followed the sea surface temperature cycle, and pH was
driven mostly by changes in the temperature dependence of CO2 system thermodynamics (Figure 4). SeaFET observations at
CCE-1 agree with the climatology to +0.017±0.014 pH units, with episodic excursions from the climatology but a general
return to the climatological mean. Although the annual cycle in the open ocean is somewhat predictable, it is notable that even
at these seemingly stable locations, climatology-based forecasts consistently underestimate natural variability. Our
observations confirm an annual mean variability in pH at CCE-1 of nearly 0.1, suggest an inter-annual variability of ~0.02 pH,
and capture episodic changes that deviate from the climatology (Figure 4). Similar underestimates of CO2 variability were
observed at nine other open ocean locations, where the Takahashi pCO2 climatology overlaps PMEL moorings with pCO2
sensors (not shown). Thus, on both a monthly (Fig. 2) and annual scale (Fig. 4), even the most stable open
ocean sites see pH changes many times larger than the annual rate of acidification. This natural
variability has prompted the suggestion that “an appropriate null hypothesis may be , until
evidence is obtained to the contrary, that major biogeochemical processes in the oceans other than
calcification will not be fundamentally different under future higher CO2/lower pH conditions” [24].
Similarly, the sensors deployed on the benthos in the Antarctic (Cindercones and Cape Evans, Figure 2B) recorded relatively
stable pH conditions when compared to other sites in the study. Very few data exist for the Southern Ocean; however, open-
water areas in this region experience a strong seasonal shift in seawater pH (~0.3–0.5 units) between austral summer and
winter [48], [49] due to a decline in photosynthesis during winter and a disequilibrium of air-sea CO2 exchange due to annual
surface sea ice and deep water entrainment [50]. Given the timing of deployment of our sensor in McMurdo Sound (austral
spring: October–November), the sensor did not capture the change in seawater chemistry that might have occurred in the
austral winter [49]. In general, due to sea ice conditions, observations from the Southern Ocean are limited, with water
chemistry data falling into two categories: (1) discrete sampling events during oceanographic cruises (e.g. US Joint Global
Ocean Flux Study, http://www1.whoi.edu/) and (2) single-point measurements from locations under sea ice [49], [51], [52].
Biologically speaking, the Southern Ocean is a region expected to experience acidification and undersaturated conditions
earlier in time than other parts of the ocean [47], and calcifying Antarctic organisms are thought to be quite vulnerable to
anthropogenic OA given the already challenging saturation states that are characteristic of cold polar waters [53]–[56]. Short-
term CO2 perturbation experiments have shown that Antarctic calcifying marine invertebrates are sensitive to decreased
saturation states [51], [57], although the number of species-level studies and community-level studies are very limited. The
Western Antarctic Peninsula and the sub-Antarctic islands will experience pronounced increases in temperature [54] and
could consequently undergo more variation and/or undersaturation given the increased potential for biological activity.
Importantly, depending on the patterns of seasonally-dependent saturation state that will be revealed with improved
observations [58], Antarctic organisms may experience more variation than might be expected, a situation that will
influence their resilience to future acidification. Three other types of study sites – the coastal
upwelling, kelp forest and estuarine/near-shore sites – all exhibited variability due to a combination of mixing, tidal
excursions, biological activity, and variable residence time (Fig. 2). Although these sites are all united by fairly obvious
heterogeneity in pH, organisms living in these areas encounter unique complexities in seawater
chemistry that will influence their physiological response, resilience, and potential for
adaptation. Typically, estuarine environments have riverine input that naturally creates very low saturation states [59]–
[61]. Seawater chemistry conditions in these areas often shift dramatically, challenging biogenic calcification by resident
organisms. Additionally, these species must also tolerate abiotic factors that interact with pH, such as temperature [62]. Two
sensors in the Monterey Bay region, L1 (at the mouth of Elkhorn Slough) and L20 (~2 km seaward and north of L1), recorded
rapid changes in pH. However, as opposed to riverine input, the low pH fluctuations observed here are likely due to isopycnal
shoaling or low CO2 water that is pulsing up to the near shore on internal tides. These locations may also experience high river
run-off in the rainy season, but such conditions were not reflected in the time series shown in Fig. 2. Organisms living in
upwelling regions may be acclimatized and adapted to extremes in seawater chemistry ; here, deep
CO2-enriched waters reach the surface and may shoal onto the benthos on the continental shelf [31], [32]. Data collected
from our upwelling sites support the patterns found by cruise-based investigations; pH
fluctuations were often sharp, and large transitions of up to ~0.35 pH units occurred over the
course of days (Fig. 2). Laboratory studies on calcifying marine invertebrates living in upwelling regions suggest that
these organisms maintain function under such stochastic conditions. However, overall performance may be reduced,
suggesting that these species are indeed threatened by future acidification [17], [18], [63]. For
kelp forests, although
there is less influence from riverine inputs, pH variation is quite dynamic at these sites in the coastal California
region (Fig 2; [18]). Patterns here are likely driven by fluctuations in coastal upwelling, biological
activity, currents, internal tides, seasonally shoaling isopleths, as well as the size of the kelp forest, which
may influence residence times via reduced flow. Kelps may respond positively to increased availability of
CO2 and HCO3−, which may allow for reduced metabolic costs and increased productivity [64].
Increased kelp production may elevate pH within the forest during periods of
photosynthesis, causing wider daily fluctuations in pH, though this is speculative at this time. As a result,
kelp forests, particularly those of surface canopy forming species such as Macrocystis pyrifera, may contain a greater level of
spatial heterogeneity in terms of the pH environment; vertical gradients in pH may form due to enhanced levels of
photosynthesis at shallower depths. Such gradients may increase the risk of low pH exposure for benthic species while
buffering those found within the surface canopy. Kelp forests provide habitat to a rich diversity of organisms from a wide
range of calcifying and non-calcifying taxa [65]. As with organisms from the other coastal locations (estuarine and upwelling),
the biota living within kelp forest environments are most likely acclimatized to this degree of natural variation. However,
continued declines in oxygenation and shoaling of hypoxic boundaries observed in recent decades in the southern California
bight [66], [67] are likely accompanied by a reduction in pH and saturation state. Thus, pH exposure regimes for the coastal
California region's kelp forest biota may be changing over relatively short time scales. Over longer temporal scales as pH and
carbonate saturation levels decrease, the relative abundances of these species may change, with community shifts favoring
non-calcified species, as exemplified by long-term studies in intertidal communities by Wootton et al. [15]. For all the marine
habitats described above, one very important consideration is that the extreme range of environmental variability does not
necessarily translate to extreme resistance to future OA. Instead, such a range of variation may mean that the organisms
resident in tidal, estuarine, and upwelling regions are already operating at the limits of their physiological tolerances (a la the
classic tolerance windows of Fox – see [68]). Thus, future acidification, whether it be atmospheric or from other sources, may
drive the physiology of these organisms closer to the edges of their tolerance windows. When environmental change is layered
upon their present-day range of environmental exposures, they may thereby be pushed to the “guardrails” of their tolerance
[20], [68]. In contrast to more stochastic changes in pH that were observed in some sites, our coral reef locations displayed a
strikingly consistent pattern of diel fluctuations over the 30-day recording period. Similar short-term pH time series with
lower daily resolution [69], [70] have reported regular diel pH fluctuation correlated to changes in total alkalinity and oxygen
levels. These environmental patterns of pH suggest that reef organisms may be acclimatized to consistent but moderate
changes in the carbonate system. Coral reefs have been at the center of research regarding the effects of OA on marine
ecosystems [71]–[73]. Along with the calcification biology of the dominant scleractinian corals and coralline algae, the
biodiversity on coral reefs includes many other calcifying species that will likely be affected [74]–[77]. Across the existing
datasets in tropical reef ecosystems, the biological response of calcifying species to variation in seawater chemistry is complex
(see [78]) –all corals or calcifying algal species will not respond similarly, in part because these calcifying reef-builders are
photo-autotrophs (or mixotrophs), with algal symbionts that complicate the physiological response of the animal to changes in
seawater chemistry. Finally, the
“Extreme” sites in our comparative dataset are of interest in that the low pH
levels observed here represent a natural analogue to OA conditions in the future , demonstrating how
the abundance and distribution of calcifying benthic organisms, as well as multi-species assemblages, can vary as a function of
seawater chemistry [16], [35], [36], [79]. The variability in seawater pH was higher at both the groundwater springs off the
coast of Mexico and the natural CO2 vents off the coast of Italy than at any of the other sensor locations. Offshore of Puerto
Morelos, Mexico (and at other sites along the Mesoamerican Reef), natural low-saturation (Ω~0.5, pH 6.70–7.30, due to non-
ventilated, high CO2, high alkalinity groundwater) submarine springs have been discharging for millennia. Here, variability in
pH is due to long-term respiration driving a low ratio of alkalinity to dissolved inorganic carbon in effluent ground water.
These sites provide insight into potential long-term responses of coral backreef ecosystems to low saturation conditions [79].
Unlike Puerto Morelos, the variability of pH at volcanic CO2 vents at Ischia, Italy is almost purely abiotically derived, due
entirely to CO2 venting and subsequent mixing. This site in the Mediterranean Sea hosts a benthic assemblage that reflects the
impacts of OA on rocky reef communities [16], [36]. Overall, the
‘extreme’ systems provide an opportunity to
examine how variability in pH and extreme events (sensu [80]) affects ecological processes.
Knowledge of this biophysical link is essential for forecasting ecological responses to
acidification in ecosystems with sharp fluctuations in pH, such as upwelling or estuarine environments. Despite
reductions in species richness, several calcifying organisms are found in low pH
conditions close to the vents [16] and the springs [79]. The persistence of calcifying organisms at these
extreme sites, where mean pH values are comparable to those that have reduced organism performance in laboratory
experiments (i.e., pHT 7.8; reviewed in [16]), suggest
that long exposures to such variability in pH, versus a
consistently low-pH environment, could play an important role in regulating organism performance.
Variability in pH could potentially promote acclimatization or adaptation to acidification through
repeated exposure to low pH conditions [24]; alternatively, transient exposures to high pH
conditions could buffer the effects of acidification by relieving physiological stress. Thus, the
ecological patterns coupled with the high fluctuations in pH at the extreme sites highlight the need to consider carbonate
chemistry variability in experiments and models aimed at understanding the impacts of acidification.

Ocean Acidification Isn’t Real- Data Is Incorrectly Gathered

Ball, PhD in Science, ’11 (Tim, 5/19/11, Dr. Tim Ball: A Different Perspective, “Analysis of
Alarmism: Ocean Acidification”, http://drtimball.com/2011/analysis-of-alarmism-ocean-
acidification/, 6/29/14, AEG) The claim of ocean acidification is based on
estimates and computer models; these use the very questionable pre-industrial
atmospheric level of CO2 to calculate an increase of about 0.1 pH units. Of course, the
Intergovernmental Panel on Climate Change (IPCC) attributes the CO2 increase to human
production, which is wrong because the global carbon cycle is very vague about sources,
storage and length of time in each condition. For example, the error in the estimate of CO2
from the oceans each year is greater than the total human contribution. The idea that a 0.1
pH unit increase is significant is ludicrous when the estimate has a range of 0.3 units. There
is a subtle but important point here, because words are part of the scare component. Even if
you accept the claimed change it, is not acidification; it is proper to say the solution is
becoming less alkaline, but that doesn’t sound threatening. More problematic is the validity
of the measures Although pH in seawater has been measured for many decades, a reliable
long-term trend of ocean water pH cannot be established due to data quality issues, in
particular the lack of strict and stable calibration procedures and standards. Moreover,
seawater pH is very sensitive to temperature, and temperature is not always recorded or
measured at sufficient accuracy to constrain the pH measurement. Even if CO2 increases to
560 ppm by 2050 as the IPCC predict, it would only result in a 0.2 unit reduction of pH. This
is still within the error of the estimate of global average.

Climate Change and Ocean Acidification Inevitable- the plan does not solve worldwide or for
gas emissions

Marshall, New Scientist Environmental Editor, ‘12 (Michael, August 2012, New Scientist,
“Lowest US Carbon Emissions Won’t Slow Climate Change”,
http://www.newscientist.com/article/dn22196-lowest-us-carbon-emissions-wont-slow-
climate-change.html#.U7CiKJSwLHg, 6/29/14, AEG)
It looks like good news, but it's not. The US has recorded a sharp fall in its greenhouse gas
emissions from energy use. Thanks to a rise in the use of natural gas, emissions are at their
lowest since 1992. The fall will boost the natural gas industry, but in reality the emissions
have simply been exported. According to the US Energy Information Administration (EIA),
energy-related CO2 emissions in the first quarter of 2012 were the lowest in two decades.
Emissions are normally high between January and March because people use more heating
in the winter, but last winter was mild in the US. The EIA says that an increase in gas-fired
power generation, and a corresponding decline in coal-fired, contributed to the fall in
emissions. Burning natural gas produces fewer emissions than burning coal, and natural gas
is currently unusually cheap in the US thanks to a glut of shale gas extracted by hydraulic
fracturing or "fracking". If gas companies continue to expand their shale gas operations, the
US could generate even more electricity from gas, and its emissions could fall for several
years, says Kevin Anderson of the University of Manchester, UK. However, this will not slow
down climate change. US coal consumption has fallen, but production is holding steady and
the surplus is being sold to Asia. As a result, the US is effectively exporting the coa|-related
emissions. "Gas is less bad than burning the coal, but only if you keep the coal in the
ground," Anderson says. Proponents of natural gas argue that it is a "transition fuel" that we
can burn for a few years while we install low-carbon infrastructure such as wind farms and
nuclear power stations.

No ocean acidification

Duarte et. al. 09 – (11/24/09, Carlos M. Duarte is a research professor with the Spanish Council
for Scientific Research at IMEDEA, , I.E. Hendriks, and M. Álvarez, Department of Global Change
Research. IMEDEA (CSIC-UIB), Instituto Mediterráneo de Estudios Avanzados, “Vulnerability of
marine biodiversity to ocean acidification: A meta-analysis,” Estuarine, Coastal and Shelf Science
Volume 86, Issue 2, 20 January 2010, Pages 157-164, sciencedirect)

In summary, our analysis shows that marine biota is more resistant to ocean acidification than
suggested by pessimistic predictions identifying ocean acidification as a major threat to marine
biodiversity ( [Kleypas et al., 1999] , [Orr et al., 2005] , [Raven, 2005] , [Sponberg, 2007] and [Zondervan et al., 2001] ), which
may not be the widespread problem conjured into the 21st century . Ocean acidification will
enhance growth of marine autotrophs and reduce fertility and metabolic rates, but effects are
likely to be minor along the range of pCO2 predicted for the 21st century, and feedbacks between
positive responses of autotrophs and pH may further buffer the impacts. Particularly sensitive processes like calcification
may be affected, while bivalves seem to be most vulnerable to changes in ambient pH. Modellers and
chemical oceanographers need to improve their predictions on the impacts of ocean
acidification by incorporating natural variability in pCO2 in marine waters, the small-scale
physical processes that detach the organismal chemosphere from the bulk water properties and
the potential for homeostasis resulting from active processes at the cellular level. The predictions
need also consider how the gradual changes conducive to the changes in pH expected during the 21st century may depart from the
impacts extrapolated from experiments involving the sudden exposure of organisms to reduced pH. Ocean acidification needs be
carefully monitored and its effects better understood, while especially synergistic effects and complex interactions between
acidification and other stressors need to be studied, as these synergies may amplify the otherwise limited impacts of ocean
acidification. Science and society should not forget other major threats to marine biodiversity like overfishing, habitat destruction,
increased nutrient inputs and associated oxygen depletion and warming ( [Dobson et al., 2006] , [Jackson et al., 2001] , [Kennish,
2002] , [Thomas et al., 2004] and [Valiela, 2006] ). The
attention that ocean acidification as a sole threat to marine
biodiversity has drawn recently might not be fully justified concerning the limited impact of
experimental acidification on organism processes as shown by the meta-analysis presented
here.

You might also like