George C. Marshall INSTITUTE Cooler Heads Coalition E v e n ts August 7, 1998 About Washington, DC Membership Newsletter Hot Times or Hot Air: The Sun in the Science of Global Warming* Sallie Baliunas, Ph.D.; Senior Scientist, George C. Marshall Institute; Staff Astrophysicist, Harvard-Smithsonian Center for Astrophysics Roundtables Publications Envi ronmental Education Global Warming Facts Kyoto Protttcol Links Home Contact us at info@marshall.org The extent of human effects on global climate change remains a highly complex scientific matter, as the United Nations Intergovernmental Panel on Climate Change (IPCC) made clear in its last report: Our ability to quantify the human influence on global climate is limited because the expected signal is still emerging from the noise of natural variability [NB: The signal has not yet been detected], and because there are uncertainties in key factors [NB: like natural factors]. (IPCC 1995, p. 5) Given the complexities of climate, how can forecasts of climate change from increased greenhouse gases 100 years into the future be checked? One starts by noting that the increase in the different greenhouse gases in the atmosphere due to human activities over the last 100 years is effectively equal to roughly a 50% increase in the atmospheric concentration of carbon dioxide alone. That 50% effective increase in carbon dioxide concentration gives a way to test the accuracy of the computer forecasts of climate change simply by measuring the response of the climate to the increase in greenhouse gases that has already occurred. 1. Tests o f climate projection scenarios against measurements For example, all the IPCC 1995 scenarios make specific forecasts for warming that should have already occurred, including: (i) Global warming of as much as 1 C; (ii) Extremely large and rapid warming in the North Polar Region. Both forecasts will be discussed. (i) Global temperature records —According to thermometer records collected near the surface, over land and sea, in the last 100 years as well as temperature estimates from proxy records, the temperature averaged from different parts of the world rose about 0.5 C since the late 19th century [Figure 1], Although global warming has occurred, its cause is an issue of scientific study. As noted above, the 100-year instrument record is important because it covers the period in which there has been an effective 50% increase in carbon dioxide. The computer scenarios say that there should already be a warming as much as IT C in the globally averaged temperature as a result of increased greenhouse gases in the atmosphere. At first glance it seems that the observed warming occurred due to the increases in the minor greenhouse gases in the last 100 years and is good evidence for the validity of projections of global warming from human activities. That conclusion is insupportable for several reasons. First, at least half of the warming seen in the surface temperature record occurred before about 1940, while most of the greenhouse gas increase in the atmosphere occurred after 1940. That means that most o f the temperature rise o f the last 100 years occurred before the substantial increases in greenhouse gases in the atmosphere from human activities. Of the 0.5 C rise observed, at most only one or two tenths of a degree can be attributed to increased greenhouse gases. A second problem is the uncertainty of the urban-heat-island effect in the temperature records: temperature measurements made in growing, modem cities can read excess warmth due to the effects of machinery, pavement, tree-cutting, etc. A correction, as best as possible, can been made, but the process can introduce a systematic error to the average record. Another uncertainty comes from uneven coverage in the surface records. Good records with near-continual coverage for the last 100 years apply to only 18% of the surface. Better coverage exists today of the surface; still, the Polar Regions as well as vast areas of the southern and tropical oceans are not adequately sampled. Beginning in 1978 NASA satellites were launched and their measurements have yielded a record of nearly globally-averaged temperature [Figure 2], Measurements are made of the lower troposphere and were discussed in a series of lectures by Dr. John Christy. The forecasts claim the increase in atmospheric greenhouse gases is now so large that the global temperature should be rising about 0.2 C per decade. But the satellite readings (validated by independent balloon experiments) show that the temperature has not changed at all in the last 19 years in response to the increased greenhouse gases. An uncertainty of all instrumental records is their shortness, which limits our ability to compare the warming of the early 20th century natural fluctuations of the climate during other periods. No worldwide instrumental records go back further than the mid-19th century, but there are records or reconstructions of several regions going back further. For example, the natural variability of temperature in the mid-latitude Atlantic Ocean over the last 3000 years has been reconstructed from ocean-bottom sediments [Figure 3]. The longer perspective, hidden by the bias of the shortness of the instrumental records, shows substantial natural variability of temperature. Evident are periods like the Little Ice Age, reaching its coldest during the 17th and 18th centuries, and the Medieval Climate Optimum, a warming in the 10th - 11th centuries. That record does not stretch as far back as the Holocene Optimum, 6500 years ago, the warmest interval of the last 10,000 years after the end of the last major ice age. The record of natural variability indicates that the wanning of the early 20th century is not unusual, either in amplitude or speed. Neither the surface records nor the lower tropospheric temperature records show the substantial increasing warming trends expected from increased greenhouse gases, in contradiction to the projections from the computer scenarios. (ii)Temperature Records of the Arctic —According to the computer forecasts, the polar areas are very sensitive to global warming. The forecasts say that the Arctic region should have warmed by as much as 2 - 4 C in the last 50 years in response to increased greenhouse gases in the atmosphere. Instrumental measurements contradict the intense warming trend for that region projected by the computer scenarios. On the average over the last 40 years, the temperature record does not show and increasing warming trend. A longer-term view of the lower Arctic [Figure 4] comes from proxy records like tree-ring growth. In that record there is rapid warming in the early 20th century, but it began before 1920, and again, must be largely unrelated to increased greenhouse gases. Since 1950, that record suggests that the Arctic has cooled despite the increases in greenhouse gases. In the test of the Arctic temperature record, the computer forecasts exaggerate the expected wanning, this time, by more than ten-fold. As a result of looking at the temperature information, one concludes two things: (1) the climate scenarios exaggerate the warming that should have already occurred (and likely also the future warming) as a result of increased greenhouse gases; and (2) most of the warming this century cannot have been caused by increased greenhouse gases, because the warming predates the greatest increases in the greenhouse gases. 2. What’s wrong with the computer forecasts? The calculation of the response of the climate system to the energy input from increases in greenhouse gases is difficult because the climate system is a complex nonlinear system that is not well understood. The most sophisticated of the computer simulations track five million parameters. Understanding of the physics of some of the processes (e.g., upper tropospheric water vapor) is incomplete; the measured values of some parameters are poor. The computer scenarios presently do not but need to span 16 orders of magnitude in spatial scale and to include ten million trillion degrees of freedom. The computer programs calculate the effect of adding a small amount of energy, 4 W m-2, to a simulated climate. That amount roughly corresponds to a doubling of the atmospheric concentration of carbon dioxide. But the uncertainty in calculating, e.g., the effects of humidity, clouds and the flow of energy from the equatorial to the polar regions may be each at least five times larger. Such errors can give rise to area-by-area "flux adjustments" of up to 25 times larger in some regions of several coupled ocean-atmosphere simulations. When the unknowns in the climate simulations are more than an order of magnitude larger, it seems difficult to accept the calculated response of the climate to the small, added energy expected from doubling carbon dioxide. Water vapor - Compared to the temperature records, the computer projections exaggerate the warming that should have appeared. Why? The scenarios assume that the small warming from increases in the minor greenhouse gases like carbon dioxide is amplified by water vapor in the upper atmosphere. That is, water vapor is assumed to provide a strong, positive feedback for any minor warming. This assumption has been challenged by both developments in convection theory and new measurements. When the gain in temperature due to water vapor feedback is removed, doubling carbon dioxide yields roughly only IT C warming or less, which is more consistent with the observed trends in the surface temperature records. 3. Natural factors o f climate change: the sun Another reason for exaggerated forecasts may rest in incomplete knowledge of natural variations, as the IPCC suggests. One such natural factor may be changes in the brightness of the sun over decades and centuries. The magnetism on the sun’s surface varies roughly every 11 years [Figure 5 . According to recent findings based on measurements from NASA satellites, the sun brightens and fades in its total energy output, in step with the 11-year cycle of magnetism. Not only the sun’s brightness, but also its flux of high-energy particles that bombard the earth, changes with the surface magnetism of the sun. The observed association of brightness change with magnetic change over nearly two decades is the basis for thinking that changes in the sun’s brightness occur over many decades, along with recorded changes in magnetism. If significant, brightness changes over decades could drive non-trivial global temperature change. The climate record indicates a solar influence of this kind. An example [Figure 6] is the record of the sun’s magnetism and reconstructed land temperatures of the Northern Hemisphere over 240 years. The two curves are highly correlated over several centuries. Those changes in the sun’s magnetism may track changes in the sun’s brightness, for which direct measurements are lacking. Assuming that the timing of the sun’s magnetic changes is a proxy for the sun’s changing brightness, computer simulations of the earth’s climate suggest that changes of 0.4% in the sun’s brightness could have produced global average temperature changes of about 0.5 C over the last 100 years. Additional evidence points to the sun’s signature in the climate record over many millennia. Every few centuries the sun’s magnetism weakens to low levels sustained for several decades. An example is the magnetic low circa 1640 - 1720, when sunspots were rare and the sun’s magnetic field weak. That period coincided with a climate cooling called the Little Ice Age, when the global temperature is estimated to have been roughly IT C colder than today. Quantitative records of the sun’s magnetism over millennia come from measurements of the isotopes radiocarbon (14C, in tree rings) and l