The Sun's Role In Global Warming
August 5, 1999
Those who promote the controversial theory that the Earth is getting warmer due to burning of fossil fuels are overlooking an important factor, scientists say: the changing nature of the sun.
Computer simulations say the "greenhouse effect" should have raised the global temperature by about one degree Celsius -- or 1.8 degrees Fahrenheit -- over the past 100 years. But in fact temperatures rose 0.5 degrees C over the century, peaking before 1940. Then it decreased until the 1970s, and has since risen a modest 0.2 degrees C. Experts point out that because more than 80 percent of manmade carbon dioxide entered the air since the 1940s, the earlier warming of 0.5 degrees C must be natural.
Sunspots may very well be that natural factor:
- The sun is brightest during peak sunspot periods -- which change every 11 years or so, making for a 22-year magnetic cycle.
- Studies show that changes in the cycle correlate closely to changes in North American land temperatures over three centuries.
- If recent NASA data are indicative, changes in the sun's magnetism track changes in the sun's brightness.
- If this is so, changes in the sunspot cycle would explain the average temperature change of 0.5 degrees Celsius over the past 100 years.
Changes in the cycle also correspond to temperature variations over many millennia. Sunspots were rare from about 1640 to 1710 -- a period which coincides with the coldest century of the millennium.
While we lack temperature data for other centuries, the role of sunspots in climate change can be inferred from historical fluctuations of wheat prices. In periods of few sunspots, the climate gets colder. Indeed, five prolonged periods of few sunspots coincides with periods of higher wheat prices in England.
Source: Sallie Baliunas (George C. Marshall Institute, Mt. Wilson Observatory), "Why So Hot? Don't Blame Man, Blame the Sun," Wall Street Journal, August 5, 1999.
Browse more articles on Environment Issues