A dearth of bright spots on the sun might have contributed to a frigid period known as the "little ice age" in the middle of the past millennium, researchers suggest.
From the 1500s to the 1800s, much of Europe and North America were plunged into what came to be called the little ice age. The coolest part of this cold spell coincided with a 75-year period beginning in 1645 when astronomers detected almost no sunspots on the sun, a time now referred to as the Maunder Minimum.
Past studies had mulled over whether the decreased solar activity seen during the Maunder Minimum might have helped cause the little ice age. Although sunspots are cool, dark regions on the sun, their absence suggests there was less solar activity in general. Now scientists suggest there might have been fewer intensely bright spots known as faculae on the sun as well during that time, potentially reducing its brightness enough to cool the Earth.
The dip in the number of faculae in the 17th century might have dimmed the sun by just 0.2 percent, which may have been enough to help trigger a brief, radical climate shift on Earth, researcher Peter Foukal, a solar physicist at research company Heliophysics in Nahant, Mass., told LiveScience.
"The sun may have dimmed more than we thought," Foukal said.
Foukal emphasized this dimming might not have been the only or even main cause of the cooling seen during the little ice age. "There were also strong volcanic effects involved — something like 17 huge volcanic eruptions then," he said.
Foukal also cautioned these findings regarding the sun did not apply to modern-day global warming. "Increased solar activity would not have anything to do with the global warming seen in the last 100 years," he explained. [10 Surprising Results of Global Warming]
Foukal and his colleagues detailed their findings May 27 at the American Astronomical Society meeting in Boston.