Climate change reaches all the way to the bottom of the sea.
The same greenhouse gas emissions that are causing the planet's climate to change are also causing the seafloor to dissolve. And new research has found the ocean bottom is melting away faster in some places than others.
The ocean is what's known as a carbon sink: It absorbs carbon from the atmosphere. And that carbon acidifies the water. In the deep ocean, where the pressure is high, this acidified seawater reacts with calcium carbonate that comes from dead shelled creatures. The reaction neutralizes the carbon, creating bicarbonate.
Over the millennia, this reaction has been a handy way to store carbon without throwing the ocean's chemistry wildly out of whack. But as humans have burned fossil fuels, more and more carbon has ended up in the ocean. In fact, according to NASA, about 48 percent of the excess carbon humans have pumped into the atmosphere has been locked away in the oceans. [7 Ways the Earth Changes in the Blink of an Eye]
All that carbon means more acidic oceans, which means faster dissolution of calcium carbonate on the seafloor. To find out how quickly humanity is burning through the ocean floor's calcium carbonate supply, researchers led by Princeton University atmospheric and ocean scientist Robert Key estimated the likely dissolution rate around the world, using water current data, measurements of calcium carbonate in seafloor sediments and other key metrics like ocean salinity and temperature. They compared the rate with that before the industrial revolution.
Their results, published Oct. 29 in the journal Proceedings of the National Academy of Sciences, were a mix of good and bad news. The good news was that most areas of the oceans didn't yet show a dramatic difference in the rate of calcium carbonate dissolution prior to and after the industrial revolution. However, there are multiple hotspots where human-made carbon emissions are making a big difference — and those regions may be the canaries in the coalmine.
The biggest hotspot was the western North Atlantic, where anthropogenic carbon is responsible for between 40 and 100 percent of dissolving calcium carbonate. There were other small hotspots, in the Indian Ocean and in the Southern Atlantic, where generous carbon deposits and fast bottom currents speed the rate of dissolution, the researchers wrote.
The western North Atlantic is where the ocean layer without calcium carbonate has risen 980 feet (300 meters). This depth, called the calcite compensation depth, occurs where the rain of calcium carbonate from dead animals is essentially canceled out by ocean acidity. Below this line, there is no accumulation of calcium carbonate.
The rise in depth indicates that now that there is more carbon in the ocean, dissolution reactions are happening more rapidly and at shallower depths. This line has moved up and down throughout millennia with natural variations in the Earth's atmospheric makeup. Scientists don't yet know what this alteration in the deep sea will mean for the creatures that live there, according to Earther, but future geologists will be able to see man-made climate change in the rocks eventually formed by today's seafloor. Some current researchers have already dubbed this era the Anthropocene, defining it as the point at which human activities began to dominate the environment.
"Chemical burndown of previously deposited carbonate-rich sediments has already begun and will intensify and spread over vast areas of the seafloor during the next decades and centuries, thus altering the geological record of the deep sea," Key and his colleagues wrote. "The deep-sea benthic [bottom] environment, which covers ~60 percent of our planet, has indeed entered the Anthropocene."
Originally published on Live Science.