Twenty-seven million people within seven states and Mexico all depend on the Colorado River for water. Without its continuous and consistent level of flow, cities such as Phoenix, Las Vegas, and Los Angeles would quickly become uninhabitable. Unfortunately, current climate change models predict a 7–20% decrease in Colorado River runoff by 2050 due to increasing temperatures and lower precipitation, both direct factors of global warming. One possible strategy to help offset this impending blow to a crucial water source could be stabilizing soil surfaces and increasing vegetation areas, as the research of Painter et al. (2010) implies. Painter et al. analyzed the impact of dust on snowmelt rates throughout the Upper Colorado River Basin (UCRB) using a standard hydrologic simulation model to determine exactly how much snowmelt runoff is lost from varying levels of dust input on snow; runoff which could help counterbalance the lowering levels already being observed in the Colorado River. The results suggest that the dust-induced earlier snowmelt causes more water loss through evapotranspiration, makes runoff levels less constant and manageable, and deprives the region of substantial runoff in July, which could impede riparian vegetation health and fish survival and place further undue stress on the Colorado Basin reservoir in late-summer.–Elise Wanger
Painter, T., Deems, J., Belnap, J., Hamlet, A., Landry, C., Udall, B., 2010. Response of Colorado River runoff to dust radiative forcing in snow. PNAS Early Edition 10, 1–6.
Painter et al. used the Variable Infiltration Capacity (VIC) model to simulate runoff rates from snowmelt after dust loading (ADL) in comparison to before dust loading (BDL). The VIC model divides the region into small, homogenized grid cells that each get analyzed separately for their predicted runoff and evapotranspiration rates. These cells can be layered by elevation, and sized to a resolution small enough that the area can be justifiably stereotyped for a uniform precipitation rate, air temperature, wind speed, humidity, shortwave and longwave radiation, and snow albedo. Since Painter et al. only concerned themselves with the influence of dust and therefore only the changing the albedo factor and subsquent melting rates, these fundamental variables provided enough accuracy to calculate significant differences without obscuring the predicted results. Painter et al. used a grid cell resolution of 1/8º for the years 1915–2003; each cell covered about 1/8º in longitude and latitude. The variables for the model (daily temperature, radiation, precipitation, etc.) from the years 1915–2003 were taken from a physically based hydrology model from a previous study (Hamlet et al. 2007). These calculations fit actual data from a study in the east central UCRB, making their results a reliable choice for the region.
The albedo factor indicates the reflectivity of a surface from a scale of 0–1. Zero means that all radiation reaching the surface gets absorbed, and 1.0 means that all radiation reaching the surface gets reflected back into the atmosphere. Pristine snow has an albedo factor typically around 0.98, reflecting most of the sunlight and keeping the snow cool and compact. Dust-laden snow, however, has a much lower albedo factor, since the larger, darker dust particles absorb short- and long wave radiation. As these dust particles absorb heat and radiate it to the surrounding snow, the snow gets denser and its grain-size increases. These large grain sizes absorb more heat more easily (having a larger surface area to absorb radiation), creating a positive feedback system so that the snow melts even faster. This is why dust-on-snow creates earlier snowmelt and therefore higher runoff rates earlier in the season, and exposes more vegetation and soil earlier in the season by melting too quickly.
The model simulated the predicted runoff rates from 1915–2003, estimating wind-blown dust emission levels based on lake core sediments in the eastern UCRB. Since remote sensing, isotope analyses and ensemble backtrajectories have all shown that the dust on UCRB snow derives from the Colorado Plateau and the Central Basin, previous studies easily figured out the percentage of sediment from dust emission for each period represented in the lake core samples. Painter et al. assigned this simulation ADL, since wind-blown dust levels from 1916 to the present have been amplified by human activities. In fact, the lake core sediments indicate wind-blown dust levels increased six-fold by the early 20th century. Ranching, agriculture, deforestation, oil drilling and other anthropogenic disruptions left the surface soil more vulnerable to getting picked up by wind. The biological crusts of bacteria, fungi, and bugs create a moist mat of decomposing material netted in place by the extensive cell systems of biota. The soil above has a physically-derived mat as well, from retaining moisture which makes the soil more dense and bonds with many water-bonding minerals. When the soils get broken up and turned over, moisture quickly evaporates and the cohesive biological mat fragments, leaving discontinuous, dry dirt that easily blows away.
Lake cores show that before around 1850, however, this was not the case. Snow before the silver mines and ranching was fairly clean, providing Painter et al. with an undisturbed set of dust-level data to which the ADL model could be compared. Using the same variables as from 1916 (temperature, wind speeds, etc.), the model was run a second time with albedo factors reflective of pre-1850 dust levels, the BDL model. Dust levels as low as those in the BDL could not be fit to the ADL parameterization because dust’s influence on melting rates is more complex than simply lowering the albedo; the “snow metamorphism” (the building of denser, larger grain-sized snow from the melting process) also enhances melting. Although snow metamorphism is fairly consistent past a certain threshold dust level, at concentrations below this threshold the impact of metamorphism is probably less consequential than melting rates calculated from dust-laden snow can account for. Painter et al. therefore estimated the lower melting rates for the BDL higher albedo using a parameterization model from data at Morteratshglestcher, Switzerland, and the Storglaciären Glacier, Sweden, instead, which both receive less dust than the current UCRB. Like the previous model—which was based on observations of high dust levels—the values from Switzerland and Sweden were graphed to indicate the relationships between dust load and melt rates. Painter et al. fit this relationship between albedo and melting rates on a straight line, thus overestimating the melting rate of clean snow. Additional dust loads don’t absorb as much radiation per increase in mass as initial deposits, making the relationship plateau more (lowering the slope) as dust amounts increase. Therefore the first layer of dust has the highest impact on albedo reduction, creating a non-linear progression; the difference between a thin layer of dark color and white is much more significant than between a thin layer of dark and a slightly thicker layer of dark. The first layers of dust also increase snow grain size the most and therefore melting rates the most as well. Yet Painter et al. intentionally kept the relationship linear in order to take a conservative stance on ADL and BDL differences. If true BDL dust levels have less of an impact on radiative absorption than the model estimates, than the differences in runoff between disturbed and undisturbed dust-loads is probably even greater than the model created by Painter et al. infer. So the fact that this cautious parameterization still predicts 2.5-fold less dust-related absorption during the accumulation seasons (when more snow is falling than evaporating or melting), and 3-fold less during the ablation season (when more snow is being lost to evaporation and melting than falling) makes quite a statement.
Once Painter et al. registered all these data into the VIC model, they calculated the difference in melting rates between BDL and ADL snow by predicting when the snow would be at only 10% of its peak level. Once 90% of the snow has melted, artifacts on the soil layer (such as vegetation and rock) begin to influence melt rates much more than dust, and some areas maintain 5–10% of its snow year-round, making 10% snow cover a reasonable end-point to see the impact of dust load. From 1916–2003, the mean difference between the time BDL and ADL snow cover loss reaches 10% of its original value is 21 days. In contrast, the mean difference between BDL and pristine snow cover loss is only seven days.
As expected, areas with the most snow differed in melting rates the most between BDL and ADL. Since more snow takes a longer time to melt in either model, the different rates of melting lead to the greatest difference in days to reach 10% of peak snow cover with high levels of snow. Years with the most annual runoff (which implies more snow) also showed the greatest difference in runoff rates between BDL and ADL snow. This is why even exclusively within the ADL or BDL seasons, the date at which half the river flow occurs can differ as much as 9–21 days; though on average, ADL years still reach half the annual flow an average of two weeks earlier than BDL years, meaning that more water is flowing faster earlier in the season after dust disturbances (and subsequently less water is flowing later).
On the other hand, vegetation mitigates the difference between BDL and ADL models, since the forest canopies will block dust and light radiation from reaching the snow layer. With or without dust, snow on the forest floor won’t melt as readily, especially in evergreen forests where the needles block sunlight and particulate deposition year-round. For this reason, studies measuring the impact of increased radiative forcing (the radiation hitting a surface, measured in Watts per meter2) are currently being conducted in forests infested with pine beetles. Dead, needle-less pines would expectedly increase snowmelt rates simply by exposing more ground to sunlight.
Evapotranspiration levels also differ in respect to dust flux. Since snow absorbing more radiation is wetter and warmer than pristine, hard snowpack, dust-laden snow not only melts more, but a greater portion returns to the atmosphere as vapor. Furthermore, shorter snow-covered seasons due to faster melting mean that vegetation and bare soil have more time exposed to the hot and dry mountain air. Evapotransporation (ET) is the combination of evaporation and transpiration levels. Water that vaporizes from soil, groundcover, lakes, or leaf-surfaces (especially along the forest canopies) all count as evaporation. Transpiration is also a vaporization process, conducted by the living plants themselves. Cells on leaves create openings called stomata, allowing water collected from the plant roots to be released as vapor into the atmosphere. This sometimes serves a useful purpose in cooling the plant down or facilitating the transport of minerals, but often it’s just a byproduct of the need to open stomata to admit CO2 for photosynthesis.
The ADL earlier exposure of vegetation and soils causes an estimated runoff loss of 1.0 bcm (billion cubic meters) of water per year, which equates to about 5% of the annual average. To put that amount in perspective, Las Vegas is allocated only half that amount per year, and Los Angeles only two-thirds. The increase in ET is greatest between ADL and BDL models in June and July, when the difference in exposed soil is highest and the grain sizes the most differentiated. By August, the majority of the UCRB receives daily afternoon rainstorms, which nullify the differences in ET between dust-on-snow and clean snow by melting down snow and saturating the water-holding capacity of land. However, Painter et al. conjecture that regions without considerable late-summer rainfall (such as Sierra Nevada in California) could have lower ET rates because the ground would have the most water before the ET rate peaks in June. In other words, the most snowmelt usually corresponds with the highest period of ET, which quickly depletes the just-melted water into vapor. If dust causes the melting period to precede the period of hot weather and ET peak, the water-melt would remain intact and allow for high runoff rates later in the season, as would be routine in BDL environments. However, this possible exception to the rule (the rule being that earlier melt equals earlier runoff) has not been investigated. Scientists would have to confirm that a lower ET rate during high levels of water is enough to counter the lesser overall snowmelt by mid-late summer, and soil analyses of the region would have to negate the likely possibility that the soil in such areas would not simply store the excess water, since no afternoon rains will create an increased infiltration potential—the soil would be further from the saturation threshold for water-holding capacity, and therefore store more water than in the UCRB.
Although the VIC models’ assumptions are simple and the results insightful as to our alpine ecosystem’s relationship with dust, as with any model the estimations don’t perfectly reflect real-world conditions and all the factors for which the model does not account. Calibrating natural flows of billions of cubic meters worth of snowmelt from a smaller-scale grid model ignores a lot of detailed complications that influence the true runoff levels, and the representations of vegetation cover influences, although specified for foliage type (conifer, deciduous, shrub) are static, meaning that all conifer forests are assumed to intercept the same degrees of radiation and dust. Painter et al. also concede that changes in vegetation transience did not get accounted for in the model. Industrial era logging in the later 1800s would probably have significantly different runoff rates than the 1960s reforestation movement (which is probably one of the reasons the researchers started the simulations at 1916). Change in stomatal resistance also did not factor into the model, although in the relatively clean air of the UCRB this dynamic may not be much of an issue. In the presence of increased CO2, plants will conserve their water and open their stomata less, leading to less transpiration. If this phenomenon is significant from the past 150 years, then BDL transpiration would be even lower and the difference between total runoff between ADL and BDL even more pronounced.
Most significantly, the VIC models do not factor in surface-temperature interactions, namely the way surface absorbance or reflectivity influences weather conditions in the troposphere. In the case of the BDL model—in which the majority of light radiation reflects off the snow and returns to the atmosphere—the surface temperature and subsequently the overlying atmosphere might be cooler than the temperatures entered into the model (which are the same as for ADL). Cloud patterns and thus precipitation might differ from the cooler climate created, causing snowmelt rates to be even lower than estimated. This means, just as with the linear relationship fitted between dust load and melt rates, the model has steered towards the more conservative route, understating the possible variation between BDL and ADL runoff.
Especially given the water loss that will continue with global warming, understanding the seasonal runoff rates of UCRB snow will allow policymakers to adjust their habits and water deliveries accordingly to maximize efficient allocation. From 2000 to 2010, unprecedented dust emissions caused sharply earlier runoff rates. Yet normal deliveries from Lake Powell to the Lower Basin continued as usual without compensating for the seasonal variation, and by the end of the decade Lake Mead went from a nearly full reservoir of 30.8 bcm in 2000 to only 42% of capacity (12.8 bcm). Yet taking more water earlier is an inadequate solution to a long-term problem, and does nothing to compensate for the overdraft of taking more from a decreasing source. A more effective, long-term initiative that Painter et al. endorse is to reduce dust loading by increasing surface stabilization. This would entail further curbing of large livestock grazing, restrictions of vehicles on dirt roads, and more regulated agricultural practices in lower elevations, including minimizing plowing and soil turn-over. The physical crust can repair in a matter of days with a good rain, and the cyptobiotic mat only takes a few years to establish, once the land no longer gets disturbed. Reinstating native plants that are capable of germinating in drought years would also reduce soil exposure and stabilize the soil, since more dirt would be rooted in place and shielded by plant-life.
The droughts in the Colorado River Basin predicted by global warming can’t provide a complete picture of the severity of water loss without acknowledging the role of dust loads. Dust-on-snow generates irregular runoff rates and increases the rate of ET. The research of Painter et al. shows that as dust emissions increase depletion of Colorado River reservoirs, countering the trend by restabilizing surface soil could be feasible.