BioOne.org will be down briefly for maintenance on 12 February 2025 between 18:00-21:00 Pacific Time US. We apologize for any inconvenience.
Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
A central tenant of ecohydrology in drylands is that runoff redistribution from bare to vegetated patches concentrates the key limiting resource of water, which can then enhance vegetation growth and biomass. Conversely, a reduction in vegetation patches, particularly those associated with herbaceous plants, can lead to a threshold-like response in which bare patches become highly interconnected, triggering a large increase in hillslope runoff and associated erosion. However, generally lacking is an assessment of how maximization of run-on to herbaceous patches relates to minimization of hillslope-scale runoff. To illustrate how runoff redistribution potentially changes in response to conversion of herbaceous patches to bare ones, we used a spatially distributed model, SPLASH (Simulator for Processes at the Landscape Surface–Subsurface Hydrology), with an example of a semiarid piñon–juniper woodland hillslope with seven combinations of bare and herbaceous patch cover, culminating in complete loss of herbaceous patches, for a 1-yr design storm. As expected, the amount of hillslope runoff increased curvilinearly with reductions in herbaceous cover as runoff per cell increased from bare patches and run-on per cell increased for herbaceous patches. Notably, the total amount of run-on to all herbaceous patches was greatest when the amount of bare cover was intermediate, highlighting a trade-off between the source area for generating runoff and the sink area for capturing run-on. The specific nature of patch–hillslope runoff redistribution responses certainly depends on several site-specific conditions, but the general nature of the response exhibited in our example simulation may be indicative of a general type of response applicable to many rangelands. We suggest that a more robust suite of such relationships could be valuable for managing rangelands by enabling explicit accounting for optimality and trade-offs in biomass per herbaceous patch, total herbaceous cover, and prevention of hillslope-scale connectivity of bare patches that triggers a large increase in runoff and associated erosion.
Remote sensing is used to map the actual distribution of some invasive plant species, such as leafy spurge (Euphorbia esula L.), whereas geospatial models are used to indicate the species' potential distribution over a landscape. Geographic data layers were acquired for Crook County, Wyoming, and the potential distribution of leafy spurge presence or absence were predicted with the use of the Weed Invasion Susceptibility Prediction (WISP) model. Hyperspectral imagery and field data were acquired in 1999 over parts of the study area. Leafy spurge presence or absence was classified with the use of the Spectral Angle Mapper with a 74% overall accuracy. However, the user accuracy was 93%, showing that where leafy spurge was indicated in the image, leafy spurge was usually found at that location. With the use of Kappa analysis, there was no agreement between WISP model predictions and either the field data or the classified hyperspectral image. Kappa analysis was then used to compare predictions based on single geographic data layers, to increase the power to detect subtle relationships between independent variables and leafy spurge distribution. The WISP model was revised for leafy spurge based on the remote-sensing analyses, and only a few variables contributed to predictions of leafy spurge distribution. The revised model had significantly increased accuracy, from 52.8% to 61.3% for the field data and from 30.4% to 80.3% for the hyperspectral image classification, primarily by reducing the areas predicted to have potential for invasion. It is generally more cost effective to deal with the initial stages of invasion by only a few plants, compared to an invasion that is large enough to be detected by remote sensing. By reducing the potential area for monitoring, management of invasive plants could be performed more efficiently by field crews.
Juniper encroachment into shrub steppe and grassland systems is one of the most prominent changes occurring in rangelands of western North America. Most studies on juniper change are conducted over small areas, although encroachment is occurring across large regions. Development of image-based methods to assess juniper encroachment over large areas would facilitate rapid monitoring and identification of priority areas for juniper management. In this study, we fused Landsat 5 Thematic Mapper and Light Detection and Ranging (lidar)–based juniper classifications to evaluate juniper expansion patterns in the Reynolds Creek Experimental Watershed of southwestern Idaho. Lidar applications for characterizing juniper encroachment attributes at finer scales were also explored. The fusion-based juniper classification model performed well (83% overall accuracy). A comparison of the resulting juniper presence/absence map to a 1965 vegetation cover map indicated 85% juniper expansion, which was consistent with tree-ring data. Comparisons of current and previous canopy-cover estimates also indicated an increase in juniper density within the historically mapped juniper distribution. Percent canopy cover of juniper varied significantly with land-cover types highlighting areas where intensive juniper management might be prioritized.
Linear disturbances associated with on- and off-road vehicle use on rangelands has increased dramatically throughout the world in recent decades. This increase is due to a variety of factors including increased availability of all-terrain vehicles, infrastructure development (oil, gas, renewable energy, and ex-urban), and recreational activities. In addition to the direct impacts of road development, the presence and use of roads may alter resilience of adjoining areas through indirect effects such as altered site hydrologic and eolian processes, invasive seed dispersal, and sediment transport. There are few standardized methods for assessing impacts of transportation-related land-use activities on soils and vegetation in arid and semi-arid rangelands. Interpreting Indicators of Rangeland Health (IIRH) is an internationally accepted qualitative assessment that is applied widely to rangelands. We tested the sensitivity of IIRH to impacts of roads, trails, and pipelines on adjacent lands by surveying plots at three distances from these linear disturbances. We performed tests at 16 randomly selected sites in each of three ecosystems (Northern High Plains, Colorado Plateau, and Chihuahuan Desert) for a total of 208 evaluation plots. We also evaluated the repeatability of IIRH when applied to road-related disturbance gradients. Finally, we tested extent of correlations between IIRH plot attribute departure classes and trends in a suite of quantitative indicators. Results indicated that the IIRH technique is sensitive to direct and indirect impacts of transportation activities with greater departure from reference condition near disturbances than far from disturbances. Trends in degradation of ecological processes detected with qualitative assessments were highly correlated with quantitative data. Qualitative and quantitative assessments employed in this study can be used to assess impacts of transportation features at the plot scale. Through integration with remote sensing technologies, these methods could also potentially be used to assess cumulative impacts of transportation networks at the landscape scale.
Rapid vegetation sampling methods based on visual estimation are useful for monitoring changes in rangeland vegetation composition because large spatial and temporal scales are often involved and have limited sampling resources available. Here we compared two sampling methods in their ability to detect changes in vegetation composition following rangeland development: 1) species percent cover estimates within subplots (the percent cover [PC] method) and 2) rankings of relative biomass of the 10 most abundant species across the whole plot and the ratio of two of them (the visual ranking [VR] method). Both methods were applied on 30 experimental plots at year 26 of a long-term factorial trial of five soil fertility levels and three sheep grazing intensities. Multivariate statistical methods showed significant effects of experimental treatments (fertilizer level and sheep grazing intensity) and of vegetation sampling method (VR vs. PC) on vegetation composition. Importantly, we detected no significant interactions involving sampling method, indicating that the effect of sampling method was consistent across experimental treatments. Effects of fertilizer on vegetation composition were an order of magnitude greater than the effect of sampling method, whereas the latter was twice as important as the effect of grazing. Results were robust to differential weights given to relative abundances vs. compositional changes. Differences between methods were primarily driven by the PC method giving lower abundance estimates of one species, lupin (a hybrid of Lupinus polyphyllus Lindl.), relative to the VR method. Our results support the use of the VR method as a rapid yet powerful method for monitoring changes in vegetation composition under rangeland development.
The effect of stocking rate on forage growth has attracted much research attention in forage science. Findings show that forage growth may be affected by stocking rate, and there is a consensus that high stocking rates lead to soil compaction, which could also in turn affect forage growth because of the changing soil hydrology and increased soil impedance to forage root penetration. In this study we used a modeling approach to investigate the effect of stocking rates on the growth of sand-bluestem forage at Fort Supply, Oklahoma. The GPFARM-Range model, which was originally developed and validated for Cheyenne, Wyoming, was recalibrated and enhanced to simulate soil compaction effects on forage growth at Fort Supply. Simulations without the consideration of soil compaction effects overestimated the forage growth under high stocking rate conditions (mean bias [MBE] = −591 kg · ha−1), and the agreement between the simulated and observed forage growth was poor (Willmott's d = 0.47). The implementation in the model of soil compaction effects associated with high stocking rates reduced the bias (MBE = −222 kg · ha−1) and improved the overall agreement between the observed and the simulated forage growth (d = 0.68). It was concluded that forage growth under increasing soil compaction could be predicted provided such sensitivities are included in forage growth models.
Leafy spurge (Euphorbia esula L.) is an aggressive exotic species that has been successfully suppressed in a variety of situations using classical biological control (flea beetles; Aphthona spp.). This 9-yr study investigated patterns of vegetation responses following significant reductions in leafy spurge cover and density by flea beetles in southeastern Montana. We hypothesized that the vegetation following leafy spurge suppression would be dominated by species and plant functional groups able to persist through heavy infestations. Flea beetles were first released in 1998, and by 2006 leafy spurge foliar cover was reduced 80% to 90% compared to 1998 values on both release and nonrelease plots. Although total cover of the resident vegetation, excluding leafy spurge, increased 72% to 88%, relative cover of the functional groups (native forbs, native sedges, native grasses, and non-native species) was similar among years and between release and nonrelease plots. Mean diversity and mean species richness values did not differ among years or between release and nonrelease plots (P < 0.05), but mean diversity on both release and nonrelease plots was significantly less than noninfested plots, although richness was similar (P < 0.05). Indicator species analysis revealed that non-native Poa spp. replaced leafy spurge as the dominant species on release and nonrelease plots. Conversely, noninfested plots contained a variety of native species with high indicator values. Although total abundance of the resident vegetation in 2006 was significantly greater than 1998, plant species composition and relative cover showed little change for the duration of the study. Failure of the native vegetation to recover to a community that approached nearby noninfested conditions may be attributed to a variety of interacting scenarios, some of which may be ameliorated by treating infestations as soon as possible to avoid long-term residual effects.
Medusahead (Taeniatherum caput-medusae [L.] Nevski) is an exotic annual grass invading western rangelands. Invasion by medusahead is problematic because it decreases livestock forage production, degrades wildlife habitat, reduces biodiversity, and increases fire frequency. Revegetation of medusahead-invaded sagebrush steppe is needed to increase ecosystem and economic productivity. Most efforts to revegetate medusahead-infested plant communities are unsuccessful because perennial bunchgrasses rarely establish after medusahead control. The effects of prescribed burning (spring or fall), fall imazapic application, and their combinations were evaluated for medusahead control and the establishment of seeded large perennial bunchgrasses. One growing season after treatments were applied, desert wheatgrass (Agropyron desertorum [Fisch. ex Link] Schult.) and squirreltail (Elymus elymoides [Raf.] Swezey) were drill seeded into treatment plots, except for the control treatment. Vegetation characteristics were measured for 2 yr postseeding (second and third year post-treatment). Medusahead was best controlled when prescribed burned and then treated with imazapic (P < 0.05). These treatments also had greater large perennial bunchgrass cover and density compared to other treatments (P < 0.05). The prescribed burned followed by imazapic application had greater than 10- and 8-fold more perennial bunchgrass cover and density than the control treatment, respectively. Prescribed burning, regardless of season, was not effective at controlling medusahead or promoting establishment of perennial bunchgrasses. The results of this study question the long-term effectiveness of using imazapic in revegetation efforts of medusahead-infested sagebrush steppe without first prescribed burning the infestation. Effective control of medusahead appears to be needed for establishment of seeded perennial bunchgrasses. The results of this study demonstrate that seeding desert wheatgrass and squirreltail can successfully revegetate rangeland infested with medusahead when medusahead has been controlled with prescribed fire followed by fall application of imazapic.
Because of concerns about the impact of grazing management on surface water quality, a 3-yr study was conducted to determine grazing management and microclimate impacts on cattle distribution relative to a pasture stream and shade. Three treatments, continuous stocking with unrestricted stream access (CSU), continuous stocking with restricted stream access (CSR), and rotational stocking (RS), were evaluated on six 12.1-ha cool-season grass pastures stocked with 15 fall-calving Angus cows (Bos taurus L.) from mid-May through mid-October of each year. On 2 d · mo−1 from May through September of each year, a trained observer in each pasture recorded cattle position and activity every 10 min from 0600 to 1800 hours. In years 2 and 3, position of one cow per pasture was recorded with a Global Positioning System (GPS) collar at 10-min intervals 24 h · d−1 for 2 wk · mo−1 from May through September. In week 2 of collar deployment in May, July, and September, cattle had access to off-stream water. Ambient temperature, black globe temperature, relative humidity, and wind speed were recorded at 10-min intervals and temperature humidity (THI), black globe temperature humidity (BGTHI), and heat load (HLI) indices were calculated. Based on GPS collars, mean percentage of time cows in CSU pastures were in the stream (1.1%) and streamside zone (10.5%) were greater (P < 0.05) than cows in CSR (0.2% and 1.8%) or RS (0.1% and 1.5%) pastures. Based on GPS collar data, off-stream water did not affect the percentage of time cattle in CSU or CSR pastures spent in the stream. Probabilities that cattle in CSU and CSR pastures were in the stream or riparian zones increased (P < 0.05) as ambient temperature, black globe temperature, THI, BGTHI, and HLI increased. Rotational stocking and restricted stream access were effective strategies to decrease the amount of time cattle spent in or near a pasture stream.
The Rainwater Basin region in Nebraska is critically important stopover habitat for spring waterfowl migrations, but the ability of these sites to produce sufficient food for migrating waterfowl is endangered by the invasion of reed canarygrass (Phalaris arundinacea L.). This species produces thick litter layers and abundant aboveground biomass, reducing germination and seedling survival of the annual plant species responsible for much of the seed production in the area. Cattle grazing often is used as a management tool in the Rainwater Basin to slow or reverse reed canarygrass invasion and to improve growing conditions for more desirable plant species. However, there has been little research on the impact of grazing on these factors. We studied the impacts of one-time, early-season (between April and June) cattle grazing on the abundance of reed canarygrass, bare ground, and litter. We hypothesized that cattle grazing would result in reduced reed canarygrass by the end of the 2-yr study, and that grazing would increase the abundance of bare ground and decrease the abundance of litter. Because grazing was expected to improve conditions for seed germination, we expected to find higher species richness in grazed areas. We found that grazing did not reduce the abundance of reed canarygrass, but the application of early-season grazing for two consecutive years did reduce litter and increase bare ground. Litter abundance decreased by 7.5% in ungrazed plots and litter increased by 8.6% in grazed plots. Bare ground in grazed plots increased 10.7% in grazed plots but decreased 1.2% in ungrazed plots. Species richness was not affected by grazing during this study. We concluded that grazing, as utilized in this study, is not sufficient to reduce reed canarygrass abundance, but can be used to mitigate some of the negative impacts of reed canarygrass invasion.
Prescribed fire and/or mechanical methods can be used to modify the quantity, continuity, and/or spatial arrangement of flammable fuel. Yet the consequences of fuel management, both in terms of ecological outcomes and in facilitating improved fire management, often are poorly documented. In the global biodiversity hotspot of southwest Western Australia, chaining and burning is a novel technique for manipulating fuels. Vegetation first is dislodged using a chain, then after a period of curing, burnt. We tested whether combining two disturbance events in this way results in different vegetation structure postfire than only burning, and whether the postfire sprouting capacity of community-dominant Eucalyptus spp. is compromised. Both chained and burnt and only burnt treatments had much less leaf litter and vegetation > 25 cm high than long-unburnt vegetation, indicating a fire management benefit of fuel modification. Chained and burnt strips had a threefold reduction in standing dead vegetation compared to only burnt samples. The stem number of Eucalyptus spp. was reduced by 20% in chained and burnt strips compared to only burnt vegetation, indicating that consecutive disturbances reduce resilience and might render sprouters vulnerable to subsequent disturbances. Balancing the fire management benefits of chaining and burning with the ecological consequences is a significant challenge facing land managers in this fire-prone landscape.
Mongolia's Eastern Steppe is one of the largest remaining temperate grassland ecosystems and is habitat for Mongolian gazelles (Procapra gutturosa). During four surveys, we quantified vegetation composition, forage quality, and trace elements to gain insights on characteristics of forage that could be influencing how gazelles are distributed across the steppe. Grasses made up between 57% and 68% of all species, Stipa spp. (24–42% of all grasses) being the most abundant. Forbs made up 6% to 23% of all species with Allium spp. (11–44% of all forbs) the most abundant. The shrubs and dwarf shrubs were least common (7% and 12% of all species) with Artemisia frigida Willd. (18% and 47% of all shrubs) most common. Spring crude protein values of green vegetation averaged 21.9%. Considered an important forage for gazelles, Stipa spp. was below optimum value in phosphorous (P) and magnesium (Mg). The forbs Allium spp. and Astragalus spp. and the dwarf shrub Artemisia frigida had some of the highest crude protein contents and were above optimum for all important elements (except P in Astragalus). Calcium (Ca) and the Ca∶ P ratio were above optimal at nearly all sites surveyed. Phosphorus levels in vegetation were 96% of minimum requirements for ungulates at maintenance whereas magnesium and calcium were 113% and 145% of minimum requirements for ungulates, respectively. Magnesium and phosphorous were below values considered optimal for lactation and bone development at 78% and 71% of sites, respectively. Gazelles likely satisfy their nutrient requirements by selectively foraging on species that contain high concentrations of critical minerals. During periods of peak demands, particularly calving and postcalving periods, regions with a high abundance of forbs commonly occurring in gazelle diets (Allium and Astragalus) might be of greater value to lactating females and growing calves and, therefore, sought out.
Wildfires in the United States can be destructive to human life and property. The ability to predict fire danger helps reduce the risks associated with wildfires by keeping firefighters on high alert and allowing better preparedness. In the state of Oklahoma, fire is a common occurrence. By looking at past wildfire records and researching the weather conditions under which they burned, we were able to determine the most important weather conditions affecting wildfire size. We looked at 10 different weather variables and found that minimum relative humidity (r = 0.98, P = 0.001), maximum and average wind speed (r = 0.95, P = 0.003; r = 0.95, P = 0.004, respectively), and precipitation (r = 0.88, P = 0.02) were the most important factors relating to wildfire size. Temperature variables did not have significant relationships with wildfire size categories. Additionally, we found that most of the largest wildfires occurred in January and December. This information can be used to adjust and improve current wildfire danger models and predictive abilities. We define conditions under which firefighters should be on high alert with hopes of improving their ability to expediently manage rangeland wildfires.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere