Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Grassland conversion into croplands in the Prairie Pothole Region of the United States is a persistent hurdle toward mitigating climate change. Several carbon offset markets have been designed to reward landowners for keeping lands in their native state when incentives to convert are high. We explore the role of a critical determinant in such programs: the additionality threshold. This factor, if appropriately selected and applied, reduces the participation of landowners that would choose to enroll in the program but would not have converted their land under business-as-usual conditions. Using a simple model relating land quality and land use to economic rents, we simulate potential avoided grassland conversion offset market participation across a range of cropland over pasture rent difference threshold (RDT) values. We find mitigation potential and simulated program costs are widely variable depending on this parameter and assume carbon prices: across the five states studied, the full range is 0.41 tCO2e • yr–1 (0.2 RDT, $10 •$t–1 carbon price) to 4.6 million tCO2e • yr–1 (1.2 RDT, $40 tCO2e • t–1 carbon price), assuming average land use change emissions values for pastureland in the region. Total program costs for these offsets also exhibit a wide range, spanning $2–$120 million • yr–1 depending on parameterization. Results across the full range of RDTs (0.2–2) demonstrate a tendency toward higher RDTs for achieving high levels of avoided emissions, with cost efficiency being maximized in the 1.4–1.8 range for RDTs. A state-level breakdown of results demonstrate the importance of modeling economic trends in land use and setting region-specific additionality thresholds for avoided grassland conversion offsets. Although our study is specific to grassland conversion in one region of the United States, similar offset markets exist elsewhere, where additionality concerns are paramount. We believe our framework can be useful in improving protocol design.
In Great Britain, red-billed chough (Pyrrhocorax pyrrhocorax) breed in discrete populations along the west coast: on Islay and Colonsay, in the Inner Hebrides of Scotland; on the Isle of Man; in Wales; and in Cornwall. Chough are dependent on pastures grazed by sheep or cattle, and their survival therefore depends on sympathetic management of grassland. The Scottish population is in decline, and all other populations are growing or stable. Sixty-three farmers in these regions whose farms were known to support feeding chough were asked questions about their farm management using a structured, questionnaire-based personal interview. Islay farms were significantly larger and had more grazing area, with the lowest stocking densities. Welsh farms had the least cropping area and the smallest number of cattle. Cornwall had the smallest number of sheep per farm. Welsh farms were more likely to not house cattle during winter. Liver fluke in sheep and ticks and tick-borne disease were a higher concern on Islay than other regions, and abortion in sheep was of highest concern on the Isle of Man. Islay farmers applied between 4× and 13× as many synthetic pyrethroid (SP) treatments to cattle per year than farmers at other regions, and the application rate of triclabendazole (TCBZ) to sheep was higher on Islay than other regions. The rate of application of other products, including macrocyclic lactones, did not differ among regions. The study described here shows clear differences in the farm grazing management, in the priority given to animal health problems and in the frequency of application of veterinary parasiticides among four regions that provide feeding habitat for chough in the United Kingdom. These differences suggest that the viability of chough populations might be favored by higher-intensity grazing and low rates of application of veterinary parasiticides of either the TCBZ or SP, or both classes of parasiticides.
A 2-yr study of livestock/wildlife tracking was conducted on four streams in eastern Oregon. Binomial sampling of tracks proved to be an effective statistical method for monitoring the proportion of samples containing tracks on stream greenlines and testing the observed value against an established standard. Study results indicate that tracks are related to variables outside of the control of livestock grazing management.
Few studies have evaluated the response of ungulate populations to wind energy development. Recent demand for wind-generated electricity coupled with a tendency for wind-energy facilities to be sited within suitable pronghorn (Antilocapra americana) winter range make this a critical issue for conservation of this icon of western North America. We evaluated pronghorn response to wind energy development at the winter home range scale, as well as within individual winter home ranges using data collected from 47 adult female pronghorn equipped with Global Positioning System transmitters. At both scales, we developed separate resource selection models for pronghorn before (winter 2010) and after (winters 2011 and 2012) development of the Dunlap Ranch wind energy facility in south-central Wyoming to evaluate the potential impacts of wind energy infrastructure on pronghorn winter resource selection. In general, pronghorn winter resource selection was correlated with greater sagebrush (Artemisia spp.) cover, lower snow depth, and lower slopes before and after wind energy development at both scales. At the larger scale, pronghorn selected home ranges closer to wind turbines during all winters. Within home ranges, pronghorn selected areas closer to future locations of wind turbines at Dunlap Ranch during 2010 before turbine erection. However, we found evidence that pronghorn avoided wind turbines in winters after development within their winter home ranges. This relationship was most evident during winter 2011, which coincided with the most severe winter of our study. Long-term replicated studies will be necessary to make inferences for pronghorn populations exposed to wind energy development in different environments and scales than we evaluated. Nonetheless, in the absence of additional information on how ungulates respond to wind energy development, our finding that pronghorn avoided wind turbines within their winter home ranges has important implications for future wind development projects, particularly in areas known to fulfill important seasonal requirements of pronghorn populations.
The daily nutritional balance of free-ranging cattle is the net result of intake from available forage biomass and nutritive value weighed against the nutritional requirements of the animal. Plant phenology influences nutritive value. Plant phenology is dictated by time of year and an accumulation of photosynthetically active days. Growing degree day (GDD) is a concept that quantifies this relationship and has been used to predict nutritive value in perennial range grasses. GDD could be substituted for chemical analysis to inform grazing animal nutritional monitoring efforts. We hypothesized that in C4 grass-dominated rangelands, a cumulative GDD calculation would correlate with diet crude protein (CP) predictions obtained by fecal near infrared spectroscopy (FNIRS) from free-ranging cattle. Therefore, the objectives of our research were to evaluate the effectiveness of GDD to predict FNIRS-derived determinations of grazing cattle diet CP in 1) two groups of three individual animals grazing a small native pasture and 2) large commercial-scale herds grazing expansive rangelands. For the first objective, cumulative GDD and FNIRS-predicted diet CP were strongly correlated (r2 = 0.76; P < 0.01). Relationships between cumulative GDD and FNIRS-predicted diet CP for the second objective varied considerably among ranches, ranging from a low r2 of 0.05 (P = 0.871) to a high r2 of 0.78 (P < 0.049). Similar values for individual ranch/year combinations were stronger; ranging from a minimum r2 of 0.44 (P = 0.556) to a maximum of 0.95 (P = 0.051). The aggregate relationship between GDD and FNIRS-predicted CP for all ranch/year combinations was highly significant (r2 = 0.37; P < 0.001), but the standard error was 1.86% CP. The noninvasive remotely sensed grazing animal nutritional monitoring method described here was accurate enough to inform tactical rangeland diet quality assessments but was not accurate enough to inform operational-scale grazing management decisions.
Anderson Michel Soares Bolzan, Olivier Jean François Bonnet, Marcelo Osorio Wallau, Catarine Basso, Adriana Pires Neves, Paulo César de Faccio Carvalho
Early relationships between young mammalian herbivores and social models (e.g., mothers or peers) have been proposed as playing a major role in the process of diet learning. Diet selection is an important factor influencing animal development and ecology, especially in natural and seminatural grasslands, with a large diversity of plant species. To explore the learning process of foraging behavior and diet selection choices by foals, six free-ranging Criollo foals and their respective mares were monitored through continuous bite monitoring from birth to 130 d old, in the Pampas Grasslands of southern Brazil. Cumulative suckling time decreased exponentially from birth to 130 d old, while dry matter intake, foraging time, and bite mass of foals increased continuously. It was possible to identify three marked periods in the foal's foraging behavior development: 1) an exploratory phase (from 0 to 40 d old) marked by limited forage intake from a large diversity of plants; 2) a specialization phase (from 40 to 110 d old) with a marked increase in forage intake and a specialization around the same plants as the ones selected by the mares; and 3) a stabilization phase (after 110 d old) in which forage intake still increases but diet composition of foals stabilized similarly to the one of the respective mares. The higher diversity at young ages could be explained by exploratory hypothesis, where foals test different forages to discover their environment, given that their nutritional needs are fulfilled by milk consumption, not by forage intake. As requirements shift toward solid items, bite mass and foraging time increase and diet choices become similar to that of the mares. Our results detail how young foals develop their foraging behavior and suggest, without testing it and under the circumstances of this study, that they learn their diet through social transmission from their mothers.
Understanding fall precipitation effects on rangelands could improve forage production forecasting and inform predictions of potential climate change effects. We used a rainout shelter and water addition to test effects of seasonal precipitation on soil water and annual net primary production of C3 perennial grass, C4 perennial grass, annual grasses, forbs, and all plants combined. Treatments were 1) drought during September–October and April–May (DD); 2) drought plus irrigation during September–October and drought during April–May (WD); 3) year-long ambient conditions (WW); and 4) ambient plus irrigation during September–October (W + W). Treatments created conditions ranking among the driest and wettest September–October periods since 1937. Fall water effects on soil water were not detectable by May at 15 cm and 30 cm. Effects persisted into July at 60 cm and 90 cm, depths below the primary root zone. With spring drought, annual net primary production was 344 kg ha–1 greater when the previous fall was wet rather than dry. No differences were detected between fall water treatments when spring was wet and fall was about 184% (1 938 ± 117 kg ha–1) or 391% of the median (1 903 ± 117 kg ha–1). Fall water increased C3 perennial grass when spring was also wet and had no effect under spring drought, when forage production concerns are greatest. Fall water did not affect C4 perennial grass, and extremely wet fall conditions reduced forb production about 50%. The greatest effect of fall water was increased annual grass production. Even record high levels of fall water had minor effects on biomass, functional group composition, and soil water that were short-lived and overwhelmed by the influence of spring precipitation. Movement of fall water to deep soil by the growing season suggests plants that would most benefit from fall precipitation are those that could use it during fall (winter annuals), or deep-rooted species (shrubs).
Surface litter protects rangeland soils against wind and water erosion and provides food and nesting materials for wildlife and insects. However, the ability of grassland systems to provide these services depends on the little studied topic of seasonal surface litter decomposition. Seasonal and annual surface litter decomposition rates were determined between 2014 and 2015 in central and western South Dakota at three mixed-grass prairie locations. Residue bags containing surface litter were placed in the field in late fall (1 November) of 2014 and removed after the winter (1 April), spring (1 July), and summer + fall seasons (1 November) of 2015. The litter was analyzed for total C, total N, acid detergent fiber (ADF), and acid detergent lignin (ADL). Average winter temperatures ranged from –5°C to –15°C, while summer temperatures ranged from 10°C to 35°C. Litter decomposition was lowest during the winter (0.57–0.86 g [kg × day]–1) and greatest during the summer + fall (2.12–2.69 g [kg × day]–1). Over the entire season, 40.8–62% of the surface litter decomposed. Winter litter decomposition was positively correlated with air temperature (r = 0.62, P < 0.01) and snow depth (r = 0.61, P < 0.01), and negatively correlated with C/N ratio (r = –0.65, P < 0.01), ADF (r = –0.35, P < 0.05), and ADL (r = –0.25, P < 0.05) concentrations. These findings indicate that winter decomposition cannot be ignored and that winter surface litter decomposition increases with snow depth.
Sagebrush ecosystems consist of different communities of species and subspecies of sagebrush marked by distinct ecotones along elevation gradients, yet few studies have quantified how ecosystem-scale carbon dioxide (net ecosystem exchange, NEE) and water fluxes (evapotranspiration, ET), as well as their environmental drivers, vary among communities dominated by different sub/species of sagebrush at daily and seasonal time scales. To address this knowledge gap, we measured daytime (6 a.m.–6 p.m.) NEE and ET using a tent chamber and associated environmental drivers in three sagebrush communities spanning an elevation gradient of 1 425–2 111 m at the Reynolds Creek Critical Zone Observatory in southwestern Idaho. Daytime NEE and ET were greatest at the highest elevation (snow-dominated) site during the study period except NEE in June. By late summer, NEE declined by > 80% at the lower (rain-dominated) sites but only 50% at the highest site, compared with maximal values in June. In contrast, ET declined ∼95% in late summer compared with June at all three sites. Ecosystem-scale NEE and ET were mainly controlled by soil moisture and vapor pressure deficit at the rain-dominated sites and by deep soil moisture and air temperature at the snow-dominated site. Cumulative (June–August) modeled daytime NEE was greatest at the midelevation site, whereas cumulative daytime ET was greatest at the highest-elevation site. Ecosystem models often assume that sagebrush landscapes are homogeneous and do not differ in fluxes and controls, yet our data demonstrate that there are fundamental differences in CO2 and water fluxes and their controls among different shrub communities that should be accounted for in these models.
Plant functional traits can be used to predict ecosystem responses to climate gradients, yet precipitation explains very little variation for most traits. Soil water availability directly influences plant water uptake and thus may assist with the improvement of plant trait–water relationships. However, this promise remains poorly realized due to rare tests. Here, we provide the first study that attempts to link climate factors, vertical soil water availability, and community composition at a regional scale. Our study paired field-measured vertical soil available water (0–300 cm) and community functional composition at 46 herbaceous grassland sites along a steep hydrothermal gradient in the Loess Plateau of Central China. Community functional composition was expressed via community-weighted means of eight traits. Structural equation modeling was employed to evaluate the role of vertical soil available water content, controlled by precipitation and air temperature, in affecting plant community-weighted traits. We found that soil available water content at depths of 20–100 cm was typically responsible for mediating the effects of precipitation and air temperature on plant community composition. This emerged as the predominant factor to explain variations in grassland response traits, including leaf area, specific leaf area, and leaf dry matter content. These traits exhibited clear drought-induced shifts along soil desiccation gradients and responded to drier conditions by reducing leaf area/specific leaf area and increasing leaf dry matter content. Our findings rehighlighted soil water availability as the core driver that needs to be considered in the restoration and management of dryland ecosystems.
Flood irrigation on western rangelands is important for diverse social and ecological reasons, providing forage for many agricultural operations and maintaining many critical wetlands across the region. However, recent debate over the efficiency of flood irrigation and resulting transition to other “more efficient” types of irrigation has put many of the working wet meadows sustained by flood irrigation at risk. As the sustainability of these landscapes is primarily dependent on ranchers' management decisions, we sought to gain a deeper understanding of factors influencing ranchers who flood irrigate and how these factors interrelate. We applied the Community Capitals Framework to explore what considerations act as enablers and constraints to maintaining flood irrigation and to evaluate the role of each type of capital in enabling and constraining the coproduction of working wet meadows for ranchers and the environment. Our qualitative analysis of facilitated workshop transcripts and observation notes from two study areas within the Intermountain West showed that ranchers perceived constraining and enabling factors of flood irrigation related to all seven types of community capitals: natural, financial, built, cultural, human, social, and political. The irrigation methods used by ranchers were heavily influenced by environmental components of the landscape rather than reflecting a choice among alternative methods. Other prominent enablers included a commitment toward maintaining the natural history of the landscape and the ranching lifestyle. Primary constraints included the impact of public misperception and the ability to pass their operation on to the next generation. Ranchers weighed multiple considerations simultaneously in a holistic, community-scale approach to management decisions and described how diverse enablers and constraints interacted to determine the viability of flood irrigation and ranching. These results indicate rancher decisions are driven by complex social-ecological considerations and demonstrate the importance of each capital type to rangeland conservation.
Sagebrush ecosystems of the western United States can transition from extended periods of relatively stable conditions to rapid ecological change if acute disturbances occur. Areas dominated by native sagebrush can transition from species-rich native systems to altered states where non-native annual grasses dominate, if resistance to annual grasses is low. The non-native annual grasses provide relatively little value to wildlife, livestock, and humans and function as fuel that increases fire frequency. The more land area covered by annual grasses, the higher the potential for fire, thus reducing the potential for native vegetation to reestablish, even when applying restoration treatments. Mapping areas of stability and areas of change using machine-learning algorithms allows both the identification of dominant abiotic variables that drive ecosystem dynamics and the variables' important thresholds. We develop a decision-tree model with rulesets that estimate three classes of sagebrush condition (i.e., sagebrush recovery, tipping point [ecosystem degradation], and stable). We find rulesets that primarily drive development of the sagebrush recovery class indicate areas of midelevations (1 602 m), warm 30-yr July temperature maximums (tmax) (30.62°C), and 30-yr March precipitation (ppt) averages equal to 26.26 mm, about 10% of the 30-yr annual ppt values. Tipping point and stable classes occur at elevations that are lower (1 505 m) and higher (1 939 m), respectively, more mesic during March and annually, and experience lower 30-yr July tmax averages. These defined variable averages can be used to understand current dynamics of sagebrush condition and to predict where future transitions may occur under novel conditions.
Remotely sensed data products depicting physical and ecological attributes of a landscape are becoming invaluable tools in wildlife and rangeland management. However, if such geospatial tools and data layers are to be used in management, their accuracy and appropriateness for such use needs to be vetted and validated. We assessed accuracy of two National Land Cover Database (NLCD) shrubland products for use in western South Dakota—percent sagebrush and sagebrush height—by comparing them to ground-truthed data. Western South Dakota sagebrush communities are an ecotone between sagebrush (Artemisia spp.) and grassland. This ecotone is typified by shorter- and lower-density sagebrush than interior sagebrush steppe ecosystem. This distinction could make it difficult to remotely detect and map sagebrush in this area. We determined NLCD correlations with ground estimates of sagebrush canopy cover (r = 0.17) and sagebrush height (r = 0.40). The NLCD percent sagebrush accurately predicted sagebrush presence ∼73–76% of the time once resampled to 100-m pixels and 50-m mean values, respectively. Cohen's kappa values were estimated to determine if the ground-truthed and remoted-sensed data were in agreement when determining sagebrush presence. Kappa values were 0.26 ± 0.06 and 0.28 ± 0.06 for mean values within 50-m and resampled 100-m pixels, respectively, indicating a “fair” level of agreement between the ground-truthed and remote-sensed data types when determining presence of sagebrush. The NLCD data sufficiently described the presence of sagebrush in South Dakota, which is useful for estimating geographic distributions of sagebrush obligate species, species distribution models in which presence or absence of sagebrush is of interest, or mapping the occurrence of sagebrush in South Dakota. Inaccuracies of the NLCD shrubland products in predicting sagebrush height and sagebrush canopy cover may limit their utility as continuous variables in species distribution models, habitat selection, and suitability models or when assessing rangeland quality in South Dakota.
Indigenous rangeland management practices, forage quality and availability, and livestock production by pastoralists and agro-pastoralists in miombo woodlands were investigated in a study conducted in Kilosa district, Tanzania. The study methods comprised household interviews, key informant and focus group discussions, and forage laboratory analyses. Preferred forage species and indigenous rangeland and livestock management practices among pastoral and agro-pastoral communities in miombo woodlands were identified, and the nutrient content of the forages was determined. In general, rangeland management in the study area faces challenges such as unclear or disputed land tenure regime and lack of technical knowledge. Moreover, the nutritional value of some native forage species identified in miombo was found to be too low to meet the nutrient requirements of livestock. Livestock in miombo contribute greatly to household livelihoods and food security, but forage scarcity was identified as a limiting factor. Overall, it was concluded that rangeland improvement practices are poor or nonexistent in allocated grazing areas in Kilosa's miombo woodlands.
Since Euro-American settlement of the region, biological diversity of the northern Great Plains has been adversely affected, mainly by agricultural conversion. The role of invasive plants in degradation of remaining prairies has gained attention in recent years but remains poorly understood. Floristic composition of US Fish and Wildlife Service (Service) prairies is significantly altered, mainly by invasion of smooth brome (Bromus inermis Leyss.), Kentucky bluegrass (Poa pratensis L.), and woody vegetation. We measured floristic composition of about 90 000 ha of Service-owned mixed-grass and tallgrass prairie in North Dakota, South Dakota, and northeastern Montana. Our primary objective was to identify factors associated with greater native grass-forb plant assemblages, while conversely identifying features more aligned with Kentucky bluegrass, smooth brome, and low shrub invasion. Service-owned prairies had a higher frequency of native grass-forb farther from habitat edges, such as cropland boundaries and roads, and on harsher ecological sites composed of poorer soils, steeper slopes, or with southern and western exposures. Kentucky bluegrass, smooth brome, and low shrubs differed in their respective responses to explanatory variables we considered and also reportedly differ in response to management actions such as fire and grazing. Therefore, prairie managers can expect significant challenges during restoration management in cases where two or more of these invaders occur. By understanding patterns of invasion related to edaphic, edge, and landscape features, prairie restorationists can focus on areas where the probability of restoration success is greater and better understand how these features might influence restoration success or failure.
Seeding is sometimes used in attempts to increase grass forage production in invaded rangelands, but insufficient long-term data prevent determining if seeded grasses are likely to become and remain productive enough to justify this expensive practice. We quantified long-term seeding outcomes in a widespread Rocky Mountain foothill habitat invaded by leafy spurge (Euphorbia esula L.) and several exotic grasses. Fourteen yr after seeding, the most productive grass (bluebunch wheatgrass [Pseudoroegneria spicata (Pursh) Á. Löve]) produced 900 (100, 12 000) kg ha–1 [mean (95% CI)], which was about 70% of total plant community biomass. This result was not greatly altered by grazing according to an unreplicated, grazed experiment adjacent to our replicated ungrazed experiment. Regardless of treatment, E. esula gradually became less productive and seeded and unseeded plots produced similar E. esula biomass 14 yr after seeding. P. spicata reduced exotic grasses about 85%. Our results resemble those of another foothills study of another invasive forb (Centaurea stoebe L. ssp. micranthos [Gugler] Hayek) and a Great Plains study of E. esula, so foothills seeding outcomes seem somewhat insensitive to invader composition, and seeding can increase forage across much of E. esula's range. While there is always some risk seeded grasses will fail to establish, our study combined with past studies identifies invaded habitats where seeded grasses have a good possibility of forming persistent, productive stands.
Stuart P. Hardegree, Roger L. Sheley, Jeremy J. James, Patrick A. Reeves, Christopher M. Richards, Christina T. Walters, Chad S. Boyd, Corey A. Moffet, Gerald N. Flerchinger
Rangelands in the western United States exhibit extremely high temporal variability in seedbed microclimate, and this variability contributes to poor establishment of revegetation species that are typically planted in the fall. We conducted long-term simulations of cumulative germination as a function of planting date and identified alternative germination syndromes based on population-level responses to environmental variability. These germination syndromes reveal ecologically significant differences but also noteworthy similarities in species and seed lot response that can inform rangeland restoration planning and management. Seed germination may occur much sooner than assumed under the traditional paradigm of fall-planting/spring-emergence in the intermountain western United States, and seed germination per se does not appear to be a bottleneck for successful establishment in most years. Instead, simulations of germination response support recent hypotheses that postgermination/preemergent mortality may be the larger contributor to poor seedling establishment. Our data support two general strategies to improve the likelihood of seedling survival into the spring: seeding as late as possible in the fall and active diversification of germination syndromes within a given seed mix. Consistent application of these strategies could increase the probability that some seeds are always available to take advantage of any pulse of seedbed favorability in the late fall, winter, or early spring.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere