BioOne.org will be down briefly for maintenance on 12 February 2025 between 18:00-21:00 Pacific Time US. We apologize for any inconvenience.
Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Reevaluating assumptions about the ecology and management of sagebrush and salt desert shrub systems in the Great Basin and Intermountain West is a proper role for science. These are complex rangeland ecosystems, and our management applications need to account for this complexity. Understanding and reckoning this complexity is vital to the future existence of these rangeland systems and their ability to provide critical goods and ecosystem services to society. The most influential ecological claim of the past 40 yr is based on ideas presented by Mack and Thompson (1982), that Great Basin and Intermountain West plant communities evolved with few or perhaps no large hooved-grazing animals. Our thesis asserts that Mack and Thompson's position is based on 1) an oversimplification of complex, heterogeneous, and diverse ecosystems; 2) a poor understanding of science, both in 1982 and now; and 3) the attribution of all recent ecological changes to a single land use. We review the archaeological and historical record of vegetation and large grazing animals in the region and then revisit Mack and Thompson's (1982) interpretations of the rangeland plants and plant communities, forage quality and nutrition, and soil biotic crusts east and west of the Rocky Mountains, adding the information necessary for a more comprehensive interpretation. We finish by proposing an alternative paradigm to guide management and conservation of sagebrush and salt desert systems of the Great Basin and Intermountain West and beyond.
Prairie dogs can reduce the carrying capacity on rangelands by up to 50% through direct consumption of vegetation and by clipping plants. Studies have shown that forage quality and digestibility are greater on prairie dog towns than off town; however, research is lacking that quantifies rates of forage and nutrition intake by cattle. In 2012–2016, a study was conducted in South Dakota to evaluate livestock grazing behavior, diet quality, and forage intake on three plant communities in pastures occupied by prairie dogs. Plant communities studied were grass dominated on-town sites (PDOG-GRASS), forb dominated on-town sites (PDOG-FORB), and grass dominated off-town sites (NO-PDOG). Three pastures with varying levels of prairie dog occupation (0%, 20%, and 40%) were studied. Each pasture was grazed by a herd of yearling steers, a subset of which were fitted with Global Positioning System (GPS) collars. Daily time spent grazing was estimated for each plant community. Forage quality and intake were estimated using ruminally fistulated steers that were allowed to graze in 30-min increments within each plant community for June, July, and August of each year. Rumen samples were weighed and analyzed for forage quality. Intake was calculated as the rate of organic matter per minute and multiplied by average grazing time based on GPS collar data. Livestock grazing preference decreased linearly (P < 0.001) on PDOG-GRASS sites and increased linearly (P = 0.001) for NO-PDOG sites through the growing season. Crude protein content was significantly higher (P = 0.002) on the PDOG-FORB sites versus PDOG-GRASS and NO-PDOG. Few differences in forage quality were evident between the PDOG-GRASS and NO-PDOG communities. Organic matter intake rates were similar between PDOG-GRASS and NO-PDOG communities; PDOG-FORB intake rates were reduced 59% compared with NO-PDOG sites. This study will inform land managers of potential forage contributions of on-town and off-town plant communities in pastures colonized by prairie dogs.
Sagebrush is a vital habitat component for many endangered grassland species. As the need for sagebrush habitat restoration increases, models that enable restoration practitioners to calculate seeding or planting densities required to obtain desired sagebrush cover within specific time frames are essential. We measured cover and density of naturally occurring silver sagebrush (Artemisia cana) stands, subjected to different grazing management in Grasslands National Park. In 10 × 10 m plots, stem diameter and crown diameter of all individuals were measured; a subset of individuals were photographed to determine canopy cover and sampled (cut) for age determination by ring count. Strong relationships between morphological characteristics and age were found. Age was significantly correlated with stem diameter (r2 = 0.79) allowing nondestructive age estimations to be made for A. cana. Canopy cover was correlated to stem diameter and age (r2 = 0.49 to 0.67) with the relationship best described by reciprocal quadratic and rational models. We accurately modeled plot cover as the product of mean canopy cover, determined from morphological models, and plot density (81% of cover estimates within 10% of measured values). Sagebrush mortality was estimated using age frequencies of the sagebrush stems and outplanted seedling survival rates of other studies. Using mortality rate to determine how sagebrush density would change over time, we made projections of sagebrush cover for outplanted seedlings. These models indicate largest cover increases can occur in areas that are lightly grazed by cattle, but that greatest short-term increases can occur in areas where cattle grazing is heavier. Before this study, maximum canopy cover of sagebrush, and age at which it was achieved, were unknown and estimated for restoration purposes. This study provides essential information for successful restoration of A. cana, by modeling sagebrush cover as a function of density and stand age.
Quantifying rangeland vegetation amounts with remotely sensed satellite data is a proliferating field of study. Yet the resulting datasets are rarely related to use-based monitoring indicators (i.e., utilization or residual biomass), which are critical for adaptive management and to inform the subsequent year's grazing plans. To better assess our ability to use remotely sensed data products for grazing monitoring and adaptive management, we tested the relationships between a variety of vegetation biomass metrics derived from remotely sensed data on a bunchgrass-dominated grassland in northeast Oregon and two common indicators: stocking rate at the pasture scale (40–250 ha; a management indicator) and field-based utilization estimates at the plot scale (25–50 m; a grazing indicator). At the pasture scale, we correlated stocking rate to biomass metrics and found two metrics that had consistent relationships to stocking rate: fall mean biomass (r values range: –0.52 to –0.56; P values < 0.001) and the 10th percentile of the relative difference between summer and fall biomass (r values range: –0.47 to –0.52; P values < 0.01). Scatterplots from these correlations were then evaluated alongside managers' knowledge to interpret why some pastures deviated from the overall pattern. At the plot scale, we correlated infield utilization estimates to biomass metrics and found consistent relationships with fall mean biomass (r values range: –0.32 to –0.47; P values < 0.001) and the relative difference between summer and fall biomass (r value: from –0.20 to –0.62; P values < 0.005). To further visualize the utilization correlations, we classified these two biomass maps into three categories guided by our utilization estimates. Significant changes in biomass due to management and interannual variation in biomass amounts stood out. The results and visualizations demonstrate how remotely sensed data relate to conventional grazing monitoring indicators and exemplify how remotely sensed data can be used to inform adaptive management.
Rangeland scientists have long relied on thermocouples for measuring temperature, especially in agris—in the field, under the extreme conditions of wildland fire. But the electronics required to sense and record thermocouple data remain expensive to both purchase and protect from exposure to heat and flames. Open-source, do-it-yourself (DIY) electronics platforms such as Arduino are increasingly popular among ecologists, and have been shown to perform as well as proprietary commercial systems when recording thermocouple data. The FeatherFlame system is a reliable, low-cost solution to sampling wildland fire, ranging from US$240–490 for 1–6 thermocouple sensors, including all fire protection equipment. The low cost and multi-sensor capacity facilitates spatial replication, which allows fire ecologists to measure and report meaningful data on rate of spread and calculate fire intensity.
Chelsea E. Keefer, Samuel B. St. Clair, Janae Radke, Phil S. Allen, Benjamin W. Hoose, Savannah Fahning, Nicholas K. Hayward, Tamzen K. Stringham, Matthew D. Madsen
Seed germination during unhospitable environmental conditions can be a major barrier to direct seeding efforts in dryland systems. In the sagebrush steppe, Wyoming big sagebrush and low sagebrush are important shrub species used in restoration; however, seeding success is highly sporadic due to interannual and intraseasonal weather variability. It may be possible to improve restoration success by expanding the period of seed germination to increase the chances some seeds will germinate within a window that is favorable for plant establishment. Our objective was to determine if we could expand the period of germination using plant growth regulators (PGRs) applied in a conglomerated seed coating to Wyoming big sagebrush and low sagebrush. The seed of each species was 1) left untreated; 2) conglomerated; 3) conglomerated and treated with two concentrations of a germination inhibitor, abscisic acid (ABA); or 4) conglomerated and treated with two different germination promoters, gibberellic acid (GA3) and 1-aminocyclopropane carboxylic acid (ACC) (6 treatments total). Seeds were incubated in a loam soil at five constant temperatures (5–25°C) for approximately 3 mo. Results indicate that seed treatments with PGRs can delay or accelerate germination. For example, at 5°C, which is the temperature most similar to when the seeds germinate in the field, ABA delayed the time for 50% of the seeds to germinate by a maximum of 28 d and 38 d and the germination promoters decreased this time by 9 d and 11 d for Wyoming big sagebrush and low sagebrush, respectively. Field studies are now needed to determine if the bet-hedging strategy developed in this study will increase the likelihood of seeding success. Although our study focused on sagebrush, there is merit to evaluate the use of PGRs on other species, particularly where seed is being sown in highly variable environments.
Approximately one third of California is grazed by livestock, with most forage produced on annual rangelands, the common term for rangelands with a significant annual herbaceous component. Given the state's interest in mitigating climate change and growing public attention on grazing systems that enhance carbon sequestration, we investigate the impact of grazing management on soil carbon cycling on California annual rangelands, drawing on soil science, rangeland management, and policy analysis literature. We conclude that using managed grazing for augmenting soil organic carbon on California annual rangelands presents significant challenges. Challenges include the heterogeneity, biogeochemical characteristics, and nonequilibrium nature of California's Mediterranean region, where ecological site conditions, soil type and texture, and climate moderate carbon sequestration. Enduring unknowns in the science underlying soil carbon and the dearth of relevant California-based studies further obscure the potential climate change mitigation effects of grazing systems. Given this, grazing management on California annual rangelands should not be prioritized as a climate change mitigation strategy, unless it is for the purposes of data collection and research. Alternative climate change mitigation opportunities on these landscapes include preventing rangeland conversion and enhancing soil carbon stocks through the suite of range management practices known to augment soil organic carbon or prevent erosion, including marginal cropland restoration, riparian restoration, organic amendments, and silvopasture. In this review, we argue that single-purpose management is generally not fitting for the diverse portfolio of social-ecological services produced on California's vast and varied rangelands. When assessing the value of grazing systems for augmenting soil organic carbon, policymakers, landowners, and other decision makers should consider the potential impacts on the numerous ecosystem services supported by the landscape. The multidisciplinary method presented in this review provides a critical framework for evaluating the appropriateness of working lands carbon policies as Natural Climate Solutions for climate change mitigation are developed in California and other geographies.
Bison serve as keystone species in prairie ecosystems of North America, yet few studies have evaluated the effects of bison carcasses on other animals. To determine which species forage on bison carcasses and whether such carcasses catalyze intraspecific and interspecific interactions, we used wildlife cameras to observe scavenger visits and activity over a 6-wk period following the death of an adult female bison from a conservation herd in northern Colorado. We captured more than 45 000 photos of avian and mammalian scavengers. Photos most often included coyote (51% of photos) and common raven (50%), followed by black-billed magpie (14%) and golden eagle (0.11%). Most photos (86%) captured individuals of the same species (intraspecific activity) visiting the carcass, while 14% had two or more different species (interspecific activity) at the carcass simultaneously. The species most often photographed together included the common raven and black-billed magpie (64%), followed by coyote and black-billed magpie (25%) and coyote and common raven (7%). Less than 1% of photos captured dominance behaviors among individuals. Our study demonstrates that bison carcasses have the potential to provide key food resources in the winter for shortgrass prairie animals. The role of bison carcasses in supporting grassland animal communities is likely to become increasingly important as efforts gain momentum to restore bison across their historic range in North America.
Net primary production (NPP) is a critical ecosystem property that researchers and land managers attempt to quantify across ecosystems globally. Although the belowground NPP (BNPP) component of total NPP often contributes 50% or more to total NPP, it is infrequently measured due to the amount of labor involved. Here, I present a rapid method to estimate BNPP or fine root production in rangelands using sequential or root ingrowth cores and modifications to existing root-washing methods. The details provided should allow anyone to estimate BNPP in rangelands worldwide with minimal investment in materials and labor.
Indicators of vegetation cover and structure are widely available for monitoring and managing rangeland wind erosion. Identifying which indicators are most appropriate for managers could improve wind erosion mitigation and restoration efforts. Vegetation cover directly protects the soil surface from erosive winds and reduces wind erosivity by extracting momentum from the air. The portion of the soil surface that is directly protected by vegetation is adequately described by fractional ground cover indicators. However, the aerodynamic sheltering effects of vegetation, which are more important for wind erosion than for water erosion, are not captured by these indicators. As wind erosion is a lateral process, the vertical structure and spatial distribution of vegetation are most important for controlling where, when, and how much wind erosion occurs on rangelands. These controlling factors can be described by indicators of the vegetation canopy gap size distribution and vegetation height, for which data are collected widely in the United States by standardized rangeland monitoring and assessment programs. In this paper we address why canopy gap size distribution and vegetation height are critical indicators of rangeland wind erosion and health. We review wind erosion processes to explain the physical role of these vegetation attributes. We then address the management implications including availability of data on the indicators on rangelands and needs to make the indicators and model estimates of wind erosion more accessible to the range management community.
Invasive grasses reduce habitat quality for multiple taxa and can negatively impact forage quality for livestock. Large-scale experimental studies are needed to inform more effective grassland restoration that is grounded in practice. To this end, we studied the control of a common but highly invasive cool-season grass using a landscape-scale experiment, employing an adaptive management framework. The study design included three patches (average 8.7 ha) at each of seven sites. Treatments included 1) herbicide (glyphosate), 2) herbicide and native seeding, and 3) control. Four sites were grazed by domestic cattle using adaptive stocking. We sampled vegetation composition and structure during one pretreatment year (2014) and four post-treatment years (2015–2018). Our primary objective was to evaluate how these onetime treatments affected the cover of tall fescue (Schedonorus arundinaceus) and native grasses and forbs. Tall fescue cover was reduced after a one-time glyphosate application, and this reduction was maintained over 4 years on grazed and ungrazed sites. We observed increases of warm-season grasses after herbicide and seeding, with the strongest restoration observed on ungrazed sites. Native grasses did not differ strongly between treatments on grazed sites, where there was a resurgence of nonfescue exotic grasses. Percent cover of native forbs was near zero before seeding but ranged from low to moderate levels afterward. Our results indicate a one-time application of herbicide can be used to reduce but not eradicate the invasive grass tall fescue, although other exotic grasses may replace tall fescue, especially on grazed sites. For plant community restoration to be successful, sites should be rested from grazing to give native seedings time to establish. Although eradication of invasive grasses is often infeasible in productive landscapes, restoring at least some native vegetation has the potential to protect ecosystem services provided by grasslands.
Invasive annual grasses such as cheatgrass (Bromus tectorum L.) outcompete native grasses, increase fire frequency, and impact the functionality and productivity of rangeland ecosystems. Preemergent herbicide treatments are often used to control annual grasses but may limit timely restoration options due to negative effects on concurrently planted desired seeded species. We tested the efficacy of activated carbon-based herbicide protection coatings applied to individual bluebunch wheatgrass (Pseudoroegneria spicata [Pursh] A. Love) seeds for protecting seedlings from injury associated with pre-emergent herbicide (imazapic) application in a laboratory environment. Emergence of coated seed averaged 57% ± 5% compared with bare seed, which had 14% ± 10% emergence with imazapic application. Seedling height for coated seed averaged 7.56 ± 0.6 cm compared with 2.26 ± 0.4 cm in uncoated bare seed in the presence of imazapic. Coated seeds produced 87% more plant biomass than uncoated seeds. Our laboratory results suggest that treating individual seeds with an activated carbon-based coating dramatically reduces negative effects of pre-emergent herbicide on desired seeded species. Field studies are needed to confirm these results in an applied restoration context.
Herbivores regulate nutrient cycling of terrestrial ecosystems through trampling topsoil and vegetation, selective foraging, and excretion of feces and urine. However, the role of trampling in the soil-plant interaction is still unclear, partly due to the lack of empirical studies examining soil and plant stoichiometric responses to trampling. We conducted a 2-consecutive-yr field trial to explore the effects of simulated sheep trampling intensity on soil and plant carbon:nitrogen:phosphorus (C:N:P) stoichiometry in a typical steppe of the Loess Plateau, China. Results show that with the increase in trampling intensity, the soil bulk density at 0–10 cm decreased in 2016 while it increased in 2017. Although trampling increased soil total N and P concentration, the soil C:N, C:P, and N:P ratios remained stable. The aboveground biomass of three dominant species increased with trampling intensity in 2016 but decreased in 2017. Trampling increased plant N and P concentrations but decreased plant C concentration, and C:N and C:P ratios. Trampling affected plant N:P ratios depending on species, but usually exacerbated P limitation. Thus, the differential responses of soil and plant C:N:P stoichiometry to trampling intensity indicated that the effects of trampling on the plants did not convey equivalent impacts on the soil. Our study provides evidence that it is necessary to isolate the effects of livestock trampling from grazing on grasslands and highlights that at the stocking rate of 2.7 Tan sheep ha–1 (equivalent to ∼40 footsteps m–2), trampling does not negatively affect soil-plant interactions or stoichiometry, and hence such stocking rate may be compatible with rangeland restoration objectives.
The growing invasion of ecosystems by invasive alien plants (IAPs) has substantially affected biodiversity worldwide, compromising provision of ecosystem services. In this study, we present evidence of the impacts of an IAP, Robinia pseudoacacia L., on native plant diversity in montane rangelands of South Africa and its threats to grazing, an ecosystem service. We assessed stand characteristics, understory vegetation composition and rangeland condition similarities in invaded and uninvaded sites. We observed a shift in grass communities after invasion by R. pseudoacacia as invaded communities differed by 96% from uninvaded rangeland. Invaded habitat was dominated by nitrophilous, shade-tolerant alien ruderals that follow the primitive C3 carbon fixation pathway. Nitrogen fixation and light-demanding properties of R. pseudoacacia are likely to be the main factors driving these changes. As a result, range condition was significantly lower in invaded habitats with smaller and dense trees (180 ± 24.3) (mean ± standard error) when compared with adjacent uninvaded habitat (401 ± 24.3). These preliminary findings support an urgent need for sustainable control of R. pseudoacacia as an effective approach to stop further reduction in grazing capacity and losses in livestock production.
Severe drought and insect outbreaks have caused widespread mortality and dieback of trees in semiarid woodlands. Despite the extent and severity of these dieback events, little is known about how the woodland understory vegetation responds to severe drought and whether that response is mediated by changes to the tree layer. We sampled understory vegetation of 98 plots in 25 pinyon-juniper woodland sites in central Nevada with varying amounts of dieback, during the third yr of a severe regional drought. Twenty-five of these plots had predrought baseline data. We related covers of perennial grasses, perennial forbs, shrubs, and the exotic invasive cheatgrass to gradients of aridity, tree cover, soil water capacity, and tree dieback. Between 2005 and 2015, the covers of perennial grasses and forbs declined, while shrub cover remained mostly unchanged. We found that the greatest understory cover was associated with lower tree cover and greater tree dieback. We did not find evidence for rapid colonization of microhabitats created by tree mortality. The site-level response of vegetation to tree dieback depended on aridity: For both perennial grasses and cheatgrass, tree mortality had a positive effect only on dry sites. Cheatgrass abundance increased the most at dry sites with tree dieback, and almost every site with substantial dieback also had cheatgrass present. Our results show differential effects of drought and tree dieback on understory functional types, which may affect the postdrought successional trajectory of drought-impacted pinyon-juniper stands. Further research is needed to determine the mechanisms by which tree mortality affects the understory, whether through altering the litter layer, light levels, or soil water, and whether these effects persist over longer timescales.
Widespread invasive annual grasses, cheatgrass (Bromus tectorum L.) and Japanese brome (Bromus japonicus Thunb.), fluctuate greatly in abundance and compete with native species. Fire and herbicide have each provided various levels of short-term control. We tested the individual and combined effects of fall fire and the herbicide aminopyralid on annual brome to determine whether combined treatment increased or extended control. Treatments were a factorial arrangement of two fire (no fire; fall fire with 2-yr return interval) and three herbicide (no herbicide; alternate-yr herbicide; annual herbicide) treatments with five replications. Across years, fire doubled bare ground and reduced litter cover to half of that with no fire. Fire had no effect on germination of brome seed produced after fire. Fire reduced brome biomass one or two growing seasons after fire. The first 3 study yr, brome biomass was < 29 kg·ha–1 with fire and was 132 kg·ha–1 without fire. During the final year, brome biomass increased and was similar (704 kg·ha–1) across fire treatments. Nonbrome biomass was 19% greater with fire during 2015 and 2018. Nonburned plots shifted from dominance by C3 perennial grass to dominance by bromes by the last year. C3 perennial grass maintained dominance through 2017 with fire and was codominant with bromes during 2018. Aminopyralid reduced brome germination each year it was applied but did not affect brome biomass. C3 perennial grass dominated in both herbicide treatments through 2017. During 2018, bromes dominated with no herbicide or alternate-yr herbicide and were codominant with C3 perennial grass with annual herbicide treatment. All treatment combinations reduced forbs compared with nonburned, no-herbicide treatment. Long-term control of annual bromes requires long-term commitment to repeated treatment. The combination of fall fire and spring application of aminopyralid did not extend annual brome control under the study conditions.
Dairy and livestock grazing operations can introduce pollutants such as fecal coliform to surface waters, posing risk to human health and ecological conditions. Agricultural best management practices (BMPs) that increase control of manure, runoff, and animal access to waterways can reduce deposition of microbial pollution into streams. Between 2000 and 2013, we monitored water quality (fecal indicator bacteria [FIB] consisting of fecal coliform [FC] and Escherichia coli [EC]) in three coastal watersheds that encompass intensively managed dairies, beef cattle grazing operations, and public use at Point Reyes National Seashore in Marin County, California. Concurrently, approximately 30 BMPs were implemented to manage cattle and improve dairy infrastructure. We also monitored a fourth adjacent watershed without any BMPs and limited direct livestock influence. The primary FIB parameter was changed from FC to EC midway through the study in 2007, so we combined FC data from 2000 to 2006 with EC data from 2007 to 2013 using previously published EC/FC ratios and data from this study to create a continuous time series. Competing Bayesian generalized linear mixed models examined whether FIB was best explained by year, 24-h rainfall, season, or annual rainfall and compared results with numeric regulatory objectives for surface waters. FIB from 2000 to 2013 declined at all 13 water quality stations that were downstream of BMPs implemented during the study, while there was a slight positive trend in the adjacent watershed without BMPs and limited livestock influence. There was a 54–99% reduction in FIB with a sixfold increase in the frequency of samples meeting regulatory criteria over the study period, and results were robust to varying assumptions of the relationship between FC and EC. These findings indicate that targeted BMPs can effectively reduce FIB, increasing the probability of meeting water quality objectives across varying types of livestock operations.
Royce E. Larsen, Matthew W.K. Shapero, Karl Striby, LynneDee Althouse, Daniel E. Meade, Katie Brown, Marc R. Horney, Devii R. Rao, Josh S. Davy, Craig W. Rigby, Kevin B. Jensen, Randy A. Dahlgren
Livestock obtain forage by grazing on rangeland. In California annual rangelands, residual dry matter is commonly used to determine proper grazing levels. Rangeland forage biomass and quality can degrade dramatically during the dormant summer period. We examined 25 sites across an annual rainfall gradient (183–492 mm) over 3 contrasting rainfall yr (2015–2017) that varied from 57% to 152% of average annual precipitation. Overall fractional biomass loss was 54.4% (range = 46.5–61.5%) with greater fractional losses occurring in dry years. Biomass losses were related to the amount of peak standing crop and plant composition—both a function of annual precipitation. Fractional seasonal losses from the peak standing biomass in 2015 = 962 kg/ha (61.5% seasonal; 9.7% monthly), 2016 = 1 541 kg/ha (55.0% seasonal; 8.7%monthly) and 2017 = 1 923 kg/ha (46.5% seasonal; 7.3%, monthly). Forage quality metrics were strongly affected by summer weathering processes. Crude protein concentrations decreased by 33.6%, 27.7%, and 21.0% in 2015, 2016, and 2017, respectively. In contrast, relative concentrations of fiber and lignin (acid detergent fiber [ADF] = cellulose + lignin) and in the weathered biomass showed increases for ADF: 44.6% (2015), 32.2% (2016), and 24.1% (2017). Increased lignin varied: 3.4% in 2015, 23.9% in 2016, and 28.0% in 2017. While ADF and lignin concentrations (weathered biomass, kg/ha) increased during the weathering process, the standing stock decreased by 39.3% (ADF) and 46.6% (lignin), compared with overall weathered biomass loss of 54.4% and CP loss of 67.1%. The significant loss of aboveground biomass and forage quality as weathering processes occurred throughout the dry summer period affects livestock grazing strategies. Forage biomass and nutrient losses through the dry season should be considered when determining grazing strategies to achieve proper residual dry matter levels and nutrient supplementation regimes before the onset of the rainy season.
The objective was to determine if 3-axis accelerometers could be used to predict daily activity for cattle grazing rangeland. There were 48 Hereford × Angus 2-yr-old low- or high-residual feed intake (LRFI or HRFI) cows used in this 2-yr trial. Cattle grazed in 4 pasture treatments consisting of continuously grazed, control (CCON); continuously grazed, supplemented (CTRT); rotationally grazed, control (RCON); and rotationally grazed, supplemented pastures (RTRT). Three LRFI- and 3 HRFI-collared cows in each treatment had accelerometers mounted for 29 d in 2016 and 45 d in 2017, beginning mid-October. Grazing time (GT), resting time (RT), and walking time (WLK) were obtained for each cow by direct observation over 3 d each year and compared with accelerometer predicted behavior. In 2016, 1.6% of the days were rejected for halter-mounted accelerometers and 3.6% were rejected in 2017 for collar-mounted accelerometers. The GT and RT were more accurately predicted than was WLK with the percentage error of predicted against observed data being 11.94% for RT, 13.51% for GT, and 30.13% for WLK in 2017. Less observation data were available in 2016, but when considering other sampling periods for the same cows and halters, the error rate was 15.1% for RT, 19.3% for GT, and 52.6% for WLK. The accelerometers successfully identified patterns of grazing behavior and differentiated among climatic, grazing system, supplementation status, and residual feed intake classification influences on GT, RT, and WLK. In a more moderate climate year, HRFI cattle appeared to rest less (P < 0.08) and walk more (P < 0.07) than LRFI cattle. Similar patterns were observed for cattle in the CCON versus CTRT treatments, with supplemented cattle resting more (P < 0.05) and walking less (P < 0.05). Accelerometers appear to be effective in determining mechanistic adaptations in grazing behavior by beef cattle on range.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere