Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Context. Since the introduction of fallow deer (Dama dama) to Tasmania in the early 1830s, the management of the species has been conflicted; the species is partially protected as a recreational hunting resource, yet simultaneously recognised as an invasive species because of its environmental impact and the biosecurity risk that it poses. The range and abundance of fallow deer in Tasmania has evidently increased over the past three decades. In the 1970s, it was estimated that ∼7000–8000 deer were distributed in three distinct subpopulations occupying a region of ∼400 000 ha (generally centred around the original introduction sites). By the early 2000s, the estimated population size had more than tripled to ∼20 000–30 000 deer occupying 2.1 million ha. No study has attempted to predict what further growth in this population is likely.
Aims. The purpose of our study was to provide a preliminary estimate of the future population range and abundance of fallow deer in Tasmania under different management scenarios.
Methods. We developed a spatially explicit, deterministic population model for fallow deer in Tasmania, based on estimates of demographic parameters linked to a species distribution model. Spatial variation in abundance was incorporated into the model by setting carrying capacity as a function of climate suitability.
Key results. On the basis of a conservative estimate of population growth for the species, and without active management beyond the current policy of hunting and crop protection permits, abundance of fallow deer is estimated to increase substantially in the next 10 years. Uncontrolled, the population could exceed 1 million animals by the middle of the 21st century. This potential increase is a function both of local increase in abundance and extension of range.
Conclusions. Our results identify areas at high risk of impact from fallow deer in the near future, including ecologically sensitive areas of Tasmania (e.g. the Tasmanian Wilderness World Heritage Area).
Implications. The research approach and results are presented as a contribution to debate and decisions about the management of fallow deer in Tasmania. In particular, they provide a considered basis for anticipating future impacts of deer in Tasmania and prioritising management to mitigate impact in ecologically sensitive areas.
Mickey Agha, Mason O. Murphy, Jeffrey E. Lovich, Joshua R. Ennen, Christian R. Oldham, Kathie Meyer, Curtis Bjurlin, Meaghan Austin, Sheila Madrak, Caleb Loughran, Laura Tennant, Steven J. Price
Context. There is little information available on how research activities might cause stress responses in wildlife, especially responses of threatened species such as the desert tortoise (Gopherus agassizii).
Aims. The present study aims to detect behavioural effects of researcher handling and winter precipitation on a natural population of desert tortoises in the desert of Southwestern United States, over the period 1997 to 2014, through extensive assessments of capture events during multiple research studies, and capture–mark–recapture survivorship analysis.
Methods. Juvenile and adult desert tortoises were repeatedly handled with consistent methodology across 18 years during 10 study seasons. Using a generalised linear mixed-effects model, we assessed the effects of both research manipulation and abiotic conditions on probability of voiding. Additionally, we used a Cormack–Jolly–Seber model to assess the effects of winter precipitation and voiding on long-term apparent survivorship.
Key results. Of 1008 total capture events, voiding was recorded on 83 (8.2%) occasions in 42 different individuals. Our top models indicated that increases in handling time led to significantly higher probabilities of voiding for juveniles, females and males. Similarly, increases in precipitation resulted in significantly higher probabilities of voiding for juveniles and females, but not for males. Tortoise capture frequency was negatively correlated with voiding occurrence. Cormack–Jolly–Seber models demonstrated a weak effect of winter precipitation on survivorship, but a negligible effect for both voiding behaviour and sex.
Conclusions. Handling-induced voiding by desert tortoises may occur during common research activities and years of above average winter precipitation. Increased likelihood of voiding in individuals with relatively low numbers of recaptures suggested that tortoises may have perceived researchers initially as predators, and therefore voided as a defensive strategy. Voiding does not appear to impact long-term survivorship in desert tortoises at this site.
Implications. This study has demonstrated that common handling practices on desert tortoise may cause voiding behaviour. These results suggest that in order to minimise undesirable behavioural responses in studied desert tortoise populations, defined procedures or protocols must be followed by the investigators to reduce contact period to the extent feasible.
Context. Seasonal and individual variation in predator selection for primary and alternative prey can affect predator–prey dynamics, which can further influence invasive-predator impacts on rare prey.
Aims. We evaluated individual and seasonal variation in resource selection by feral cats (Felis silvestris catus) for areas with European rabbits (Oryctolagus cuniculus) around a breeding colony of endangered black-fronted terns (Chlidonias albostriatus) in the Upper Ohau River, within the Mackenzie Basin of New Zealand.
Methods. Within a feral cat population subject to localised control (within a 1-km area surrounding the tern colony), we mapped the movements of 17 individuals using GPS collars, and evaluated individual and seasonal variation in third-order resource selection (i.e. within home ranges) by using resource-selection functions with mixed effects. The year was divided into breeding and non-breeding seasons for terns.
Key results. Three of the eight feral cats monitored during the breeding season used the colony in proportion to availability and one selected it. These four individuals therefore pose a threat to the tern colony despite ongoing predator control. Selection by feral cats for areas with high relative rabbit abundance was not ubiquitous year-round, despite previous research showing that rabbits are their primary prey in the Mackenzie Basin.
Conclusions. Results suggest that rabbit control around the colony should reduce use by feral cats that select areas with high relative rabbit abundance (less than half the individuals monitored), but is unlikely to alleviate the impacts of those that select areas with low relative rabbit abundance. Hence, predator control is also required to target these individuals. Results thus support the current coupled-control of feral cats and rabbits within a 1-km buffer surrounding the tern colony. Future research should determine what scale of coupled-control yields the greatest benefits to localised prey, such as the tern colony, and whether rabbits aid hyperpredation of terns by feral cats via landscape supplementation.
Implications. The present study has highlighted the importance of considering seasonal and individual effects in resource selection by predators, and the role of primary prey, when designing management programs to protect rare prey.
Context. Livestock guardian dogs (LGDs, Canis familiaris) can be highly effective in protecting livestock from predators; however, how they accomplish this, is poorly understood. Whereas it is clear that these dogs spend a high proportion of their time accompanying livestock, and confront predators that approach closely, it is unknown whether they also maintain territories around the areas used by their livestock and exclude predators from those territories.
Aims. We aimed to determine whether LGD behaviour towards predators is consistent with defence of a larger territory that encompasses the stock, or is based on repelling predators that closely approach livestock.
Methods. We used audio playbacks and scent placements to simulate incursions by dingoes (Canis dingo) at different locations with the LGD ranges, and used GPS tracking and automatic cameras to monitor responses to these incursions.
Key results. The LGD responses depended on location of the incursion. When simulated incursions were a significant distance inside the range (about the 50th kernel isopleth), they responded by vocalising, leaving their livestock, and travelling up to 570 m away from the stock to approach the incursion point and display challenging behaviour; when incursions were at the boundary of the range (at or beyond the 90th kernel isopleth), they vocalised but did not approach the incursion point, regardless of the location of the sheep. The LGDs in this study worked in groups. Group members responded differently to simulated incursions, some moving to challenge, whereas others remained close to the sheep.
Conclusions. Our results showed that protection by LGDs extends beyond the immediate vicinity of livestock, and is consistent with the defence of a larger territory.
Implications. If predators are excluded from this territory, LGDs enforce a spatial separation of predators and livestock. This would reduce risk of attack, but also prevents the disturbance and stress to livestock that would be caused by frequent approaches of predators. Where possible, training and management of LGDs should allow them to range freely over large areas so that they can develop and exhibit territorial behaviour, and they should be deployed in groups so that group members can assume complementary roles.
Context. The giant anteater, Myrmecophaga tridactyla, is a large insectivorous mammal from Cerrado which is classified as vulnerable by the IUCN‘s red list. In spite of frequent giant anteater casualties, there continues to be a lack of published data on how road and landscape attributes affect road-kill rates – information that could prove useful in guiding mitigation measures.
Aims. We seek to determine whether road and landscape attributes influence the incidence of road-kills of the giant anteater.
Methods. From February 2002 to December 2012 (except for 2004), five roads in two regions in south-eastern Brazil were surveyed twice each month by car. We recorded temporal road-kill data for the giant anteater and related spatial road variables. These variables were also recorded at regular control sites every 2 km. We also took traffic volume data on stretches of the two roads to correlate with road-kills.
Key results. Of the 45 anteater casualties recorded, there was a predominance of adult males. On roads MG-428 and SP-334, we found anteater road-kills were more common in the dry season, negatively correlated with traffic volume and related to the presence of native vegetation. Accordingly, road-kill sites tended to occur near the cerrado and grasslands and also appeared more frequently on some straight stretches of roadways. Although it was not shown to influence road-kill rates, topography data does point to regular overpass/underpass locations allowing population connectivity. Termitaria or ant nests were present at all road-kill sites, with 86% having signs of feeding.
Conclusions. Native vegetation along roadways, together with straight road design, increases the probability of anteater road-kills by 40.1%.
Implications. For mitigation, mowing and removing insect nests on roadsides, as well as roadside wildlife fencing in cerrado and grassland areas is suggested. Warning signs and radar to reduce vehicle speed are recommended for both human safety and anteater conservation. With regard to population connectivity, the absence of aggregated anteater road-kill data in this study meant that there were no particular crossing locations identified. However, the collected topography data do show places that could be used for roadway crossings. The measures indicated may apply to similar species and types of topography on other continents.
Context. The Tasmanian Government is attempting to eradicate foxes from Tasmania and carnivore-scat surveys using humans and dogs combined with DNA testing are the main methods of detection. Understanding the rate that scats degrade is a key component for estimating the power of monitoring for detecting cryptic predators and will contribute to a broader understanding of the use of scat monitoring for informing eradication programs.
Aims. To estimate the degradation rate of fox scats and derive an estimate of the abundance of scats available to observers monitoring for fox presence.
Methods. In total, 486 fresh fox scats were placed at nine sites within three bioregions in Tasmania and left to degrade for up to 126 days. Scats were observed periodically by both humans and dogs to determine when they became unrecognisable and/or undetectable.
Key results. Recognition of scats by humans declined faster in summer than in winter and did not vary systematically among bioregions. Median survival times of scats were 19 days in summer and 26 days in winter. Recognition of scats by dogs was higher in summer than in winter, with dogs recognising scats past the time they became unrecognisable to humans. Using estimates of scat degradation derived from human observers, the equilibrium abundance of detectable scats within a fox home range was estimated to be 179–243 scats. However, the abundance of detectable scats on linear features subject to monitoring was estimated to be 10–15 scats.
Conclusions. Using our estimate of the abundance of scats on linear features, the current distribution of fox scats detected in Tasmania may not be as anomalous as has been suggested by others. However, fox detection from scats will be highly dependent on deposition patterns and distribution of scats on linear features and this should be critically reassessed in Tasmania.
Implications. Fox scats are not expected to exhibit systematic regional differences in degradation rates that might have an impact on monitoring strategies. Estimates of the abundance of scats detectable by observers are critical for assessing the effectiveness of scat-monitoring programs. We advocate that a rigorous assessment of future scat-monitoring programs in Tasmania be undertaken to determine their power to detect foxes.
Context. In young forests of the Pacific North-west of North America, the potential impacts of domestic grazing by cattle (Bos taurus) on forest ecosystems and native ungulates such as mule deer (Odocoileus hemionus) are poorly understood. It is not clear how cattle and deer may interact in young forests used for summer range by both ungulates, and winter range used by deer, where pre-commercial thinning (PCT) and fertilisation enhance both timber and forage.
Aims. To test the following two hypotheses: (H1) that PCT and repeated fertilisation of young lodgepole pine (Pinus contorta var. latifolia) stands would increase relative habitat use by cattle; and (H2) that increased use of forested range by cattle would result in decreased use by mule deer.
Methods. Replicate study areas were located near Summerland, Kelowna, and Williams Lake in south-central British Columbia, Canada. Each study area had the following nine treatments: four pairs of stands thinned to densities of ∼250 (very low), ∼500 (low), ∼1000 (medium), and ∼2000 (high) stems ha–1 with one stand of each pair fertilised five times at 2-year intervals. Relative habitat use was measured by counting cowpies for cattle in summer and pellet groups for deer in summer and winter periods 1998–2003.
Key results. Relative habitat use by cattle was significantly enhanced by fertiliser treatments and heavy thinning, supporting H1. Relative habitat use by deer during summer periods was not affected by stand density, but was significantly higher in fertilised than unfertilised stands, with no difference in winter months, thereby not supporting H2.
Conclusions. Summer habitat use by mule deer appeared to be a function of forage opportunities and no significant correlations in relative habitat use between cattle and mule deer during the summer were detected. Negative correlations were better explained by the need for tree cover by deer during severe winter conditions than a negative response to cattle grazing.
Implications. Domestic grazing by cattle may be compatible with native ungulates such as mule deer, at least in those forest sites that are managed intensively for timber production. Fertilisation may result in sufficient forage production in the understorey vegetation of these forest ecosystems, to compensate for cattle grazing that reduces the live forage biomass.
Context. Hollow-bearing trees are an important breeding and shelter resource for wildlife in Australian native forests and hollow availability can influence species abundance and diversity in forest ecosystems. A persistent problem for forest managers is the ability to locate and survey hollow-bearing trees with a high level of accuracy at low cost over large areas of forest.
Aims. The aim of this study was to determine whether remote-sensing techniques could identify key variables useful in classifying the likelihood of a tree to contain hollows suitable for wildlife.
Methods. The data were high-resolution, multispectral aerial imagery and light detection and ranging (Lidar). A ground-based survey of 194 trees, 96 Eucalyptus crebra and 98 E. chloroclada and E. blakelyi, were used to train and validate tree-senescence classification models.
Key results. We found that trees in the youngest stage of tree senescence, which had a very low probability of hollow occurrence, could be distinguished using multispectral aerial imagery from trees in the later stages of tree senescence, which had a high probability of hollow occurrence. Independently, the canopy-height model used to estimate crown foliage density demonstrated the potential of Lidar-derived structural parameters as predictors of senescence and the hollow-bearing status of individual trees.
Conclusions. This study demonstrated a ‘proof of concept’ that remotely sensed tree parameters are suitable predictor variables for the hollow-bearing status of an individual tree.
Implications. Distinguishing early stage senescence trees from later-stage senescence trees using remote sensing offers potential as an efficient, repeatable and cost-effective way to map the distribution and abundance of hollow-bearing trees across the landscape. Further development is required to automate this process across the landscape, particularly the delineation of tree crowns. Further improvements may be obtained using a combination of these remote-sensing techniques. This information has important applications in commercial forest inventory and in biodiversity monitoring programs.
João J. S. Paula, Regina M. B. Bispo, Andreia H. Leite, Pedro G. S. Pereira, Hugo M. R. G. Costa, Carlos M. M. S. Fonseca, Miguel R. T. Mascarenhas, Joana L. V. Bernardino
Context. To assess the real impact of human-made structures on bird and bat communities, a significant number of carcass-removal trials has been performed worldwide in recent decades. Recently, researchers have started to use camera traps to record carcasses exact removal time and better understand the factors that influence this event.
Aims. In our study, we endeavoured to identify the factors that significantly affect carcass-persistence time, such as (1) season, (2) scavenger guild, (3) type of carcass, (4) habitat and (5) weather conditions. Additionally, we aimed to assess the performance of camera-trapping technology in comparison to the conventional method typically used in carcass-removal trials.
Methods. We conducted two trials in two wind farms during early spring and during summer season. In each trial, we used 30 bird carcasses and 30 mice carcasses as surrogates for bats. Digital infrared camera traps were used to monitor each carcass. Chi-squared test was used to investigate differences between wind farms regarding the scavenger guild. A log-rank test was used to compare carcass-persistence times for both wind farms. Carcass-persistence times were analysed using both non-parametric and parametric survival models. Finally, we evaluated the percentage of carcasses removed during the day time and night time.
Key results. In our study area, carcass-persistence times were influenced by the scavenger guild present and by the exposure to rain. Camera traps allowed to record the exact removal time for the majority of the carcasses, reducing the number of visits to the study site about five times. However, there were also cases wherein loss of data occurred as a result of equipment flaws or camera theft.
Conclusions. Results demonstrated the importance of undertaking site-specific carcass-removal trials. Use of camera-trap methodology is a valid option, reducing displacement costs. Costs related to equipment purchase and the risk of camera theft should be taken into consideration.
Implications. When choosing camera-trapping, the main aspect to evaluate is the balance between the investment in equipment purchase and the cost savings through reduced displacement costs. Further studies are required concerning the real effects of the data collected on the accuracy of carcass-removal correction factor obtained.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere