BioOne.org will be down briefly for maintenance on 14 May 2025 between 18:00-22:00 Pacific Time US. We apologize for any inconvenience.
Registered users receive a variety of benefits including the ability to customize email alerts, create favorite journals list, and save searches.
Please note that a BioOne web account does not automatically grant access to full-text content. An institutional or society member subscription is required to view non-Open Access content.
Contact helpdesk@bioone.org with any questions.
Wheat crops usually yield more when grown after another species than when grown after wheat. Quantifying the yield increase and explaining the factors that affect the increase will assist farmers to decide on crop sequences. This review quantifies the yield increase, based on >900 comparisons of wheat growing after a break crop with wheat after wheat. The mean increase in wheat yield varied with species of break crop, ranging from 0.5 t ha–1 after oats to 1.2 t ha–1 after grain legumes. Based on overlapping experiments, the observed ranking of break-crop species in terms of mean yield response of the following wheat crop was: oats < canola ≈ mustard ≈ flax < field peas ≈ faba beans ≈ chickpeas ≈ lentils ≈ lupins. The mean additional wheat yield after oats or oilseed break crops was independent of the yield level of the following wheat crop. The wheat yield response to legume break crops was not clearly independent of yield level and was relatively greater at high yields. The yield of wheat after two successive break crops was 0.1–0.3 t ha–1 greater than after a single break crop. The additional yield of a second wheat crop after a single break crop ranged from 20% of the effect on a first wheat crop after canola, to 60% after legumes. The mean yield effect on a third wheat crop was negligible, except in persistently dry conditions. The variability of the break-crop effect on the yield of a second wheat crop was larger than of a first wheat crop, particularly following canola. We discuss the responses in relation to mechanisms by which break crops affect soil and following crops. By quantifying the magnitude and persistence of break-crop effects, we aim to provide a basis for the decision to grow continuous cereal crops, strategic rotations or tactically selected break crops. In many wheat-growing areas, the large potential yield increases due to break crops are not fully exploited. Research into quantifying the net benefits of break crops, determining the situations where the benefits are greatest, and improving the benefits of break crops promises to improve the efficiency of wheat-based cropping systems.
Continuous-cropping systems based on no-till and crop residue retention have been widely adopted across the low-rainfall cereal belt in southern Australia in the last decade to manage climate risk and wind erosion. This paper reports on two long-term field experiments that were established in the late 1990s on texturally different soil types at a time of uncertainty about the profitability of continuous-cropping rotations in low-rainfall environments. Continuous-cereal systems significantly outyielded the traditional pasture–wheat systems in five of the 11 seasons at Waikerie (light-textured soil), resulting in a cumulative gross margin of AU$1600 ha–1 after the initial eight seasons, almost double that of the other treatments. All rotation systems at Kerribee (loam-textured soil) performed poorly, with only the 2003 season producing yields close to 3 t ha–1 and no profit achieved in the years 2004–08. For low-rainfall environments, the success of a higher input cropping system largely depends on the ability to offset the losses in poor seasons by capturing greater benefits from good seasons; therefore, strategies to manage climatic risk are paramount. Fallow efficiency, or the efficiency with which rainfall was stored during the period between crops, averaged 17% at Kerribee and 30% at Waikerie, also indicating that soil texture strongly influences soil evaporation. A ‘responsive’ strategy of continuous cereal with the occasional, high-value ‘break crop’ when seasonal conditions are optimal is considered superior to fixed or pasture–fallow rotations for controlling grass, disease or nutritional issues.
In low-rainfall environments, a high frequency of cereal crops has been favoured for optimising productivity and risk. However, cereals at high intensity often lead to declining water-use efficiency and increasing inputs to cope with emergent nutritional, disease and weed problems. The value of including breaks in the cropping sequence can involve a high level of uncertainty in low-rainfall areas where non-cereal crops are more risky and profitability is largely determined by the subsequent benefit to cereal productivity. In this study, we aimed to improve understanding of the magnitude and primary source of break benefits such as nutrition, water and disease management in a low-rainfall environment where a high level of within-field soil variability can also contribute to uncertainty about the value of breaks. In on-farm field experiments near Karoonda in the South Australian Mallee, breaks were grown in 2009 or 2010 on four distinct soil types across a dune–swale catena. The effect of these breaks on subsequent cereal production was measured for up to 3 years. In addition, the effect of breaks on nutrition and water available, along with disease infection in subsequent cereal crops, was explored and actual yields were compared with nitrogen and water-limited potential yields. Consistent cumulative benefits to subsequent cereal crops of at least 1 t ha–1 after 3 years accrue from breaks grown on the different soil types. The inclusion of breaks had beneficial effects on the cycling and supply of nutrients along with some short-term impacts on infection by Rhizoctonia solani AG8 in subsequent cereals, whereas there were no conclusive effects of breaks on the supply of water to subsequent crops. This study suggests that the inclusion of both legume and brassica breaks is likely to be beneficial to subsequent cereal production where nitrogen is a factor limiting productivity in low-rainfall, semi-arid environments.
Western Australian grain production is dominated by wheat, but growing wheat continually in unbroken sequences leads to increasing problems with soil nutrient depletion, root and leaf disease build-up, high weed burdens, and possibly other less well-defined production constraints. These can adversely affect both production and grain quality. Including breaks in the crop sequence in the form of break crops, pasture, or fallow can reduce these problems, but these breaks can be expensive to implement, in terms of both direct cost and forgone revenue. It is therefore critical to predict the response of subsequent wheat crops to a break in order to choose crop sequences rationally.
We conducted a 4-year experiment at Wongan Hills, Western Australia, evaluating how wheat productivity in a wheat-based cropping sequence is affected by including wheat, barley, lupins, triazine-tolerant and Roundup Ready® canola, oaten hay, volunteer pasture, serradella pasture, and chemical fallow. Wheat yield responded positively to fallow, lupins, oaten hay, volunteer pastures and serradella but not to barley or canola when compared with continuous wheat. Responses depended on seasonal conditions; in a dry year, a very large response occurred after fallow but not after lupin or serradella, whereas in a wetter year, there were large responses after these crops. Fallowing, cutting hay, crop-topping lupins, and spray-topping volunteer and serradella pasture all reduced seedset of annual ryegrass dramatically, and reduced weed competition was a major contributor to the observed break crop responses. Nitrogen fixation by lupins and serradella and water storage by fallow in a dry year were also important, but soilborne diseases did not contribute to wheat yield responses. Some yield responses persisted for at least 3 years, and the contribution of effects of weed competition to yield responses increased over this time. These results emphasise the importance of understanding which productivity constraints are present in a cropping system at a given time when deciding whether a break is necessary and which is the most appropriate break. The results also emphasise the importance of managing the wheat crop after a break to maximise the response and its longevity.
During the last two decades in Western Australia, the traditional mixed farming system has been increasingly displaced by intensive crop sequences dominated by wheat. Intensive wheat sequences are usually maintained by using suitable breaks, including pasture, fallow, or alternative cereal, oilseed and legume crops, to control weeds and disease, or maintain the supply of nitrogen to crops. New cereal fungicide options may also assist to maintain intensive cereal systems by suppressing soilborne cereal diseases. To guide the successful diversification of intensive cereal systems, we evaluated the effect of a 2-year experimental matrix of 10 different sequence options. Wheat in the sequence was treated with the fluquinconazole fungicide Jockey (wheat J) to control soilborne pathogens, or with the usual seed dressing of flutriafol fungicide (wheat – J), used for control of bunts and smuts only. The sequences were wheat J, wheat – J, barley, grain oats, oaten hay, canola, lupin, field pea, oat–vetch green manure, bare fallow) in which all treatment combinations were grown in year 2 following the same 10 treatments in year 1. In year 3, wheat J was grown across the entire area as the test crop. In year 2, grain yields of all crops were reduced when crops were grown on their own residues, including wheat (22% reduction), canola (46%), lupin (40%) and field pea (51%). Wheat J significantly outyielded wheat – J by 300 kg ha–1 in year 1 (14% increase) and 535 kg ha–1 in year 2 (26% increase). Wheat J was more responsive to break crops than wheat – J in both year 1 and year 2. Break crops sown in year 1, such as canola, fallow, field pea, lupin and oaten hay, continued to have a positive effect on year 3 wheat J yields. This study has highlighted the importance of break crops to following cereal crops, and provided an example in which a seed-dressing fungicide fluquinconazole in the presence of low levels of disease consistently improved wheat yields.
In cropping systems where one type of crop dominates for economic reasons, farmers may employ alternative cropping or pasture options for strategic purposes such as controlling weed populations, reducing crop disease, and accumulating soil nitrogen. Tactical decisions regarding break crops often involve understanding the economic implications of several interacting bio-physical factors, along with complex trade-offs between short-term benefits, such as immediate profit, and long-term ecological problems, such as increased weed seedbank. Modelling analysis regarding tactical crop-sequencing and break-crop decisions has generally not addressed these longer term dynamic factors. In this study we adapted an analysis and modelling framework (LUSO), originally designed to aid understanding of the long-term strategic planning of agricultural crop and pasture rotations, so that it can be used to analyse immediate tactical decisions regarding break crops and sequencing, while still accounting for both short- and long-term implications of these decisions. We show how the revised framework was applied to two example scenarios and demonstrate that in both cases it can be used for simple decision-support, as well as more in-depth analysis and insight into the factors influencing the immediate decision.
Crop rotation, in which a legume, pasture, fallow or oilseed ‘break crop’ is grown after a cereal crop to manage soil-borne disease and weeds and, on occasions, to fix nitrogen, is one of the oldest techniques in agriculture. Valuing of crop rotations is complicated because the profitability of particular crop species changes with the prevalence of biotic stresses and varies with seasonal factors such as rainfall. With the Land Use Sequence Optimiser (LUSO) and the Agricultural Production Systems Simulator (APSIM) crop model, we generate an optimum land-use strategy for various biotic stresses and land-use options for a semi-arid grain-growing region in Australia. Over a 10-year time horizon, we compare the performance and variability of an optimal sequence with three sequences recommended by local agronomists. The agronomists recommended strategic sequences to manage weeds and disease and to maximise profit. The optimal crop sequence, with perfect knowledge, selected a mixture of grain legume, oilseed, cereal crops and pastures to manage biotic stresses and generate profit. This sequence precisely timed a period of exploitation, when high-profit crops were repeatedly grown and the biotic stresses increase, with a period of rehabilitation, when low-profit break crops are grown to reduce the biotic stresses. The agronomists’ strategic sequences were either slightly more exploitative, grew more crops and allowed the biotic stresses to increase, or were more conservative and grew fewer profitable crops while managing the biotic stresses. Both strategic approaches were less profitable than the optimal crop sequence. The value of knowledge about a particular stress increases as its rate of accumulation in the farming system increases. With high levels of biotic weed stress, perfect knowledge was worth an additional AU$73 ha–1 year–1. In scenarios with lower levels of biotic weed stress, perfect knowledge was worth just $24 ha–1 year–1. Several measures of risk were defined, but there was no trade-off between profit and risk. Variability at the crop or enterprise scale did not necessarily translate into variability in profit when viewed over 10 years. Tools such as LUSO can help to determine the optimal crop sequence for a given series of enterprise options and a given level of biotic stress and explore the variability and risk associated with different enterprise choices.
A survey was conducted of commercial broadacre paddocks in the south-west cropping zone of Western Australia from 2010 to 2013. In total, 687 paddock years of data were sampled from 184 paddocks. The land use of each paddock was recorded together with measurements of weed density, the incidence of soilborne pathogen DNA, and soil inorganic nitrogen (nitrate and ammonium). The dynamics of these biophysical variables were related to the crop and pasture sequences employed.
Wheat was the most frequent land use (60% of paddock years), followed by canola and pasture (12% each), and lupins and barley (6% each). Four crop species, wheat, canola, barley and lupins, accounted for 84% of land use. By region, wheat, canola, barley and lupin accounted for 90% of land use in the Northern Agricultural Region (NAR), 83% in the Central Agricultural Region (CAR) and 78% in the Southern Agricultural Region (SAR). Conversely, pasture usage in the SAR was 21%, compared with 12% in the CAR and 7% in the NAR.
Over the surveyed paddocks, weed density, soilborne pathogens and soil N were maintained at levels suitable for wheat production. The inclusion of land uses other than wheat at the frequency reported maintained the condition of these biophysical variables.
This article is only available to subscribers. It is not available for individual sale.
Access to the requested content is limited to institutions that have
purchased or subscribe to this BioOne eBook Collection. You are receiving
this notice because your organization may not have this eBook access.*
*Shibboleth/Open Athens users-please
sign in
to access your institution's subscriptions.
Additional information about institution subscriptions can be foundhere