Effect of harvest maturity stage and seeding rate on alfalfa yield and quality

Cultivated alfalfa (Medicago sativa L.) is a highly productive forage crop with great economic potential. Our objectives were to investigate the effects and interactions of environment, harvest maturity stage, seeding rate and cultivars on alfalfa dry matter yield and quality. The field experiment was carried out over 2010-2012 at the Experimental Field of Institute of Field and Vegetable Crops in Novi Sad, at two locations: (I) Cenej, and (II) Rimski Sancevi, which were characterized by a contrasting soil condition. The experimental treatments included two seeding rates of 8 and 16 kg ha-1, three different harvest maturity stages of alfalfa, and four alfalfa cultivars included as a subplot in every treatment. Harvesting alfalfa five times within a year (cutting in the beginning of flowering) in the second and third year of stand life is the most efficient harvest regime that allows full exploitation of cultivar genetic potential and environmental conditions. There was no difference in yields between harvests at early and full flowering stage (15.9 t ha-1). Data related to forage quality components show increase of nutritive value when harvesting alfalfa in earlier maturity stage. Maturity stage in the moment of harvesting significantly affected all quality parameters. The results obtained suggest that in a temperate climate of Southeast Europe there is little justification for planting higher seeding rates of alfalfa varieties under good establishment practices. Increasing seeding rates above 16 kg ha-1 does not provide a long-term boon in alfalfa production.


Introduction
Alfalfa is a forage species widely grown all around the globe, with its main purpose for hay, pasture and silage production, because of its high nutrition forage value and broad adaptability (Li & Brummer, 2012). In Serbia, alfalfa is the most important perennial forage legume cultivated on around 200,000 ha, which is about 8% of total agricultural arable land in Serbia (Milić et al., 2014).
Actuality of alfalfa production is that yield and quality are inversely related (Sheaffer et al., 2000;Brink et al., 2010). Harvesting alfalfa at early maturity growth stages (i.e. pre-bud or early bud) increased level of forage quality, but decreases the yield. Many studies have reported decrease in alfalfa forage quality when alfalfa plants were cut from early bud to late flowering stage (Sheaffer et al., 2000;Kallenbach et al., 2002;Schwab et al., 2005;Katic et al., 2005;Lamb et al., 2006;Brink et al., 2010). Alfalfa yields are significantly affected by the choice of cultivar (Kallenbach et al., 2002;Lamb et al., 2006;Li & Brummer, 2012) but quality of alfalfa hay is strongly affected by the harvest regime (maturity stage) and weed control. Still, there are some reports where genetic background could affect forage quality and digestibility (Sheaffer et al., 1998;Hall et al., 2000;Kallenbach et al., 2002;Lamb et al., 2007;Milic et al., 2011;Rimi et al., 2012). In addition, it is of great importance to choose the most appropriate cutting schedule for each cultivar (Kallenbach et al., 2002;Orloff & Putnam, 2010;Rimi et al., 2012). Harvest timing is the most powerful tool under the alfalfa grower's control to affect yield and quality and ultimately increase profitability potential; more so than the cultivar choice, fertilization and other management factors (Orloff & Putnam, 2006). In order to achieve balanced yield and quality of alfalfa production over the years, it is necessary to use cultivars that differ in time of earliness (Kallenbach et al., 2002;Orloff & Putnam, 2010;Rimi et al., 2014). In the USA, planting alfalfa cultivars with different dormancy can improve alfalfa forage quality because non-dormant cultivars should be cut in earlier stages and level of hay quality decreased more slowly with advanced plant maturity . Besides differences in fall growth and winter survival, varieties with less dormancy will tend to re-grow faster after cutting than more dormant varieties. Within a cutting schedule (frequency), fall dormancy rating could be a very powerful indicator of forage quality Orloff & Putnam, 2006). Consequently, the selection of cultivar fall dormancy rating should be considered of a secondary importance compared to choose of harvest regime in order to improve the yield performance of alfalfa in environments with temperate climates (Rimi et al., 2014). Brink et al. (2010) reported that across Northern US states only spring and early summer harvests need to be performed earlier and more frequently to obtain forage with high nutritive value in humid regions, because of faster change of forage quality components in these regions, while timing of the spring harvest in more arid environments appears to be less critical due to the slower rate of change in nutritive value compared to the early and late summer. Variation of forage quality is influenced by higher air temperatures and lower soil moisture. Contribution to the seasonal variation, at least partially, is a result of the faster growth and maturity of alfalfa plants in summer (Katić et al., 2007;Brink et al., 2010).
Several teams were working on alfalfa stand establishment depending on seeding rate using both conventional (Hall et al., 2004;Lloveras et al., 2008) and glyphosate-tolerant technologies more recently Bradley et al., 2010;Berti et al., 2014). Objectives are related to year of establishment of alfalfa fields, but also the effects of different seeding rates on alfalfa yield, plant density and forage quality in years after establishment. Results obtained in these studies clearly demonstrate that there is no significant impact of higher seeding rates (above 16 kg ha -1 of pure live seed) on dry matter yield or forage quality (Hall et al., 2004;Bradley et al., 2010;Berti et al., 2014),regardless of whether it is a conventional or glyphosate-tolerant alfalfa stand. Also, the effect on plant density is recorded when planting lower seeding rates of 8 kg ha -1 pure live seed (Hall et al., 2004) in particular in the year of establishment (Bradley et al., 2010;Berti et al., 2014). Lower seeding rates can result in decreased stem density, but lower density does not affect either alfalfa yield or digestibility during the year of alfalfa establishment (Bradley et al., 2010). Seeding at rates higher than 17 kg ha -1 provided no measurable long-term trade off in terms of yield and forage quality (Hall et al., 2004;Lloveras et al., 2008;Bradley et al., 2010;Berti et al., 2014), and rather than that attention should be paid on the method of planting, seedbed preparation, and weed control.
It is expected that the alfalfa yield and quality will vary significantly depending on the phenological development phase of alfalfa plants, as well as the influence of agroecological conditions and different reaction of tested varieties to the various cutting regimes. A fundament of successful alfalfa production would be to develop a compatible management system and cultivar choice that would increase forage yields without significant losses of nutrient value.
The objectives of this study were to: i) evaluate the effects of harvest management strategies based on maturity stages through different environments on alfalfa dry matter yield and quality; ii) compare the effects of seeding rate on alfalfa yields in long term period, and iii) evaluate influence of genetics background (cultivars) on alfalfa yield and forage quality over environments and harvest maturity stages.

Experimental design
The field trials with data collection were carried out over 2010-2012 at the Experimental Field of Institute of Field and Vegetable Crops in Novi Sad, Serbia, on chernozem soil, at two locations: Čenej, (location I) and Rimski Šančevi (location II), which were characterized by the same climate, but contrasting soil conditions in terms of K 2 O and P 2 O 5 content (Table 1). The experimental design was a randomized complete block with three replicates in a split-plot factorial arrangement of the subplot treatments, where seeding rates were the main plots, and all combinations of the four cultivars and maturity stages were the subplots. The experiment was conducted in early April of 2009 which was the year of establishment and was excluded from the evaluation. The experimental treatments included: i) two seeding rates of 8 and 16 kg ha -1 ; ii) three different harvest maturity stages of alfalfa, according to the 9-stage classification (Kalu & Fick,1981): 5 -early flower stage (EFS), one node with one open flower (standard open); no seed pods (approximately 10% of plants with flower); 6 -late flower stage (LFS), ≥ 2 nodes with open flowers; no seed pods (approximately 50% of plants with flower) and 8 -late seed pod stage (LSP), ≥ 4 nodes with green seed pods, and iii) four alfalfa cultivars included as a subplot in every treatment. Alfalfa cultivars were planted at a depth of 1 cm using Wintersteiger single row seeder. Individual subplots were 1 m × 6 m (6 m 2 ) with row spacing of 20 cm. In the first year of plant life (2009), all standard cultivation practices were applied (weed and insect control). Plots were cut twice without data recording in order to ensure successful experiment establishment.
On each location during 2010-2012 plots were cut at a stubble height of 7 cm, using the forage harvester Cibus. In 2010 and 2011 at the EFS plots were cut five times, in the LFS 4 times and in LPS 3 times. In the fourth year of plant life (2012), number of harvests was different: early flowering maturity stage 3 times, in the late flowering maturity stage 2 times and in green pod stage 1 time on both locations (Table 2). Changes in the number of harvests in 2012 were related to stand age, cutting schedule and severe drought during growing season, which significantly influenced trial performance in the year of 2012.

Plant material
The research involved the four commercial alfalfa cultivars in Serbia: NS Mediana ZMS V, Nijagara, Banat VS, and NS Alfa, created in Institute of Field and Vegetable Crops Novi Sad, Serbia. Cultivars are derived using same breeding method, but they have different genetic background, coming from different breeding cycles and they were created for various growing conditions. NS Mediana ZMS V was developed from the crosses of local Serbian blue and yellow tetraploid alfalfa populations and Nijagara was developed by individual selection from the crosses of elite Serbian and US germplasm (Milic et al., 2014). These cultivars represent synthetic populations intended for hilly regions and heavy hydromorphic soils (they have certain amounts of M. falcata gene pool). Banat VS is a cultivar selected from the Pannonian ecotype of alfalfa, well adapted to dry regions and sandy soils. NS Alfa is a cultivar selected from the Northwest European alfalfa populations more tolerant to lodging and well adapted to colder regions due to better persistency (Katic et al., 2008).

Measurements
The effects of the applied treatments on dry matter yield (DMY) (tha -1 ), crude protein content (CP), neutral detergent fiber content (NDF), acid detergent fiber content (ADF), and acid detergent lignin (ADL) of alfalfa were monitored. Green forage yield was determined by cutting the plot area and the fresh weight was determined immediately. Subsamples of approximately 300 g were taken from every plot and dried in oven at 60 °C for 72 h to determine forage dry matter, which was used to calculate dry matter yield. Forage quality analyses were performed on whole plant samples taken from the second cut in 2010 and 2011 on both locations for forage quality testing. Total forage nitrogen (N) was determined by the Kjeldahl technique (Bremmer & Breitenbach, 1983), and CP content was calculated by multiplying total forage N by 6.25. The fiber analyses were carried out using the Filter Bag Technique (Ankom Technology Corp., Fairport, NY) for neutral detergent fiber (NDF), acid detergent fiber (ADF), and acid detergent lignin (ADL). The analyses were performed on an Ankom 2000 Fiber Analyzer (Ankom Technology Corp., NY, USA).

Statistical analysis
Analysis of variance was conducted to determine the effects of environment, seeding rate, plant maturity stage and alfalfa cultivars on DMY, CP, NDF, ADF and ADL content, using mixed model procedures of SAS (PROC MIX, SAS ver. 9.3, SAS Institute, Cary, NC), as described in Littell et al. (2006). In the formulated model of analysis of variance, each harvest year at one location was considered as one environment. Means for DMY were evaluated for main effects and interactions among six environments, four alfalfa cultivars, three harvest maturity stages, and two seeding rates were compared for analyses. Four quality components (CP, NDF, ADF, ADL) were evaluated for main effects and interactions among the four environments, four alfalfa cultivars and three harvest maturity stages planted at seeding rate of 16 kg ha -1 . In overall analyses, environments were considered random and seeding rate, harvesting maturity, and alfalfa cultivars were considered fixed. Least square means for DMY and forage quality components were compared using the PDIFF option of PROC MIXED (SAS Institute, Cary, NC). When treatment effect was significant, Tukey-Kramer test (Kramer, 1956) was used as a means separation procedure. When significant, effects of treatments were declared at P < 0.05, unless otherwise indicated. To highlight the differences between treatments an iterative algorithm (Piepho, 2004) was used, which adds the same letters to the treatments and their interactions when there is no statistical significance.

Weather
The location where the experiment was established (Northern Serbia) has a continental semiarid to semi humid climate, with cold winters, and hot, humid summers with well distributed rainfall patterns. The mean monthly air temperature is 11 °C; an annual sum of precipitation is around 600 mm. The first year of trial (2010) had extremely high amounts of precipitation during growing season (684.4 mm). On the other hand, the following two years (2011 and 2012) were extremely dry with high temperatures and low precipitations during summer. The maximum precipitation in 2010 was reached in the summer months (June, July, and August). It was nearly twice higher than the long-term average for that period. At the beginning of summers of 2011 and 2012, long-lasting high air temperatures and small precipitation quantities caused severe to extreme drought. The extremely dry period was in August with 1.5 mm precipitation in 2011 and 3.5 mm in 2012. A small amount of precipitation and high air temperatures along with increased water consumption in August caused further aggravation of the soil moisture condition, so drought was again recorded at the end of September.

Results and discussion
Environment, seeding rate, maturity stage at harvest and cultivars affected total DMY (Table 3). Two-way interactions among environments and maturity stages were found significant for DMY. There is a considerable variation in the dry matter yield (coefficient of variation 12.0%). Cutting frequency, or more accurately the maturity of the alfalfa at the time of harvest, determines forage quality and yield (Sheaffer et al., 2000;Brink et al., 2010;Orloff & Putnam, 2010). The overall analyses (Table 4) clearly demonstrated an advantage of cutting alfalfa in early stages of development in temperate climate. There is no yield difference between harvests at early and full flowering stage (15.9 t DM ha -1 ), while data related to forage quality components show increase in nutritive value when harvesting alfalfa in earlier maturity stage (Table 6). Also, there was variation between the environments, caused by weather variation throughout the growing seasons (2010-2012) and years of plant life. The highest yields were recorded at location Čenej which is characterized by better chemical soil properties. The highest yields were recorded at the same location in the second (19.5 t DM ha -1 ) and third (22.4 t DM ha -1 ) year of stand life. At location R. Šančevi lower yield was obtained mostly due to poor soil quality. In general, yields in the fourth year of plant life (2012) were the lowest because of severe drought. Furthermore, in 2012 it was not possible to apply desirable cutting regime because of dry season, and complete distribution of harvests was different and shifted, as shown in Table 2.
Overall cultivar performance shows that there was slight difference in yields across the studied treatments and their interactions. The highest yield was registered with cultivar Nijagara (15.6 t DM ha -1 ), which means that cultivar selection should still be considered as an important issue together with maturity stage and environmental effects. Results clearly demonstrate that in regions with temperate climate such as southeast Europe, cutting alfalfa five times within a year (harvest in the beginning of flowering), in the second and third year of stand life is the most efficient harvest regime that allows full exploitation of cultivar genetic potential and environmental conditions of this particular region.
There was no statistical difference in yield between two applied seeding rates (Table 4.). Results obtained in our study support the findings of Lloveras et al. (2008) who claimed that in temperate climates there was no reasonable justification for planting higher alfalfa seeding rates if there was a good establishment of plant stand. The obtained results related to seeding rate of alfalfa are in line with Hall et al. (2004), Lloveras et al. (2008), Bradley et al. (2010), and Berti et al. (2014), who point out that planting alfalfa at rates above 17 kg ha -1 does not contribute to measurable long term trade off in yield and forage quality and that attention should be paid to the establishment, i.e. seeding depth, seedbed preparation, weed and insects control before and after planting. According to our results, alfalfa growers could plant lower seeding rates (below 16 kg ha -1 ), but in that case the highest priority should be given to good establishment practices. The environment and maturity stage at harvest influenced crude protein content in the trial (Table 5). Fiber (NDF and ADF) and acid detergent lignin content were affected only by the maturity stage. Two-way interaction among environments and maturity stages was found significant only for the NDF content (Figure 1). Numerous studies have documented the impact of harvest maturity stage on the forage quality of alfalfa (Sheaffer et al., 2000;Lamb et al., 2003;Brink et al., 2010;Rimi et al., 2012). Across the environments results from the forage quality analyses showed that content of crude protein and neutral detergent fiber were affected by the environmental factors, but there were no differences in ADF and ADL content between the environments (Tables 6 and 7). The highest crude protein content was registered in 2010 on location Čenej (195.3g kg -1 ). Environmental conditions at both locations varied considerably in 2010 and 2011. The growing season in 2011 was generally warmer and much drier than in 2010. The differences between the growing seasons, particularly in the obtained precipitations contributed to the significant environmental effects together with differences caused by soil properties on both locations.  Among three harvest maturity stages alfalfa fiber (NDF and ADF) and lignin (ADL) concentrations increased very rapidly from early flower stage to stage of green pods (Tables 6  and 7).
General trend for CP content was, appropriately, quite opposite of that for NDF, ADF, and ADL. There was no recorded interaction between harvest maturity stage and cultivars for all quality components (Table 5) which is in compliance with Kallenbach et al. (2002) and Brink et al. (2010). In fiber concentration (NDF and ADF) the obtained results showed strong impact of harvest maturity stage (Table 7), without any influence of genetic factors. Also, there was some environmental effect for NDF content that appeared in the very humid 2010 season at location R. Šančevi.
Results from forage quality analyses data lead to two main observations. Firstly, harvest maturity has the most impact on forage quality and greater effect than choice of cultivar. The same results have previously been reported by several authors (Lamb et al., 2003;Putnam et al., 2005;Orloff& Putnam 2010;Brink et al., 2010;Rimi et al., 2014). Also, other factors related to stand management could affect forage quality besides harvest maturity (weed and insect control), and this should not be forgotten. Secondly, the environmental factors have great impact on forage quality (Kallenbach et al., 2002). This is especially important when it comes to seasonal variation of forage quality (low concentrations of NDF and ADF and high CP content), in the first and last cut of the year, when temperatures are lower, day length is shorter and rainfalls and soil moisture optimal. Results related to ADL content show importance of maturity stage for improvement of alfalfa forage digestibility (Table 6). However, our results suggest that there is some genetic variability in ADL content which is in accordance with other authors (Schwab et al., 2005, Lamb et al., 2007. The highest level of ADL was found in overall analyses in cultivar NS Alfa (100.1 g kg -1 ), which was significantly higher compared to other cultivars used in the trial. The selection goals set for this cultivar were fulfilled according to Katic et al. (2008). NS Alfa met all the breeding criteria due to better tolerance to lodging (ADL content), and high forage yields with good quality performance.
Seasonal changes in the decline of alfalfa nutritive value have been found by many authors (Kallenbach et al., 2002;Katić et al., 2007;Brink et al., 2010;Rimi et al., 2012). There were reports on more rapid increase of neutral detergent fiber (NDF) concentrations during spring and early summer than in late summer or fall in humid regions (Brink et al., 2010). The same authors described in details that slower rate of change in nutritive value later in the growing season suggests that the harvest time can be later in order to capture additional dry matter yields. In contrast, timing of the spring harvest in arid environments appears to not be so critical due to the slower rate of change in nutritive value compared to the early and late summer, thus allowing producers to delay harvest to obtain maximum DMY. In Serbian agro-ecological conditions this can be the case in dry seasons, especially because of huge climate variation, and presence of more arid years in the previous decade.
Harvesting alfalfa more frequently improved forage nutritive value despite of the fact that fall dormancy group was used. Kallenbach et al. (2002) clearly demonstrated that it was possible to achieve gain of 10-15% in forage quality by changing from a four-cut system to a five-cut system. Authors claim that increase could be closer to 7% when changing from a fivecut system to a six-cut system. Our data support these results, and we can confirm that these results are not applicable only for the Midwest of US, but also in the temperate climates of Europe such as Serbia, using alfalfa cultivars with fall dormancy rankings 5-6.
In general, the results of this study indicated that in alfalfa production in the southeast Europe identified that increased attention should be focused on the impact of harvest maturity stage (time of cutting) on alfalfa yield and nutritive value. A better understanding of the relationship between yield and forage quality, especially fibers and lignin content, will help farmers to apply the most efficient harvest regime in order to optimize both yield and forage quality value of alfalfa.
Our results come from one site with two sub-locations which had different soil characteristics and wide variation of weather conditions during the trial. This allowed the highly accurate analyses of the impact of alfalfa maturity stage on yield and quality at the moment of harvesting.

Conclusion
Our results indicate that only four-or five-cut regime should be considered in order to obtain higher hay yields. Harvesting alfalfa five times per year, which means cutting alfalfa in the stage of begging of flowering in agro-ecological conditions of south east Europe, significantly increases forage nutritive value without DMY reduction. Balancing yield and quality of alfalfa depends strongly on the cutting regime, namely on the maturity stage of alfalfa more than on the cultivar selection.
Genetic background or cultivar factor should also be taken into account, especially the choice of the dormancy group within the applied harvesting schedule. Choice of the most appropriate genetics and stage of harvesting should be carefully considered in order to achieve forage yield with high nutritive value in temperate regions.
The results obtained in this study suggest that in a temperate climate of Southeast Europe there is little justification for planting higher seeding rates of alfalfa varieties under good establishment practices. Increasing seeding rates above 16 kg ha -1 does not provide long-term boon in alfalfa production and more important issues should be considered, such as the method of planting and seedbed preparation. Lower seeding rates should be applied without significant yield losses, if the advanced method of stand establishment is used.
The basic principles of successful alfalfa production would be to develop compatible management system and cultivar choice that would help farmers increase forage yields without significant loses of nutritive value, always bearing in mind the environmental factors.