Beef Research News
Brought to you by Kansas State University College of Veterinary Medicine - Farm Animal Section
Fecal shedding of Salmonella in BRD cases
A prospective cohort study was used to assess whether Salmonella shedding in commercial feedlot cattle treated with antimicrobials for respiratory disease was associated with the incidence of adverse health outcomes. Feces were collected per rectum from cattle that were examined for apparent respiratory disease, had a rectal temperature >40 degrees C, and subsequently received antimicrobial treatment. Salmonella were recovered from 918 (73.7%) of 1,245 fecal samples and weekly prevalence estimates ranged from 49 to 100% over the 3-month study. Genotypic and phenotypic characteristics of Salmonella strains in the population were determined. Serogroup E Salmonella were most common (73.3%), followed by C1 (11.0%), C3 (8.6%), and B (1.1%). Predominant serotypes were Orion (46.5%), Anatum (19.8%), Kentucky (8.7%), Montevideo (7.5%), and Senftenberg (4.9%). All Salmonella had virulence genes invA and pagC, but few (3.9%) were positive for antimicrobial resistance-associated integron gene intI1. Re-treatment and case fatality rates were numerically higher for individuals that were Salmonella-positive versus -negative at initial treatment, but were not statistically different on multivariate analysis. However, the case fatality rate was higher for cattle shedding Group B Salmonella than for cattle shedding other serogroups. Lots (groups) with a higher Salmonella prevalence at first treatment had a higher proportion of mortalities occur in a hospital pen, higher overall re-treatment rates, and were more likely to be sampled later in the study. Results indicate a high prevalence of Salmonella in cattle treated for respiratory disease, but that effects associated with clinical outcomes may depend on the Salmonella strain and lot-level factors.
Alam M.J., Renter D.G., Ives S.E., Thomson D.U., Hollis L.C., Sanderson M.W., Nagaraja T.G. Potential associations between fecal shedding of Salmonella in feedlot cattle treated for apparent respiratory disease and subsequent adverse health outcomes. Veterinary Research; (2009 40:02).
Grazing stockpiled fescue with varying endophyte status
The objective of this study was to evaluate the performance of growing cattle when intensively grazing stockpiled endophyte-infected (E+), endophyte-free (E-) and non-toxic endophyte-infected (EN) tall fescue during the winter. The experiment was conducted over 5 consecutive winters. In each year, plots (1 ha each, 4 per treatment) were harvested for hay in August, fertilized in September and forage was allowed to accumulate until grazing was initiated in early December. Each year, 48 Angus-cross tester cattle (4 per plot) were given a daily allotment of forage, under strip-grazing (frontal grazing) management, with a target residual height of 5 cm. Steers were used the first year, and heifers were used in subsequent years. The grazing periods for determination of pasture ADG were 86 d (yr 1), 70 d (yr 2), 86 d (yr 3), 72 d (yr 4), and 56 d (yr 5). Pasture ADG of cattle did not differ among treatments (P = 0.13) and were 0.51, 0.59, and 0.56 kg/d (SEM 0.03) for E+, E-, and EN, respectively. Serum prolactin concentrations of heifers grazing E+ were lower (P < 0.05) than those grazing E- and EN during all years except yr 2. In yr 2, E+ and E- did not differ (P = 0.11). Serum prolactin of heifers grazing E- and EN did not differ (P > 0.20) except in yr 4. During yr 4, serum prolactin of heifers grazing E- was greater (P = 0.05) than that of those grazing EN. Serum urea-N concentrations (SUN) tended to differ among treatments (P = 0.10) and there was a treatment by year interaction (P = 0.05). During yr 1 through 3, SUN did not differ (P > 0.15) among treatments. However, as the stands aged, E- had a greater invasion of other plant species which increased the CP content of the sward, thus causing heifers grazing E- during yr 5 to have greater (P < 0.01) SUN than heifers grazing E+ and EN, which did not differ (P = 0.89). Forage disappearance (DM basis) did not differ (P = 0.75) among treatments and was 4.7, 4.7, and 5.0 kg/animal daily (SEM 0.27) for E+, E-, and EN, respectively. Gain per ha was greater (P = 0.04) for E+ (257 kg) than for E- (220 kg) or EN (228 kg). In most years, animal grazing days on E+ was greater than E- and EN. However, in yr 5, animal grazing days did not differ (P > 0.20) among treatments. The use of stockpiled E+ as a source of low cost winter feed is a viable option for producers, whereas grazing of EN may be more beneficial during the spring and fall when more severe negative effects of ergot alkaloids have been observed.
M. E. Drewnoski, E. J. Oliphant, B. T. Marshall, M. H. Poore, J. T. Green, and M. E. Hockett Performance of growing cattle grazing stockpiled Jesup tall fescue with varying endophyte status Published online first on November 21, 2008 J. Anim Sci. 1910. doi:10.2527/jas.2008-0977
Genetic change based on economic objectives
The objective of this study was to evaluate and quantify the genetic progress achieved in a New Zealand Angus nucleus herd through long-term selection for an economically based, multi-trait breeding objective. A 4-trait breeding objective was implemented in 1976 and selected on through 1993 with traits consisting of slaughter weight and dressing percentage of harvest progeny and cull cows, and the number of calves weaned in the lifetime of each cow. These traits were related to gross income with none related to costs of production. To overcome this, economic weights were adjusted down for increased feed requirements of faster growing (and generally larger) animals. Performance and pedigree information was recorded on 16,189 animals from 1976 through 1993 and included weaning, yearling, and mature cow weights along with the lifetime number of calves weaned by each cow. These traits were used in the phenotypic selection indexes developed to predict the defined breeding objective. Individual performance was adjusted by least squares for major environmental fixed effects and deviated from contemporaneous means. Genetic and residual (co)variances were re-estimated for each of the traits using REML techniques and used to calculate EBV for each trait.
These EBV were in turn used to calculate annual genetic changes. The average annual genetic changes for weaning weight direct and maternal breeding value were 0.43 ± 0.05 and 0.03 ± 0.22 kg/yr, respectively. Corresponding annual genetic changes for postweaning BW gain, yearling weight, harvest weight, and mature BW were 0.29 ± 0.03, 0.72 ± 0.06, 1.7 ± 0.13, and 0.13 ± 0.09 kg, respectively. The annual change in number of calves weaned per cow lifetime was 0.006 ± 0.001 calves/cow and the change in dressing percentage was estimated to be –0.035 ± 0.003 %/yr. At the end of the program, 3.21 generations of selection had occurred with a mean accumulated selection differential of 3.87 SD. Change in objective traits due to selection was similar to or exceeded change predicted at the onset of the program with the exception of mature BW and dressing percentage. Genetic change in mature BW was not different from zero, whereas the predicted change was 29.3 kg. The overall genetic trend in the breeding objective exceeded that predicted at the onset of the program. Results of this study showed that selection on indexes developed to predict an economically based, multi-trait breeding objective will produce genetic change.
Enns, R.M. and G. B. Nicoll. Genetic change results from selection on an economic breeding objective in beef cattle. J. Anim Sci. 2008. 86:3348-3357.
Supplemental energy source & frequency effect on calf growth
Crossbred heifers (n = 120; 265 kg, SD = 37) were fed individually (84 d) to determine the effect of supplement type, concentration, and frequency on intake and performance and to estimate the energy value of dry distillers grains plus solubles (DDGS) in a high-forage diet. Treatments were arranged in a 3 x 2 x 2 factorial, with 3 supplements, 2 concentrations, and 2 frequencies of supplementation. Supplements including dry-rolled corn (DRC), DRC with corn gluten meal (DRC + CGM), and DDGS were fed at 0.21% (LOW) or 0.81% (HIGH) of BW daily and were provided daily (DAILY) or 3 times weekly (ALT). Heifers were fed to consume grass hay (8.7% CP) ad libitum. Individual DMI, diet composition, BW, and ADG were used to calculate energy values for DDGS and DRC. Supplement type, concentration, frequency, and interactions were tested using the MIXED procedure of SAS, with BW included as a covariate. Supplement x concentration interactions for gain (P = 0.01) and G:F (P < 0.01) were detected. At the LOW concentration, heifers supplemented with DDGS gained more and were more efficient (P 0.03) than those supplemented with DRC or DRC + CGM. No performance differences were observed (P 0.22) between DDGS and DRC + CGM in HIGH treatments, although both improved (P 0.01) gain and G:F relative to DRC. Calculated TDN content of DDGS was 18 to 30% greater than DRC. Gain and G:F were improved (P < 0.01) in heifers fed HIGH vs. LOW. Total intake was greater (P < 0.01) for HIGH than LOW, but LOW heifers consumed more hay (P < 0.01) than HIGH. The DAILY heifers consumed more (P < 0.01) hay and total DM than the ALT heifers. The DAILY heifers gained more (P < 0.01) than ALT, but G:F was not affected (P = 0.85) by supplementation frequency. In a high-forage diet, DDGS has greater energy value than corn.
Loy, T.W., T. J. Klopfenstein2, G. E. Erickson, C. N. Macken and J. C. MacDonald. Effect of supplemental energy source and frequency on growing calf performance. J. Anim Sci. 2008. 86:3504-3510.
Therapeutic Ceftiofur effect on E. coli dynamics in dairy
The goal of this study was to follow ceftiofur-treated and untreated cattle in a normally functioning dairy to examine enteric Escherichia coli for changes in antibiotic resistance profiles and genetic diversity. Prior to treatment, all of the bacteria cultured from the cows were susceptible to ceftiofur. Ceftiofur-resistant E. coli was only isolated from treated cows during and immediately following the cessation of treatment, and the 12 blaCMY-2-positive isolates clustered into two genetic groups. E. coli bacterial counts dropped significantly in the treated animals (P < 0.027), reflecting a disappearance of the antibiotic-susceptible strains. The resistant bacterial population, however, did not increase in quantity within the treated cows; levels stayed low and were overtaken by a returning susceptible population. There was no difference in the genetic diversities of the E. coli between the treated and untreated cows prior to ceftiofur administration or after the susceptible population of E. coli returned in the treated cows. A cluster analysis of antibiotic susceptibility profiles resulted in six clusters, two of which were multidrug resistant and were comprised solely of isolates from the treated cows immediately following treatment. The antibiotic treatment provided a window to detect the presence of ceftiofur-resistant E. coli but did not appear to cause its emergence or result in its amplification. The finding of resistant isolates following antibiotic treatment is not sufficient to estimate the strength of selection pressure nor is it sufficient to demonstrate a causal link between antibiotic use and the emergence or amplification of resistance.
R. S. Singer, S. K. Patterson, and R. L. Wallace Effects of Therapeutic Ceftiofur Administration to Dairy Cattle on Escherichia coli Dynamics in the Intestinal Tract Appl. Environ. Microbiol. 2008;74 6956-6962
Beef Research News is produced by the Farm Animal section at Kansas State University. To modify your subscription to this service please email Brad White at email@example.com
For more information please contact:
Beef Production Medicine
Q211 Mosier Hall
Manhattan, KS 66506