Skip to the content

Kansas State University

 

Beef Research News
Brought to you by Kansas State University College of Veterinary Medicine - Farm Animal Section
July 2009

 

 

 

Contents:

Diagnostic comparison of clinical observations and harvest lung lesions

Microbial investigations in BRD calves

Lung pathology in fatal feedlot pneumonias

Effects of delays in daily feed delivery time

Meta-analysis of conventional vs. non-conventional beef production


Diagnostic comparison of clinical observations and harvest lung lesions
Bovine respiratory disease (BRD) diagnosis during the postweaning phase of beef production is an important component of effective preventive health and treatment programs. Although identification of diseased animals based on signs of clinical illness (CI) is a common method in the beef industry for identifying BRD, very little information is available on the accuracy of this method. Previous investigators hypothesized that monitoring pulmonary lesions at harvest (LU) could be a more reliable indicator of disease status during the postweaning phase. A structured literature review was conducted to identify research that compared CI and LU. Because there is no true gold standard for diagnosing BRD, Bayesian methods were used to estimate the sensitivity and specificity of each diagnostic method relative to a BRD diagnosis at any time during the postweaning phase. Results from the current study indicate that the estimated diagnostic sensitivity and specificity of CI were 61.8% (97.5% probability interval [PI]: 55.7, 68.4) and 62.8% (97.5% PI: 60.0, 65.7), respectively. Use of LU for a BRD diagnosis was estimated to have a sensitivity of 77.4% (97.5% PI: 66.2, 87.3) and a specificity of 89.7% (97.5% PI: 86.0, 93.8). Further analysis revealed that the probabilities of LU having higher sensitivity and specificity than CI were 99.4% and 100%, respectively. The present research indicates that neither method was perfect, and both methods were relatively poor at correctly classifying truly diseased animals (sensitivity) but that LU was more accurate than CI for BRD diagnosis. Results from the present study should be considered when these diagnostic methods are used to evaluate BRD outcomes in clinical and research settings.

White, B.J. and D.G. Renter. Bayesian estimation of the performance of using clinical observations and harvest lung lesions for diagnosing bovine respiratory disease in post-weaned beef calves. 2009. Journal of Veterinary Diagnostic Investigation Vol. 21 Issue 4, 446-453

Microbial investigations in BRD calves
Trans-tracheal aspirations from 56 apparently healthy calves and 34 calves with clinical signs of pneumonia were collected in six different herds during September and November 2002. The 90 samples were cultivated and investigated by PCR tests targeting the species Histophilus somni, Mannheimia haemolytica, Pasteurella multocida, Mycoplasma bovis, Mycoplasma dispar, and Mycoplasma bovirhinis. A PCR test amplifying the lktC-artJ intergenic region was evaluated and shown to be specific for the two species M. haemolytica and Mannheimia glucosida. All 90 aspirations were also analyzed for bovine respiratory syncytial virus (BRSV), parainfluenza-3 virus, and bovine corona virus by antigen ELISA. Surprisingly, 63% of the apparently healthy calves harbored potentially pathogenic bacteria in the lower respiratory tract, 60% of these samples contained either pure cultures or many pathogenic bacteria in mixed culture. Among diseased calves, all samples showed growth of pathogenic bacteria in the lower respiratory tract. All of these were classified as pure culture or many pathogenic bacteria in mixed culture. A higher percentage of the samples were positive for all bacterial species in the group of diseased animals compared to the clinically healthy animals, however this difference was only significant for M. dispar and M. bovirhinis. M. bovis was not detected in any of the samples. BRSV was detected in diseased calves in two herds but not in the clinically healthy animals. Among the diseased calves in these two herds a significant increase in haptoglobin and serum amyloid A levels was observed compared to the healthy calves. The results indicate that haptoglobin might be the best choice for detecting disease under field conditions. For H. somni and M. haemolytica, a higher percentage of the samples were found positive by PCR than by cultivation, whereas the opposite result was found for P. multocida. Detection of P. multocida by PCR or cultivation was found to be significantly associated with the disease status of the calves. For H. somni a similar association with disease status was only observed for cultivation and not for PCR.

Angen O, Thomsen J, Larsen LE, Larsen J, Kokotovic B, Heegaard PM, Enemark JM.  Respiratory disease in calves: Microbiological investigations on trans-tracheally aspirated bronchoalveolar fluid and acute phase protein response.  Vet Microbiol. 2009 May 28;137(1-2):165-71. Epub 2009 Jan 4.

Lung pathology in fatal feedlot pneumonias
This study charted 237 fatal cases of bovine respiratory disease (BRD) observed from May 2002 to May 2003 in a single Oklahoma feed yard. Postmortem lung samples were used for agent identification and histopathology. Late in the study, 94 skin samples (ear notches) were tested for Bovine viral diarrhea virus (BVDV) by immunohistochemistry (IHC). Bovine respiratory disease morbidity was 14.7%, and the mortality rate of all causes was 1.3%, with more than half (53.8%) attributed to BRD (0.7% total of all causes). The agents isolated were the following: Mannheimia haemolytica (25.0%), Pasteurella multocida (24.5%), Histophilus somni (10.0%), Arcanobacterium pyogenes (35.0%), Salmonella spp. (0.5%), and Mycoplasma spp. (71.4%). Viruses recovered by cell culture were BVDV-1a noncytopathic (NCP; 2.7%), BVDV-1a cytopathic (CP) vaccine strain (1.8%), BVDV-1b NCP (2.7%), BVDV-2a NCP (3.2%), BVDV-2b CP (0.5%), and Bovine herpesvirus 1 (2.3%). Gel-based polymerase chain reaction (PCR) assays were 4.6% positive for Bovine respiratory syncytial virus and 10.8% positive for Bovine coronavirus. Bovine viral diarrhea virus IHC testing was positive in 5.3% of the animals. The mean values were determined for the treatment data: fatal disease onset (32.65 days), treatment interval (29.15 days), number of antibiotic treatments (2.65), number of different antibiotics (1.89), and day of death (61.81 days). Lesions included the following: 1) duration: acute (21%), subacute (15%), chronic (40.2%), healing (2.8%), normal (18.1%), and autolyzed (2.8%); 2) type of pneumonia: lobar bronchopneumonia (LBP; 27.1%), LBP with pleuritis (49.1%), interstitial pneumonia (5.1%), bronchointerstitial pneumonia (1.4%), septic (0.9%), embolic foci (0.5%), other (2.8%), normal (10.3%), and autolyzed (2.8%); and 3) bronchiolar lesions: bronchiolitis obliterans (39.7%), bronchiolar necrosis (26.6%), bronchiolitis obliterans/bronchiolar necrosis (1.4%), other bronchiolar lesions (6.5%), and bronchiolar lesion negative (25.7%). Statistically significant relationships were present among the agents, lesions, and the animal treatment, disease onset, and mortality data. Clinical illnesses observed in this study were lengthier than those reported 16–20 years ago, based on fatal disease onset, treatment interval, and day of death.

R. W. Fulton, K.S. Blood, R.J. Panciera, M.E. Payton, J.F. Ridpath, A.W. Confer, J.T. Saliki, L.T. Burge, R.D. Welsh, B.J. Johnson and A. Reck. Lung pathology and infectious agents in fatal feedlot pneumonias and relationship with mortality, disease onset, and treatments. 2009 Journal of Veterinary Diagnostic Investigation Vol. 21 Issue 4, 464-477

Effects of delays in daily feed delivery time
Four rumen-fistulated Holstein heifers (134 ± 1 kg initial BW) were used in a 4 x 4 Latin square design to determine the effects of delaying daily feed delivery time on intake, ruminal fermentation, behavior, and stress response. Each 3-wk experimental period was preceded by 1 wk in which all animals were fed at 0800 h. Feed bunks were cleaned at 0745 h and feed offered at 0800 h (T0, no delay), 0900 (T1), 1000 (T2), and 1100 (T3) from d 1 to 21 with measurements taken during wk 1 and 3. Heifers were able to see each other at all times. Concentrate and barley straw were offered in separate compartments of the feed bunks, once daily and for ad libitum intake. Ruminal pH and saliva cortisol concentrations were measured at 0, 4, 8, and 12 h postfeeding on d 3 and 17 of each experimental period. Fecal glucocorticoid metabolites were measured on d 17. Increasing length of delay in daily feed delivery time resulted in a quadratic response in concentrate DMI (low in T1 and T2; P = 0.002), whereas straw DMI was greatest in T1 and T3 (cubic P = 0.03). Treatments affected the distribution of DMI within the day with a linear decrease observed between 0800 and 1200 h but a linear increase during nighttimes (2000 to 0800 h), whereas T1 and T2 had reduced DMI between 1200 and 1600 h (quadratic P = 0.04). Water consumption (L/d) was not affected but decreased linearly when expressed as liters per kilogram of DMI (P = 0.01). Meal length was greatest and eating rate slowest in T1 and T2 (quadratic P  0.001). Size of the first meal after feed delivery was reduced in T1 on d 1 (cubic P = 0.05) and decreased linearly on d 2 (P = 0.01) after change. Concentrate eating and drinking time (shortest in T1) and straw eating time (longest in T1) followed a cubic trend (P  0.02). Time spent lying down was shortest and ruminating in standing position longest in T1 and T2. Delay of feeding time resulted in greater daily maximum salivary cortisol concentration (quadratic P = 0.04), which was greatest at 0 h in T1 and at 12 h after feeding in T2 (P < 0.05). Daily mean fecal glucocorticoid metabolites were greatest in T1 and T3 (cubic P = 0.04). Ruminal pH showed a treatment effect at wk 1 because of increased values in T1 and T3 (cubic P = 0.01). Delaying feed delivery time was not detrimental for rumen function because a stress response was triggered, which led to reduced concentrate intake, eating rate, and size of first meal, and increased straw intake. Increased salivary cortisol suggests that animal welfare is compromised.

L. A. González, L. B. Correa, A. Ferret, X. Manteca, J. L. Ruíz-de-la-Torre and S. Calsamiglia.  Intake, water consumption, ruminal fermentation, and stress response of beef heifers fed after different lengths of delays in the daily feed delivery time. J. Anim Sci. 2009. 87:2709-2718.

Meta-analysis of conventional vs. non-conventional beef production
Conventional feeding systems use pharmaceutical products not allowed in natural or organic systems for finishing cattle. This review of data compares the performance effects (ADG, G:F, DMI) of technologies used in conventional feeding programs that are prohibited in organic and/or natural programs. The technologies evaluated were steroid implants, monensin, tylosin, endectocides, and metaphylaxis with any antimicrobial. For inclusion in this analysis, studies were conducted in North America; reported randomization to treatment group; utilized beef cattle; contained an untreated control group; and were sourced from peer reviewed journals. Forest plots were used to visually examine the data for trends towards a uniform effect of the technology on the outcomes of interest (ADG, DMI, G:F). Technologies that displayed a uniform response compared to negative controls on the forest plot were then analyzed using mixed models. Examination of forest plots for endectocides, steroid implants, monensin and metaphylaxis technologies appeared to show performance advantages for treated cattle relative to cattle in negative control groups. An insufficient number of studies met the inclusion criteria to conduct meta-analyses comparing endectocides, monensin or tylosin to negative controls. Average daily gain in feeder cattle given metaphylaxis on arrival was 0.11 kg/d (P < 0.01) greater relative to cattle that did not receive metaphylaxis on arrival. Implanting heifers increased ADG by 0.08 kg/d compared to non-implanted controls (P = 0.09). Implants had no effect on G:F (P = 0.14) in heifers or on DMI (P = 0.44) relative to non-implanted control heifers. Implanting steers was associated with greater ADG by 0.25 kg/d (P < 0.01) and DMI by 0.53 kg/d (P < 0.01) relative to non-implanted control steers. Implants also improved G:F in steers relative to non-implanted steers by 0.02 (0.17 vs. 0.15; implanted vs. controls, P < 0.01) (n = 21 studies). When average estimated differences in ADG and G:F for implanted and non-implanted steers were incorporated into a breakeven model, implanted steers had a $77 per head lower cost of production than non-implanted steers and $349 per head lower cost of production than organically raised steers. These data illustrate the importance of capturing premiums when operating natural and organic production systems to maintain economic viability.

B.W. Wileman, D. U. Thomson, C.D. Reinhardt and D. G. Renter.  Analysis of modern technologies commonly used in beef cattle production: Conventional beef production verses non-conventional production using meta-analysis. Published online first on July 17, 2009. J. Anim Sci. 1910. doi:10.2527/jas.2009-1778




Beef Research News is produced by the Farm Animal section at Kansas State University. To modify your subscription to this service please email Brad White. ( bwhite@vet.ksu.edu )


For more information please contact:
Brad White, DVM, MS
Beef Production Medicine
Q211 Mosier Hall
Manhattan, KS 66506
bwhite@vet.ksu.edu