Search
Author
Title
Vol.
Issue
Year
1st Page

Abstract

 

This article in

  1. Vol. 79 No. 5, p. 1166-1172
     
    Published:


 View
 Download
 Share

doi:10.2527/2001.7951166x

Use of the preconditioned conjugate gradient algorithm as a generic solver for mixed-model equations in animal breeding applications.

  1. S Tsuruta,
  2. I Misztal and
  3. I Strandén
  1. Department of Animal and Dairy Science, University of Georgia, Athens 30602-2771, USA. shogo@arches.uga.edu

Abstract

Utility of the preconditioned conjugate gradient algorithm with a diagonal preconditioner for solving mixed-model equations in animal breeding applications was evaluated with 16 test problems. The problems included single- and multiple-trait analyses, with data on beef, dairy, and swine ranging from small examples to national data sets. Multiple-trait models considered low and high genetic correlations. Convergence was based on relative differences between left- and right-hand sides. The ordering of equations was fixed effects followed by random effects, with no special ordering within random effects. The preconditioned conjugate gradient program implemented with double precision converged for all models. However, when implemented in single precision, the preconditioned conjugate gradient algorithm did not converge for seven large models. The preconditioned conjugate gradient and successive overrelaxation algorithms were subsequently compared for 13 of the test problems. The preconditioned conjugate gradient algorithm was easy to implement with the iteration on data for general models. However, successive overrelaxation requires specific programming for each set of models. On average, the preconditioned conjugate gradient algorithm converged in three times fewer rounds of iteration than successive overrelaxation. With straightforward implementations, programs using the preconditioned conjugate gradient algorithm may be two or more times faster than those using successive overrelaxation. However, programs using the preconditioned conjugate gradient algorithm would use more memory than would comparable implementations using successive overrelaxation. Extensive optimization of either algorithm can influence rankings. The preconditioned conjugate gradient implemented with iteration on data, a diagonal preconditioner, and in double precision may be the algorithm of choice for solving mixed-model equations when sufficient memory is available and ease of implementation is essential.

  Please view the pdf by using the Full Text (PDF) link under 'View' to the left.

Copyright © .