**GaussFit**

A System for Least Squares and Robust Estimation

William Jefferys, Barbara McArthur,James McCartney

Department of Astronomy

University of Texas at Austin

GaussFit was developed as a platform to facilitate the solution of
least squares and robust estimation problems for astrometric data reduction with
data from the NASA Hubble Space Telescope. An environment where astrometric
models could easily and quickly be written, tested and modified was required.**
**Previous methods had numerous and varied limitations. In the past,
individual models would have to be completely programmed with specific reduction
requirements. This case-by-case programming required large amounts of time and
would be more susceptible to error in the individual calculations.

1. Technical Description

GaussFit is a computer program, written in the "C" programming language, which includes a full-featured programing language in which models can be built to solve least squares and robust estimation problems. GaussFit runs on many computers with different operating systems including: UNIX® , VMS®, XENIX® and the Apple Macintosh®. It is also integrated into the Space Telescope Science Institute's data reduction system (STSDAS). Like other programs, GaussFit will obtain the solution more rapidly when the computational resources of the machine are greater. It has been tested by at least forty researchers in the field, and is being used to solve problems in astronomy and other sciences.

2. Functional operation

The program GaussFit consists of a number of parts. It has a compiler which takes the user’s model, written in the GaussFit programming language, and converts it into an assembly language program for an abstract machine. This program is then interpreted. Data are obtained from the user’s data and parameter files and used to form the equations of condition and constraint. To do this the interpreter relies on a built-in algebraic manipulator called the "cotangent bundle machine" to calculate not only the value of every arithmetic operation, but also all of the required derivative information (which is therefore computed analytically, not numerically). After the condition and constraint equations have been formed, the resulting matrix is sent to the solution algorithm which has been chosen by the user. For least squares estimation, it is a Householder orthogonal transformation method; for median- type estimators, it is based on the Barrodale and Roberts implementation of the Simplex algorithm; and for other robust methods ( Huber's metric, Tukey's biweight, and the metric fair [5]), Householder transformations are used together with either Newton’s method or the method of Iteratively Reweighted Least Squares (IRLS). The data files are then updated to reflect the results of the calculation, and the process is iterated until some iteration criterion has been satisfied. There are additional techniques implemented for special cases. For nonlinear modelling, the Marquadt-Levenburg method of parameter estimation has been implemented (a method which uses both the inverse- Hessian and steepest descent methods). For systems which produce a matrix that is singular or numerically very close to singular or ill-conditioned, singular value decomposition is being implemented (SVD). A bias elimination algorithm is being implemented for cases in which the variance of the measurement error is not small relative to the curvature[6].

To set up a problem for GaussFit, the user must provide several things. The first is a mathematical description of the model to be used to reduce the data. This would be a short program in the GaussFit programming language that describes how the equations of condition and the constraints (if any) are to be calculated. This calculation can be in the form of equations, or it can be algorithmic in nature. Next, the user must provide an "environment" file. This file specifies such things as the names of data and parameter files, iteration and convergence criteria, what matrices and other data will be printed, and the method of reduction to be used (particularly if the user chooses a robust method). The user must provide the input data in a form acceptable to the program. Finally, the user must provide a file with initial approximations of the parameters to be fitted. After preparing the model, environment and data files, the execution of the program should be initiated. The program prompts for the model and environment file names, and then runs to completion. Progress reports are printed to the screen periodically, and the results will be found in the result file and in the modified data files.

3. Supportive theory

The method of least squares, invented by Gauss nearly two centuries
ago, is a reliable workhorse for estimating parameters from noisy data. A number
of improvements to the basic method have been made and are implemented in this
program, but the fundamental idea of the method of least squares is very simple:
given a mathematical description which specifies the dependence of observed
(noisy) data on certain parameters, find the value of the parameters that
minimizes the overall error. In the classical method of least squares, the
overall error is estimated by the sum of the (weighted) squares of the residuals
(i.e., differences between the values predicted by the model and the values
observed). The method is extended straightforwardly to include the case where
observations are correlated, and with more difficulty to the case where more
than one observation may appear in the mathematical relationship connecting the
observations and the parameters. Nonlinear equations can also be solved by a
method of successive approximations. The program GaussFit implements a
generalized least squares algorithm that handles all of these cases. Reduction
models can easily and quickly be written, tested and modified. GaussFit also
provides an experimental capability for robust estimation. It is based on a
generalization of the methods pioneered by Huber, and reduces to them under the
conditions he discussed. Robust estimation greatly improves the resistance of
the estimator to "outliers," that is, bad observations which are not known *a
priori* to be bad [4].

4. Unique features

One of the onerous tasks that faces the implementer of a least
squares problem, particularly if the problem is nonlinear, is the calculation of
the partial derivatives with respect to the parameters and observations that are
required in order to form the equations of condition and the constraint
equations. GaussFit solves this problem automatically using a built-in algebraic
manipulator to calculate all of the required partial derivatives. Every
expression that the user’s model computes will carry all of the required
derivative information along with it. For example, if the calculation can only
be expressed algorithmically, GaussFit will automatically carry along all
derivative information at each step of the calculation. Notice that the
derivatives are calculated *analytically* at each step. No numerical
approximations are used. Because the required derivatives are all calculated
automatically, the user is relieved of the burden of worrying about them and can
concentrate on the task of specifying the correct equations of condition and
constraints. If the model to be fitted changes, the derivative information will
automatically be changed as well. This feature of GaussFit is a real time-saver,
particularly when fitting complex nonlinear models.

5. Analysis of capabilities

GaussFit is capable of handling situations that arise often enough to be of practical interest, but which have usually been ignored because they are not well understood by many users. It provides an easy and natural way to formulate general nonlinear problems; problems where the observation equations (equations of condition) contain more than one observation (the errors-in- variables case); problems with correlated observations; problems where exact constraints among parameters must be enforced. Certain robust estimation methods that generalize least squares to non-euclidian metrics and provide greater immunity against "outliers" than does the classical least squares method are available.

References

1. W. H. Jefferys, M. J. Fitzpatrick and B. E. McArthur, "GaussFit - a
system for least squares and robust estimation", *Celestial Mechanics*, 41,
1988, 39-49.

2. W. H. Jefferys, M.J. Fitzpatrick, B.E. McArthur and J.E. McCartney, "GaussFit: A System for Least Squares and Robust Estimation", User's Manual, Department of Astronomy and McDonald Observatory, Austin, Texas, 1990, 75 pp.

3. F. Murtagh, "Linear Regression with Errors in Both Variables: A Short
Review", *Errors, Uncertainties and Bias in Astronomy,* C. Jaschek and F.
Murtagh (Eds.), Cambridge University Press.

4. W. H. Jefferys, "Robust Estimation when more than one variable per
equation of condition has error," *Biometrika, *77, 1990, 597-607.

5. W.J.J. Rey, *Introduction to Robust and Quasi-Robust Statistical
Methods*, Springer-Verlag, New York, 1983.

6. W.A. Fuller, *Measurement Error Models*, John Wiley & Sons,
1987.

Acknowledgements

This software was developed under NASA Contract NAS8-32906, the support of which is gratefully acknowledged.