Employing small—sigma (o-) asymptotics we approximate the small—sample bias of the ordinary least—squares (OLS) estimator of the full coefficient vector in a linear regression model which includes a one period lagged dependent variable and an arbitrary number of fixed regressors. This bias term is used to construct a corrected least—squares estimator (CLS) which is unbiased to 0 ( o-2). Approximations are obtained for the mean squared error and the variance of the OLS and the CLS estimators in small samples but the complexity of the expressions makes direct comparison difficult. Empirical and artificial data are used to illustrate the theoretical findings and from a small scale simulation study it is shown that the CLS estimator can be virtually unbiased. Its variance, which is of the same order as OLS, can be estimated by the standard expression for the variance of the OLS coefficient vector. The CLS estimator is easy to calculate and provides a simple method of eliminating bias and at the same time it may reduce the mean squared error of estimation. It is shown that the technique can also be used for bias reduction when estimating reduced form equations in a dynamic simultaneous equation system.