This paper examines the magnitude of error associated with linear approximations of nonlinear variables based on Taylor's Series. Little attention has been given to the error term in previous empirical studies. This paper presents the mathematical technique for the single-variable and twovariable cases. Examples are given for each situation using agricultural time-series data. Characteristics of time-series data are sometimes crucial in the selection of an evaluation point for minimum error. The importance of selecting evaluation points is illustrated for three categories of timeseries data : (1) smooth trends, (2) trends with substantial variation, and (3) oscillatory series.