Files

Abstract

This article delineates the principles for obtaining optimal control policies based on dynamic plant growth models. It starts with a brief outline of the structure of decision problems and the resulting requirements for models designed to support these decisions. The concepts of open-loop and closed-loop control are discussed next. Emphasis is placed on the methods for solving the control problem. Three basic frameworks are considered including Pontrjagin's Maximum Principle, Dynamic Programming and the use of Numerical Optimization methods. The article outlines the principle structure of these approaches and addresses their strengths and weaknesses. It concludes with an application of Numerical Optimization methods to greenhouse temperature strategies. The results demonstrate how the algorithms work and lead to conclusions with respect to their applicability for research purposes and practical decision-making.

Details

PDF

Statistics

from
to
Export
Download Full History