![]() ![]() ![]() From a physical standpoint, the constant Hamiltonian condition is a generalization of the energy conservation condition as applied to optimal discrete systems with free intervals of time. By applying discrete dynamic programming, the necessary optimality conditions can be determined in a form that contains a discrete equation with a delayed time argument, which is of the Hamilton–Jacobi type. A consequence of this property is that each final segment of an optimal path, which can be continuous or discrete, is optimal with respect to its own initial state, initial time and the corresponding number of stages. In a continuous or discrete process that is described by an additive performance criterion, the optimal strategy and optimal profit are functions of the initial state, initial time and total number of stages. In general, optimization theories of discrete and continuous processes differ in their assumptions, formal descriptions and strength of optimality conditions and thus they usually constitute two different fields. Cascades, which are systems characterized by sequential arrangement of stages, are examples of dynamic discrete processes. Dynamic processes can be either discrete or continuous. In optimization, a process is regarded as dynamic when it can be described as a well-defined sequence of steps in time or space. This chapter focuses on the various dynamic optimization problems. ![]()
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |