Continuous Time Dynamical Systems

Continuous Time Dynamical Systems

State Estimation and Optimal Control with Orthogonal Functions

by B. M. Mohan
5/5

Optimal control deals with the problem of finding a control law for a given system such that a certain optimality criterion is achieved.

An optimal control is a set of differential equations describing the paths of the control variables that minimize the.

First published
2018
Publishers
Taylor & Francis Group
Language
English

Books

Similar books