中南大学最优控制课件
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
Chapter 7 Minimum Time Optimal Control Chapter 8 Linear-Quadratic Optimal Control
2
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
g5 x( t f ), t f 0, g 6 x (t f ), t f 0, terminal constraint J u( t ) R: performance index (objective functional) u(t )* : optimal control
u(t )* arg min J u(t ) min F x(t ), x( t ), u( t ), t
u(t ) u(t )
subject to:
x( t ) f x( t ), u( t ), t , system state equation u( t ) U R m , x( t ) X R n , t t0 , t f g1 x( t ), u( t ), t 0, g 2 x( t ), u( t), t 0 g 3 x( t0 ), t0 0, g 4 x (t0 ), t0 0, initial constraint
Approaches:
• The calculus of variations (Variational methods) • The minimum principle • Dynamic programming
Optimization problems:
• Minimization or Maximization • Differential games (min-max)
u(t ) u(t )
subject to:
x( t ) f x( t ), u( t ), t , system state equation u( t ) U R m , x( t ) X R n , t t0 , t f g1 x( t ), u( t ), t 0, g 2 x( t ), u( t), t 0 g3 x( t0 ), t0 0, g 4 x (t0 ), t0 0, initial constraint
1.3 Optimal control problems
To find the admissible control to minimize a cost functional or performance index among all the solutions (or trajectories) of the system's state equation, subject to the control constraint, state constraint, and all other contraints.
12
1.2 Basic problems of optimization
Optimization problems (II)
2) Dynamic Optimization (optimal control, functional extreme)
u(t )* arg min J u(t ) min F x(t ), x( t ), u( t ), t
Optimal control
Suboptimal control; Optimal control sensitivity; Multi-goal optimal control; Differential games;
· · · · · · · · · ·
8
1.1 Overview
Applied fields of optimal control
subject to:
f (1 , 2 , g (1, 2 ,
, n ) 0 , n ) 0
equality constraint inequality constraint
J R: performance index (objective function)
=(1 , 2 , , n )T R n : parameters to be optimized *: optimal parameters
Department of Automation School of Information Science & Engineering Central South University Changsha, Hunan 410083, China
1
Contents
Chapter 1 Introduction
13
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
1.4 Solution methods of optimization
14
Optimal control problem:
Optimal control Adaptive control Predictive control Robust control Intelligent control System modeling and identification
…………
6
1.1 Overview
Main contents of optimal control
最优控制理论及参数优化,李国勇 等,国防工业出版社
10
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
1.4 Solution methods of optimization
Volume 1: Chapter 1,2
Modern Control Theory
— Optimal Control —
( An undergraduate optional course )
Hui PENG
PhD, Professor
( http://deptauto.csu.edu.cn/staffmember/HuiPeng.htm )
g5 x( t f ), t f 0, g 6 x (t f ), t f 0, terminal constraint J u(t ) R: performance index (cost functional) u(t )* : optimal control
15
1.3 Optimal control problems
Performance index in optimal control problems
1.4 Solution methods of optimization
3
1.1 Overview
Chronological History of Feedback Control
1624, 1728, 1868, 1877, 1890, 1910, 1927, 1932, 1936, 1938, 1942, 1947, 1948, 1950, 1956, 1957, Drebble, Incubator (hatch chicks) Watt, Flyball governor Maxwell, Flyball stability analysis Routh, Stability Liapunov, Nonlinear stability Sperry, Gyroscope and autopilot Black, Feedback electronic amplifier Bush, Differential analyzer Nyquist, Nyquist stability criterion Callender, PID controller Bode, frequency response methods Wiener, Optimal filter design Hurewicz, Sampled data systems Nichols, Nichols chart Evans, Root locus Kochenberger, Nonlinear analysis Pontryagin, Minimum principle Bellman, Dynamic programming
7
1.1 Overview
Main branches of optimal control
Optimal control of distributed parameter systems;
Stochastic optimal control;
Adaptive optimal control; Optimal control of large-scale systems;
Advanced math
Linear algebra Automatic control theory (classical)
Linear control system (state-space method)
References (books)
Optimal Control, F. L. Lewis and V. L. Syrmos, John Wiley & Sons Dynamic Programming and Optimal Control,D. P. Bertsekas, Athena Scientific Dynamic Optimization,E. Bryson, Addison Wesley 自动控制原理(第二版)(下),吴麟主编,清华大学出版社, 2006 系统最优化及控制,符曦,机械工业出版社 最优控制理论与系统(第二版),胡寿松,王执铨,胡维礼,科学出版社 最优控制应用基础,邢继祥,科学出版社,2003
Chapter 2 Static Optimization Chapter 3 Variational Methods
Chapter 4 The Pontryagin Minimum Principle
Chapter 5 Discrete-Time Optimal Control
Chapter 6 Dynamic Programming
Control engineering; Space technology;
System engineering;
Economic management;
Financial engineering;
· · · · · · · · · ·
9
1.1 Overview
Some basic courses for studying optimal control
wenku.baidu.com
11
1.2 Basic problems of optimization
Optimization problems (I)
1) Static Optimization (parameters optimization, function extreme)
* arg min J min F (1 , 2 , , n )
Classical Control Theory
Modern Control Theory
4
5
1.1 Overview
Modern Control Theory
Multivariable linear system theory (State-space methods)
Optimal estimation, Kalman filtering
2
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
g5 x( t f ), t f 0, g 6 x (t f ), t f 0, terminal constraint J u( t ) R: performance index (objective functional) u(t )* : optimal control
u(t )* arg min J u(t ) min F x(t ), x( t ), u( t ), t
u(t ) u(t )
subject to:
x( t ) f x( t ), u( t ), t , system state equation u( t ) U R m , x( t ) X R n , t t0 , t f g1 x( t ), u( t ), t 0, g 2 x( t ), u( t), t 0 g 3 x( t0 ), t0 0, g 4 x (t0 ), t0 0, initial constraint
Approaches:
• The calculus of variations (Variational methods) • The minimum principle • Dynamic programming
Optimization problems:
• Minimization or Maximization • Differential games (min-max)
u(t ) u(t )
subject to:
x( t ) f x( t ), u( t ), t , system state equation u( t ) U R m , x( t ) X R n , t t0 , t f g1 x( t ), u( t ), t 0, g 2 x( t ), u( t), t 0 g3 x( t0 ), t0 0, g 4 x (t0 ), t0 0, initial constraint
1.3 Optimal control problems
To find the admissible control to minimize a cost functional or performance index among all the solutions (or trajectories) of the system's state equation, subject to the control constraint, state constraint, and all other contraints.
12
1.2 Basic problems of optimization
Optimization problems (II)
2) Dynamic Optimization (optimal control, functional extreme)
u(t )* arg min J u(t ) min F x(t ), x( t ), u( t ), t
Optimal control
Suboptimal control; Optimal control sensitivity; Multi-goal optimal control; Differential games;
· · · · · · · · · ·
8
1.1 Overview
Applied fields of optimal control
subject to:
f (1 , 2 , g (1, 2 ,
, n ) 0 , n ) 0
equality constraint inequality constraint
J R: performance index (objective function)
=(1 , 2 , , n )T R n : parameters to be optimized *: optimal parameters
Department of Automation School of Information Science & Engineering Central South University Changsha, Hunan 410083, China
1
Contents
Chapter 1 Introduction
13
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
1.4 Solution methods of optimization
14
Optimal control problem:
Optimal control Adaptive control Predictive control Robust control Intelligent control System modeling and identification
…………
6
1.1 Overview
Main contents of optimal control
最优控制理论及参数优化,李国勇 等,国防工业出版社
10
Chapter 1
1.1 Overview
Introduction
1.2 Basic problems of optimization 1.3 Optimal control problems
1.4 Solution methods of optimization
Volume 1: Chapter 1,2
Modern Control Theory
— Optimal Control —
( An undergraduate optional course )
Hui PENG
PhD, Professor
( http://deptauto.csu.edu.cn/staffmember/HuiPeng.htm )
g5 x( t f ), t f 0, g 6 x (t f ), t f 0, terminal constraint J u(t ) R: performance index (cost functional) u(t )* : optimal control
15
1.3 Optimal control problems
Performance index in optimal control problems
1.4 Solution methods of optimization
3
1.1 Overview
Chronological History of Feedback Control
1624, 1728, 1868, 1877, 1890, 1910, 1927, 1932, 1936, 1938, 1942, 1947, 1948, 1950, 1956, 1957, Drebble, Incubator (hatch chicks) Watt, Flyball governor Maxwell, Flyball stability analysis Routh, Stability Liapunov, Nonlinear stability Sperry, Gyroscope and autopilot Black, Feedback electronic amplifier Bush, Differential analyzer Nyquist, Nyquist stability criterion Callender, PID controller Bode, frequency response methods Wiener, Optimal filter design Hurewicz, Sampled data systems Nichols, Nichols chart Evans, Root locus Kochenberger, Nonlinear analysis Pontryagin, Minimum principle Bellman, Dynamic programming
7
1.1 Overview
Main branches of optimal control
Optimal control of distributed parameter systems;
Stochastic optimal control;
Adaptive optimal control; Optimal control of large-scale systems;
Advanced math
Linear algebra Automatic control theory (classical)
Linear control system (state-space method)
References (books)
Optimal Control, F. L. Lewis and V. L. Syrmos, John Wiley & Sons Dynamic Programming and Optimal Control,D. P. Bertsekas, Athena Scientific Dynamic Optimization,E. Bryson, Addison Wesley 自动控制原理(第二版)(下),吴麟主编,清华大学出版社, 2006 系统最优化及控制,符曦,机械工业出版社 最优控制理论与系统(第二版),胡寿松,王执铨,胡维礼,科学出版社 最优控制应用基础,邢继祥,科学出版社,2003
Chapter 2 Static Optimization Chapter 3 Variational Methods
Chapter 4 The Pontryagin Minimum Principle
Chapter 5 Discrete-Time Optimal Control
Chapter 6 Dynamic Programming
Control engineering; Space technology;
System engineering;
Economic management;
Financial engineering;
· · · · · · · · · ·
9
1.1 Overview
Some basic courses for studying optimal control
wenku.baidu.com
11
1.2 Basic problems of optimization
Optimization problems (I)
1) Static Optimization (parameters optimization, function extreme)
* arg min J min F (1 , 2 , , n )
Classical Control Theory
Modern Control Theory
4
5
1.1 Overview
Modern Control Theory
Multivariable linear system theory (State-space methods)
Optimal estimation, Kalman filtering