【最新】分布估计算法PPT 课件教案讲义(获奖作品) 图文
合集下载
相关主题
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
5
₪ end-while
Details
₪ Population replaced by probability vector
■ P = {p1 , p2 , …, p}
₪ pi : Probability of 1 in the ith bit ₪ Generate n individuals ₪ Update P using the best individual
9
Flowchart
10
What Models to use?
Start with ₪ Probability vector for binary strings ₪ Gaussian distribution Later ₪ Dependency tree models (COMIT) ₪ Bayesian Network
₪ Function optimization ₪ Job-shop scheduling ₪ TSP ₪ Bin-packing ₪ Knapsack Problem ₪ Neural Network weight training
8
分布估计算法:总的框架
₪ Estimation of Distribution Algorithms do just that! ₪ Typically they operate as follows: ■ Step 0: Randomly generate a set of individuals (t=0) ■ Step 1: Evaluate the individuals ■ While (not done) ■ Step 2: Select individuals (where ) to be parents ■ Develop a probability distribution/density function, pt, based on the parents ■ Step 3: Create offspring using pt ■ Step 4: Evaluate the offspring ■ Step 5: The offspring replace the parents (t = t + 1) ■ Step 6: Goto While
■ Capture the trend from the best performer
4
Basic PBIL
₪ P initialize probability vector (each position = 0.5) ₪ while (generations++ < limit)
■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ for each vector i do for each position j do generate Vi(j) according to P(j) end-do evaluate f(Vi) end-do Vmax = max(f(Vi)) update P according to Vmax if random(0,1] < Pmutate mutate P end-if
11
Probability Vector PMBGAs
12
分布估计算法:概率向量
₪ The EDA is known as the Univariate Marginal Distribution Algorithm ₪ Let’s try to solve the following problem
■ pi(t+1) = xi + pi(t)(1- ), i = 1,2,…,
₪ Mutate P:
■ pi(t+1) = mU[0,1) + pi(t+1)(1- m)
6
PBIL Example
₪ t = 0, P = {0.5, 0.5, 0.5, 0.5} ₪ Generate 5 individuals
3ቤተ መጻሕፍቲ ባይዱ
基于种群的增强式学习
₪ Population based Incremental Learning (PBIL, Baluja, 1994)
₪ Populations based search, such as GA
■ Create a probability vector by counting the number of 1s and 0s in each gene position ■ Generate new population using the probability vector ■ No information is carried from generation to generation!
【原创】定制代写 r/python/spss/matlab/WEKA/s as/sql/C++/stata/eviews 数据 挖掘和统计分析可视化调研报 告等服务(附代码数据),咨 询邮箱: glttom@ 有问题到 7 淘宝找“大数据部落”就可以 了
Some applications
分布估计算法
2009 济南
1
思想
₪ 遗传算法中的交叉、变异等操作有可能破坏已经 优化好的个体。为了避免这种现象,一种新的演 化算法 – 分布估计算法应运而生。 ₪ 分布估计算法中没有交叉和变异。主要用到是好 的个体的一种概率模型,以及根据此模型抽样产 生下一代。
2
2018/5/14
GA to EDA
₪ Supervised Competitive learning, e.g. LVQ
■ Winner-take-all reinforcement learning in ANN ■ Winner is a kind of prototype of the sample presented
₪ PBIL = GA + CL
■ {1010, 1100, 0100, 0111, 0001}
₪ Fitness: {2, 2, 1, 3, 1} ₪ Best individual: 0111; = 0.1 ₪ Update P
■ p1 = 0.5*(1-0.1) = 0.45 ■ p2 = p3 = p4 = 0.1*1 + 0.5*(1-0.1) = 0.55
₪ end-while
Details
₪ Population replaced by probability vector
■ P = {p1 , p2 , …, p}
₪ pi : Probability of 1 in the ith bit ₪ Generate n individuals ₪ Update P using the best individual
9
Flowchart
10
What Models to use?
Start with ₪ Probability vector for binary strings ₪ Gaussian distribution Later ₪ Dependency tree models (COMIT) ₪ Bayesian Network
₪ Function optimization ₪ Job-shop scheduling ₪ TSP ₪ Bin-packing ₪ Knapsack Problem ₪ Neural Network weight training
8
分布估计算法:总的框架
₪ Estimation of Distribution Algorithms do just that! ₪ Typically they operate as follows: ■ Step 0: Randomly generate a set of individuals (t=0) ■ Step 1: Evaluate the individuals ■ While (not done) ■ Step 2: Select individuals (where ) to be parents ■ Develop a probability distribution/density function, pt, based on the parents ■ Step 3: Create offspring using pt ■ Step 4: Evaluate the offspring ■ Step 5: The offspring replace the parents (t = t + 1) ■ Step 6: Goto While
■ Capture the trend from the best performer
4
Basic PBIL
₪ P initialize probability vector (each position = 0.5) ₪ while (generations++ < limit)
■ ■ ■ ■ ■ ■ ■ ■ ■ ■ ■ for each vector i do for each position j do generate Vi(j) according to P(j) end-do evaluate f(Vi) end-do Vmax = max(f(Vi)) update P according to Vmax if random(0,1] < Pmutate mutate P end-if
11
Probability Vector PMBGAs
12
分布估计算法:概率向量
₪ The EDA is known as the Univariate Marginal Distribution Algorithm ₪ Let’s try to solve the following problem
■ pi(t+1) = xi + pi(t)(1- ), i = 1,2,…,
₪ Mutate P:
■ pi(t+1) = mU[0,1) + pi(t+1)(1- m)
6
PBIL Example
₪ t = 0, P = {0.5, 0.5, 0.5, 0.5} ₪ Generate 5 individuals
3ቤተ መጻሕፍቲ ባይዱ
基于种群的增强式学习
₪ Population based Incremental Learning (PBIL, Baluja, 1994)
₪ Populations based search, such as GA
■ Create a probability vector by counting the number of 1s and 0s in each gene position ■ Generate new population using the probability vector ■ No information is carried from generation to generation!
【原创】定制代写 r/python/spss/matlab/WEKA/s as/sql/C++/stata/eviews 数据 挖掘和统计分析可视化调研报 告等服务(附代码数据),咨 询邮箱: glttom@ 有问题到 7 淘宝找“大数据部落”就可以 了
Some applications
分布估计算法
2009 济南
1
思想
₪ 遗传算法中的交叉、变异等操作有可能破坏已经 优化好的个体。为了避免这种现象,一种新的演 化算法 – 分布估计算法应运而生。 ₪ 分布估计算法中没有交叉和变异。主要用到是好 的个体的一种概率模型,以及根据此模型抽样产 生下一代。
2
2018/5/14
GA to EDA
₪ Supervised Competitive learning, e.g. LVQ
■ Winner-take-all reinforcement learning in ANN ■ Winner is a kind of prototype of the sample presented
₪ PBIL = GA + CL
■ {1010, 1100, 0100, 0111, 0001}
₪ Fitness: {2, 2, 1, 3, 1} ₪ Best individual: 0111; = 0.1 ₪ Update P
■ p1 = 0.5*(1-0.1) = 0.45 ■ p2 = p3 = p4 = 0.1*1 + 0.5*(1-0.1) = 0.55