VB梯度下降算法
- 1、下载文档前请自行甄别文档内容的完整性,平台不提供额外的编辑、内容补充、找答案等附加服务。
- 2、"仅部分预览"的文档,不可在线预览部分如存在完整性等问题,可反馈申请退款(可完整预览的文档不适用该条件!)。
- 3、如文档侵犯您的权益,请联系客服反馈,我们会尽快为您处理(人工客服工作时间:9:00-18:30)。
VB梯度下降算法
function grad_ascent(x,y,z,px,py,N,mu,xstart,ystart)
xga(1)= xstart;
yga(1)= ystart;
zga(1)=func(xga(1),yga(1));
for i=1:N
gradx = ( func(xga(i)+eps,yga(i))-func(xga(i),yga(i)) )/eps;
grady = ( func(xga(i),yga(i)+eps)-func(xga(i),yga(i)) )/eps;
xga(i+1) = xga(i) + mu*gradx;
yga(i+1) = yga(i) + mu*grady;
zga(i+1)=func(xga(i+1),yga(i+1));
end
hold off
contour(x,y,z,10)
hold on
quiver(x,y,px,py)
hold on
plot(xga,yga)
S = sprintf('Gradiant Ascent: N = %d, Step Size = %f',N,mu);
title(S)
xlabel('x axis')
ylabel('yaxis')
DEMO
clear
print_flag = 1;
width = 1.5;
xord = -width:.15:width;
yord = -width:.15:width;
[x,y] = meshgrid(xord,yord);
z = func(x,y);
hold off
surfl(x,y,z)
xlabel('x axis')
ylabel('yaxis')
if print_flag, print
else, input('Coninue?'), end
[px,py] = gradient(z,.2,.2);
xstart = 0.9*width;
ystart =-0.3*width;
N = 100;
mu = 0.02;
grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print
else, input('Coninue?'), end
N = 100;
mu = 0.06;
grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print
else, input('Coninue?'), end
N = 100;
mu = 0.18;
grad_ascent(x,y,z,px,py,N,mu,xstart,ystart) if print_flag, print
else, input('Coninue?'), end