在练习2中使用了到了两个函数:optimset和fminunc。

%  Set options for fminunc
options = optimset('GradObj', 'on', 'MaxIter', 400);
%set the GradObj option to on,which tells fminunc that our function
%returns both the cost and gradient.
%This allows fminunc  to use the gradient when minimizing the function.
%set the MaxIter option to 400, so that fminunc will run for at most 400
%steps before it terminates.%  Run fminunc to obtain the optimal theta
%  This function will return theta and the cost
[theta, cost] = ...fminunc(@(t)(costFunction(t, X, y)), initial_theta, options);

1、通过help查看optimset函数

>> help optimsetoptimset Create/alter optimization OPTIONS structure.OPTIONS = optimset('PARAM1',VALUE1,'PARAM2',VALUE2,...) creates anoptimization options structure OPTIONS in which the named parameters havethe specified values.  Any unspecified parameters are set to [] (parameterswith value [] indicate to use the default value for that parameter whenOPTIONS is passed to the optimization function). It is sufficient to typeonly the leading characters that uniquely identify the parameter.  Case isignored for parameter names.NOTE: For values that are strings, the complete string is required.OPTIONS = optimset(OLDOPTS,'PARAM1',VALUE1,...) creates a copy of OLDOPTSwith the named parameters altered with the specified values.OPTIONS = optimset(OLDOPTS,NEWOPTS) combines an existing options structureOLDOPTS with a new options structure NEWOPTS.  Any parameters in NEWOPTSwith non-empty values overwrite the corresponding old parameters inOLDOPTS.optimset with no input arguments and no output arguments displays allparameter names and their possible values, with defaults shown in {}when the default is the same for all functions that use that parameter. Use optimset(OPTIMFUNCTION) to see parameters for a specific function.OPTIONS = optimset (with no input arguments) creates an options structureOPTIONS where all the fields are set to [].OPTIONS = optimset(OPTIMFUNCTION) creates an options structure with allthe parameter names and default values relevant to the optimizationfunction named in OPTIMFUNCTION. For example,optimset('fminbnd')oroptimset(@fminbnd)returns an options structure containing all the parameter names anddefault values relevant to the function 'fminbnd'.optimset PARAMETERS for MATLABDisplay - Level of display [ off | iter | notify | final ]MaxFunEvals - Maximum number of function evaluations allowed[ positive integer ]MaxIter - Maximum number of iterations allowed [ positive scalar ]TolFun - Termination tolerance on the function value [ positive scalar ]TolX - Termination tolerance on X [ positive scalar ]FunValCheck - Check for invalid values, such as NaN or complex, from user-supplied functions [ {off} | on ]OutputFcn - Name(s) of output function [ {[]} | function ] All output functions are called by the solver after eachiteration.PlotFcns - Name(s) of plot function [ {[]} | function ]Function(s) used to plot various quantities in every iterationNote to Optimization Toolbox users:To see the parameters for a specific function, check the documentation page for that function. For instance, enterdoc fminconto open the reference page for fmincon.You can also see the options in the Optimization Tool. EnteroptimtoolExamples:To create an options structure with the default parameters for FZEROoptions = optimset('fzero');To create an options structure with TolFun equal to 1e-3options = optimset('TolFun',1e-3);To change the Display value of options to 'iter'options = optimset(options,'Display','iter');See also optimget, fzero, fminbnd, fminsearch, lsqnonneg.Reference page for optimsetOther functions named optimset

optimset函数创建或改变最优化OPTIONS结构。
OPTIONS = optimset(‘PARAM1’,VALUE1,‘PARAM2’,VALUE2,…) 创建了一个优化选项结构OPTIONS,当OPTIONS传递优化参数时,将已经命名的参数赋值为特殊值,未有赋值的参数设置为[],即设置为默认值。
OPTIONS = optimset ()会创建一个优选结构OPTIONS其所有的参数设置为[].
OPTIONS = optimset(OPTIMFUNCTION)会创建一个针对优化函数OPTIMFUNCTION的优选结构OPTIONS,其参数为优化函数默认值。
比如说,可以使用语句optimset(‘fminbnd’)或者optimset(@fminbnd),其会返回针对优化函数 'fminbnd’的优选结构,其参数均为优化函数 'fminbnd’默认值。
通过上述分析,可以使用命令optimset(‘fminunc’)查看函数的参数。

>> optimset('fminunc')ans = Display: 'final'MaxFunEvals: '100*numberofvariables'MaxIter: 400TolFun: 1.0000e-06TolX: 1.0000e-06FunValCheck: 'off'OutputFcn: []PlotFcns: []ActiveConstrTol: []Algorithm: []AlwaysHonorConstraints: []DerivativeCheck: 'off'Diagnostics: 'off'DiffMaxChange: InfDiffMinChange: 0FinDiffRelStep: []FinDiffType: 'forward'GoalsExactAchieve: []GradConstr: []GradObj: 'off'HessFcn: []Hessian: 'off'HessMult: []HessPattern: 'sparse(ones(numberofvariables))'HessUpdate: 'bfgs'InitialHessType: 'scaled-identity'InitialHessMatrix: []InitBarrierParam: []InitTrustRegionRadius: []Jacobian: []JacobMult: []JacobPattern: []LargeScale: 'on'MaxNodes: []MaxPCGIter: 'max(1,floor(numberofvariables/2))'MaxProjCGIter: []MaxSQPIter: []MaxTime: []MeritFunction: []MinAbsMax: []NoStopIfFlatInfeas: []ObjectiveLimit: -1.0000e+20PhaseOneTotalScaling: []Preconditioner: []PrecondBandWidth: 0RelLineSrchBnd: []RelLineSrchBndDuration: []ScaleProblem: []Simplex: []SubproblemAlgorithm: []TolCon: []TolConSQP: []TolGradCon: []TolPCG: 0.1000TolProjCG: []TolProjCGAbs: []TypicalX: 'ones(numberofvariables,1)'UseParallel: 0

故函数options = optimset(‘GradObj’, ‘on’, ‘MaxIter’, 400)为了一个创建名称为options优选结构,其将参数’GradObj’设置为’On’,使用用户创建的梯度函数,将’MaxIter’设置为400,最大迭代次数为400。
2、关于语句[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options)
使用help fminunc查看fminunc函数的使用方法。

>> help fminuncfminunc finds a local minimum of a function of several variables.X = fminunc(FUN,X0) starts at X0 and attempts to find a local minimizerX of the function FUN. FUN accepts input X and returns a scalarfunction value F evaluated at X. X0 can be a scalar, vector or matrix. X = fminunc(FUN,X0,OPTIONS) minimizes with the default optimizationparameters replaced by values in OPTIONS, an argument created with theOPTIMOPTIONS function.  See OPTIMOPTIONS for details. Use theSpecifyObjectiveGradient option to specify that FUN also returns asecond output argument G that is the partial derivatives of thefunction df/dX, at the point X. Use the HessianFcn option to specifythat FUN also returns a third output argument H that is the 2nd partialderivatives of the function (the Hessian) at the point X. The Hessianis only used by the trust-region algorithm.X = fminunc(PROBLEM) finds the minimum for PROBLEM. PROBLEM is astructure with the function FUN in PROBLEM.objective, the start pointin PROBLEM.x0, the options structure in PROBLEM.options, and solvername 'fminunc' in PROBLEM.solver. Use this syntax to solve at the command line a problem exported from OPTIMTOOL. [X,FVAL] = fminunc(FUN,X0,...) returns the value of the objective function FUN at the solution X.[X,FVAL,EXITFLAG] = fminunc(FUN,X0,...) returns an EXITFLAG thatdescribes the exit condition. Possible values of EXITFLAG and thecorresponding exit conditions are listed below. See the documentationfor a complete description.1  Magnitude of gradient small enough. 2  Change in X too small.3  Change in objective function too small.5  Cannot decrease function along search direction.0  Too many function evaluations or iterations.-1  Stopped by output/plot function.-3  Problem seems unbounded. [X,FVAL,EXITFLAG,OUTPUT] = fminunc(FUN,X0,...) returns a structure OUTPUT with the number of iterations taken in OUTPUT.iterations, the number of function evaluations in OUTPUT.funcCount, the algorithm used in OUTPUT.algorithm, the number of CG iterations (if used) inOUTPUT.cgiterations, the first-order optimality (if used) inOUTPUT.firstorderopt, and the exit message in OUTPUT.message.[X,FVAL,EXITFLAG,OUTPUT,GRAD] = fminunc(FUN,X0,...) returns the value of the gradient of FUN at the solution X.[X,FVAL,EXITFLAG,OUTPUT,GRAD,HESSIAN] = fminunc(FUN,X0,...) returns the value of the Hessian of the objective function FUN at the solution X.ExamplesFUN can be specified using @:X = fminunc(@myfun,2)where myfun is a MATLAB function such as:function F = myfun(x)F = sin(x) + 3;To minimize this function with the gradient provided, modifythe function myfun so the gradient is the second output argument:function [f,g] = myfun(x)f = sin(x) + 3;g = cos(x);and indicate the gradient value is available by creating options withOPTIONS.SpecifyObjectiveGradient set to true (using OPTIMOPTIONS):options = optimoptions('fminunc','SpecifyObjectiveGradient',true);x = fminunc(@myfun,4,options);FUN can also be an anonymous function:x = fminunc(@(x) 5*x(1)^2 + x(2)^2,[5;1])If FUN is parameterized, you can use anonymous functions to capture theproblem-dependent parameters. Suppose you want to minimize the objective given in the function myfun, which is parameterized by its second argument c. Here myfun is a MATLAB file function such asfunction [f,g] = myfun(x,c)f = c*x(1)^2 + 2*x(1)*x(2) + x(2)^2; % functiong = [2*c*x(1) + 2*x(2)               % gradient2*x(1) + 2*x(2)];To optimize for a specific value of c, first assign the value to c. Then create a one-argument anonymous function that captures that value of c and calls myfun with two arguments. Finally, pass this anonymous function to fminunc:c = 3;                              % define parameter firstoptions = optimoptions('fminunc','SpecifyObjectiveGradient',true); % indicate gradient is provided x = fminunc(@(x) myfun(x,c),[1;1],options)See also optimoptions, fminsearch, fminbnd, fmincon, @, inline.Reference page for fminunc>>

fminunc是一个寻找局部最小值函数。
X = fminunc(FUN,X0) 表示从X0开始并尝试通过函数FUN寻找局部最小值. FUN接受输入参数X并返回一个在点X计算出来的标量,X0可以是标量、矢量和矩阵。
X = fminunc(FUN,X0,OPTIONS)使用了优化参数OPTIONS而不是默认参数。
例如FUN可以是匿名函数:
X = fminunc(@myfun,2)
function F = myfun(x)
F = sin(x) + 3;
为了使用提供的梯度函数,修改myfun函数,增加其输出第二参数为梯度。并增加优选结构options。
function [f,g] = myfun(x)
f = sin(x) + 3;
g = cos(x);
options = optimoptions(‘fminunc’,‘SpecifyObjectiveGradient’,true);
x = fminunc(@myfun,4,options);

综上,理解语句[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options)的意思为:
(1)@(t)(costFunction(t, X, y)为匿名函数,原本函数costFunction为输入三个参数theta,X,y输出两个参数J,gradient,通过匿名函数后,其输入参数变成一个t,另外两个参数X,y变成固定值(X,y前面已经定义),故该表达符合fminunc函数的第一个输入参数,即输入为函数,且该函数仅有一个输入,输出第一个值为计算的最小值,输出第二个值为用户自定义梯度计算值;
(2)initial_theta为寻找局部最小值的起始位置;
(3)options为函数fminunc的优化配置,即最大迭代次数为400,用户梯度函数设置为On;
(4)根据解释[X,FVAL] = fminunc(FUN,X0,…) returns the value of the objective function FUN at the solution X。
[theta, cost] = fminunc(@(t)(costFunction(t, X, y)), initial_theta, options)表示的是返回函数costFunction的局部最小值,其在点theta处,函数costFunction在theta处值为cost。

吴恩达机器学习练习2:optimset和fminunc函数相关推荐

  1. 【吴恩达机器学习】Week4 编程作业ex3——多分类任务和神经网络

    Multi-class Classification 1. 数据预处理和可视化 dispalyData.m function [h, display_array] = displayData(X, e ...

  2. 吴恩达机器学习之逻辑回归:逻辑回归的假说表示、判定边界、代价函数、简化的成本函数和梯度下降、高级悠哈、多类别分类之一对多(详细笔记,建议收藏,已有专栏)

    吴恩达机器学习栏目清单 专栏直达:https://blog.csdn.net/qq_35456045/category_9762715.html 文章目录 6.逻辑回归(Logistic Regres ...

  3. 吴恩达机器学习笔记整理(Week1-Week5)

    吴恩达机器学习笔记整理 1. Week1 1.1 什么是机器学习(What is Machine Learning) 1.2机器学习算法分类 1.3 单变量线性回归(Linear Regression ...

  4. 【CV】吴恩达机器学习课程笔记 | 第1-2章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 1 介绍 1-3 监督学习 1-4 无监督学习 2 单变量 ...

  5. 【CV】吴恩达机器学习课程笔记第18章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 18 应用案例:照片OCR 18-1 问题描述与流程(pi ...

  6. 【CV】吴恩达机器学习课程笔记第17章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 17 大规模机器学习 17-1 学习大数据集 17-2 随 ...

  7. 【CV】吴恩达机器学习课程笔记第16章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 16 推荐系统 16-1 问题规划 16-2 基于内容的推 ...

  8. 【CV】吴恩达机器学习课程笔记第10章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 10 应用机器学习的建议 10-1 决定下一步做什么 10 ...

  9. 【CV】吴恩达机器学习课程笔记第11章

    本系列文章如果没有特殊说明,正文内容均解释的是文字上方的图片 机器学习 | Coursera 吴恩达机器学习系列课程_bilibili 目录 11 机器学习系统设计 11-1 确定执行的优先级:以垃圾 ...

  10. 带你少走弯路:五篇文章学完吴恩达机器学习

    本文是吴恩达老师的机器学习课程[1]的笔记和代码复现部分,这门课是经典,没有之一.但是有个问题,就是内容较多,有些内容确实有点过时. 如何在最短时间学完这门课程?作为课程的主要翻译者和笔记作者,我推荐 ...

最新文章

  1. 【转载】c语言中的可变参数编程
  2. Lucene查询索引代码实现
  3. 分享一个我现在用的Eclipse(ZendStudio)的PHP黑色背景主题,喜欢的请留言拿走。...
  4. Nginx_负载均衡配置讲解
  5. php gd库 函数 建立gif,PHP_PHP GD库生成图像的几个函数总结,使用GD库中提供的函数动态绘 - phpStudy...
  6. C++学习之路 | PTA乙级—— 1064 朋友数 (20 分)(精简)
  7. 服务器位置设置在哪里找,服务器主页在哪里设置方法
  8. Social Dialogue征集IT意见领袖和优秀博客的RSS地址
  9. citirx for wincor configuration (citrix 7.5 setup with WI)
  10. SSM框架流浪动物管理系统宠物寄样收养领养宠物收容所管理(idea开发javaweb-javaee-j2ee-springboot)
  11. 凸优化工具包CVX快速入门
  12. python遥控汽车玩具_分享 | 撞坏遥控车后,有个技术大牛爸爸是种怎样的体验
  13. 简简单单,做自己的视频加密软件
  14. 数据科学总纲:欲练此功,必过此纲
  15. python循环队列_JS 队列-优先队列、循环队列
  16. 英文双引号引发的杯具
  17. 2021年想转行产品经理,应该如何入门?
  18. ABAP中的subroutine和function module
  19. STC15系列单片机SPI使用教程(一)
  20. redis泡菜5_《redis讲解》PPT课件.pptx

热门文章

  1. Java语言基础(4)
  2. mysql lost connection to server during query
  3. CentOS 6.3用ssh无密码登陆其它主机
  4. mysql单实例安装
  5. 一句话总结重构、重载、重写
  6. WebBrowser控件 打印2
  7. vm15安装MACOS
  8. 10 python 扩展
  9. 记一次无法登录 wine QQ
  10. python关于文件的一些记录