The design matrix X (in the bottom right side of the slide) given in the example should have elements x with subscript 1 and superscripts varying from 1 to m because for all m training sets there are only 2 features x0x_0x0​ and x1x_1x1​The X matrix is m by (n+1) and NOT n by n.

Gradient descent gives one way of minimizing J. Let’s discuss a second way of doing so, this time performing the minimization explicitly and without resorting to an iterative algorithm. In the “Normal Equation” method, we will minimize J by explicitly taking its derivatives with respect to the θj ’s, and setting them to zero. This allows us to find the optimum theta without iteration. The normal equation formula is given below:
θ=(XTX)−1XTy\theta=(X^TX)^{-1}X^Tyθ=(XTX)−1XTy

There is no need to do feature scaling with the normal equation.
The following is a comparison of gradient descent and the normal equation:

Gradient Descent Normal Equation
Need to choose alpha No need to choose alpha
Needs many iterations No need to iterate
o(kn2)n^2)n2) o(n3n^3n3),need to caculate inverse of XTXX^TXXTX
Works well when n is large Slow if n is very large

With the normal equation, computing the inversion has complexity O(n3n^3n3). So if we have a very large number of features, the normal equation will be slow. In practice, when n exceeds 10,000 it might be a good time to go from a normal solution to an iterative process.

Normal Equation----machine learning相关推荐

  1. [coursera machine learning] Week 1

    1. machine learning 问题的分类: Supervised Learning: right answers given in samples Regression: continuou ...

  2. coursera—吴恩达Machine Learning笔记(1-3周)

    Machine Learning 笔记 笔记主要按照进度记录上课主要内容和部分代码实现,因为我会看一阶段再进行整理,内容会有一定交叉.关于代码部分,一开始我是只为了做作业而写代码的,现在觉得不妨仔细看 ...

  3. Machine Learning - Andrew Ng on Coursera (Week 2)

    本篇文章将分享Coursera上Andrew Ng的Machine Learning第二周的课程,主要内容有如下,详细内容可以参考文末附件: 设置作业环境 多变量线性回归 参数的解析算法 Octave ...

  4. 吴恩达《Machine Learning》精炼笔记 2:梯度下降与正规方程

    作者 | Peter 编辑 | AI有道 今天带来第二周课程的笔记:梯度下降与正规方程. 主要内容: 多维特征 多变量梯度下降 梯度下降法实践 正规方程 多维特征Multiple Features 还 ...

  5. python解zuobiaoxi方程_吴恩达《Machine Learning》精炼笔记 2:梯度下降与正规方程

    作者:Peter 红色石头的个人网站: 红色石头的个人博客-机器学习.深度学习之路​www.redstonewill.com 今天带来第二周课程的笔记:梯度下降与正规方程. 主要内容: 多维特征 多变 ...

  6. Andrew Ng Machine Learning 专题【Logistic Regression amp; Regularization】

    此文是斯坦福大学,机器学习界 superstar - Andrew Ng 所开设的 Coursera 课程:Machine Learning 的课程笔记. 力求简洁,仅代表本人观点,不足之处希望大家探 ...

  7. coursera机器学习笔记-多元线性回归,normal equation

    #对coursera上Andrew Ng老师开的机器学习课程的笔记和心得: #注:此笔记是我自己认为本节课里比较重要.难理解或容易忘记的内容并做了些补充,并非是课堂详细笔记和要点: #标记为<补 ...

  8. Machine Learning笔记(三) 多变量线性回归

    2019独角兽企业重金招聘Python工程师标准>>> Machine Learning笔记(三) 多变量线性回归 注:本文内容资源来自 Andrew Ng 在 Coursera上的 ...

  9. Machine Learning Basics(2)

    文章目录 CODE WORKS CONTENTS Capacity, Overfitting and Underfitting The No Free Lunch Theorem Regulariza ...

  10. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

最新文章

  1. 工具用途_机械加工中研磨加工刀具(砂轮)﹑治工具及其用途
  2. Linux下MySQL的几种安装方式
  3. SSH生成rsa密钥对
  4. python build-in function
  5. centos 零碎学习小记 2.
  6. android 上传nexus_android发布到nexus私服
  7. [Ext JS] 3.3 树(Tree)的定义和使用
  8. TMS320F28335项目开发记录1_CCS的使用介绍
  9. 【LoadRunner技术讲座8】LoadRunner函数中的几个陷阱
  10. ThickBox在ASP.NET中的应用
  11. C语言 一个字符串翻转函数的编写
  12. 软考-程序员-知识点汇总
  13. 使用opencv进行车牌提取及识别
  14. php微信问卷调查,We_Questionnaire: !!停止维护!!基于Thinkphp3.2.3 + jqueryMobile1.4.4的微信公众号应用 -- 移动端问卷调查...
  15. Kaggle-泰坦尼克号-机器学习/数据挖掘学习笔记
  16. 比较Cint() , int() , fix() ,round()的区别
  17. 帝国cms php超时,帝国CMS二次开发基本问题汇总
  18. html+css技巧分享和IE6典型BUG分析(重温一下)
  19. ML模型特点以及区别
  20. dedecms 发布文章时,关键字会自动加内链

热门文章

  1. 复习webpack4之PWA打包配置
  2. java日期多次使用修改,数据有问题
  3. c语言新手的无奈,几个新手容易犯的错误
  4. create-react-app+antd+react-css-modules配置
  5. 用python写MapReduce函数——以WordCount为例
  6. 【渗透】浅谈webshell隐藏
  7. 从排序开始(三)归并排序
  8. 连接 mysql 数据库失败频繁的原因探秘
  9. ISA2000资料大全(详细)
  10. php xml三级联动,jquery+xml实现三级联动步骤详解