Linear Regression with Multiple Variables

5 试题

1.

Suppose m=4 students have taken some class, and the class had a midterm exam and a final exam. You have collected a dataset of their scores on the two exams, which is as follows:

midterm exam (midterm exam)2 final exam
89 7921 96
72 5184 74
94 8836 87
69 4761 78

You'd like to use polynomial regression to predict a student's final exam score from their midterm exam score. Concretely, suppose you want to fit a model of the form hθ(x)=θ0+θ1x1+θ2x2, where x1 is the midterm score and x2 is (midterm score)2. Further, you plan to use both feature scaling (dividing by the "max-min", or range, of a feature) and mean normalization.

What is the normalized feature x(1)1? (Hint: midterm = 89, final = 96 is training example 1.) Please round off your answer to two decimal places and enter in the text box below.

2.

You run gradient descent for 15 iterations

with α=0.3 and compute J(θ) after each

iteration. You find that the value of J(θ) increases over

time. Based on this, which of the following conclusions seems

most plausible?

Rather than use the current value of α, it'd be more promising to try a smaller value of α (say α=0.1).

Rather than use the current value of α, it'd be more promising to try a larger value of α(say α=1.0).

α=0.3 is an effective choice of learning rate.

3.

Suppose you have m=23 training examples with n=5 features (excluding the additional all-ones feature for the intercept term, which you should add). The normal equation is θ=(XTX)−1XTy. For the given values of m and n, what are the dimensions of θ, X, and y in this equation?

X is 23×6, y is 23×6, θ is 6×6

X is 23×5, y is 23×1, θ is 5×5

X is 23×6, y is 23×1, θ is 6×1

X is 23×5, y is 23×1, θ is 5×1

4.

Suppose you have a dataset with m=1000000 examples and n=200000 features for each example. You want to use multivariate linear regression to fit the parameters θ to our data. Should you prefer gradient descent or the normal equation?

The normal equation, since gradient descent might be unable to find the optimal θ.

The normal equation, since it provides an efficient way to directly find the solution.

Gradient descent, since it will always converge to the optimal θ.

Gradient descent, since (XTX)−1 will be very slow to compute in the normal equation.

5.

Which of the following are reasons for using feature scaling?

It prevents the matrix XTX (used in the normal equation) from being non-invertable (singular/degenerate).

It speeds up gradient descent by making it require fewer iterations to get to a good solution.

It is necessary to prevent the normal equation from getting stuck in local optima.

It speeds up gradient descent by making each iteration of gradient descent less expensive to compute.

Machine Learning week 2 quiz: Linear Regression with Multiple Variables相关推荐

  1. Machine Learning – 第2周(Linear Regression with Multiple Variables、Octave/Matlab Tutorial)

    Machine Learning – Coursera Octave for Microsoft Windows GNU Octave官网 GNU Octave帮助文档 (有900页的pdf版本) O ...

  2. 机器学习之多变量线性回归(Linear Regression with multiple variables)

    机器学习之多变量线性回归(Linear Regression with multiple variables) 1. Multiple features(多维特征) 在机器学习之单变量线性回归(Lin ...

  3. Coursera公开课笔记: 斯坦福大学机器学习第四课“多变量线性回归(Linear Regression with Multiple Variables)”

    Coursera公开课笔记: 斯坦福大学机器学习第四课"多变量线性回归(Linear Regression with Multiple Variables)" 斯坦福大学机器学习第 ...

  4. Machine Learning week 3 quiz : Logistic Regression

    Logistic Regression 5 试题 1. Suppose that you have trained a logistic regression classifier, and it o ...

  5. Machine Learning week 1 quiz: Linear Algebra

    Linear Algebra 1. Let two matrices be A=[4639],B=[−2−592] What is A - B? [411211] [611−1211] [611−67 ...

  6. Derivation of Linear Regression with Multiple Variables

    The article only shows the derivation parts of the lin reg model. Representations of variables and o ...

  7. 多元线性回归(Linear regression with multiple variables)

    目录 多维特征 多元的梯度下降法 特征和多项式回归 正规方程 多维特征 n:特征量(维度) m:样本数量 x(i):第 i 个样本 x(i)j:第 i 个样本的第 j 维度 多元线性回归: 多元的梯度 ...

  8. python多变量非线性拟合_python实现多变量线性回归(Linear Regression with Multiple Variables)...

    本文介绍如何使用python实现多变量线性回归,文章参考NG的视频和黄海广博士的笔记 现在对房价模型增加更多的特征,例如房间数楼层等,构成一个含有多个变量的模型,模型中的特征为(x1,x2,...,x ...

  9. Machine Learning week 6 quiz: programming assignment-Regularized Linear Regression and Bias/Variance

    一.ex5.m %% Machine Learning Online Class % Exercise 5 | Regularized Linear Regression and Bias-Varia ...

最新文章

  1. [微信小程序]单选框以及多选框实例代码附讲解
  2. Wind River Helix系统及物联网解决方案 简化企业IOT部署
  3. iOS: JS和Native交互的两种方法,iosjsnative交互
  4. 2021-07-24
  5. java date linux,Java中Date,SimpleDateFormat
  6. uCOS-II中的OS_CPU.h,OS_CPU_A.s,OS_CPU.c
  7. 用树莓派+lora shield搭建一个LoRaWAN网关
  8. esp8266 rtos 开发环境 ubuntu_esp8266/32~资源帖[持续更新]
  9. Try Git 译文
  10. cuda 图片拆分_急需,PDF怎么拆分啊?
  11. 每个创始人都需要了解的来自 Y Combinator 的 13 个见解
  12. java 模拟 cmd_用JAVA模拟实现CMD命令行
  13. 全面启动远程医疗行业
  14. 商业创业融资计划书PPT模板
  15. comsol光学仿真02
  16. 微信设置文字大小影响网页布局
  17. 【每天更新】2022年最新WordPress主题下载,外贸独立站商城/企业网站/个人博客模板 2022-5-18
  18. 高校计算机课程建设研讨会通知,计算机学院承办陕西省高校MOOC与大学计算机课程建设研讨会...
  19. 并发------多线程安全
  20. wangEditor实现用户自定义图片大小(改源码)

热门文章

  1. mysql登录抓包_MySQL登录验证的抓包
  2. 实战SSM_O2O商铺_42【前端展示】店铺列表页面View层的实现
  3. SQLite3单例模式(C++)
  4. python是什么学了有什么用处_学python有什么用途 就业方向有哪些
  5. ssm_maven idea分模块开发
  6. 语音识别:时间序列Damerau–Levenshtein距离
  7. 2021-01-10 Halcon初学者知识 【10】形状匹配 【二】模板的形状匹配
  8. in the java search_LeetCode第[33]题(Java):Search in Rotated Sorted Array
  9. php数组保存txt,php将数组存储为文本文件方法汇总,_PHP教程
  10. matlab datetime时间处理、时间转换