第一次编程作业跟着讲义感觉还是很简单,毕竟大部分代码都给了,自己只需要写一点点的算法实现,其实就是编写几个数学公式的事情,接上代码。
PS:博主在做一维的情况时对各个变量的计算进行了向量化,因此损失函数和梯度下降可以完全适应于多特征值的情况。
损失函数:

function J = computeCost(X, y, theta)
%COMPUTECOST Compute cost for linear regression
%   J = COMPUTECOST(X, y, theta) computes the cost of using theta as the
%   parameter for linear regression to fit the data points in X and y% Initialize some useful values
m = length(y); % number of training examples% You need to return the following variables correctly
J = 0;% ====================== YOUR CODE HERE ======================
% Instructions: Compute the cost of a particular choice of theta
%               You should set J to the cost.
s = 0;for iter=1:mh = theta' * X(iter,:)';s  = s + (h-y(iter))^2;
endJ = s/(2*m);% =========================================================================end

梯度下降:

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
%   theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
%   taking num_iters gradient steps with learning rate alpha% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);for iter = 1:num_iters% ====================== YOUR CODE HERE ======================% Instructions: Perform a single gradient step on the parameter vector%               theta. %% Hint: While debugging, it can be useful to print out the values%       of the cost function (computeCost) and gradient here.%s = zeros(size(theta));for j = 1:mh = theta' * X(j,:)';s = s + (h-y(j)) * (X(j,:)');endtheta  = theta - alpha * s / m;% ============================================================% Save the cost J in every iteration    J_history(iter) = computeCost(X, y, theta);endend

Normal Equations:

function [theta] = normalEqn(X, y)
%NORMALEQN Computes the closed-form solution to linear regression
%   NORMALEQN(X,y) computes the closed-form solution to linear
%   regression using the normal equations.theta = zeros(size(X, 2), 1);% ====================== YOUR CODE HERE ======================
% Instructions: Complete the code to compute the closed form solution
%               to linear regression and put the result in theta.
%% ---------------------- Sample Solution ----------------------
theta = inv(X'*X)*X'*y% -------------------------------------------------------------% ============================================================end

ex1_multi.m:

%% Machine Learning Online Class
%  Exercise 1: Linear regression with multiple variables
%
%  Instructions
%  ------------
%
%  This file contains code that helps you get started on the
%  linear regression exercise.
%
%  You will need to complete the following functions in this
%  exericse:
%
%     warmUpExercise.m
%     plotData.m
%     gradientDescent.m
%     computeCost.m
%     gradientDescentMulti.m
%     computeCostMulti.m
%     featureNormalize.m
%     normalEqn.m
%
%  For this part of the exercise, you will need to change some
%  parts of the code below for various experiments (e.g., changing
%  learning rates).
%%% Initialization%% ================ Part 1: Feature Normalization ================%% Clear and Close Figures
clear ; close all; clcfprintf('Loading data ...\n');%% Load Data
data = load('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);% Print out some data points
fprintf('First 10 examples from the dataset: \n');
fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]');fprintf('Program paused. Press enter to continue.\n');
pause;% Scale features and set them to zero mean
fprintf('Normalizing Features ...\n');[X mu sigma] = featureNormalize(X);% Add intercept term to X
X = [ones(m, 1) X];%% ================ Part 2: Gradient Descent ================% ====================== YOUR CODE HERE ======================
% Instructions: We have provided you with the following starter
%               code that runs gradient descent with a particular
%               learning rate (alpha).
%
%               Your task is to first make sure that your functions -
%               computeCost and gradientDescent already work with
%               this starter code and support multiple variables.
%
%               After that, try running gradient descent with
%               different values of alpha and see which one gives
%               you the best result.
%
%               Finally, you should complete the code at the end
%               to predict the price of a 1650 sq-ft, 3 br house.
%
% Hint: By using the 'hold on' command, you can plot multiple
%       graphs on the same figure.
%
% Hint: At prediction, make sure you do the same feature normalization.
%fprintf('Running gradient descent ...\n');% Choose some alpha value
alpha = 0.01;
num_iters = 400;% Init Theta and Run Gradient Descent
theta = zeros(3, 1);
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);% Plot the convergence graph
figure;
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');% Display gradient descent's result
fprintf('Theta computed from gradient descent: \n');
fprintf(' %f \n', theta);
fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house
% ====================== YOUR CODE HERE ======================
% Recall that the first column of X is all-ones. Thus, it does
% not need to be normalized.
price = theta' * [1;(1650-mu)/sigma;3]; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...'(using gradient descent):\n $%f\n'], price);fprintf('Program paused. Press enter to continue.\n');
pause;%% ================ Part 3: Normal Equations ================fprintf('Solving with normal equations...\n');% ====================== YOUR CODE HERE ======================
% Instructions: The following code computes the closed form
%               solution for linear regression using the normal
%               equations. You should complete the code in
%               normalEqn.m
%
%               After doing so, you should complete this code
%               to predict the price of a 1650 sq-ft, 3 br house.
%%% Load Data
data = csvread('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);% Add intercept term to X
X = [ones(m, 1) X];% Calculate the parameters from the normal equation
theta = normalEqn(X, y);% Display normal equation's result
fprintf('Theta computed from the normal equations: \n');
fprintf(' %f \n', theta);
fprintf('\n');% Estimate the price of a 1650 sq-ft, 3 br house
% ====================== YOUR CODE HERE ======================
price = theta' * [1;1650;3]; % You should change this% ============================================================fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ...'(using normal equations):\n $%f\n'], price);

斯坦福机器学习公开课第一次编程作业相关推荐

  1. 斯坦福机器学习公开课学习笔记(1)—机器学习的动机与应用

    (转载请注明出处:http://blog.csdn.net/buptgshengod) 1.背景         斯坦福机器学习公开课差不多是网上能找到的最好的机器学习入门课程了.现在一共有20节课放 ...

  2. 斯坦福机器学习公开课笔记--神经网络的学习

    转载自:http://blog.csdn.net/jj12345jj198999/article/details/9024037 公开课地址:https://class.coursera.org/ml ...

  3. 斯坦福机器学习公开课学习笔记(3)—拟合问题以及局部权重回归、逻辑回归

    (转载请注明出处:http://blog.csdn.net/buptgshengod) 1.拟合问题        这节课首先讲到了一个我们经常遇到的问题,欠拟合(underfitting)以及过拟合 ...

  4. 斯坦福机器学习公开课学习笔记(2)—监督学习 梯度下降

    (转载请注明出处:http://blog.csdn.net/buptgshengod) 1.感受         这一节课Andrew讲的是监督学习应用中的梯度下降方法(Supervised-Lear ...

  5. 【斯坦福大学公开课CS224W——图机器学习】三、节点和图嵌入

    [斯坦福大学公开课CS224W--图机器学习]三.节点和图嵌入 文章目录 [斯坦福大学公开课CS224W--图机器学习]三.节点和图嵌入 1. 节点嵌入 1.1 编码器与解码器 1.2 节点嵌入的游走 ...

  6. 斯坦福大学机器学习公开课视频及课件

    下面是这段时间学习机器学习时下载的一些视频学习资料,斯坦福这套机器学习公开课是其中相当牛X的一个. 公开课的教授Andrew Ng不得不提,能够把很抽象的机器学习过程讲得很清楚,小弟看了这套公开课资料 ...

  7. 斯坦福大学公开课 :机器学习课程

    共20讲 在网易公开课上有视频全集,难能可贵的是配带中英文字幕 斯坦福大学公开课 :机器学习课程 在JerryLead的blog中可以下到他的学习笔记以及讲义原稿. 感谢Andrew Ng, 感谢Je ...

  8. 【斯坦福大学公开课CS224W——图机器学习】五、消息传递和节点分类

    [斯坦福大学公开课CS224W--图机器学习]五.消息传递和节点分类 文章目录 [斯坦福大学公开课CS224W--图机器学习]五.消息传递和节点分类 1. Message Passing and No ...

  9. Machine Learning机器学习公开课汇总

    机器学习目前比较热,网上也散落着很多相关的公开课和学习资源,这里基于课程图谱的机器学习公开课标签做一个汇总整理,便于大家参考对比. 1.Coursera上斯坦福大学Andrew Ng教授的" ...

最新文章

  1. Saas与传统软件对比
  2. Maven学习总结(52)——Maven 配置文件密码加密机制使用说明
  3. 那年我学过的SpringBoot笔记
  4. LINQ to SQL 用O/R设计器手工建表对象
  5. 在Win10 LTSC 2019上安装和卸载linux子系统
  6. 邻接矩阵实现无向图的创建并根据louvain算法实现分区
  7. 〖Python 数据库开发实战 - MySQL篇④〗- MacOS 配置 MySQL 环境变量及安装MySQL图形化工具 - MySQL Workbench
  8. PPT幻灯片母版在制作时的应用
  9. android 360度视频播放器,Android开发VR实战之播放360度全景视频
  10. 【免费or付费】外卖优惠券公众号的申请以及做法有什么不同?
  11. 【BackEnd】SpringBoot整合MybatisPlus实现登录注册功能(适合初学者)
  12. 如何打印CSDN文章或把文章转换PDF
  13. Linux系统安装make命令(错误提醒:Failed to search for file: Cannot update read-only repo)
  14. 最新分布式存储解决方案zData将于闪存论坛上正式发布!
  15. 骨干计算机专业,计算机教导非本专业与骨干专业的融洽
  16. jenkins 构建异常_jenkins构建失败的原因是什么?
  17. 驱动及驱动开发的简单理解
  18. ChatGPT介绍与使用场景
  19. 设计资料原理图-383光纤加速计算-XCKU060的双路QSFP+光纤PCIe 卡 高速信号处理卡
  20. 【面试智力题】你有四个装药丸的罐子,每个药丸都有一定的重量,被污染的药丸是没被污染的重量+1,只称量一次,如何判断哪个罐子的药被污染了?

热门文章

  1. 【一分钟学会】用python做一个语音对话ChatGPT的程序——打造私人语音助手
  2. 【C语言】快速排序函数qsort()
  3. 其他计算机设备是什么东西,什么是网络设备?计算机入门知识,这些网络设备及工具你有必要知道...
  4. 数据结构——哈希查找的实现(C语言)
  5. 想靠创业赚钱,这4个冷门生意不错,竞争不大,很多人都没听说过
  6. PCA主成分分析原理及分析实践详细介绍
  7. OO系统分析员之路--用例分析系列(7)--用例规约的编写--业务规则和实体描述
  8. 一边学计算机一边上班累的说说,上班好累好累心情说说
  9. 让奥迪耐克微软们集体翻车的Woke-washing,套路到底有多深?
  10. JavaScript交换两个变量值的七种解决方案