Stanford 机器学习练习 Part 1 Linear Regression
source link: https://zxs.io/article/155
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
warmUpExercise.m
function A = warmUpExercise() %WARMUPEXERCISE Example function in octave % A = WARMUPEXERCISE() is an example function that returns the 5x5 identity matrix A = []; % ============= YOUR CODE HERE ============== % Instructions: Return the 5x5 identity matrix % In octave, we return values by defining which variables % represent the return values (at the top of the file) % and then set them accordingly. A = eye(5); % =========================================== end
computeCost.m
function J = computeCost(X, y, theta) %COMPUTECOST Compute cost for linear regression % J = COMPUTECOST(X, y, theta) computes the cost of using theta as the % parameter for linear regression to fit the data points in X and y % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta % You should set J to the cost. J = sum((X*theta - y).^2) / (2*m); % ========================================================================= end
plotData.m
function plotData(x, y) %PLOTDATA Plots the data points x and y into a new figure % PLOTDATA(x,y) plots the data points and gives the figure axes labels of % population and profit. % ====================== YOUR CODE HERE ====================== % Instructions: Plot the training data into a figure using the % "figure" and "plot" commands. Set the axes labels using % the "xlabel" and "ylabel" commands. Assume the % population and revenue data have been passed in % as the x and y arguments of this function. % % Hint: You can use the 'rx' option with plot to have the markers % appear as red crosses. Furthermore, you can make the % markers larger by using plot(..., 'rx', 'MarkerSize', 10); figure; % open a new figure window plot(x, y, 'rx', 'MarkerSize', 5); xlabel("x"); ylabel("y"); % ============================================================ end
gradientDescent.m
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta % theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here. % temp = 0; temp = temp + alpha/m * X' * (y - X * theta); theta = theta + temp; % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta); end end
ex1.m
%% Machine Learning Online Class - Exercise 1: Linear Regression % Instructions % ------------ % % This file contains code that helps you get started on the % linear exercise. You will need to complete the following functions % in this exericse: % % warmUpExercise.m % plotData.m % gradientDescent.m % computeCost.m % gradientDescentMulti.m % computeCostMulti.m % featureNormalize.m % normalEqn.m % % For this exercise, you will not need to change any code in this file, % or any other files other than those mentioned above. % % x refers to the population size in 10,000s % y refers to the profit in $10,000s % %% Initialization clear ; close all; clc %% ==================== Part 1: Basic Function ==================== % Complete warmUpExercise.m fprintf('Running warmUpExercise ... \n'); fprintf('5x5 Identity Matrix: \n'); warmUpExercise() fprintf('Program paused. Press enter to continue.\n'); pause; %% ======================= Part 2: Plotting ======================= fprintf('Plotting Data ...\n') data = load('ex1data1.txt'); X = data(:, 1); y = data(:, 2); m = length(y); % number of training examples % Plot Data % Note: You have to complete the code in plotData.m plotData(X, y); fprintf('Program paused. Press enter to continue.\n'); pause; %% =================== Part 3: Gradient descent =================== fprintf('Running Gradient Descent ...\n') X = [ones(m, 1), data(:,1)]; % Add a column of ones to x theta = zeros(2, 1); % initialize fitting parameters % Some gradient descent settings iterations = 1500; alpha = 0.01; % compute and display initial cost computeCost(X, y, theta) % run gradient descent theta = gradientDescent(X, y, theta, alpha, iterations); % print theta to screen fprintf('Theta found by gradient descent: '); fprintf('%f %f \n', theta(1), theta(2)); % Plot the linear fit hold on; % keep previous plot visible plot(X(:,2), X*theta, '-') legend('Training data', 'Linear regression') hold off % don't overlay any more plots on this figure % Predict values for population sizes of 35,000 and 70,000 predict1 = [1, 3.5] *theta; fprintf('For population = 35,000, we predict a profit of %f\n',... predict1*10000); predict2 = [1, 7] * theta; fprintf('For population = 70,000, we predict a profit of %f\n',... predict2*10000); fprintf('Program paused. Press enter to continue.\n'); pause; %% ============= Part 4: Visualizing J(theta_0, theta_1) ============= fprintf('Visualizing J(theta_0, theta_1) ...\n') % Grid over which we will calculate J theta0_vals = linspace(-10, 10, 100); theta1_vals = linspace(-1, 4, 100); % initialize J_vals to a matrix of 0's J_vals = zeros(length(theta0_vals), length(theta1_vals)); % Fill out J_vals for i = 1:length(theta0_vals) for j = 1:length(theta1_vals) t = [theta0_vals(i); theta1_vals(j)]; J_vals(i,j) = computeCost(X, y, t); end end % Because of the way meshgrids work in the surf command, we need to % transpose J_vals before calling surf, or else the axes will be flipped J_vals = J_vals'; % Surface plot figure; surf(theta0_vals, theta1_vals, J_vals) xlabel('\theta_0'); ylabel('\theta_1'); % Contour plot figure; % Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100 contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20)) xlabel('\theta_0'); ylabel('\theta_1'); hold on; plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);
featureNormalize.m
function [X_norm, mu, sigma] = featureNormalize(X) %FEATURENORMALIZE Normalizes the features in X % FEATURENORMALIZE(X) returns a normalized version of X where % the mean value of each feature is 0 and the standard deviation % is 1. This is often a good preprocessing step to do when % working with learning algorithms. % You need to set these values correctly X_norm = X; m = size(X, 2); mu = zeros(1, size(X, 2)); mu = mean(X); sigma = std(X); for i = 1:m X_norm(:,i) = (X(:,i).-mu(i))./sigma(i); end % ====================== YOUR CODE HERE ====================== % Instructions: First, for each feature dimension, compute the mean % of the feature and subtract it from the dataset, % storing the mean value in mu. Next, compute the % standard deviation of each feature and divide % each feature by it's standard deviation, storing % the standard deviation in sigma. % % Note that X is a matrix where each column is a % feature and each row is an example. You need % to perform the normalization separately for % each feature. % % Hint: You might find the 'mean' and 'std' functions useful. % % ============================================================ end
computeCostMulti.m
function J = computeCostMulti(X, y, theta) %COMPUTECOSTMULTI Compute cost for linear regression with multiple variables % J = COMPUTECOSTMULTI(X, y, theta) computes the cost of using theta as the % parameter for linear regression to fit the data points in X and y % Initialize some useful values m = length(y); % number of training examples % You need to return the following variables correctly J = 0; % ====================== YOUR CODE HERE ====================== % Instructions: Compute the cost of a particular choice of theta % You should set J to the cost. J = 1/(2*m) * ( X * theta - y)' * (X*theta - y); % ========================================================================= end
gradientDescentMulti.m
function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters) %GRADIENTDESCENTMULTI Performs gradient descent to learn theta % theta = GRADIENTDESCENTMULTI(x, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1); temp = zeros(feature_number,1); for iter = 1:num_iters temp = alpha/m * X' * (y - X*theta); theta = theta + temp; J_history(iter) = computeCostMulti(X, y, theta); % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta. % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCostMulti) and gradient here. % % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCostMulti(X, y, theta); end end
normalEqn.m
function [theta] = normalEqn(X, y) %NORMALEQN Computes the closed-form solution to linear regression % NORMALEQN(X,y) computes the closed-form solution to linear % regression using the normal equations. theta = zeros(size(X, 2), 1); % ====================== YOUR CODE HERE ====================== % Instructions: Complete the code to compute the closed form solution % to linear regression and put the result in theta. % % ---------------------- Sample Solution ---------------------- theta = pinv(X' * X) * X' * y; % ------------------------------------------------------------- % ============================================================ end
ex1_multi.m
%% Machine Learning Online Class % Exercise 1: Linear regression with multiple variables % % Instructions % ------------ % % This file contains code that helps you get started on the % linear regression exercise. % % You will need to complete the following functions in this % exericse: % % warmUpExercise.m % plotData.m % gradientDescent.m % computeCost.m % gradientDescentMulti.m % computeCostMulti.m % featureNormalize.m % normalEqn.m % % For this part of the exercise, you will need to change some % parts of the code below for various experiments (e.g., changing % learning rates). % %% Initialization %% ================ Part 1: Feature Normalization ================ %% Clear and Close Figures clear ; close all; clc fprintf('Loading data ...\n'); %% Load Data data = load('ex1data2.txt'); X = data(:, 1:2); y = data(:, 3); m = length(y); % Print out some data points fprintf('First 10 examples from the dataset: \n'); fprintf(' x = [%.0f %.0f], y = %.0f \n', [X(1:10,:) y(1:10,:)]'); fprintf('Program paused. Press enter to continue.\n'); pause; % Scale features and set them to zero mean fprintf('Normalizing Features ...\n'); [X mu sigma] = featureNormalize(X); % Add intercept term to X X = [ones(m, 1) X]; %% ================ Part 2: Gradient Descent ================ % ====================== YOUR CODE HERE ====================== % Instructions: We have provided you with the following starter % code that runs gradient descent with a particular % learning rate (alpha). % % Your task is to first make sure that your functions - % computeCost and gradientDescent already work with % this starter code and support multiple variables. % % After that, try running gradient descent with % different values of alpha and see which one gives % you the best result. % % Finally, you should complete the code at the end % to predict the price of a 1650 sq-ft, 3 br house. % % Hint: By using the 'hold on' command, you can plot multiple % graphs on the same figure. % % Hint: At prediction, make sure you do the same feature normalization. % fprintf('Running gradient descent ...\n'); % Choose some alpha value alpha = 0.01; num_iters = 400; % Init Theta and Run Gradient Descent theta = zeros(3, 1); [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters); % Plot the convergence graph figure; plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2); xlabel('Number of iterations'); ylabel('Cost J'); % Display gradient descent's result fprintf('Theta computed from gradient descent: \n'); fprintf(' %f \n', theta); fprintf('\n'); % Estimate the price of a 1650 sq-ft, 3 br house % ====================== YOUR CODE HERE ====================== % Recall that the first column of X is all-ones. Thus, it does % not need to be normalized. price = 0; % You should change this % ============================================================ fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ... '(using gradient descent):\n $%f\n'], price); fprintf('Program paused. Press enter to continue.\n'); pause; %% ================ Part 3: Normal Equations ================ fprintf('Solving with normal equations...\n'); % ====================== YOUR CODE HERE ====================== % Instructions: The following code computes the closed form % solution for linear regression using the normal % equations. You should complete the code in % normalEqn.m % % After doing so, you should complete this code % to predict the price of a 1650 sq-ft, 3 br house. % %% Load Data data = csvread('ex1data2.txt'); X = data(:, 1:2); y = data(:, 3); m = length(y); % Add intercept term to X X = [ones(m, 1) X]; % Calculate the parameters from the normal equation theta = normalEqn(X, y); % Display normal equation's result fprintf('Theta computed from the normal equations: \n'); fprintf(' %f \n', theta); fprintf('\n'); % Estimate the price of a 1650 sq-ft, 3 br house % ====================== YOUR CODE HERE ====================== price = 0; % You should change this % ============================================================ fprintf(['Predicted price of a 1650 sq-ft, 3 br house ' ... '(using normal equations):\n $%f\n'], price);
Recommend
-
76
code and opinions
-
48
In this post we will see how to include the effect of predictors in non-linear regressions. In other words, letting the parameters of non-linear regressions vary according to some explanatory variables (or predictors). B...
-
33
We’re living in the era of large amounts ofdata, powerful computers, and artificial intelligence. This is just the beginning.Data science and machine learning are driving image recognition, autonomous vehicles developmen...
-
7
Stanford 机器学习练习 Part 3 Neural Networks: Representation 从神经网络开始,感觉自己慢慢跟不上课程的节奏了,一些代码好多参考了别人的代码,而且,让我现在单独写也不一定写的出来了。学习就是一件慢慢积累的过程,两年前我学算法的时候,好多算法都...
-
15
Stanford 机器学习练习 Part 2 Logistics Regression以下是我学习Andrew Ng machine learning 课程时logistic regression的相关代码,仅作为参考,因为是初学,暂时没办法做出总结。 sigmoid.m function g = sigmoid(z) %SIGMOID Compute sigmo...
-
11
-
7
-
11
-
9
1. Linear Regression1.1 正规方程...
-
5
Notes on Stanford Linear Accelerator Center 2023-05-25 The latest of my "sixth grade class trip" adventures found me (and some co-workers) at SLAC - the Stanford Linear Accelerator Center.
About Joyk
Aggregate valuable and interesting links.
Joyk means Joy of geeK