機器學習作業(一)線性回歸——Matlab實現


題目太長啦!文檔下載【傳送門

第1題

簡述:設計一個5*5的單位矩陣。

function A = warmUpExercise()
A = [];
A = eye(5);
end

 

運行結果:

 

第2題

簡述:實現單變量線性回歸。

第1步:加載數據文件;

data = load('ex1data1.txt');
X = data(:, 1); y = data(:, 2);
m = length(y); % number of training examples
% Plot Data
% Note: You have to complete the code in plotData.m
plotData(X, y);

 

 第2步:plotData函數實現訓練樣本的可視化;

function plotData(x, y)
figure;
plot(x,y,'rx','MarkerSize',10);
ylabel('Profit in $10,000s');
xlabel('Population of City in 10,000s');
end 

 

第3步:使用梯度下降函數計算局部最優解,並顯示線性回歸;

X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
theta = zeros(2, 1); % initialize fitting parameters
% Some gradient descent settings
iterations = 1500;
alpha = 0.01;
% run gradient descent
theta = gradientDescent(X, y, theta, alpha, iterations);
% print theta to screen
fprintf('Theta found by gradient descent:\n');
fprintf('%f\n', theta);
% Plot the linear fit
hold on; % keep previous plot visible
plot(X(:,2), X*theta, '-')
legend('Training data', 'Linear regression')
hold off % don't overlay any more plots on this figure  

 

第4步:實現梯度下降gradientDescent函數;

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

% Initialize some useful values
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters
    theta = theta - alpha/length(y)*(X'*(X*theta-y));
    % Save the cost J in every iteration    
    J_history(iter) = computeCost(X, y, theta);
end

end

 

第5步:實現代價計算computeCost函數;

function J = computeCost(X, y, theta)
m = length(y); % number of training examples
J = 1/(2*m)*sum((X*theta-y).^2);
end

 

第6步:實現三維圖、輪廓圖的顯示。

% Grid over which we will calculate J
theta0_vals = linspace(-10, 10, 100);
theta1_vals = linspace(-1, 4, 100);

% initialize J_vals to a matrix of 0's
J_vals = zeros(length(theta0_vals), length(theta1_vals));

% Fill out J_vals
for i = 1:length(theta0_vals)
    for j = 1:length(theta1_vals)
	  t = [theta0_vals(i); theta1_vals(j)];
	  J_vals(i,j) = computeCost(X, y, t);
    end
end

% Because of the way meshgrids work in the surf command, we need to
% transpose J_vals before calling surf, or else the axes will be flipped
J_vals = J_vals';
% Surface plot
figure;
surf(theta0_vals, theta1_vals, J_vals);
xlabel('\theta_0'); ylabel('\theta_1');

% Contour plot
figure;
% Plot J_vals as 15 contours spaced logarithmically between 0.01 and 100
contour(theta0_vals, theta1_vals, J_vals, logspace(-2, 3, 20))
xlabel('\theta_0'); ylabel('\theta_1');
hold on;
plot(theta(1), theta(2), 'rx', 'MarkerSize', 10, 'LineWidth', 2);

 

運行結果:

 

 

 

第3題

簡述:實現多元線性回歸。

第1步:加載數據文件;

data = load('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);
[X mu sigma] = featureNormalize(X);
% Add intercept term to X
X = [ones(m, 1) X];

 

第2步:均值歸一化featureNormalize函數實現;

function [X_norm, mu, sigma] = featureNormalize(X)

X_norm = X;
mu = zeros(1, size(X, 2));
sigma = zeros(1, size(X, 2));
mu = mean(X,1);
sigma = std(X,0,1);
X_norm = (X_norm-mu)./sigma;

end

 

第3步:使用梯度下降函數計算局部最優解,並顯示線性回歸;

% Choose some alpha value
alpha = 0.05;
num_iters = 100;

% Init Theta and Run Gradient Descent 
theta = zeros(3, 1);
[theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters);

% Plot the convergence graph
figure;
plot(1:numel(J_history), J_history, '-b', 'LineWidth', 2);
xlabel('Number of iterations');
ylabel('Cost J');

 

第4步:實現梯度下降gradientDescentMulti函數;

function [theta, J_history] = gradientDescentMulti(X, y, theta, alpha, num_iters)

m = length(y); % number of training examples
J_history = zeros(num_iters, 1);

for iter = 1:num_iters
    theta = theta - alpha/m*(X'*(X*theta-y));
    % Save the cost J in every iteration    
    J_history(iter) = computeCostMulti(X, y, theta);
end

end

 

第5步:實現代價計算computeCostMulti函數;

function J = computeCostMulti(X, y, theta)
m = length(y); % number of training examples
J = 1/(2*m)*sum((X*theta-y).^2);%J=(X*theta-y)'*(X*theta-y)/(2*m);
end

 

運行結果:

 

第6步:使用上述結果對“the price of a 1650 sq-ft, 3 br house”進行預測;

X1 = [1,1650,3];
X1(2:3) = (X1(2:3)-mu)./sigma;
price = X1*theta;

預測結果: 

 

第7步:使用正規方程法求解;

%%Load Data
data = csvread('ex1data2.txt');
X = data(:, 1:2);
y = data(:, 3);
m = length(y);

% Add intercept term to X
X = [ones(m, 1) X];

% Calculate the parameters from the normal equation
theta = normalEqn(X, y);

 

第8步:實現normalEqn函數;

function [theta] = normalEqn(X, y)
theta = zeros(size(X, 2), 1);
theta = (X'*X)^(-1)*X'*y;
end

 

第9步:使用上述結果對“the price of a 1650 sq-ft, 3 br house”再次進行預測;

price = [1,1650,3]*theta;

預測結果:(與梯度下降法結果很接近)

 


免責聲明!

本站轉載的文章為個人學習借鑒使用,本站對版權不負任何法律責任。如果侵犯了您的隱私權益,請聯系本站郵箱yoyou2525@163.com刪除。



 
粵ICP備18138465號   © 2018-2025 CODEPRJ.COM