Question about gradient descent matlab

4 visualizzazioni (ultimi 30 giorni)
Yongsoon Hwang
Yongsoon Hwang il 18 Ott 2020
Risposto: Satwik il 25 Mar 2025
Hello everyone,
i am working on a exercise 1 of machine learning course and i am having a trouble with gradient descent.
I typed this code, however it is not working. Can anyone give me suggestion please?
Thank you very much!
unction [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
%GRADIENTDESCENT Performs gradient descent to learn theta
% theta = GRADIENTDESCENT(X, y, theta, alpha, num_iters) updates theta by
% taking num_iters gradient steps with learning rate alpha
data = load('ex1data1.txt'); % read comma separated data
y = data(:, 2);
% Initialize some useful values
m = length(y); % number of training examples
X = [ones(m, 1), data(:,1)]; % Add a column of ones to x
theta = zeros(2, 1);
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% ====================== YOUR CODE HERE ======================
% Instructions: Perform a single gradient step on the parameter vector
% theta.
%
% Hint: While debugging, it can be useful to print out the values
% of the cost function (computeCost) and gradient here.
%
for k=1:m;
predictions =X*theta;
theta=theta-(alpha/m*sum((predictions-y).*X))';
% ============================================================
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
end

Risposte (1)

Satwik
Satwik il 25 Mar 2025
Hi,
Here are some points which should help troubleshoot and fix the 'gradientDescent' function:
  1. Correct Syntax: Ensure the function definition starts with 'function' instead of 'unction'.
  2. Remove Data Loading Inside the function: The 'gradientDescent' function should not load data from a file. Instead, 'x', 'y', and 'theta' should be passed as arguments. Load the data and initialize the variables outside the function.
  3. Compute Cost Function: Ensure to have a 'computeCost' function defined to calculate the cost for linear regression.
Given below is a modified version of the code implementing the above changes and also illustrating its usage:
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m = length(y); % number of training examples
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
% Compute the predictions
predictions = X * theta;
% Compute the error
errors = predictions - y;
% Perform the gradient descent update
theta = theta - (alpha / m) * (X' * errors);
% Save the cost J in every iteration
J_history(iter) = computeCost(X, y, theta);
end
end
% Function to compute the cost for linear regression
function J = computeCost(X, y, theta)
m = length(y); % number of training examples
predictions = X * theta;
sqErrors = (predictions - y).^2;
J = 1 / (2 * m) * sum(sqErrors);
end
% Example Usage
% Load data
data = load('ex1data1.txt');
X = data(:, 1);
y = data(:, 2);
m = length(y);
% Add intercept term to X
X = [ones(m, 1), X];
% Initialize fitting parameters
theta = zeros(2, 1);
% Set gradient descent parameters
alpha = 0.01;
num_iters = 1500;
% Run gradient descent
[theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters);
% Display the result
fprintf('Theta found by gradient descent: ');
fprintf('%f %f \n', theta(1), theta(2));
The following image shows the result achieved upon using the above script on dummy data:
I hope this helps!

Categorie

Scopri di più su Deep Learning Toolbox in Help Center e File Exchange

Prodotti

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by