请教一个线性回归算法代码
def gradientDescent(X, y, theta, alpha, iters):temp = np.matrix(np.zeros(theta.shape))
parameters = int(theta.ravel().shape)
cost = np.zeros(iters)
for i in range(iters):
error = (X * theta.T) - y
for j in range(parameters):
term = np.multiply(error, X[:,j])
temp = theta - ((alpha / len(X)) * np.sum(term))
theta = temp
cost = computeCost(X, y, theta)
return theta, cost
其中:
np.multiply(error, X[:,j])
以及
temp = theta - ((alpha / len(X)) * np.sum(term))
没太明白怎么是用公式的,好像和梯度下降求偏导的公式不太一样,请教下老师们
公式图片见下方
https://render.githubusercontent.com/render/math?math=%7B%7B%5Ctheta%20%7D_%7Bj%7D%7D%3A%3D%7B%7B%5Ctheta%20%7D_%7Bj%7D%7D-%5Calpha%20%5Cfrac%7B%5Cpartial%20%7D%7B%5Cpartial%20%7B%7B%5Ctheta%20%7D_%7Bj%7D%7D%7DJ%5Cleft%28%20%5Ctheta%20%20%5Cright%29&mode=display
页:
[1]