【人工智能专栏】Stochastic Gradient Descent (SGD) 随机梯度下降

Stochastic Gradient Descent (SGD) 随机梯度下降

Introducing SGD

AI:

Stochastic Gradient Descent (SGD) is a popular optimization algorithm used in machine learning for finding the minimum of a cost function. It is a variant of the gradient descent algorithm that iteratively updates the parameters of a model by computing the gradient of the cost function with respect to the parameters using a small subset of the training data at each iteration.

In other words, instead of computing the gradient of the cost function for the entire dataset, SGD randomly selects a small batch of training data at each iteration and computes the gradient of the cost functio

你可能感兴趣的:(深度学习,机器学习,人工智能,机器学习,逻辑回归)