Gradient Descent Optimization Algorithm

 




        Gradient Descent is known as one of the most usually operated optimization algorithms to train machine learning models by measure of minimizing errors between real and anticipated results. Further, gradient descent is also utilized to train Neural Networks. 

 Gradient descent is one of the most popularized algorithms to carry out optimization and by far the most familiar way to optimize neural networks. At the same time, every state-of-the- craft Deep Learning library contains executions of varied algorithms to optimize gradient descent. These algorithms, still, are frequently applied as black-box optimizers, as practical explanations of their strengths and faults are tough to come by. 


 Advantages 

 

 Easy Calculation. 


 Easy to Apply 

 

 Easy to decide 


 Disadvantages 

 

 May trap at original minima. 


 Weights are altered after computing the gradient on the whole dataset. so, if the datasets are too large also this may take days to meet to the minima. 

 

 Different types of Gradient descent are 


 1. Batch Gradient Descent 

 

 In the batch gradient, we use the entire dataset to calculate the grade of the cost function for each replication of the grade descent and also modernize the weights. 


 2. Stochastic Gradient Descent 

 

 It’s a variant of Gradient Descent. It tries to modernize the model’s parameters more constantly. In this, the model parameters are altered after the calculation of loss on each training exemplification. 


 3. Mini batch Gradient Descent 

 

Mini-batch gradient is a variation of gradient descent where the batch size consists of further than one and lesser than the total dataset. Mini batch grade descent is extensively used and converges swiftly and is more stable 


 

 Types of Optimizers 

 It’s delicate to overdo how popular grade descent really is, and it’s used across the board indeed up to complex neural net infrastructures (backpropagation is principally gradient descent enforced on a network). There are other types of optimizers predicated on grade descent that are operated however, and then are a many of them 





Conclusion 


Here, we learned about gradient descent algorithm , its advantages , disadvantages and different types optimizer . 








Comments

Popular posts from this blog

Machine learning and Artificial Intelligence

Animations using AI

Stemming in Python