• +91 9723535972
  • info@interviewmaterial.com

Deep Learning Interview Questions and Answers

Question - Explain the Adam Optimizer in one minute.

Answer -

Adaptive momentum or Adam optimizer is an optimization algorithm designed to deal with sparse gradients on noisy problems. Adam optimizer improves convergence through momentum that ensures that a model does not get stuck in saddle point and also provides per-parameter updates for faster convergence.

Comment(S)

Show all Coment

Leave a Comment




NCERT Solutions

 

Share your email for latest updates

Name:
Email:

Our partners