• +91 9723535972
  • info@interviewmaterial.com

Deep Learning Interview Questions and Answers

Question - Why Sigmoid or Tanh is not preferred to be used as the activation function in the hidden layer of the neural network?

Answer -

A common problem with Tanh or Sigmoid functions is that they saturate. Once saturated, the learning algorithms cannot adapt to the weights and enhance the performance of the model. Thus, Sigmoid or Tanh activation functions prevent the neural network from learning effectively leading to a vanishing gradient problem. The vanishing gradient problem can be addressed with the use of Rectified Linear Activation Function (ReLu) instead of sigmoid and using a Xavier initialization.

Comment(S)

Show all Coment

Leave a Comment




NCERT Solutions

 

Share your email for latest updates

Name:
Email:

Our partners