• +91 9723535972
  • info@interviewmaterial.com

Deep Learning Interview Questions and Answers

Question - How is the transformer architecture better than RNNs in Deep Learning?

Answer -

With the use of sequential processing, programmers were up against:

  • The usage of high processing power
  • The difficulty of parallel execution
This caused the rise of the transformer architecture. Here, there is a mechanism called attention mechanism, which is used to map all of the dependencies between sentences, thereby making huge progress in the case of NLP models.

Comment(S)

Show all Coment

Leave a Comment




NCERT Solutions

 

Share your email for latest updates

Name:
Email:

Our partners