Pages

Showing posts with label vanishing gradient. Show all posts
Showing posts with label vanishing gradient. Show all posts

Setting Up Your Deep Neural Network Optimization Problem (Week 1 Summary)

0 comments

Starting with normalizing our input features, why do we really need normalizing the input features on the first place? The simple answer is we normalize the input features so that the cost function is elongated so when the gradient descent is run, it makes it easier and faster to find its way to the minimum.
Read More »
 

Popular Posts

Total Pageviews

Like Us at

WUZ Following

Join People Following WUZ

© 2011. All Rights Reserved | WriteUpZone | Template by Blogger Widgets

Home | About | Top