Online citations, reference lists, and bibliographies.

Modifications And Extensions To A Feed-Forward Neural Network

Sandro Skansi
Published 2018 · Computer Science

Cite This
Download PDF
Analyze on Scholarcy
Share
This chapter explores modifications and extensions to simple feed-forward neural networks, which can be applied to any other neural network. The problem of local minima as one of the main problems in machine learning is explored with all of its intricacies. The main strategy against local minima is the idea of regularization, by adding a regularization parameter when learning. Both L1 and L2 regularizations are explored and explained in detail. The chapter also addresses the idea of the learning rate and shows how to implement it in backpropagation, both in the static and dynamic setting. Momentum is also explored, as a technique which also helps against local minima by adding inertia to the gradient descent. This chapter also explores the stochastic gradient descent in the form of learning with batches and pure online learning. This chapter concludes with a final view on the vanishing and exploding gradient problems, setting the stage for deep learning.
This paper references



Semantic Scholar Logo Some data provided by SemanticScholar