Neural Networks and Back Propagation

  • Post author:
  • Post category:AI-category
  • Reading time:11 mins read

Neural Networks and Back Propagation

Neural networks are established on computer-generated neurons, which are combined in a diversity of ways to form networks. Artificial Neural Network is programs intended to solve any problem by copying the structure and the function of our nervous system. Neural comes from neuron and network means they are connected as a graph. For a computing system to be called by these pretty names, the system needs to have a label directed graph structure where nodes complete some simple computations.

Biological Terminology

Artificial Neural Network Terminology

NeuronNode/Unit/Cell
SynapseConnection/Edge/Link
Synaptic EfficiencyConnection Strength/Weight
Firing frequencyNode output

 A neural network is similar to the human brain in the following two ways:

  1. A neural network acquires knowledge through learning.
  2. The knowledge of the neural network is stored inside the interconnection strengths called synaptic weight.
Types of Artificial Neural Network

There are many types of neural network, they are different from each other based on the network.

  • Fully connected Neural network
  • Layered Neural Network
  • Feedforward Neural Network
  • Radial basis function Neural Network
  • Kohonen Self Organizing Neural Network
  • Recurrent Neural Network(RNN)
  • Convolutional Neural Network
  • Acyclic Neural Network
  • Modular Neural Network

1-Fully Connected Network

A network in which every node is linked to every other node, and these connections can be either excitatory (positive weights), inhibitory (negative weights), or irrelevant (almost zero weights).

Neural networks
Fully Connected Network

2-Layered Network

These are networks in which nodes are partitioned into subsets called layers, with no connections from layer j to k if j > k.

Neural networks
Layered Network

3-Acyclic Network

This is the subclass of the layered networks in which there are no intra-layer connections. In other words, a connection may exist between any node in layer I and any node in layer j for i < j, but a connection is not allowed for i=j.

Neural networks
Acyclic Network

4-Modular Neural Network

A lot of problems are solved by using neural networks whose architecture made by many modules, with light interconnections between them. However, Modules can be structured in many different ways as Hierarchical organization, Continuous refinement, Input modularity.

In ANNs, learning is the method of changing the weights of connections between the nodes of a listed network. Its architecture determines the learning ability of neural networks and by the algorithmic method chosen for training.

Supervised Learning

In supervised learning, the training data consist of pairs of input and desired output values that are traditionally represented in data vectors. Supervised learning can also be stated as classification, where we have a wide range of classifiers, (Multilayer perceptron, k nearest neighbor, etc.

Unsupervised Learning

In unsupervised methodology, no sample outputs are provided to the network against which it can measure its predictive performance for a given vector of inputs. The well-known form of unsupervised learning is clustering where we group data in different clusters by their similarity.

Back Propagation Algorithm

The backpropagation algorithm (Rumelhart and McClelland, 1986) is used in layered feed-forward Artificial Neural Networks. The backpropagation network is a type of multi-layer feedforward, supervised learning network based on gradient descent learning rule. we provide an algorithm with examples of some inputs and outputs that we want the network to compute, and then the error (the difference between actual and expected results) is calculated. The idea of the backpropagation algorithm is to reduce this error until the Artificial Neural Network learns the training data.

The activation function of the artificial neurons in ANNs implementing the backpropagation algorithm is a weighted sum (the addition of the inputs xi multiplied by their weights Wji)

The best-known output function is the sigmoidal function.

Since the error is the difference between the actual and the desired output, the error depends on the weights, and we have to adjust the weights for minimizing the error. We can also describe the error function for the output of each neuron.

Back Propagation Algorithm
Back Propagation Algorithm

Applications of Neural Network

  • Image Processing and Pattern Recognition
  • Character recognition
  • Forecasting
  • Speech Recognition
  • Signature Verification

Advantages of Neural Network

  • It involves human-like thinking.
  • Used in Data mining to handle noisy or missing data.
  • Neural Networks can work through a large number of variables or parameters.
  • They provide general solutions with good predictive accuracy.
  • The system has got property of continuous learning.

Robotics

admin

We are a team of writers, researchers, and editors who are passionate about helping others live their best lives. We believe that life is a beautiful gift. We try to live our lives to the fullest and enjoy every moment. We are always learning and growing, and we cherish the relationships we have with our family and friends.

Leave a Reply