Click here to Skip to main content
15,868,016 members
Articles / Programming Languages / C#

Parallel Artificial Neural Networks in .NET Framework

Rate me:
Please Sign up or sign in to vote.
4.44/5 (16 votes)
7 Aug 2015CPOL7 min read 26.1K   1.2K   29   9
A description of NeuralClasses created for .NET Framework and a simple character detector application

Introduction

To understand the human brain and the nature of consciousness, the proper understanding of Neurons and Neural Networks is a requirement. Simple implementations of artificial neural networks in a sequential computer can be trained to do tasks at which the brain is good at, like pattern recognition and classification, which can be used in voice recognition and image processing applications. A neuron is an elongated cell which is excited by signals through dendrites and if greater than some threshold will generate an action potential which travels through the axon. This connects to other neurons and makes up a neural network, but how the brain learns and processes with this amazingly complex collection of interconnected neurons is a different issue. An Artificial Neuron is the building block of Artificial Neural Network, with multiple inputs and one output. The inputs are accumulated, after being multiplied by the weight of the connection into the neuron and compared with a threshold and if it is more than the threshold, the neuron fires (1), or it remains unfired (0), and the thresholding is done by something called an ‘Activation Function’, which is generally a differentiable sigmoid. ANN’s are commonly trained by ‘Backpropagation’ algorithm, where inputs and the corresponding outputs are given to ANN and internal weights are updated to produce the output for that given input. Depending on the complexity of the inputs and the number of neurons, some parameters, such as ‘Learning Rate’ and ‘Momentum’ have to be set and network trained, so that neural network remembers it to give proper outputs. You can read about Backpropagation in external sites.

As all ANNs start with an Artificial Neuron, we start with a class Neuron, which can be modified according to the parameters to set various activation functions, accumulate the values based on the inputs given to it and fire according to the activation functions. A collection of Neuron’s will be called a Layer, which will be used to either give inputs or get outputs, or just remain hidden within the ANN (hidden layer). Methods included in the class are SetInputs(IEnumerable<double>), IEnumerable<double> GetOutputs()CalculateLayer(), which updates all the neurons in the layer depending on the inputs from the previous layer. ‘ConnectAllTo’ is a methods to link two layers, each neuron in the previous layer connected to all the neurons in the next, with a random weight. The Layer class also has two events, Updated and Changed, which fire when layer is updated by previous inputs and input to the layer changes respectively.

A collection of interconnected layers is called a NeuralNetwork, which has methods to save the network to an XML file after being trained and tested or read the whole XML to recreate the saved network. ApplyInput(IEnumerable<double>), which gives input to the input layer of the network and IEnumerable<double> ReadOutput() which returns a collection of double values corresponding to the output neurons. It has a virtual method, ‘CalculateOutput’, which can be overridden to change the way the output is calculated. The method assumes a feed forward network with all layers sequential and interconnected.

So, to implement Backpropagation algorithm, NeuralNetwork class is used as a base class for BackpropagationNetwork, which implements methods BackPropagate(DataSet data, double lerningRate, double momentum), to train the network

In this article we shall discuss a simple pattern recognition application using this class. All inputs and outputs are a collection of IEnumerable<double>, whose size depends on the number of input and output neurons.

The .NET Framework provides class Parallel which can be used to distribute processor intensive work to all processors and threads on the system, which is used to calculate the feed forward calculations.

Neuron

Each neuron can be initialized to any of 5 different types of activation function, given by

public enum ActivationType { BINARY, BIPOLAR, SIGMOID, BIPOLARSIGMOID, RAMP };

Then within the neuron, there is a list of connections, which point to the connected neuron and the weight of the connection. Each connection to the list is added during initialization of BackpropogationNetwork when Layers are interconnected. The two methods below are the most important, one creates connections to other neurons and, CalculateValue() reads the connections inuput to the neuron and fires accoring top the activation function.

public void AddConnection(Neuron neuron, bool to, double weight = 0/*, double bias = 0*/)
{
    Connection c;

    if (to)
        c = new Connection(this, neuron, weight);
    else
        c = new Connection(neuron, this, weight);

    this.Connections.Add(c);
    neuron.Connections.Add(c);
}

public double CalculateValue()  //NEURON UPDATE
{
    double val = 0;

    IEnumerable<connection> inputConns = Connections.Where(r => r.toNeuron == this);

    foreach (Connection c in inputConns)
    {
        val += (c.fromNeuron.Value * c.Weight);
    }

    if (inputConns.Count() > 0)
    {
        val += Bias;
        AccumulateStore = val;
        Value = Activate(val);
    }
    return Value;
}

Layer

A layer is a collection of Neuron's of a same kind and share events Updated and Changed, which fire when on of the input neuron's value changes and output of the layer is calculated, by calling each Neuron's CalculateValue() in a foreach loop. Here is the code of 'GetValues()', which read the output of the layer and returns and IEnumerable<double> and CalculateLayer() which calculates output of all the neurons, and ConnectAllTo(), which interconnects all Neurons of two layers. These methods are used within 'BackpropogationNetwork. Here, as you can see, Parallel.ForEach is used to calculate outputs of each neuron, which will make the task considerably faster in a multi-processor environment.

public IEnumerable<double> CalculateLayer()
{
    Parallel.ForEach(Neurons, new Action<neuron>((n) =>
    {
        n.CalculateValue();
    }));
    RaiseUpdated();
    return Neurons.Select(r => r.Value);
}

public void ConnectAllTo(Layer nextlayer, double defaultWeight = 0)
{
    //Connects all neurons in current layer to the neurons in the next

    foreach (Neuron n1 in Neurons)
    {
        foreach (Neuron n2 in nextlayer.Neurons)
        {
            n1.AddConnection(n2, true, RandomProvider.random.NextDouble() - 0.5);
        }
    }
}

Neural Network

A NeuralNetwork is a collection of Layers with interconnected neurons, with 'ApplyInput()' and 'ReadOutput()' methods to access the data. As it will be easier for end user to save and read this network as XML file or copy the network data into another network for mutation or reproduction, I have included some methods to take care of just that.

public NeuralNetwork(NetworkData network)
{
    Layers = new Layer[network.Layers.Count];
    NumLayers = network.Layers.Count;
    foreach (LayerData ld in network.Layers)
    {

    Layers[network.Layers.IndexOf(ld)] = new Layer(network.Layers.IndexOf(ld), ld.NumNeuron, ld.ActType, ld.Bias);
    }

    foreach (ConnectionData cd in network.Connections)
    {
        Layers[cd.From.Layer].Neurons[cd.From.Node].AddConnection(Layers[cd.To.Layer].Neurons[cd.To.Node], true, cd.Weight);
    }

    InputIndex = network.InputLayerId;
    OutputIndex = network.OutputLayerId
}

As a way of extracting network data, which includes all the Layer and connection properties, the method 'GetNetworkData()' can be used

public NetworkData GetNetworkData()
{
    List<connectiondata> conns = new List<connectiondata&rt;();
    List<layerdata&rt; lays = new List<layerdata&rt;();

    foreach (Layer layer in Layers)
    {
        LayerData newlayerData = new LayerData() { NumNeuron = layer.NumNeurons, ActType = layer.ActType, Bias = new List<double&rt;() };
        int layerid = layer.Index;
        foreach (Neuron neuron in layer.Neurons)
        {
            int neuronid = neuron.Index;
            newlayerData.Bias.Add(neuron.Bias);

            foreach (Connection conn in neuron.Connections.Where(r => r.fromNeuron == neuron))
            {
                conns.Add(new ConnectionData()
                {
                    From = new NeuronData() { Layer = layerid, Node = neuronid },
                    To = new NeuronData() { Layer = conn.toNeuron.SelfLayer.Index, Node = conn.toNeuron.Index },Weight = conn.Weight});
            }
        }
        lays.Add(newlayerData);
    }

    return new NetworkData() { Connections = conns, InputLayerId = InputIndex, OutputLayerId = OutputIndex, Layers = lays };
}

Methods are included in the library to 'public static void SaveNetworkToFile(NetworkData netdata, string filename)', 'public static NetworkData ReadNetworkFromFile(string filename)',  'IEnumerable<DataSet> ReadDataSetFromFile(string filename)' and 'public static IEnumerable<DataSet> ReadDataSetFromFile(string filename)'. This will be of great convinience when reading and writing neural networks or the training data, included by collections of DataSet objects. Hence, now, we proceed to the implementation of backpropogation network.

Backpropagation Network

It uses NeuralNetwork as the base class and overrides the virtual method 'CalculateOutput()' to create a feedforward calculation. Using the backpropogation algorithm which can be read from external sites, I create this method to be used to train an interconnected neural network, which can be initialized to anynumber of input, output or hiddenlayer Neurons.

public BackPropogationNetwork(int numInputs, int numOutputs, int numHidden, int numHiddenLayers = 1): base(numHiddenLayers + 2)
{
    Layers[0] = new Layer(0, numInputs, Neuron.ActivationType.RAMP);
    Layers[2 + numHiddenLayers - 1] = new Layer(2 + numHiddenLayers - 1, numOutputs, Neuron.ActivationType.SIGMOID);
    for (int i = 0; i < numHiddenLayers; i++)
    {
        Layers[i + 1] = new Layer(i + 1, numHidden, Neuron.ActivationType.SIGMOID);
    }

    for (int i = 0; i < NumLayers - 1; i++)
    {
        Layers[i].ConnectAllTo(Layers[i + 1]);
    }

    InputIndex = 0;
    OutputIndex = NumLayers - 1;
}

The backpropogation algorithm can be implemented in the following manner

public List<List<List<double>>> BackPropogate(DataSet datain, double learninnRate, double momentum = 0)
{
    if (PrevDW == null)
    {
        PrevDW = new List<List<List<double>>>();
        deltaArr = new List<List<double>>();
        for (int i = 0; i < NumLayers; i++)
        {
            deltaArr.Add(new List<double>());
            PrevDW.Add(new List<List<double>>());
            for (int n = 0; n < Layers[i].NumNeurons; n++)
            {
                deltaArr[i].Add(0);
                PrevDW[i].Add(new List<double>());
                IEnumerable<Connection> outConn = Layers[i].Neurons[n].Connections.Where(r => r.toNeuron == Layers[i].Neurons[n]);
                int d = 0;
                foreach (Connection c in outConn)
                {
                    PrevDW[i][n].Add(0);
                    d++;
                }
            }
        }
    }

    ApplyInput(datain.Inputs);
    CalculateOutput();

    Layer currentLayer = Layers[OutputIndex];
    while (currentLayer != Layers[InputIndex])
    {
        Parallel.ForEach(currentLayer.Neurons, new Action<Neuron>((n) =>
        {
            double error = 0;

            if (currentLayer == Layers[OutputIndex])
            {
                error = datain.Outputs[n.Index] - n.Value;
            }

            else
            {
                foreach (Connection c in n.Connections.Where(r => r.fromNeuron == n))
                {
                    error += c.Weight * deltaArr[c.toNeuron.SelfLayer.Index][c.toNeuron.Index];
                }
            }
            error = error * n.Value * (1 - n.Value);
            deltaArr[currentLayer.Index][n.Index] = error;
        }));
            currentLayer = Layers[currentLayer.Index - 1];
    }

    currentLayer = Layers[OutputIndex];
    while (currentLayer != Layers[InputIndex])
    {
        for (int i = 0; i < currentLayer.NumNeurons; i++)
        {
            Neuron n = currentLayer.Neurons[i];
            foreach (Connection c in n.Connections.Where(r => r.toNeuron == n))
            {
                double dw = (deltaArr[c.toNeuron.SelfLayer.Index][c.toNeuron.Index] * learninnRate * c.fromNeuron.Value) + (momentum * PrevDW[currentLayer.Index][i][n.Connections.IndexOf(c)]);
                c.Weight += dw;
                PrevDW[currentLayer.Index][i][n.Connections.IndexOf(c)] = dw;
            }
            n.Bias += deltaArr[currentLayer.Index][i] * learninnRate;
        }
        currentLayer = Layers[currentLayer.Index - 1];
    }
    return PrevDW;
}

 

Application

After the ‘Neural.dll’ reference has been added to the project,  all the above classes are available under the assembly 'NeuralNetworks’ and to initialize the BackPropogationNetwork object, the number of input, hidden, output neurons and number of hidden layers are to be given as parameters. Then we can being training and running the network. For training bulk data, the method ‘BatchBackPropogate’ can be used which takes a dataset, number of iterations, leaning rate, momentum and a Backgroundworker object which is running the thread while training, which makes sure the main UI is not stuck while training, are to be given as parameters. Worker also reports the training progress in percentage, which can be neatly utilized in any WinForms or WPF UI.

The application will be a 16 x 16 pixel character recognizer, which recognizes individual characters in the given data. Each pixel will be either ‘0’ or ‘1’ and the number of outputs is 16, hence a total of 16 characters can be recognized (BCD encoding can decrease the number of outputs, but for simplicity sake and for the calculation of confidence level, 16 outputs are chosen), hence, number of inputs are 256 and number of outputs is 16, each corresponding to a character .  I can randomly assume the number of hidden layers as 1 and number of hidden neurons 200. We shall being with training the network with various inputs and many iterations.

BackPropogationNetwork charecterNet = new BackPropogationNetwork(256, 16, 250);

The network can now be trained by backpropogation with various inputs and outputs with proper learning momentum and learning rate. This is done in a background worker, which itself is a parameter in the method 'BatchBackPropogate()', which returns the progress in percentage.

public void BatchBackPropogate(DataSet[] dataSet, int iterations, double leanrnignRate, double momentum = 0, BackgroundWorker worker = null)

dataSet is an array of DataSet objects 

public class DataSet
{
    public double[] Inputs { get; set; }
    public double[] Outputs { get; set; }
}

which are trained in succession one after other ‘iterations’ times.

After being trained by many inputs and outputs, the neural network will be ready for the testing, where we feedforward inputs to create output.

charecterNet.ApplyInput(GetInputFromBitmap(new Bitmap(pictureBox1.Image)));
charecterNet.CalculateOutput();

Then the result is read out from

DetectionReturn detect = OutputToCharacter(charecterNet.ReadOutput());

DetectionReturn is a local application class, which has the recognized character and the confidence level of recognition, which is returned by the method which compares the outputs and estimates the character recognized. In the application provided for download, load the 'saved.xml' file and test out the first four digits, or train the network yourself to test it for fun.

External Links

The library is uploaded at GitHub along with this application at https://github.com/hemanthk119/NeuralNetworks/


 

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer
India India
This member has not yet provided a Biography. Assume it's interesting and varied, and probably something to do with programming.

Comments and Discussions

 
GeneralMy vote of 3 Pin
Jamie Munro7-Aug-15 6:38
Jamie Munro7-Aug-15 6:38 
GeneralRe: My vote of 3 Pin
hemanthk1197-Aug-15 7:33
hemanthk1197-Aug-15 7:33 
kindly check with the project hosted at github to fill in the gaps
GeneralRe: My vote of 3 Pin
Sinisa Hajnal10-Dec-15 23:08
professionalSinisa Hajnal10-Dec-15 23:08 
GeneralRe: My vote of 3 Pin
Pascal Ganaye9-Aug-16 5:49
Pascal Ganaye9-Aug-16 5:49 
GeneralRe: My vote of 3 Pin
Sinisa Hajnal11-Aug-16 21:38
professionalSinisa Hajnal11-Aug-16 21:38 
GeneralRe: My vote of 3 Pin
Pascal Ganaye16-Oct-16 22:23
Pascal Ganaye16-Oct-16 22:23 
QuestionMy Vote of 5 Pin
EveryNameIsTakenEvenThisOne7-Aug-15 5:20
professionalEveryNameIsTakenEvenThisOne7-Aug-15 5:20 
AnswerRe: My Vote of 5 Pin
Kawaoneechan9-Aug-15 0:58
Kawaoneechan9-Aug-15 0:58 
GeneralMy vote of 5 Pin
Baxter P7-Aug-15 3:39
professionalBaxter P7-Aug-15 3:39 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.