Click here to Skip to main content
15,887,267 members
Articles / Programming Languages / C#

Testing and Validation CNTK Models using C#

Rate me:
Please Sign up or sign in to vote.
5.00/5 (5 votes)
15 Nov 2017CPOL1 min read 15.2K   1   11
Once the model is built and Loss and Validation functions satisfy our expectation, we need to validate and test the model using the data which was not part of the training data set (unseen data).

…continuation from the previous post.

Once the model is built and Loss and Validation functions satisfy our expectation, we need to validate and test the model using the data which was not part of the training data set (unseen data). The model validation is very important because we want to see if our model is trained well, so that can evaluate unseen data approximately same as the training data. Otherwise, the model which cannot predict the output is called overfitted model. Overfitting can happen when the model was trained long enough that shows very high performance for the training data set, but for the testing data, evaluate bad results.

We will continue with the implementation from the prevision two posts, and implement model validation. After the model is trained, the model and the trainer are passed to the Evaluation method. The evaluation method loads the testing data and calculates the output using passed model. Then it compares calculated (predicted) values with the output from the testing data set and calculated the accuracy. The following source code shows the evaluation implementation.

C#
private static void EvaluateIrisModel(Function ffnn_model, Trainer trainer, DeviceDescriptor device)
{
    var dataFolder = "Data";//files must be on the same folder as program
    var trainPath = Path.Combine(dataFolder, "testIris_cntk.txt");
    var featureStreamName = "features";
    var labelsStreamName = "label";

    //extract features and label from the model
    var feature = ffnn_model.Arguments[0];
    var label = ffnn_model.Output;

    //stream configuration to distinct features and labels in the file
    var streamConfig = new StreamConfiguration[]
        {
            new StreamConfiguration(featureStreamName, feature.Shape[0]),
            new StreamConfiguration(labelsStreamName, label.Shape[0])
        };

    // prepare testing data
    var testMinibatchSource = MinibatchSource.TextFormatMinibatchSource(
        trainPath, streamConfig, MinibatchSource.InfinitelyRepeat, true);
    var featureStreamInfo = testMinibatchSource.StreamInfo(featureStreamName);
    var labelStreamInfo = testMinibatchSource.StreamInfo(labelsStreamName);

    int batchSize = 20;
    int miscountTotal = 0, totalCount = 20;
    while (true)
    {
        var minibatchData = testMinibatchSource.GetNextMinibatch((uint)batchSize, device);
        if (minibatchData == null || minibatchData.Count == 0)
            break;
        totalCount += (int)minibatchData[featureStreamInfo].numberOfSamples;

        // expected labels are in the mini batch data.
        var labelData = minibatchData[labelStreamInfo].data.GetDenseData<float>(label);
        var expectedLabels = labelData.Select(l => l.IndexOf(l.Max())).ToList();

        var inputDataMap = new Dictionary<Variable, Value>() {
            { feature, minibatchData[featureStreamInfo].data }
        };

        var outputDataMap = new Dictionary<Variable, Value>() {
            { label, null }
        };

        ffnn_model.Evaluate(inputDataMap, outputDataMap, device);
        var outputData = outputDataMap[label].GetDenseData<float>(label);
        var actualLabels = outputData.Select(l => l.IndexOf(l.Max())).ToList();

        int misMatches = actualLabels.Zip(expectedLabels, (a, b) => a.Equals(b) ? 0 : 1).Sum();

        miscountTotal += misMatches;
        Console.WriteLine($"Validating Model: Total Samples = {totalCount}, 
                                              Mis-classify Count = {miscountTotal}");

        if (totalCount >= 20)
            break;
    }
    Console.WriteLine($"---------------");
    Console.WriteLine($"------TESTING SUMMARY--------");
    float accuracy = (1.0F - miscountTotal / totalCount);
    Console.WriteLine($"Model Accuracy = {accuracy}");
    return;
}

The implemented method is called in the previous Training method.

C#
EvaluateIrisModel(ffnn_model, trainer, device);

As can be seen, the model validation has shown that the model predicts the data with high accuracy, which is shown in the following image:

This is the latest post in the series of blog posts about using Feed forward neural networks to train the Iris data using CNTK and C#.

The full source code for all three samples can be found here.

Filed under: .NET, C#, CNTK, CodeProject
Tagged: .NET, C#, CNTK, Code Project, CodeProject, Machine Learning

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)


Written By
Software Developer (Senior)
Bosnia and Herzegovina Bosnia and Herzegovina
Bahrudin Hrnjica holds a Ph.D. degree in Technical Science/Engineering from University in Bihać.
Besides teaching at University, he is in the software industry for more than two decades, focusing on development technologies e.g. .NET, Visual Studio, Desktop/Web/Cloud solutions.

He works on the development and application of different ML algorithms. In the development of ML-oriented solutions and modeling, he has more than 10 years of experience. His field of interest is also the development of predictive models with the ML.NET and Keras, but also actively develop two ML-based .NET open source projects: GPdotNET-genetic programming tool and ANNdotNET - deep learning tool on .NET platform. He works in multidisciplinary teams with the mission of optimizing and selecting the ML algorithms to build ML models.

He is the author of several books, and many online articles, writes a blog at http://bhrnjica.net, regularly holds lectures at local and regional conferences, User groups and Code Camp gatherings, and is also the founder of the Bihac Developer Meetup Group. Microsoft recognizes his work and awarded him with the prestigious Microsoft MVP title for the first time in 2011, which he still holds today.

Comments and Discussions

 
QuestionI do have a quick question? Pin
asiwel16-Nov-17 9:10
professionalasiwel16-Nov-17 9:10 
AnswerRe: I do have a quick question? Pin
Bahrudin Hrnjica17-Nov-17 0:59
professionalBahrudin Hrnjica17-Nov-17 0:59 
GeneralRe: I do have a quick question? Pin
asiwel17-Nov-17 4:07
professionalasiwel17-Nov-17 4:07 
GeneralRe: I do have a quick question? Pin
Bahrudin Hrnjica21-Nov-17 8:43
professionalBahrudin Hrnjica21-Nov-17 8:43 
GeneralRe: I do have a quick question? Pin
asiwel21-Nov-17 11:15
professionalasiwel21-Nov-17 11:15 
Hi, Bahrudin. I've been busy fiddling with your code and my data and what appears to be amazingly obtuse documentation for CNTK for C#. (For that matter, the Python documentation is not much better yet IMO, but it does give clues here and there as to what all the fancy variables and formats and classes, etc., are supposed to mean and do.) Without that documentation, one can get examples to work without understanding exactly what is happening. However trying to modify or extend such examples becomes very difficult, I think.

But I have been having (sort of) great success with your codes and examples. I appreciate your tip about your createFFNN function. The funny thing is that there, I also had solved that problem myself ... and did it exactly the same way!

I am close but have not figured out how best to use a Test file in memory (even one in CNTK format) instead of a disk file in the Evaluation method. However there are many other questions, e.g.:

1) The LearningRate. TrainingParameterScheduleDouble() has several parameters, and the first is the learning rate. The next, I think, are for overriding minibatch and epoch defaults of some kind. Your (0.001125,1) worked OK for Iris, but on other, larger, data sets, I get understandable results by omitting the second argument and playing around with various more reasonable values for the first argument. But I do not find any documentation suggesting what values might be reasonable starters for different models. For my problem, I am finding .02 or even .05 speeds things up considerably for the SGDLearner.

2) However, using a MomentumSGDLearner, you need a learning rate and a momentum rate (both of which are of the type returned by TrainingParameterScheduleDouble(). No documentation about what size that latter rate should be? (0.002) seems to work OK.

3) What actually is the trainer.PreviousMinibatchEvaluationAverage()? Right near the end of a training (after many epochs) this appears to the the training misclassification rate - almost exactly. But when just starting and stopping after a few dozen or hundred epochs (while looking for bugs, etc. Smile | :) ), that "average" is usually a little bit less than the Evaluation Validation test results. Why?

BTW: In your Evaluation method, for validation and testing, the line that computes the "validation" or "test" accuracy" of the model always shows up =1. A cast is needed to get the decimal places, like this:

float accuracy = (1.0F - (float)miscountTotal / totalCount);

Lots of other stuff to figure out! Great fun! Little cosmetic stuff like this (in your Training method):

Console.WriteLine($"The model trained on {yValues.Data.Shape.Dimensions[2]} cases " + $"to an accuracy of {acc}%");

Imagine having to reach way into the yValues object like that just to grab the number of cases/records/instances/samples/etc that you might be using right then to train on! (I read the data into memory, randomize, and split it for train and test early in my programs. Could easily do train, validate, test that way if I wanted to - and could figure out how to use those arrays in memory for validating models and testing results in the Evaluation method! Writing pieces to disk for later retrieval is a bummer when experimenting.)

modified 21-Nov-17 17:29pm.

GeneralRe: I do have a quick question? Pin
Bahrudin Hrnjica22-Nov-17 0:43
professionalBahrudin Hrnjica22-Nov-17 0:43 
GeneralRe: Variable Learning rate Pin
asiwel22-Nov-17 7:26
professionalasiwel22-Nov-17 7:26 
GeneralRe: I do have a quick question and a solution Pin
asiwel21-Nov-17 13:59
professionalasiwel21-Nov-17 13:59 
GeneralHa! Feel some vindicated. Pin
asiwel25-Nov-17 11:12
professionalasiwel25-Nov-17 11:12 
QuestionReally Neat Project! Pin
asiwel15-Nov-17 16:50
professionalasiwel15-Nov-17 16:50 
AnswerRe: Really Neat Project! Pin
Bahrudin Hrnjica17-Nov-17 1:04
professionalBahrudin Hrnjica17-Nov-17 1:04 

General General    News News    Suggestion Suggestion    Question Question    Bug Bug    Answer Answer    Joke Joke    Praise Praise    Rant Rant    Admin Admin   

Use Ctrl+Left/Right to switch messages, Ctrl+Up/Down to switch threads, Ctrl+Shift+Left/Right to switch pages.