Hello hello! Are there any AI experts present? :)
So this is a general question, what neural network should I choose.
1. What I have.
There is a data flow, which happens once at a time, and data has a format of array of doubles:
{0.67, 0.70, 0.71, 0.70, 0.72}
{0.70, 0.71, 0.70, 0.69, 0.71}
{0.76, 0.74, 0.77, 0.73, 0.71}
{0.79, 0.74, 0.77, 0.71, 0.75}
Those arrays pop up one by one once at a time and have a class value of 1 (for instance). Their category is: one. As data flows, next arrays (or blocks of data: each array is a separate block of data) are:
{0.71, 0.70, 0.69, 0.67, 0.60}
{0.70, 0.69, 0.67, 0.60, 0.50}
{0.69, 0.67, 0.60, 0.50, 0.40}
{0.67, 0.60, 0.50, 0.40, 0.39}
And this data, each of those arrays (or data blocks) has a category of: two.
As you can see, data is very similar, each array is pretty much almost the same. But as a human being - you can see that there is a drop in second portion of data blocks. This is why there are categories: "one" and "two". Category "one" means that there is a constant flow of data. Category "two" means there is a drop.
2. What I need.
I need AI to be able to distinguish those blocks of data, when data is constant and when there is a drop. Not predict what will happen next, but classify what is happening right now, when a new block of data arrives. And be able to learn per new data block, without caching previous inputs and relearning all accumulated inputs as a complete growing training set.
I need something to be able to learn per new data block, and still be able to classify previous data correctly over time. If new data category arrives, let's say "gain" or "rise", or any new category - it should be able to learn it, while maintaining old knowledge, and not relearning everything from scratch.
Do you know of any solutions, that can do that? I haven't tried convolutional nets - but as far as I know, they learn the same way - they need all data to be present or am I wrong?
Thank you.
What I have tried:
Multilayer perceptron - while being able to learn as I require, it has a lot of problems with distinguishing data. It makes too much mistakes, one time it can distinguish data correctly, and other time it just does not know what is going on. It has the smallest rate of forgetting old data though.
Adaptive Multi-Hyperplane Machine (sort of SVM) - it learns great, but only if all data is already present. It cannot learn per new data block. If I train it on a new data - it completely forgets previous data.
Long short-term memory - great classifier, but again - only if all (training) data present. So I would have to accumulate all inputs and retrain network every time new input arrives with all accumulated previous inputs. In some point I will run out of memory of course.