Class definition

#define NNF_FUNC_SIGMA_LOGISTIC(x) ( 1 / (1 + exp(-(x))) )
#define NNF_FUNC_SIGMA_HYPERBOLIC(x) ( tanh((x)) )
#define NNF_FUNC_STEP(x,alfa) ( (x) >= (alfa) ? 1 : 0 )
#define NNF_FUNC_SIGN(x) ( (x) >= 0 ? 1 : -1 )
#define NNF_FUNC_SIMIL_SIGMOID(x,alfa) ((0.5*x)/(alfa+abs(x))+0.5)
#define NNF_FUNC_SIMIL_SIGMOID(x,alfa) ((0.5*x)/(alfa+abs(x))+0.5)

enum
{
   NNF_SIGMA_LOGISTIC,
   NNF_SIGMA_HYPERBOLIC,
   NNF_STEP,
   NNF_SIGN,
   NNF_SIMIL_SIGMOID
   NNF_SINC
};

class Error
{
   public: float SingleExampleError;
   public: float CumulativeError;
   public: int SingleExampleAccuracy;
   public: int CumulativeAccuracy;
   public: float Treshold;

   public: Error() //Constructor
   public: void ErrorQuadratic(float Out[], float Des[], int Dim);
   public: void Accuracy(float Out[], float Des[], int Dim);

};

class NNFpp
{
   public: int layers; //Network layers
   public: int *neurons; //Number of neurons in each layers
   public: float epsilon; //Learning rate
   public: int activationFunction; //Activation function
   public: float alfa; //Parameter only for activation function that need it
   public: float ***weight; //Weight of connections between neurons
   public: float **bias; //Weights of bias
   public: float **value; //Neuron's value
   public: float **delta; //Neuron's delta

      private: int InitType; //Type of costructor used
      private: FILE *FileForReset; //File used for costructor with file

   public: Error *Errors; //Errors

   public: TrainingData *TD; // a container for the training set


   public: NNFpp(int layrs, int neurs[], int range, int A_Function, float Epsilon);
   public: NNFpp(int layrs, int neurs[], int range, int A_Function, float FunctionParameter, float Epsilon);
   public: NNFpp(int layrs, int neurs[], int range, int A_Function[], float FunctionParameter[], float Epsilon);
   public: NNFpp(FILE *file);
   public: ~NNFpp(void);


   public: void LoadNet(FILE* loadFile);
   public: void SaveNet(FILE* saveFile);
   public: void ResetNet(int range);
   public: float* Execute(float input[]);
   public: void BackPropagateOnLine(float desired[]);
      private: float LearningBP(int i, int j, float error);
   public: bool TrainingSetFromFile(FILE *TrainingSet);
   public: bool TrainingSetFromFile(FILE *TrainingSet, int seed);
   public: void TestOnFile(FILE *test, FILE *output);
};


Class NNFpp


public: NNFpp(int layrs, int neurs[], int range, int A_Function, float Epsilon)

Parametrized constructor:
int layrs Number of layers
int neurs[] Number of neurons in each layers
int range Range for random initialization
int A_Function Activation function without parameter
float Epsilon Learning rate

This costructor allow to create a neural network with simple activation functions.

TOP


public: NNFpp(int layrs, int neurs[], int range, int A_Function, float FunctionParameter, float Epsilon)

Parametrized constructor for activation function with parameter:
int layrs Number of layers
int neurs[] Number of neurons in each layers
int range Range for random initialization
int A_Function Activation function with parameter
float FunctionParameter Parameter for function step and simil sigmoid
float Epsilon Learning rate

This costructor allow to create a neural network with activation functions that needs a parameter.

TOP


public: NNFpp(int layrs, int neurs[], int range, int A_Function[], float FunctionParameter[], float Epsilon)

Parametrized constructor for multiple activation functions with parameters:
int layrs Number of layers
int neurs[] Number of neurons in each layers
int range Range for random initialization
int A_Function[] Activation functions vector
float FunctionParameter[] Parameters for functions
float Epsilon Learning rate

This costructor allow to create a special neural network with a specified activation function for each layer. The dimension of A_Function[] vector and FunctionParameter[] vector must be the same of layers-1.

TOP


public: NNFpp(FILE *file)

Constructor from file:
FILE *file pointer to ".nnf" file that contains the net.

This costructor allow to load a net directly from a ".nnf" file.

TOP


public: ~NNFpp(void)

Destructor.

TOP


public: void LoadNet(FILE* loadFile)

Load the net's definitions from the file pointed by loadFile.

Example of a ".nnf" file:

MLP NNF - Neural Net Framework

__neural net save file__

CONSTANTS
layers: 2
neurons per layer: 2 1
epsilon: 0.700000
activation function: simil_sigmoid
Parameter: 1.000000
WEIGHTS
-0.668014
0.066606 -0.181634

Another example of a ".nnf" file:

MLP NNF - Neural Net Framework

__neural net save file__

CONSTANTS
layers: 5
neurons per layer: 2 8 5 3 1
epsilon: 0.700000
activation function: simil_sigmoid
Parameter: 1.5
sigma_logistic
simil_sigmoid
Parameter: 1.5
sigma_logistic
simil_sigmoid
Parameter: 1.5
WEIGHTS
-0.724845
-0.735771 0.887631
0.960631
-0.056001 0.218543
-0.835322
-0.923338 -0.544969
-0.807550
-0.409711 0.803278
-0.842158
0.022248 0.309610
-0.908994
0.572131 0.855708
-0.881832
-0.777032 0.786554
0.932737
-0.019868 -0.090121
44.841660
16.150427 31.722534 7.707487 14.947559 15.555410 20.338844 12.373686 31.741299
-0.515915
0.550340 -0.876583 0.395734 0.712027 0.248573 -0.124729 0.482162 0.534043
0.264077
0.483932 0.760613 0.114109 -0.199194 0.449629 -0.941710 0.847713 0.468856
0.665395
-0.019196 0.298685 0.084017 0.418622 0.127232 -0.750114 -0.528672 -0.399945
-0.721549
-0.695181 0.459822 0.056856 0.246010 -0.523301 0.256264 -0.972106 -0.932676
-0.795709
0.385601 0.599048 0.237770 0.230262 -0.615284
0.114170
0.663747 0.780877 -0.561754 -0.517624 -0.972839
0.483444
0.142735 0.316019 0.247353 -0.539903 -0.412519
0.109417
-0.642264 -0.357330 -0.520864

TOP


public: void SaveNet(FILE* saveFile)

Save the actual net's definition to the file pointed by saveFile.

TOP


public: void ResetNet(int range)

Reset the actual net and generate new random weights in the range range. This function use the information stored in InitType and FileForReset to reinitialize the net.

TOP


public: float* Execute(float input[])

Execute the net using the input vector input[].
Return a float vector with the result for each output neuron.

TOP


public: void BackPropagateOnLine(float desired[])

Make learning using Backropagation. desired[] is the vector of desired output.

TOP


public: bool TrainingSetFromFile(FILE *TrainingSet)

Load the file pointed by TrainingSet and use it for learning.
Example of ".nnf.ts" file:

NNF - Neural Net Framework

__neural net training set file__

SET
in: 2
out: 1
examples: 3

1 1
2

2 2
4

5 7
12

If you want use this function to do train, you must respect the structure of the file: in is the number of input neurons; out is the number of output neurons and examples is the number of example in the file. For each example in the first row you must insert the value of the input neurons; in the second row you must insert the desired output and so on.

public: bool TrainingSetFromFile(FILE *TrainingSet, int seed)

This is an overload of the previous function. This function loads file pointed by TrainingSet and uses it for learning, seed is used to shuffle examples.


If you want use this function to do train, you must respect the structure of an ".nnf.ts" file.


public: public: void TestOnFile(FILE *test, FILE *output)

Uses the file pointed by test to test the neural network, at the end the file pointed by output contains the results.


If you want use this function to do train, you must respect the structure of an ".nnf.ts" file.


Class Error


This class is used to calculate errors during learning. You can evaluate two kind of error: Quadratic Error and Accuracy. The first considers the quadratic difference between each output neurons with desired values; there are two variables: SingleExampleError, that stores the error on the actual example and CumulativeError that sums all errors. The second is used when the net makes categorization, Accuracy uses a value Treshold, so if the value of output neuron is greater than Treshold is 1 else is 0, at this point if output match with the desired value SingleExampleAccuracy is 1. CumulativeAccuracy sums all SingleExampleAccuracy. For more information see tutorial Logical XOR.


Class TrainingData


This class is used internally, like Examples class.


Activation function

Activation Function
Image
NNF_SIGMA_LOGISTIC 
NNF_SIGMA_HYPERBOLIC
 
NNF_STEP
 
NNF_SIGN
 
NNF_SIMIL_SIGMOID
 
NNF_SINC
 

Copyright (C) 2006 by Failla Pierluigi