Node: This page is always under construction.
Table of Contents |
---|
What is Multilayer Perceptron?
...
MLP can be used for both regression and classification. For both tasks, we need first initialize the MLP model by specifying the parameters. The parameters are listed as follows:
Parameter | Description | |
model path | The path to specify the location to store the model. | |
learningRate | Control the aggressive of learning. A big learning rate can accelerate the training speed, | |
regularization | Control the complexity of the model. A large regularization value can make the weights between | |
momentum | Control the speed of training. A big momentum can accelerate the training speed, but it may | ]]></ac:plain-text-body></ac:structured-macro> |
squashing function | Activate function used by MLP. Candidate squashing function: sigmoid, tanh. | |
cost function | Evaluate the error made during training. Candidate cost function: squared error, cross entropy (logistic). | |
layer size array | An array specify the number of neurons (exclude bias neurons) in each layer (include input and output layer). | |
The following is the sample code regarding model initialization.
No Format |
---|
String modelPath = "/tmp/xorModel-training-by-xor.data"; double learningRate = 0.6; double regularization = 0.02; // no regularization double momentum = 0.3; // no momentum String squashingFunctionName = "Tanh"; String costFunctionName = "SquaredError"; int[] layerSizeArray = new int[] { 2, 5, 1 }; SmallMultiLayerPerceptron mlp = new SmallMultiLayerPerceptron(learningRate, regularization, momentum, squashingFunctionName, costFunctionName, layerSizeArray); |
Two class learning problem
...