In this paper we address the problem of optimal parameter selection for a Multilayer Perceptron b... more In this paper we address the problem of optimal parameter selection for a Multilayer Perceptron by means of a neural network with only one hidden layer that uses the "back propagation" algorithm over relatively simple classification problems in two dimensions (input patterns with only two variables). We will show graphically the direct relation existing between the increasing complexity regions (classes) and the necessity to add more neurons in the hidden layer. At the end, we summarize our findings by means of parameter selection recommendations in order to avoid the tedious and blind "trial and error" method.
In this paper we address the problem of optimal parameter selection for a Multilayer Perceptron b... more In this paper we address the problem of optimal parameter selection for a Multilayer Perceptron by means of a neural network with only one hidden layer that uses the "back propagation" algorithm over relatively simple classification problems in two dimensions (input patterns with only two variables). We will show graphically the direct relation existing between the increasing complexity regions (classes) and the necessity to add more neurons in the hidden layer. At the end, we summarize our findings by means of parameter selection recommendations in order to avoid the tedious and blind "trial and error" method.
Uploads
Papers by Andrés Vergara