Instead of comparison of
the statistical procedures and artificial neural nets work there
is possibility to unite them into common twicemultilayered
neuronet. Any good working regression or pattern recognition
procedure can be considered as one neuron, output of which is one
of the outputs pointed out in input data sample. You can repeat
such procedure for all another variables pointed out in data sample.
Such way you get first layer of neuronet with active neurons. The
output variables should be added to variables of data sample. They
are very effective secondary inputs for the neurons of the next
layer. By repeating this operation you can construct second, third
and so on, layers of neuronet. The number of layers should be increased
until the error criterion decreases. This approach show principal
advantage of neural nets. Full set of variables we can propose to
a statistical procedure only one time, in twicemultilayered neuronets
we can propose corrected full sets of variables at each layer.
Neuronets with active neurons allow to optimize the regression
space: effective variables (factors) are chosen at each neuronet
level. Regression analysis (in the case of accurate and long input
data samples) or GMDH algorithms (in the case of noised or short
data) should be used for one single active neuron in neuronet. Neuronet
with collective of active neurons gives new possibility to generate
and to select new combination of inputs. This can highly increase
modeling accuracy with the help of regression area extension.
