Group method of data handling

Group method of data handling

Group method of data handling (GMDH) is a family of inductive algorithms for computer-based mathematical modeling of multi-parametric datasets that features fully-automatic structural and parametric optimization of models.

GMDH is used in such fields as data mining, knowledge discovery, prediction, complex systems modeling, optimization and pattern recognition.

GMDH algorithms are characterized by inductive procedure that performs sorting-out of gradually complicated polynomial models and selecting the best solution by means of the so-called "external criterion".

A GMDH model with multiple inputs and one output is a subset of components of the "base function" (1): : Y(x_1,dots,x_n)=a_0+sumlimits_{i = 1}^m a_i f_i

where "f" are elementary functions dependent on different sets of inputs, "a" are coefficients and "m" is the number of the base function components.

In order to find the best solution GMDH algorithm consider various component subsets of the base function (1) called "partial models". Coefficients of these models estimated by the least squares method. GMDH algorithm gradually increase the number of partial model components and find a model structure with optimal complexity indicated by the minimum value of an "external criterion". This process is called self-organization of models.

The most popular base function used in GMDH is the gradually complicated Kolmogorov-Gabor polynomial (2):

: Y(x_1,dots,x_n) = a_0+sumlimits_{i = 1}^n {a_i} x_i+sumlimits_{i = 1}^n {sumlimits_{j = i}^n {a_{i j} } } x_i x_j+sumlimits_{i = 1}^n {sumlimits_{j = i}^n{sumlimits_{k = j}^n {a_{i j k} } } }x_i x_j x_k+cdots

GMDH is also known as polynomial neural networks and statistical learning networks thanks to implementation of the corresponding algorithms in several commercial software products.

History

The method was originated in 1968 by Prof. Alexey G. Ivakhnenko in the Institute of Cybernetics in Kyiv (Ukraine).This approach from the very beginning was a computer-based method so, a set of computer programs and algorithms were the primary practical results achieved at the base of the new theoretical principles. Thanks to the author's policy of open code sharing the method was quickly settled in the large number of scientific laboratories world wide. At that time code sharing was quite a physical action since the Internet is at least 5 years younger than GMDH. Despite this fact the first investigation of GMDH outside the Soviet Union had been made soon by R.Shankar in 1972. Later on different GMDH variants were published by Japanese and Polish scientists.

Period 1968-1971 is characterized by application of only regularity criterion for solving of the problems of identification, pattern recognition and short-term forecasting. As reference functions polynomials, logical nets, fuzzy Zadeh sets and Bayes probability formulas were used. Authors were stimulated by very high accuracy of forecasting with the new approach. Noiseimmunity was not investigated.

Period 1972-1975. The problem of modeling of noised data and incomplete information basis was solved. Multicriteria selection and utilization of additional priory information for noiseimmunity increasing were proposed. Best experiments showed that with extended definition of the optimal model by additional criterion noise level can be ten times more than signal. Then it was improved using Shannon's Theorem of General Communication theory.

Period 1976-1979. The convergence of multilayered GMDH algorithms was investigated. It was shown that some multilayered algorithms have "multilayerness error" - analogous to static error of control systems. In 1977 a solution of objective systems analysis problems by multilayered GMDH algorithms was proposed. It turned out that sorting-out by criteria ensemble finds the only optimal system of equations and therefore to show complex object elements, their main input and output variables.

Period 1980-1988. Many important theoretical results were received. It became clear that full physical models cannot be used for long-term forecasting. It was proved, that non-physical models of GMDH are more accurate for approximation and forecast than physical models of regression analysis. Two-level algorithms which use two different time scales for modeling were developed.

Since 1989 the new algorithms (AC, OCC, PF) for non-parametric modeling of fuzzy objects and SLP for expert systems were developed and investigated. Present stage of GMDH development can be described as blossom out of twice-multilayered neuronets and parallel combinatorial algorithms for multiprocessor computers.

External criteria

External criterion is one of the key features of GMDH. Criterion describes requirements to the model, for example minimization of Least squares. It is always calculated with a separate part of data sample that have not been used for estimation of coefficients. There are several popular criteria:

* Criterion of Regularity (CR) - Least squares of a model at the sample B.
* Criterion of Unbiasdness - Sum of CR value and special CR for which A is B and B is A. Ratio of sample lengthes must be 1:1 i.e. size of A must be the same as size of B. If a criterion does not define the number of observations for external dataset then the problem of data dividing ratio appears because the forecasting abilities of identified model are very dependent on the dividing ratio.

GMDH-type neural networks

There are many different ways to choose an order for partial models consideration. The very first consideration order used in GMDH and originally called multilayered inductive procedure is the most popular one. It is a sorting-out of gradually complicated models generated from Kolmogorov-Gabor polinomial. The best model is indicated by the minimum of the external criterion characteristic. Multilayered procedure is equivalent to the Artificial Neural Network with polynomial activation function of neurons. Therefore the algorithm with such an approach usually referred as GMDH-type Neural Network or Polynomial Neural Network.

Combinatorial GMDH

Another important approach to partial models consideration that becomes more and more popular is a brute force combinatorial search that is either limited or full. This approach has some advantages against Polynomial Neural Networks but requires considerable computational power and thus is not effective for objects with more than 30 inputs in case of full search. An important achievement of Combinatorial GMDH is that it fully outperforms linear regression approach if noise level in the input data is greater than zero.

Basic combinatorial algorithm makes the following steps:

* Divides data sample onto parts A and B.
* Generates structures for partial models.
* Estimates coefficients of partial models using Least squares method and sample A.
* Calculates value of external criterion for partial models using sample B.
* Chooses the best model (set of models) indicated by minimal value of the criterion.

In contrast to GMDH-type neural networks Combinatorial algorithm can't be stopped at the certain level of complexity because a point of increase of criterion value can be simply a local minimum, see Fig.1.

Algorithms

* Combinatorial (COMBI)
* Multilayered Iterative (MIA)
* GN
* Objective System Analysis (OSA)
* Harmonical
* Two-level (ARIMAD)
* Multiplicative-Additive (MAA)
* Objective Computer Clusterization (OCC);
* Pointing Finger (PF) clusterization algorithm;
* Analogues Complexing (AC)
* Harmonical Rediscretization
* Algorithm on the base of Multilayered Theory of Statistical Decisions (MTSD)
* Group of Adaptive Models Evolution (GAME)

Bibliography

* A.G. Ivakhnenko. "Heuristic Self-Organization in Problems of Engineering Cybernetics". Automatica 6: pp.207–219, 1970.
* A.G. Ivakhnenko. "Polynomial Theory of Complex System". IEEE Trans. on Systems, Man and Cybernetics, Vol. SMC-1, No. 4, Oct. 1971, pp. 364-378.
* S.J. Farlow. "Self-Organizing Methods in Modelling: GMDH Type Algorithms". New-York, Bazel: Marcel Decker Inc., 1984, 350 p.
* H.R. Madala, A.G. Ivakhnenko. "Inductive Learning Algorithms for Complex Systems Modeling". CRC Press, Boca Raton, 1994.

List of software

* [http://www.knowledgeminer.net/aboutkm.htm KnowledgeMiner] - Commercial product.
* [http://pnn.pnnsoft.com/index.html PNN Discovery client] - Commercial product (PNNSoft (c)).
* [http://neuron.felk.cvut.cz/game/project.html FAKE GAME Project] - Open source.
* [http://www.icyb.kiev.ua/~kai/oleksiy/gmdh/index.htm Parallel COMBI] - Freeware.

External links

* [http://www.gmdh.net www.gmdh.net] - Articles, books and software.
* [http://opengmdh.org opengmdh.org] - GMDH wiki and code development


Wikimedia Foundation. 2010.

Игры ⚽ Поможем сделать НИР

Look at other dictionaries:

  • Structured data analysis (statistics) — Structured data analysis is the statistical data analysis of structured data. Either in the form of a priori structure such as multiple choice questionnaires or in situations with the need to search for structure that fits the given data, either… …   Wikipedia

  • Data envelopment analysis — (DEA) is a nonparametric method in operations research and economics for the estimation of production frontiers[clarification needed]. It is used to empirically measure productive efficiency of decision making units (or DMUs). Non parametric… …   Wikipedia

  • Data Envelopment Analysis — (DEA) is a nonparametric method in operations research and economics for the estimation of production frontiers. It is used to empirically measure productive efficiency of decision making units (or DMUs). There are also parametric approaches… …   Wikipedia

  • 3D optical data storage — is the term given to any form of optical data storage in which information can be recorded and/or read with three dimensional resolution (as opposed to the two dimensional resolution afforded, for example, by CD). [ Three Dimensional Optical Data …   Wikipedia

  • Renormalization group — In theoretical physics, the renormalization group (RG) refers to a mathematical apparatus that allows systematic investigation of the changes of a physical system as viewed at different distance scales. In particle physics, it reflects the… …   Wikipedia

  • Electronic Data Interchange — (EDI) refers to the structured transmission of data between organizations by electronic means. It is more than mere E mail; for instance, organizations might replace bills of lading and even checks with appropriate EDI messages. It also refers… …   Wikipedia

  • List of statistics topics — Please add any Wikipedia articles related to statistics that are not already on this list.The Related changes link in the margin of this page (below search) leads to a list of the most recent changes to the articles listed below. To see the most… …   Wikipedia

  • Microsoft Data Access Components — MDAC redirects here. For other uses, see MDAC (disambiguation). MDAC (Microsoft Data Access Components) Microsoft Corporation s MDAC provides a uniform framework for accessing a variety of data sources on their Windows platform. Developer(s)… …   Wikipedia

  • List of mathematics articles (G) — NOTOC G G₂ G delta space G networks Gδ set G structure G test G127 G2 manifold G2 structure Gabor atom Gabor filter Gabor transform Gabor Wigner transform Gabow s algorithm Gabriel graph Gabriel s Horn Gain graph Gain group Galerkin method… …   Wikipedia

  • Dynamic systems development method — Model of the DSDM Atern project management method …   Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”