"Iterative principles of recognition in probabilistic neural networks"@en . "0893-6080" . . "Iterativn\u00ED principy rozpozn\u00E1v\u00E1n\u00ED v pravd\u011Bpodobnostn\u00EDch neuronov\u00FDch s\u00EDt\u00EDch"@cs . . "[88EC0A76EA66]" . "Hora, Jan" . "When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description contradicts the well known short-term dynamic properties of biological neurons. By introducing iterative schemes of recognition we show that some parameters of probabilistic neural networks can be /released/ for the sake of dynamic processes without disturbing the statistically correct decision making. In particular, we can iteratively adapt the mixture component weights or modify the input pattern in order to facilitate correct recognition. Both procedures are shown to converge monotonically as a special case of the well known EM algorithm for estimating mixtures." . . "RIV/67985556:_____/08:00311199!RIV09-GA0-67985556" . . "373447" . "000259846600006" . . . . . . "2"^^ . "Iterative principles of recognition in probabilistic neural networks"@en . "Iterative principles of recognition in probabilistic neural networks" . . . "RIV/67985556:_____/08:00311199" . "GB - Spojen\u00E9 kr\u00E1lovstv\u00ED Velk\u00E9 Brit\u00E1nie a Severn\u00EDho Irska" . "Iterativn\u00ED principy rozpozn\u00E1v\u00E1n\u00ED v pravd\u011Bpodobnostn\u00EDch neuronov\u00FDch s\u00EDt\u00EDch"@cs . . "10"^^ . "Neural Networks" . . . "2"^^ . "21" . . "P(1M0572), P(GA102/07/1594), Z(AV0Z10750506)" . "Pravd\u011Bpodobnostn\u00ED p\u0159\u00EDstup pat\u0159\u00ED k nejnov\u011Bj\u0161\u00EDm metod\u00E1m n\u00E1vrhu neuronov\u00FDch s\u00EDt\u00ED. Z\u00E1kladn\u00ED paradigma pravd\u011Bpodobnostn\u00EDho p\u0159\u00EDstupu je jin\u00E9 ne\u017E v p\u0159\u00EDpad\u011B standardn\u00EDch metod. N\u00E1vrh \u201Eklasick\u00E9\u201C neuronov\u00E9 s\u00EDt\u011B zpravidla vych\u00E1z\u00ED z form\u00E1ln\u00EDho modelu neuronu a p\u0159edpokl\u00E1d\u00E1 n\u011Bjak\u00FD zp\u016Fsob propojen\u00ED neuron\u016F v s\u00EDti. Adaptace neuronov\u00E9 s\u00EDt\u011B pro dan\u00FD \u00FA\u010Del (rozpozn\u00E1v\u00E1n\u00ED vstupn\u00EDch objekt\u016F, aproximaci v\u00FDstupn\u00ED funkce a pod.) prob\u00EDh\u00E1 na z\u00E1klad\u011B n\u011Bjak\u00E9ho algoritmu u\u010Den\u00ED, kter\u00FD je navr\u017Een heuristicky, nebo je odvozen z vhodn\u011B zvolen\u00E9ho kriteria optim\u00E1ln\u00ED funkce s\u00EDt\u011B."@cs . . . "6" . "When considering the probabilistic approach to neural networks in the framework of statistical pattern recognition we assume approximation of class-conditional probability distributions by finite mixtures of product components. The mixture components can be interpreted as probabilistic neurons in neurophysiological terms and, in this respect, the fixed probabilistic description contradicts the well known short-term dynamic properties of biological neurons. By introducing iterative schemes of recognition we show that some parameters of probabilistic neural networks can be /released/ for the sake of dynamic processes without disturbing the statistically correct decision making. In particular, we can iteratively adapt the mixture component weights or modify the input pattern in order to facilitate correct recognition. Both procedures are shown to converge monotonically as a special case of the well known EM algorithm for estimating mixtures."@en . . . . "Iterative principles of recognition in probabilistic neural networks" . . "Grim, Ji\u0159\u00ED" . "Probabilistic neural networks; Distribution mixtures; EM algorithm; Recognition of numerals; Recurrent reasoning"@en .