. "1"^^ . "Artificial Neural Networks" . " Learning Algorithms" . "http://www.isvav.cz/projectDetail.do?rowId=IAB1030006"^^ . "Na z\u00E1klad\u011B Kolmogorovovy v\u011Bty a p\u0159\u00EDbuzn\u00FDch v\u00FDsledk\u016F byly vytvo\u0159eny nov\u00E9 u\u010D\u00EDc\u00ED algoritmy pro dop\u0159edn\u00E9 neuronov\u00E9 s\u00EDt\u011B. Byla ur\u010Dena jejich slo\u017Eitost a navr\u017Eeno n\u011Bkolik mo\u017Enost\u00ED efektivn\u00ED paraleln\u00ED realizace na clusterech pracovn\u00EDch stanic."@cs . "10"^^ . . "10"^^ . . "V\u011Bt\u0161ina u\u010D\u00EDc\u00EDch algoritm\u016F pro dop\u0159edn\u00E9 neuronov\u00E9 s\u00EDt\u011B je zalo\u017Eena na gradientn\u00EDch metod\u00E1ch pou\u017E\u00EDvan\u00FDch p\u0159i neline\u00E1rn\u00ED optimalizaci. P\u0159esto\u017Ee t\u011Bmito metodami lze \u00FAsp\u011B\u0161n\u011B \u0159e\u0161it re\u00E1ln\u00E9 probl\u00E9my, maj\u00ED n\u011Bkter\u00E1 podstatn\u00E1 omezen\u00ED (numerick\u00E1 nestabilita, \u010Dasov\u00E1 a komunika\u010Dn\u00ED n\u00E1ro\u010Dnost). V posledn\u00EDch letech vzniklo n\u011Bkolik teoretick\u00FDch v\u00FDsledk\u016F o aproxima\u010Dn\u00EDch vlastnostech neuronov\u00FDch s\u00EDt\u00ED, ale doposud se v\u011Bnovala mal\u00E1 pozornost v\u00FDvoji nov\u00FDch u\u010D\u00EDc\u00EDch algoritm\u016F zalo\u017Een\u00FDch na t\u011Bchto v\u00FDsledc\u00EDch. Na z\u00E1klad\u011B podrobn\u00E9anal\u00FDzy v\u011Bt a d\u016Fkazov\u00E9ho apar\u00E1tu autor\u016F Kolmogorova, Sprechera, K\u016Frkov\u00E9, Leshna, Mhaskara a dal\u0161\u00EDch navrhneme alternativn\u00ED u\u010D\u00EDc\u00ED procedury pro dop\u0159edn\u00E9 neuronov\u00E9 s\u00EDt\u011B. U navr\u017Een\u00FDch algoritm\u016F budeme studovat jejich teoretick\u00E9, numerick\u00E9 a v\u00FDpo\u010Detn\u00ED vlastnosti. Chov\u00E1n\u00ED algoritm\u016F vyzkou\u0161\u00EDme na standardn\u00EDch testovac\u00EDch probl\u00E9mech." . . . . . . . . "2004-10-27+02:00"^^ . . . . "Alternativn\u00ED u\u010D\u00EDc\u00ED algoritmy pro dop\u0159edn\u00E9 neuronov\u00E9 s\u00EDt\u011B" . . . . "Artificial Neural Networks; Learning Algorithms; Parallel Computing"@en . . "Alternative learning procedures for reedforward neural networks"@en . . "0"^^ . "The majority of currently used learning algortihms for feedforward networks is based on gradient descent methods of non-linear optimization. They have proven successfull in solving real-world problems, nevertheless they are known to suffer from several shortcomings (numerical instability, time demands, communication overhead). Several theoretical results concerning the approximation power of neural networks have been established in the last decade, but a little effort has benn made to use these results-based on functinal approximation theory-for proposal of novel learning procedures. We plan to carefully study the results and proof techniques of Kolmogorov, Sprecher, K\u016Frkov\u00E1, Leshno, Mhaskar and others to derive alternative learning procedures for feedforward networks. Numerical and computational properties of these algortihms will be studied by means of theory and experiments on standard benchmarks."@en . "IAB1030006" . "0"^^ .