"2008-01-01+01:00"^^ . . "2010-04-16+02:00"^^ . "2010-12-31+01:00"^^ . . . . "complexity of neural networks; learning from data; nonlinear approximation; kernel methods"@en . "The goal of the project is to contribute to theoretical understanding which properties make networks with perceptron and kernel computational units efficient and flexible tools for learning from high-dimensional data. Estimates of model complexity of networks will be derived in terms of smoothness and oscillatory properties of data, their dimension and type of an activation function or a kernel. By inspection of these estimates, measures of data complexity with respect to various types of computational units will be proposed and\u00A0characterized in terms of special norms tailored to perceptrons and kernels. Relationships of these norms to norms defined in terms of smoothness (such as Sobolev and Bessel norms) will be described. Estimates of network complexity will be derived using tools from nonlinear approximation and optimization theory. On the basis of constructive proof techniques fuzzy rules will be derived and learning algorithms will be proposed, analyzed, implemented and tested on"@en . "The project contributed to the development of mathematical theory of perceptron and kernel networks and theory of their learning. Estimates of network complexity were derived in depenedence on smoothness and oscillatory properties of data, their dimension, and type of an activation function or a kernel. By inspection of these estimates, measures of data complexity with respect to various typ"@en . . "complexity of neural networks" . " learning from data" . . . " nonlinear approximation" . "http://www.isvav.cz/projectDetail.do?rowId=GA201/08/1744"^^ . . . . "GA201/08/1744" . "Complexity of perceptron and kernel networks"@en . "20"^^ . . "20"^^ . . . "2015-02-09+01:00"^^ . "Slo\u017Eitost perceptronov\u00FDch a j\u00E1drov\u00FDch s\u00EDt\u00ED" . "0"^^ . . . . . "1"^^ . "C\u00EDlem projektu je p\u0159isp\u011Bt k teoretick\u00E9mu porozumn\u011Bn\u00ED vlastnostem\u00A0neuronov\u00FDch s\u00EDt\u00ED s perceptronov\u00FDmi a j\u00E1drov\u00FDmi\u00A0jednotkami, kter\u00E9 zp\u016Fsobuj\u00ED, \u017Ee tyto s\u00EDt\u011B \u00A0jsou efektivn\u00ED a flexibiln\u00ED n\u00E1stroje pro u\u010Den\u00ED na z\u00E1klad\u011B\u00A0 vysokodimenzion\u00E1ln\u00EDch dat.\u00A0 Budou odvozeny odhady slo\u017Eitosti s\u00EDt\u00ED v z\u00E1vislosti na hladkosti a oscila\u010Dn\u00EDch vlastnostech dat, jejich dimenzi a typu aktiva\u010Dn\u00ED funkce nebo j\u00E1dra. Na z\u00E1klad\u011B anal\u00FDzy t\u011Bchto odhad\u016F budou navr\u017Eeny m\u00EDry slo\u017Eitosti dat vzhledem k r\u016Fzn\u00FDm typ\u016Fm v\u00FDpo\u010Detn\u00EDch jednotek.Tyto m\u00EDry budou charakterizov\u00E1ny pomoc\u00ED speci\u00E1ln\u00EDch norem a budou pops\u00E1ny vztahy t\u011Bchto norem k norm\u00E1m modeluj\u00EDc\u00EDm hladkost (Sobolevovy a Besselovy normy). Pro odhady\u00A0slo\u017Eitosti s\u00EDt\u00ED budou pou\u017Eity\u00A0metody z teorie neline\u00E1rn\u00ED aproximace a optimalizace. Na z\u00E1klad\u011B konstrukt\u00EDvn\u00EDch\u00A0d\u016Fkazov\u00FDch technik budou odvozena fuzzy pravidla a budou navr\u017Eeny, implementov\u00E1ny a testov\u00E1ny algoritmy u\u010Den\u00ED." . . "0"^^ . "Projekt p\u0159isp\u011Bl k rozvoji matematick\u00E9 teorie perceptronov\u00FDch a j\u00E1drov\u00FDch s\u00EDt\u00ED a teorii jejich u\u010Den\u00ED.  Byly odvozeny odhady slo\u017Eitosti s\u00EDt\u00ED v z\u00E1vislosti na hladkosti a oscila\u010Dn\u00EDch vlastnostech dat, jejich dimenzi a typu aktiva\u010Dn\u00ED funkce a j\u00E1dra. Na z\u00E1klad\u011B anal\u00FDzy t\u011Bchto odhad\u016F byly navr\u017Eeny m\u00EDry slo\u017Eitosti dat vzhledem k r\u016Fzn\u00FDm typ\u016Fm v\u00FDpo\u010Detn\u00EDch jednotek. Byly porovn\u00E1ny slo\u017Eitostn\u00ED n\u00E1roky ne"@cs . .