"RIV/67985556:_____/09:00315684" . . . "2"^^ . "[8DD570FEA618]" . "Andr\u00FDsek, Josef" . "23" . "Use of Kullback\u2013Leibler divergence for forgetting" . "2"^^ . . "RIV/67985556:_____/09:00315684!RIV09-GA0-67985556" . . "GB - Spojen\u00E9 kr\u00E1lovstv\u00ED Velk\u00E9 Brit\u00E1nie a Severn\u00EDho Irska" . . . . . . . . . "Non-symmetric Kullback\u2013Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686\u2013690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated."@en . "Use of Kullback\u2013Leibler divergence for forgetting"@en . "K\u00E1rn\u00FD, Miroslav" . "1" . "Use of Kullback\u2013Leibler divergence for forgetting" . . . "International Journal of Adaptive Control and Signal Processing" . "Bayesian estimation; Kullback\u2013Leibler divergence; functional approximation of estimation; parameter tracking by stabilized forgetting; ARX model"@en . "0890-6327" . "15"^^ . "Non-symmetric Kullback\u2013Leibler divergence (KLD) measures proximity of probability density functions (pdfs). Bernardo (Ann. Stat. 1979; 7(3):686\u2013690) had shown its unique role in approximation of pdfs. The order of the KLD arguments is also implied by his methodological result. Functional approximation of estimation and stabilized forgetting, serving for tracking of slowly varying parameters, use the reversed order. This choice has the pragmatic motivation: recursive estimator often approximates the parametric model by a member of exponential family (EF) as it maps prior pdfs from the set of conjugate pdfs (CEF) back to the CEF. Approximations based on the KLD with the reversed order of arguments preserves this property. In the paper, the approximation performed within the CEF but with the proper order of arguments of the KLD is advocated. It is applied to the parameter tracking and performance improvements are demonstrated." . . "347892" . . . . "Use of Kullback\u2013Leibler divergence for forgetting"@en . . "Pou\u017Eit\u00ED Kullback\u2013Leibler divergence pro zapom\u00EDn\u00E1n\u00ED"@cs . "Pou\u017Eit\u00ED Kullback\u2013Leibler divergence pro zapom\u00EDn\u00E1n\u00ED"@cs . "P(1M0572), P(2C06001), P(GA102/08/0567), Z(AV0Z10750506)" . . . "Nesymetrick\u00E1 Kullback-Leiblerova divergence (KLD) m\u011B\u0159\u00ED bl\u00EDzkost pravd\u011Bpodobnostn\u00EDch hustot. D\u00E1 se uk\u00E1zat, \u017Ee jedna z jejich verz\u00ED je teoreticky lep\u0161\u00ED. \u010Cl\u00E1nek popisuje vyu\u017Eit\u00ED t\u00E9to skute\u010Dnosti ke zlep\u0161en\u00ED techniky zapom\u00EDn\u00E1n\u00ED."@cs . .