This HTML5 document contains 51 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
n21http://linked.opendata.cz/ontology/domain/vavai/riv/typAkce/
dctermshttp://purl.org/dc/terms/
n16http://purl.org/net/nknouf/ns/bibtex#
n11http://linked.opendata.cz/resource/domain/vavai/riv/tvurce/
n8http://linked.opendata.cz/resource/domain/vavai/projekt/
n17http://linked.opendata.cz/ontology/domain/vavai/
n20http://linked.opendata.cz/resource/domain/vavai/zamer/
n19https://schema.org/
shttp://schema.org/
skoshttp://www.w3.org/2004/02/skos/core#
n4http://linked.opendata.cz/ontology/domain/vavai/riv/
n12http://linked.opendata.cz/resource/domain/vavai/vysledek/RIV%2F67985556%3A_____%2F10%3A00348710%21RIV11-MSM-67985556/
n2http://linked.opendata.cz/resource/domain/vavai/vysledek/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n6http://linked.opendata.cz/ontology/domain/vavai/riv/klicoveSlovo/
n18http://linked.opendata.cz/ontology/domain/vavai/riv/duvernostUdaju/
xsdhhttp://www.w3.org/2001/XMLSchema#
n14http://linked.opendata.cz/ontology/domain/vavai/riv/jazykVysledku/
n7http://linked.opendata.cz/ontology/domain/vavai/riv/aktivita/
n13http://linked.opendata.cz/ontology/domain/vavai/riv/druhVysledku/
n10http://linked.opendata.cz/ontology/domain/vavai/riv/obor/
n9http://reference.data.gov.uk/id/gregorian-year/

Statements

Subject Item
n2:RIV%2F67985556%3A_____%2F10%3A00348710%21RIV11-MSM-67985556
rdf:type
skos:Concept n17:Vysledek
dcterms:description
We point out a problem inherent in the optimization scheme of many popular feature selection methods. It follows from the implicit assumption that higher feature selection criterion value always indicates more preferable subset even if the value difference is marginal. This assumption ignores the reliability issues of particular feature preferences, overfitting and feature acquisition cost. We propose an algorithmic extension applicable to many standard feature selection methods allowing better control over feature subset preference. We show experimentally that the proposed mechanism is capable of reducing the size of selected subsets as well as improving classifier generalization. We point out a problem inherent in the optimization scheme of many popular feature selection methods. It follows from the implicit assumption that higher feature selection criterion value always indicates more preferable subset even if the value difference is marginal. This assumption ignores the reliability issues of particular feature preferences, overfitting and feature acquisition cost. We propose an algorithmic extension applicable to many standard feature selection methods allowing better control over feature subset preference. We show experimentally that the proposed mechanism is capable of reducing the size of selected subsets as well as improving classifier generalization.
dcterms:title
The Problem of Fragile Feature Subset Preference in Feature Selection Methods and A Proposal of Algorithmic Workaround The Problem of Fragile Feature Subset Preference in Feature Selection Methods and A Proposal of Algorithmic Workaround
skos:prefLabel
The Problem of Fragile Feature Subset Preference in Feature Selection Methods and A Proposal of Algorithmic Workaround The Problem of Fragile Feature Subset Preference in Feature Selection Methods and A Proposal of Algorithmic Workaround
skos:notation
RIV/67985556:_____/10:00348710!RIV11-MSM-67985556
n4:aktivita
n7:Z n7:P
n4:aktivity
P(1M0572), P(2C06019), P(GA102/07/1594), P(GA102/08/0593), Z(AV0Z10750506)
n4:dodaniDat
n9:2011
n4:domaciTvurceVysledku
n11:4788575 n11:6617972 n11:5728525
n4:druhVysledku
n13:D
n4:duvernostUdaju
n18:S
n4:entitaPredkladatele
n12:predkladatel
n4:idSjednocenehoVysledku
282138
n4:idVysledku
RIV/67985556:_____/10:00348710
n4:jazykVysledku
n14:eng
n4:klicovaSlova
feature selection; machine learning; over-fitting; classification; feature weights; weighted features; feature acquisition cost
n4:klicoveSlovo
n6:machine%20learning n6:over-fitting n6:feature%20acquisition%20cost n6:feature%20weights n6:weighted%20features n6:classification n6:feature%20selection
n4:kontrolniKodProRIV
[F3D86A04B48F]
n4:mistoKonaniAkce
Istanbul
n4:mistoVydani
Istanbul
n4:nazevZdroje
Proc. 2010 Int. Conf. on Pattern Recognition
n4:obor
n10:BD
n4:pocetDomacichTvurcuVysledku
3
n4:pocetTvurcuVysledku
3
n4:projekt
n8:1M0572 n8:GA102%2F07%2F1594 n8:2C06019 n8:GA102%2F08%2F0593
n4:rokUplatneniVysledku
n9:2010
n4:tvurceVysledku
Pudil, Pavel Somol, Petr Grim, Jiří
n4:typAkce
n21:WRD
n4:zahajeniAkce
2010-08-23+02:00
n4:zamer
n20:AV0Z10750506
s:numberOfPages
4
n16:hasPublisher
IEEE Computer Society
n19:isbn
978-0-7695-4109-9