This HTML5 document contains 48 embedded RDF statements represented using HTML+Microdata notation.

The embedded RDF content will be recognized by any processor of HTML5 Microdata.

Namespace Prefixes

PrefixIRI
n15http://linked.opendata.cz/ontology/domain/vavai/riv/typAkce/
dctermshttp://purl.org/dc/terms/
n18http://purl.org/net/nknouf/ns/bibtex#
n13http://localhost/temp/predkladatel/
n8http://linked.opendata.cz/resource/domain/vavai/riv/tvurce/
n20http://linked.opendata.cz/resource/domain/vavai/subjekt/
n19http://linked.opendata.cz/ontology/domain/vavai/
n17https://schema.org/
shttp://schema.org/
rdfshttp://www.w3.org/2000/01/rdf-schema#
skoshttp://www.w3.org/2004/02/skos/core#
n3http://linked.opendata.cz/ontology/domain/vavai/riv/
n22http://bibframe.org/vocab/
n2http://linked.opendata.cz/resource/domain/vavai/vysledek/
rdfhttp://www.w3.org/1999/02/22-rdf-syntax-ns#
n9http://linked.opendata.cz/resource/domain/vavai/vysledek/RIV%2F68407700%3A21230%2F12%3A00195563%21RIV13-MSM-21230___/
n4http://linked.opendata.cz/ontology/domain/vavai/riv/klicoveSlovo/
n11http://linked.opendata.cz/ontology/domain/vavai/riv/duvernostUdaju/
xsdhhttp://www.w3.org/2001/XMLSchema#
n23http://linked.opendata.cz/ontology/domain/vavai/riv/jazykVysledku/
n14http://linked.opendata.cz/ontology/domain/vavai/riv/aktivita/
n21http://linked.opendata.cz/ontology/domain/vavai/riv/druhVysledku/
n12http://linked.opendata.cz/ontology/domain/vavai/riv/obor/
n10http://reference.data.gov.uk/id/gregorian-year/

Statements

Subject Item
n2:RIV%2F68407700%3A21230%2F12%3A00195563%21RIV13-MSM-21230___
rdf:type
skos:Concept n19:Vysledek
rdfs:seeAlso
http://dl.acm.org/citation.cfm?id=2330241
dcterms:description
In this paper we propose a new algorithm called HyperGPEFS (HyperGP with Explicit Fitness Sharing). It is based on a HyperNEAT, which is a well-established evolutionary method employing indirect encoding of artificial neural networks. Indirect encoding in HyperNEAT is realized via special function called Compositional and Pattern Producing Network (CPPN), able to describe a neural network of arbitrary size. CPPNs are represented by network structures, which are evolved by means of a slightly modified version of another, well-known algorithm NEAT (NeuroEvolution of Augmenting Topologies). HyperGP is a variant of HyperNEAT, where the CPPNs are optimized by Genetic Programming (GP). Published results reported promising improvement in the speed of convergence. Our approach further extends HyperGP by using fitness sharing to promote a diversity of a population. Here, we thoroughly compare all three algorithms on six different tasks. Fitness sharing demands a definition of a tree distance measure. Among other five, we propose a generalized distance measure which, in conjunction with HyperGPEFS, significantly outperforms HyperNEAT and HyperGP on all, but one testing problems. Although this paper focuses on indirect encoding, the proposed distance measures are generally applicable. In this paper we propose a new algorithm called HyperGPEFS (HyperGP with Explicit Fitness Sharing). It is based on a HyperNEAT, which is a well-established evolutionary method employing indirect encoding of artificial neural networks. Indirect encoding in HyperNEAT is realized via special function called Compositional and Pattern Producing Network (CPPN), able to describe a neural network of arbitrary size. CPPNs are represented by network structures, which are evolved by means of a slightly modified version of another, well-known algorithm NEAT (NeuroEvolution of Augmenting Topologies). HyperGP is a variant of HyperNEAT, where the CPPNs are optimized by Genetic Programming (GP). Published results reported promising improvement in the speed of convergence. Our approach further extends HyperGP by using fitness sharing to promote a diversity of a population. Here, we thoroughly compare all three algorithms on six different tasks. Fitness sharing demands a definition of a tree distance measure. Among other five, we propose a generalized distance measure which, in conjunction with HyperGPEFS, significantly outperforms HyperNEAT and HyperGP on all, but one testing problems. Although this paper focuses on indirect encoding, the proposed distance measures are generally applicable.
dcterms:title
Distance Measures for HyperGP with Fitness Sharing Distance Measures for HyperGP with Fitness Sharing
skos:prefLabel
Distance Measures for HyperGP with Fitness Sharing Distance Measures for HyperGP with Fitness Sharing
skos:notation
RIV/68407700:21230/12:00195563!RIV13-MSM-21230___
n19:predkladatel
n20:orjk%3A21230
n3:aktivita
n14:I
n3:aktivity
I
n3:dodaniDat
n10:2013
n3:domaciTvurceVysledku
n8:7035586 n8:9121870
n3:druhVysledku
n21:D
n3:duvernostUdaju
n11:S
n3:entitaPredkladatele
n9:predkladatel
n3:idSjednocenehoVysledku
131563
n3:idVysledku
RIV/68407700:21230/12:00195563
n3:jazykVysledku
n23:eng
n3:klicovaSlova
artificial neural networks; fitness sharing; gp; hypergp; hyperneat; indirect encodings; tree distance measures
n3:klicoveSlovo
n4:artificial%20neural%20networks n4:indirect%20encodings n4:tree%20distance%20measures n4:gp n4:hypergp n4:hyperneat n4:fitness%20sharing
n3:kontrolniKodProRIV
[7A81F11873B7]
n3:mistoKonaniAkce
Philadelphia
n3:mistoVydani
New York
n3:nazevZdroje
Proceedings of the fourteenth international conference on Genetic and evolutionary computation conference companion
n3:obor
n12:IN
n3:pocetDomacichTvurcuVysledku
2
n3:pocetTvurcuVysledku
2
n3:rokUplatneniVysledku
n10:2012
n3:tvurceVysledku
Šnorek, Miroslav Drchal, Jan
n3:typAkce
n15:WRD
n3:wos
000309611100069
n3:zahajeniAkce
2012-07-07+02:00
s:numberOfPages
8
n22:doi
10.1145/2330163.2330241
n18:hasPublisher
ACM
n17:isbn
978-1-4503-1177-9
n13:organizacniJednotka
21230