About: What Is Decreased by the Max-sum Arc Consistency Algorithm?     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : http://linked.opendata.cz/ontology/domain/vavai/Vysledek, within Data Space : linked.opendata.cz associated with source document(s)

AttributesValues
rdf:type
Description
  • Inference tasks in Markov random fields (MRFs) are closely related to the constraint satisfaction problem (CSP) and its soft generalizations. In particular, MAP inference in MRF is equivalent to the weighted (max-sum) CSP. A well-known tool to tackle CSPs are arc consistency algorithms, a.k.a. relaxation labeling. A promising approach to MAP inference in MRFs is linear programming relaxation solved by sequential tree-reweighted message passing (TRW-S). There is a not widely known algorithm equivalent to TRW-S, max-sum diffusion, which is slower but very simple. We give two theoretical results. First, we show that arc consistency algorithms and max-sum diffusion become the same thing if formulated in an abstract-algebraic way. Thus, we argue that max-sum arc consistency algorithm or max-sum relaxation labeling is a more suitable name for max-sum diffusion. Second, we give a criterion that strictly decreases during these algorithms.
  • Inference tasks in Markov random fields (MRFs) are closely related to the constraint satisfaction problem (CSP) and its soft generalizations. In particular, MAP inference in MRF is equivalent to the weighted (max-sum) CSP. A well-known tool to tackle CSPs are arc consistency algorithms, a.k.a. relaxation labeling. A promising approach to MAP inference in MRFs is linear programming relaxation solved by sequential tree-reweighted message passing (TRW-S). There is a not widely known algorithm equivalent to TRW-S, max-sum diffusion, which is slower but very simple. We give two theoretical results. First, we show that arc consistency algorithms and max-sum diffusion become the same thing if formulated in an abstract-algebraic way. Thus, we argue that max-sum arc consistency algorithm or max-sum relaxation labeling is a more suitable name for max-sum diffusion. Second, we give a criterion that strictly decreases during these algorithms. (en)
  • Inference tasks in Markov random fields (MRFs) are closely related to the constraint satisfaction problem (CSP) and its soft generalizations. In particular, MAP inference in MRF is equivalent to the weighted (max-sum) CSP. A well-known tool to tackle CSPs are arc consistency algorithms, a.k.a. relaxation labeling. A promising approach to MAP inference in MRFs is linear programming relaxation solved by sequential tree-reweighted message passing (TRW-S). There is a not widely known algorithm equivalent to TRW-S, max-sum diffusion, which is slower but very simple. We give two theoretical results. First, we show that arc consistency algorithms and max-sum diffusion become the same thing if formulated in an abstract-algebraic way. Thus, we argue that max-sum arc consistency algorithm or max-sum relaxation labeling is a more suitable name for max-sum diffusion. Second, we give a criterion that strictly decreases during these algorithms. (cs)
Title
  • What Is Decreased by the Max-sum Arc Consistency Algorithm?
  • What Is Decreased by the Max-sum Arc Consistency Algorithm? (en)
  • What Is Decreased by the Max-sum Arc Consistency Algorithm? (cs)
skos:prefLabel
  • What Is Decreased by the Max-sum Arc Consistency Algorithm?
  • What Is Decreased by the Max-sum Arc Consistency Algorithm? (en)
  • What Is Decreased by the Max-sum Arc Consistency Algorithm? (cs)
skos:notation
  • RIV/68407700:21230/07:03134582!RIV09-MSM-21230___
http://linked.open...avai/riv/aktivita
http://linked.open...avai/riv/aktivity
  • Z(MSM6840770038)
http://linked.open...vai/riv/dodaniDat
http://linked.open...aciTvurceVysledku
http://linked.open.../riv/druhVysledku
http://linked.open...iv/duvernostUdaju
http://linked.open...titaPredkladatele
http://linked.open...dnocenehoVysledku
  • 461236
http://linked.open...ai/riv/idVysledku
  • RIV/68407700:21230/07:03134582
http://linked.open...riv/jazykVysledku
http://linked.open.../riv/klicovaSlova
  • Markov random field; arc consistency; constraint satisfaction and optimisation; undirected graphical model (en)
http://linked.open.../riv/klicoveSlovo
http://linked.open...ontrolniKodProRIV
  • [77C66120A486]
http://linked.open...v/mistoKonaniAkce
  • Corvallis
http://linked.open...i/riv/mistoVydani
  • New York
http://linked.open...i/riv/nazevZdroje
  • ICML 2007: Proceedings of the 24th international conference on Machine learning
http://linked.open...in/vavai/riv/obor
http://linked.open...ichTvurcuVysledku
http://linked.open...cetTvurcuVysledku
http://linked.open...UplatneniVysledku
http://linked.open...iv/tvurceVysledku
  • Werner, Tomáš
http://linked.open...vavai/riv/typAkce
http://linked.open.../riv/zahajeniAkce
http://linked.open...n/vavai/riv/zamer
issn
  • 1053-587X
number of pages
http://purl.org/ne...btex#hasPublisher
  • ACM
https://schema.org/isbn
  • 978-1-59593-793-3
http://localhost/t...ganizacniJednotka
  • 21230
is http://linked.open...avai/riv/vysledek of
Faceted Search & Find service v1.16.118 as of Jun 21 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 07.20.3240 as of Jun 21 2024, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (126 GB total memory, 58 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software