Attributes | Values |
---|
rdf:type
| |
Description
| - Language models (LMs) are essential components of many applications such as speech recognition or machine translation. LMs factorize the probability of a string of words into a product of P(w_i|h_i), where h_i is the context (history) of word w_i. Most LMs use previous words as the context. The paper presents two alternative approaches: post-ngram LMs (which use following words as context) and dependency LMs (which exploit dependency structure of a sentence and can use e.g. the governing word as context). Dependency LMs could be useful whenever a topology of a dependency tree is available, but its lexical labels are unknown, e.g. in tree-to-tree machine translation. In comparison with baseline interpolated trigram LM both of the approaches achieve significantly lower perplexity for all seven tested languages (Arabic, Catalan, Czech, English, Hungarian, Italian, Turkish).
- Language models (LMs) are essential components of many applications such as speech recognition or machine translation. LMs factorize the probability of a string of words into a product of P(w_i|h_i), where h_i is the context (history) of word w_i. Most LMs use previous words as the context. The paper presents two alternative approaches: post-ngram LMs (which use following words as context) and dependency LMs (which exploit dependency structure of a sentence and can use e.g. the governing word as context). Dependency LMs could be useful whenever a topology of a dependency tree is available, but its lexical labels are unknown, e.g. in tree-to-tree machine translation. In comparison with baseline interpolated trigram LM both of the approaches achieve significantly lower perplexity for all seven tested languages (Arabic, Catalan, Czech, English, Hungarian, Italian, Turkish). (en)
|
Title
| - Perplexity of n-gram and Dependency Language Models
- Perplexity of n-gram and Dependency Language Models (en)
|
skos:prefLabel
| - Perplexity of n-gram and Dependency Language Models
- Perplexity of n-gram and Dependency Language Models (en)
|
skos:notation
| - RIV/00216208:11320/10:10078034!RIV11-GA0-11320___
|
http://linked.open...avai/riv/aktivita
| |
http://linked.open...avai/riv/aktivity
| |
http://linked.open...iv/cisloPeriodika
| |
http://linked.open...vai/riv/dodaniDat
| |
http://linked.open...aciTvurceVysledku
| |
http://linked.open.../riv/druhVysledku
| |
http://linked.open...iv/duvernostUdaju
| |
http://linked.open...titaPredkladatele
| |
http://linked.open...dnocenehoVysledku
| |
http://linked.open...ai/riv/idVysledku
| - RIV/00216208:11320/10:10078034
|
http://linked.open...riv/jazykVysledku
| |
http://linked.open.../riv/klicovaSlova
| - models; language; dependency; gram; perplexity (en)
|
http://linked.open.../riv/klicoveSlovo
| |
http://linked.open...odStatuVydavatele
| - DE - Spolková republika Německo
|
http://linked.open...ontrolniKodProRIV
| |
http://linked.open...i/riv/nazevZdroje
| - Lecture Notes in Computer Science
|
http://linked.open...in/vavai/riv/obor
| |
http://linked.open...ichTvurcuVysledku
| |
http://linked.open...cetTvurcuVysledku
| |
http://linked.open...vavai/riv/projekt
| |
http://linked.open...UplatneniVysledku
| |
http://linked.open...v/svazekPeriodika
| |
http://linked.open...iv/tvurceVysledku
| - Mareček, David
- Popel, Martin
|
issn
| |
number of pages
| |
http://localhost/t...ganizacniJednotka
| |