Attributes | Values |
---|
rdf:type
| |
Description
| - Recently, a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this paper, we investigate the energy complexity of recurrent networks which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size s=\Theta(\sqrt{m}) with the time overhead of \tau=O(s/e) per one input bit, using the energy O(e), for any e such that e=\Omega(\log s) and e=O(s), which shows the time-energy tradeoff in recurrent networks. In addition, for the time overhead \tau satisfying \tau^\tau=o(s), we obtain the lower bound of s^{c/\tau} on the energy of such a simulation, for some constant c>0 and for infinitely many s.
- Recently, a new so-called energy complexity measure has been introduced and studied for feedforward perceptron networks. This measure is inspired by the fact that biological neurons require more energy to transmit a spike than not to fire, and the activity of neurons in the brain is quite sparse, with only about 1% of neurons firing. In this paper, we investigate the energy complexity of recurrent networks which counts the number of active neurons at any time instant of a computation. We prove that any deterministic finite automaton with m states can be simulated by a neural network of optimal size s=\Theta(\sqrt{m}) with the time overhead of \tau=O(s/e) per one input bit, using the energy O(e), for any e such that e=\Omega(\log s) and e=O(s), which shows the time-energy tradeoff in recurrent networks. In addition, for the time overhead \tau satisfying \tau^\tau=o(s), we obtain the lower bound of s^{c/\tau} on the energy of such a simulation, for some constant c>0 and for infinitely many s. (en)
|
Title
| - Energy Complexity of Recurrent Neural Networks
- Energy Complexity of Recurrent Neural Networks (en)
|
skos:prefLabel
| - Energy Complexity of Recurrent Neural Networks
- Energy Complexity of Recurrent Neural Networks (en)
|
skos:notation
| - RIV/67985807:_____/14:00393985!RIV15-GA0-67985807
|
http://linked.open...avai/riv/aktivita
| |
http://linked.open...avai/riv/aktivity
| |
http://linked.open...iv/cisloPeriodika
| |
http://linked.open...vai/riv/dodaniDat
| |
http://linked.open...aciTvurceVysledku
| |
http://linked.open.../riv/druhVysledku
| |
http://linked.open...iv/duvernostUdaju
| |
http://linked.open...titaPredkladatele
| |
http://linked.open...dnocenehoVysledku
| |
http://linked.open...ai/riv/idVysledku
| - RIV/67985807:_____/14:00393985
|
http://linked.open...riv/jazykVysledku
| |
http://linked.open.../riv/klicovaSlova
| - neural network; finite automaton; energy complexity; optimal size (en)
|
http://linked.open.../riv/klicoveSlovo
| |
http://linked.open...odStatuVydavatele
| - US - Spojené státy americké
|
http://linked.open...ontrolniKodProRIV
| |
http://linked.open...i/riv/nazevZdroje
| |
http://linked.open...in/vavai/riv/obor
| |
http://linked.open...ichTvurcuVysledku
| |
http://linked.open...cetTvurcuVysledku
| |
http://linked.open...vavai/riv/projekt
| |
http://linked.open...UplatneniVysledku
| |
http://linked.open...v/svazekPeriodika
| |
http://linked.open...iv/tvurceVysledku
| |
http://linked.open...ain/vavai/riv/wos
| |
issn
| |
number of pages
| |
http://bibframe.org/vocab/doi
| |