About: Natural user interface as a supplement of the holographic Raman tweezers     Goto   Sponge   NotDistinct   Permalink

An Entity of Type : http://linked.opendata.cz/ontology/domain/vavai/Vysledek, within Data Space : linked.opendata.cz associated with source document(s)

AttributesValues
rdf:type
Description
  • Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion and “MyGaze low-cost sensors and a simple speech recognition program “Tazti. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin and “tweezers) serving for the manipulation with particles are displayed on the transparent “overlay window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work.
  • Holographic Raman tweezers (HRT) manipulates with microobjects by controlling the positions of multiple optical traps via the mouse or joystick. Several attempts have appeared recently to exploit touch tablets, 2D cameras or Kinect game console instead. We proposed a multimodal “Natural User Interface (NUI) approach integrating hands tracking, gestures recognition, eye tracking and speech recognition. For this purpose we exploited “Leap Motion and “MyGaze low-cost sensors and a simple speech recognition program “Tazti. We developed own NUI software which processes signals from the sensors and sends the control commands to HRT which subsequently controls the positions of trapping beams, micropositioning stage and the acquisition system of Raman spectra. System allows various modes of operation proper for specific tasks. Virtual tools (called “pin and “tweezers) serving for the manipulation with particles are displayed on the transparent “overlay window above the live camera image. Eye tracker identifies the position of the observed particle and uses it for the autofocus. Laser trap manipulation navigated by the dominant hand can be combined with the gestures recognition of the secondary hand. Speech commands recognition is useful if both hands are busy. Proposed methods make manual control of HRT more efficient and they are also a good platform for its future semi-automated and fully automated work. (en)
Title
  • Natural user interface as a supplement of the holographic Raman tweezers
  • Natural user interface as a supplement of the holographic Raman tweezers (en)
skos:prefLabel
  • Natural user interface as a supplement of the holographic Raman tweezers
  • Natural user interface as a supplement of the holographic Raman tweezers (en)
skos:notation
  • RIV/68081731:_____/14:00434981!RIV15-AV0-68081731
http://linked.open...avai/riv/aktivita
http://linked.open...avai/riv/aktivity
  • I, P(LO1212)
http://linked.open...vai/riv/dodaniDat
http://linked.open...aciTvurceVysledku
http://linked.open.../riv/druhVysledku
http://linked.open...iv/duvernostUdaju
http://linked.open...titaPredkladatele
http://linked.open...dnocenehoVysledku
  • 31728
http://linked.open...ai/riv/idVysledku
  • RIV/68081731:_____/14:00434981
http://linked.open...riv/jazykVysledku
http://linked.open.../riv/klicovaSlova
  • Holography; Interfaces; Cameras; Eye; Particles; Sensors; Speech recognition; Gesture recognition; Tablets; Software (en)
http://linked.open.../riv/klicoveSlovo
http://linked.open...ontrolniKodProRIV
  • [63BF9BE3E9E5]
http://linked.open...v/mistoKonaniAkce
  • San Diego
http://linked.open...i/riv/mistoVydani
  • Bellingham
http://linked.open...i/riv/nazevZdroje
  • Optical Trapping and Optical Micromanipulation XI (Proceedings of SPIE 9164)
http://linked.open...in/vavai/riv/obor
http://linked.open...ichTvurcuVysledku
http://linked.open...cetTvurcuVysledku
http://linked.open...vavai/riv/projekt
http://linked.open...UplatneniVysledku
http://linked.open...iv/tvurceVysledku
  • Zemánek, Pavel
  • Kaňka, Jan
  • Bernatová, Silvie
  • Jákl, Petr
  • Šerý, Mojmír
  • Tomori, Z.
  • Antalík, M.
  • Kesa, P.
http://linked.open...vavai/riv/typAkce
http://linked.open.../riv/zahajeniAkce
issn
  • 0277-786X
number of pages
http://bibframe.org/vocab/doi
  • 10.1117/12.2061024
http://purl.org/ne...btex#hasPublisher
  • SPIE
https://schema.org/isbn
  • 9781628411911
Faceted Search & Find service v1.16.118 as of Jun 21 2024


Alternative Linked Data Documents: ODE     Content Formats:   [cxml] [csv]     RDF   [text] [turtle] [ld+json] [rdf+json] [rdf+xml]     ODATA   [atom+xml] [odata+json]     Microdata   [microdata+json] [html]    About   
This material is Open Knowledge   W3C Semantic Web Technology [RDF Data] Valid XHTML + RDFa
OpenLink Virtuoso version 07.20.3240 as of Jun 21 2024, on Linux (x86_64-pc-linux-gnu), Single-Server Edition (126 GB total memory, 58 GB memory in use)
Data on this page belongs to its respective rights holders.
Virtuoso Faceted Browser Copyright © 2009-2024 OpenLink Software