Multimodal HCI Integration

1999-01-5509

10/19/1999

Event
World Aviation Congress & Exposition
Authors Abstract
Content
A multipurpose test-bed for integrating user interface and sensor technologies has been developed, based on a client- server architecture. Various interaction modalities (Speech recognition, 3-D Audio, Pointing, wireless Handheld- PC-based control and interaction, sensor interaction, etc.) are implemented as servers, encapsulating and exposing commercial and research software packages. The system allows for integrated user interaction with large and small displays using speech commands combined with pointing, spatialized audio, and other modalities. Simultaneous and independent speech recognition for two users is supported; users may be equipped with conventional acoustic or new body-coupled microphones.
Meta TagsDetails
DOI
https://doi.org/10.4271/1999-01-5509
Pages
7
Citation
Vassiliou, M., Sundareswaran, V., Chen, S., and Wang, K., "Multimodal HCI Integration," SAE Technical Paper 1999-01-5509, 1999, https://doi.org/10.4271/1999-01-5509.
Additional Details
Publisher
Published
Oct 19, 1999
Product Code
1999-01-5509
Content Type
Technical Paper
Language
English