eCite Digital Repository
Adapting spoken and visual output for a pedestrian navigation system, based on given situational statements
Citation
Wasinger, R and Oliver, D and Heckmann, D and Braun, B and Brandherm, B and Stahl, C, Adapting spoken and visual output for a pedestrian navigation system, based on given situational statements, Proceedings of Workshop on Adaptivity and User Modelling in Interactive Software Systems (ABIS), 2003, Karlsruhe, Germany, pp. 343-346. (2003) [Refereed Conference Paper]
![]() | PDF Not available 1Mb |
Copyright Statement
Copyright 2003 the Authors & ABIS
Official URL: http://km.aifb.kit.edu/ws/LLWA/abis/wasinger.pdf
Abstract
As mobile devices become more and more
complex, there is an increasing desire for these
devices to adapt to their users. This paper
identifies parameters for different input
sources (user, device and environment), and
the parameters of media output (speech, graphics,
sound and text), that may be modified to
tailor user presentation in a pedestrian navigation
system. We also provide an initial insight
into some of the causal relationships between
our input and output parameters, with a specific
focus on the effects that speech can contribute
to the presentation of media output.
Item Details
Item Type: | Refereed Conference Paper |
---|---|
Research Division: | Information and Computing Sciences |
Research Group: | Information Systems |
Research Field: | Computer-Human Interaction |
Objective Division: | Information and Communication Services |
Objective Group: | Computer Software and Services |
Objective Field: | Application Software Packages (excl. Computer Games) |
UTAS Author: | Wasinger, R (Dr Rainer Wasinger) |
ID Code: | 90186 |
Year Published: | 2003 |
Deposited By: | Information and Communication Technology |
Deposited On: | 2014-03-27 |
Last Modified: | 2014-10-08 |
Downloads: | 1 View Download Statistics |
Repository Staff Only: item control page