Generating speech user interfaces from interaction acts
Number of Authors: 3
2005 (English)Report (Refereed)
We have applied interaction acts, an abstract user-service interaction specification, to speech user interfaces to investigate how well it lends itself to a new type of user interface. We used interaction acts to generate VoiceXML-based speech user interface, and identified two main issues connected to the differences between graphical user interfaces and speech user interfaces. The first issue concerns the structure of the user interface. Generating speech user interfaces and GUIs from the same underlying structure easily results in a too hierarchical and difficult to use speech user interface. The second issue is user input. Interpreting spoken user input is fundamentally different from user input in GUIs. We have shown that it is possible to generate speech user interfaces based on. A small user study supports the results.
Place, publisher, year, edition, pages
Swedish Institute of Computer Science , 2005, 1. , 10 p.
SICS Technical Report, ISSN 1100-3154 ; 2005:13
Speech user interfaces, device independence, interaction acts, the Ubiquitous Interactor
Computer and Information Science
IdentifiersURN: urn:nbn:se:ri:diva-22093OAI: oai:DiVA.org:ri-22093DiVA: diva2:1041635