Submit a Paper

The Third International Conference on Advances in Computer-Human Interactions

ACHI 2010

February 10-16, 2010 - St. Maarten, Netherlands Antilles


Tutorials

T1

Model a Discourse and Transform it in Your User Interface
Prof. Dr. Hermann Kaindl, Vienna University of Technology, Austria

T2

Exploring Sensory Substitution Techniques: Crossmodal Audio-Tactile displays - Using the skin to hear
Dr. Maria Karam, Ryerson University, Canada

DETAILS

T1

Model a Discourse and Transform it in Your User Interface
Prof. Dr. Hermann Kaindl, Vienna University of Technology, Austria

Every Web application needs a user interface, today even several ones adapted for different devices (PCs, PDAs, mobile phones). Developing a user interface is difficult and expensive, since it normally requires design and implementation. This is also expensive, and even more so for several user interfaces for different devices.

This tutorial shows how human-computer interaction can be based on discourse modeling, even without employing speech or natural language. Our discourse models are derived from results of Human Communication theories, Cognitive Science and Sociology. Such discourse models can specify an interaction design. This tutorial also demonstrates how such an interaction design can be used for automatic generation of user interfaces and linking them to the application logic
and the domain of discourse (much like in a recently accepted tool demo at IUI’09).

Prerequisite knowledge

The assumed attendee background is primarily some interest in designing interactions and user interfaces, especially for Web applications. There are no prerequisites such as knowledge about any of the results of Human Communication theories, Cognitive Science, Sociology or HCI in general.

more

T2

Exploring Sensory Substitution Techniques: Crossmodal Audio-Tactile displays - Using the skin to hear
Dr. Maria Karam, Ryerson University, Canada

Our proposed tutorial will run for three hours, beginning with the introduction of the Emoti-Chair as a research tool for facilitation and creation of vibrotactile music. This will take place in the first ½ hour of the tutorial, when participants will have to chance to examine the system and try it for themselves.

Next, we will discuss and explain the basic theory behind the MHC system, addressing topics that influence our ability to feel music, such as the tactile mechanoreceptors stimulated by sound vibrations, a brief discussion about their concentration in different locations on the body, and the limits of what we can feel according to current research. There can then be short break.

At the start of the second hour, we will present the processes involved in translating the music onto the Emoti-Chair, highlighting the many configurable features that can be used to experiment with different ways of presenting sound to the body as vibrations. This will provide participants with the opportunity to examine the software and the hardware used to present the vibrotactile music on the chair, and to be come familiarized with the system, its processes, and capabilities in translating music to vibration during the second half of this hour. 

After another short break, we will begin the hands on portion of the tutorial, which will be based on group activities to allow participants to compose and experience tactile music.

In the music composition group, participants will be given a collection of digital and analog musical instruments that they can use to explore the chair and the different types of vibrations that sounds can induce. Participants will also be encouraged to explore the frequency distribution bands, tuning the chair to provide effective translations of different instruments and their signal ranges. We will be using a distributed version of the MHC, which can support eight individuals in working on separate channels of the EmotiChair to create vibrations and sounds that can then be combined to experience a full vibrotactile composition.

A second group will be given the same type of system, only this time, they can experience existing music through their own ipods or our itunes music library through the MHC. They will be asked to experiment with the different settings on the chair, including frequency distribution, volume, and sounds. Each participant will have the chance to select their own frequency split and try it on different types of music. We will also ask members of the group to experience the vibrations while being deafened using headphones that present different noises we use to mask the audio signal of the chair. This will give everyone the opportunity to experience the system as a deaf person might, offering an important perspective to the potential benefits that the chair can offer people who are deaf.

By the end of the tutorial, all participants will have had the opportunity to learn about sensory substitution of music as touch, to create and experience vibrotactile music, and to gain an understanding of the MHC from first hand exploration of both creating and feeling vibrotactile music.

more

 
 

Copyright (c) 2006-2010, IARIA