+ All Categories
Home > Documents > [ACM Press the SIGCHI Conference - Paris, France (2013.04.27-2013.05.02)] Proceedings of the SIGCHI...

[ACM Press the SIGCHI Conference - Paris, France (2013.04.27-2013.05.02)] Proceedings of the SIGCHI...

Date post: 23-Dec-2016
Category:
Upload: sriram
View: 213 times
Download: 1 times
Share this document with a friend
10
Talking about Tactile Experiences Marianna Obrist 1 , Sue Ann Seah 2 , Sriram Subramanian 2 1 School of Computing Science Newcastle University, UK [email protected] 2 Department of Computer Science University of Bristol, UK {s.a.seah, sriram.subramanian}@bristol.ac.uk ABSTRACT A common problem with designing and developing applications with tactile interfaces is the lack of a vocabulary that allows one to describe or communicate about haptics. Here we present the findings from a study exploring participants’ verbalizations of their tactile experiences across two modulated tactile stimuli (16Hz and 250Hz) related to two important mechanoreceptors in the human hand. The study, with 14 participants, applied the explicitation interview technique to capture detailed descriptions of the diachronic and synchronic structure of tactile experiences. We propose 14 categories for a human- experiential vocabulary based on the categorization of the findings and tie them back to neurophysiological and psychophysical data on the human hand. We finally discuss design opportunities created through this experiential understanding in relation to the two mechanoreceptors. Author Keywords Tactile experiences; human-experiential vocabulary; user study; mechanoreceptors; human hand; non-contact haptic system; ultrasound; explicitation interview technique. ACM Classification Keywords H5.2. User interfaces: Evaluation/methodology, Haptic I/O. General Terms Human Factors, Design, Documentation, Experimentation, Theory. INTRODUCTION AND MOTIVATION With the proliferation of haptic devices there is a demand for designers to create content that includes tactile feedback. This demand has pushed to the forefront the lack of a vocabulary that allows one to describe and communicate about tactile experiences when designing such systems. For example, a designer might use words like ‘pulsing’ or ‘tapping’ to describe a sensation but without any knowledge of how this relates to a specific tactile experience it is hard to design for this experience. Past attempts in creating a vocabulary around tactile experiences (e.g., [9,27,8,5,3,15]) have revealed two main challenges: first participants’ limited ability to verbalize their tactile experiences (e.g., [8,27]) and second the difficulty in separating the expressions used to describe the tactile experiences from the object/material properties (e.g., [3,8]). We believe any vocabulary has to also tie back to neurophysiological and psychophysical research, which holds valuable knowledge about the characteristics of human mechanoreceptors (e.g., [2,12]). Without tying back, the vocabulary will be intricately linked to the experimental setup and haptic device used in the study (a concern raised in some of the earlier works [3,8], thereby limiting our ability to generalize the results). In this paper we explicitly address the challenges reported in previous work and relate participants’ verbalizations on tactile experiences to the specific characteristics of two selected receptors in the glabrous skin (non-hairy part) of the human hand: the Meissner corpuscle (RA) and Pacinian corpuscle (PC). We conducted a study with 14 participants using the explicitation interview technique, a qualitative method based on a psycho-phenomenological perspective, for stimulating participants to verbalize their tactile experiences [35]. For this study we used a haptic system to generate acoustic radiation pressures modulated at 16Hz and 250Hz to stimulate tactile sensations on the hand. We analysed the participants’ verbalizations according to the diachronic and synchronic structure of the expressed experiences. The diachronic analysis revealed insights on the temporal unfolding of participants’ verbalizations (e.g., first expressions used by participants) while the synchronic analysis provided insights on the specific configuration of an expressed tactile experience (e.g., descriptions of used word/expressions). The findings are presented in 14 categories for both stimuli – 6 comparable, 6 distinct, and 2 non-distinct categories. The six comparable categories at first glance present similar facets of tactile experiences but participants’ verbalizations show the specific characteristics of these categories (e.g., ‘Pulsing’ described through ‘tapping’ or ‘dry rain drops’ versus ‘Flowing’ described as ‘flow of blood’ or ‘water’). The two non-distinct categories include tactile experiences that can be described as paraesthesia of the hand (like ‘tingling’, ‘tickly’, and ‘prickly’) and have no clear delineation between the two stimuli. The physiological mechanisms of these sensations are less understood and a matter of current investigation within the neuroscience community [22]. Each of the 14 categories is described in our results section, including further references to participants’ verbalizations, Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. CHI 2013, April 27–May 2, 2013, Paris, France. Copyright © 2013 ACM 978-1-4503-1899-0/13/04...$15.00. Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France 1659
Transcript

Talking about Tactile Experiences Marianna Obrist1, Sue Ann Seah2, Sriram Subramanian2

1School of Computing Science Newcastle University, UK

[email protected]

2Department of Computer Science University of Bristol, UK

{s.a.seah, sriram.subramanian}@bristol.ac.uk

ABSTRACT A common problem with designing and developing applications with tactile interfaces is the lack of a vocabulary that allows one to describe or communicate about haptics. Here we present the findings from a study exploring participants’ verbalizations of their tactile experiences across two modulated tactile stimuli (16Hz and 250Hz) related to two important mechanoreceptors in the human hand. The study, with 14 participants, applied the explicitation interview technique to capture detailed descriptions of the diachronic and synchronic structure of tactile experiences. We propose 14 categories for a human-experiential vocabulary based on the categorization of the findings and tie them back to neurophysiological and psychophysical data on the human hand. We finally discuss design opportunities created through this experiential understanding in relation to the two mechanoreceptors.

Author Keywords Tactile experiences; human-experiential vocabulary; user study; mechanoreceptors; human hand; non-contact haptic system; ultrasound; explicitation interview technique.

ACM Classification Keywords H5.2. User interfaces: Evaluation/methodology, Haptic I/O.

General Terms Human Factors, Design, Documentation, Experimentation, Theory.

INTRODUCTION AND MOTIVATION With the proliferation of haptic devices there is a demand for designers to create content that includes tactile feedback. This demand has pushed to the forefront the lack of a vocabulary that allows one to describe and communicate about tactile experiences when designing such systems. For example, a designer might use words like ‘pulsing’ or ‘tapping’ to describe a sensation but without any knowledge of how this relates to a specific tactile experience it is hard to design for this experience. Past attempts in creating a vocabulary around tactile experiences (e.g., [9,27,8,5,3,15]) have revealed two main challenges: first participants’ limited ability to verbalize their tactile

experiences (e.g., [8,27]) and second the difficulty in separating the expressions used to describe the tactile experiences from the object/material properties (e.g., [3,8]). We believe any vocabulary has to also tie back to neurophysiological and psychophysical research, which holds valuable knowledge about the characteristics of human mechanoreceptors (e.g., [2,12]). Without tying back, the vocabulary will be intricately linked to the experimental setup and haptic device used in the study (a concern raised in some of the earlier works [3,8], thereby limiting our ability to generalize the results).

In this paper we explicitly address the challenges reported in previous work and relate participants’ verbalizations on tactile experiences to the specific characteristics of two selected receptors in the glabrous skin (non-hairy part) of the human hand: the Meissner corpuscle (RA) and Pacinian corpuscle (PC). We conducted a study with 14 participants using the explicitation interview technique, a qualitative method based on a psycho-phenomenological perspective, for stimulating participants to verbalize their tactile experiences [35]. For this study we used a haptic system to generate acoustic radiation pressures modulated at 16Hz and 250Hz to stimulate tactile sensations on the hand.

We analysed the participants’ verbalizations according to the diachronic and synchronic structure of the expressed experiences. The diachronic analysis revealed insights on the temporal unfolding of participants’ verbalizations (e.g., first expressions used by participants) while the synchronic analysis provided insights on the specific configuration of an expressed tactile experience (e.g., descriptions of used word/expressions). The findings are presented in 14 categories for both stimuli – 6 comparable, 6 distinct, and 2 non-distinct categories. The six comparable categories at first glance present similar facets of tactile experiences but participants’ verbalizations show the specific characteristics of these categories (e.g., ‘Pulsing’ described through ‘tapping’ or ‘dry rain drops’ versus ‘Flowing’ described as ‘flow of blood’ or ‘water’). The two non-distinct categories include tactile experiences that can be described as paraesthesia of the hand (like ‘tingling’, ‘tickly’, and ‘prickly’) and have no clear delineation between the two stimuli. The physiological mechanisms of these sensations are less understood and a matter of current investigation within the neuroscience community [22].

Each of the 14 categories is described in our results section, including further references to participants’ verbalizations,

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. CHI 2013, April 27–May 2, 2013, Paris, France. Copyright © 2013 ACM 978-1-4503-1899-0/13/04...$15.00.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1659

their relation to the two mechanoreceptors in the human hand, and their suitability to assist design activities in HCI focusing on tactile feedback.

The main contributions of this paper can be summarized as follows: 1) a human-experiential vocabulary (H-E-Vocabulary) defined by the verbal descriptors used to characterise the tactile experiences for the two stimuli; 2) relating the two stimuli and the emerged vocabulary to specific mechanoreceptors of the glabrous skin of the human hand; and based on this understanding 3) identifying design opportunities for stimulating tactile experiences linked to these mechanoreceptors.

TACTILE VOCABULARY Enriquez et al. [9] created haptic phonemes that can be perceptually distinguished and can serve as haptic icons (words) to which an arbitrary meaning (e.g., blueberry, strawberry, orange) can be assigned by designers. Their findings show that the phoneme-meaning associations can be learned in a reasonable time (25 minutes) and can persist if further trained. The haptic phoneme created by the authors is a form of haptic vocabulary that represents the smallest distinguishable haptic unit, but there is no relation between the vocabulary and experiences. Our paper has a different focus, as we are interested in the vocabulary used by participants themselves to describe tactile experiences.

In the recent years there have been several attempts at establishing a vocabulary around tactile experiences by exploring it in relation to product design or in combination with other modalities [25]. However, most researchers note that their methodology is either limited or involves a work-around, as participants lack a vocabulary to express tactile experiences. O’Sullivan and Chang [27], for instance, point to the limitations of people’s ability to express haptic sensations. In their study discussing the combination of sound and vibrations they observed that participants had difficulties in describing and distinguishing haptic experiences, such as when caused by different vibrations.

Dagman et al. [8] investigated people’s ability to verbalize both visual and haptic qualities for ordinary products (i.e., coffee mug, telephone, hammer, coffee pot). They used a set of adjectives along pre-defined experience dimensions, which were completed with words from their participants. They note that they used this approach due to people’s lack of a haptic language. Jansson-Boyd [15] also questions if there is such a thing as a tactile language, which goes, for instance, beyond the framework suggested for pleasant or unpleasant haptic experiences in product design [30]. The author [15] points out that unfamiliar stimuli reveal more consistent tactile patterns than familiar stimuli (e.g., everyday objects such as a coffee mug), as they do not follow pre-established preferences for tactile input (e.g., exploring the tactile experiences of a coffee mug embodies pre-defined meanings and opinions about the object, and influences participants’ verbalizations of their experiences).

Chang and Ishii [5] also argue for a common understanding of sensory experiences and introduce a ‘sensorial mapping’, comprising all human senses. They aim to support designers in matching digital information with sensory information for enriching users experiences. The mapping provides a starting point but remains on a general level with respect to tactile experiences. Furthermore, Brown and Kaye [3] conducted a study investigating how material properties of a vibrating device changes user’s haptic experiences. The study revealed, that participants used similes and metaphors to describe the sensation of holding the different vibrating device using words such as ‘soft’ and ‘tingly’. However, the authors [3] note that the participants’ expressiveness may have come from the material used on the vibrating device rather then from the vibration itself.

A key element missing in exploring the tactile experience space is the human-experiential vocabulary for verbalizing the experiences [25]. Another problem that was noted by many of the above researchers is that it can be hard to separate the expressions used to describe the tactile experiences from the object/ material properties [3,8]. To be able to create a human-experiential vocabulary for tactile experiences one needs to understand the neurophysiological and psychophysical characteristics of the specific part of the body. From this understanding we can create tactile stimuli that target specific mechanoreceptors so that any vocabulary that emerges can be precisely linked to the characteristics of the receptors. This is essential to not only create a vocabulary that allows designers to talk about tactile experiences but to also enable developers to relate this vocabulary to specific tactile stimuli. Moreover, this form of vocabulary allows designers to create enhanced experiences, as they are better aware of how the sensory receptors contribute to the overall experience [31].

Below is a brief overview of the mechanoreceptors of the human hand, of which two are relevant (Meissner corpuscle and the Pacinian corpuscle receptors) to situate our user study explained later on.

MECHANORECEPTORS IN THE HUMAN HAND The palmar surface of the hand, which is part of the glabrous skin (or non-hairy skin), consists of four types of tactile receptors or mechanoreceptors (see Figure 1) [2,12] that can be distinguished based on two characteristics; a) their rate of adaptation to skin deformation and b) on the size of their receptive fields.

The rate of adaptation of the nerve fibers which the receptors are connected to can be broadly categorised as rapidly- or slowly-adapting. Rapidly-adapting fibers detect the transients of skin deformation, discharging only at the onset and at the removal of a stimulus whereas slowly-adapting fibers detect sustained skin deformation, discharging at stimulus onset and also if stimulus persists [24]. The Meissner corpuscle (RA) and the Pacinian corpuscle (PC) are innervated by rapidly-adapting fibers

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1660

while the Merkel neurite complex (SAI) and SAII end organ (SAII) are innervated by slowly-adapting fibers.

The receptive field is the area on the skin where the ‘right’ stimulus will evoke neural discharge in a nerve fibre. The receptive fields of RA and SAI are smaller and well-defined when compared to PC and SAII. These larger receptive fields may cover a whole fingertip (PC and SAII) or even up to a whole hand (PC). The relative densities of the RA and SAI receptors decrease from the fingertip to the palm compared to the densities of the PC and SAII receptors, which are more evenly distributed across the hand. This suggests the suitability of the RA and SAI receptors in spatial analysis and stimuli localization [17].

Neurophysiological and psychophysical studies have shown that each of these mechanoreceptors contributes to different aspects of perception (see [16,17]).

• The SAI receptor is responsible for the perception of form and texture as SAI is sensitive to points, edges, curvature and can resolve grooves up to 0.5 mm wide.

• The RA receptor is sensitive to low-frequency vibration. The perception of tiny, low-frequency skin motion (slip) helps provide the feedback signals required for grip control of an object.

• The PC receptor is sensitive to high-frequency skin deformation at the nanometer level and helps in the perception of distant textures or events when the vibrations are transmitted through a hand-held object.

• The SAII receptor is sensitive to skin stretch and is for detecting object motion or forces parallel to the skin surface and the perception of hand/finger shape.

The RA and PC receptors are most sensitive to stimuli, which continuously change with time (i.e., vibration). The RA receptors generally respond to vibration in the 2 – 40 Hz frequency range with a relatively flat response curve whereas the PC receptors operate in the range above 40Hz with a distinct U-shaped response curve with sensitivity peaking between 250 and 300 Hz [13]. By generating a vibrating stimulus in the frequency range of either the RA

or PC receptor, evoked sensations should correspond to the stimulation of the particular receptor.

In our study we focus on 2 out of the 4 mechanoreceptors due to their characteristics. We chose the two rapidly-adapting receptors (RA and PC) as they are the only receptors that respond to vibration. The remaining 2 receptors (SAI and SAII) are sensitive to other types of tactile sensations, which are not vibrational (such as points, edges, curvature and skin stretch). We use the explicitation interview technique to elicit verbalizations of participants’ experiences when each of these two receptors is stimulated.

EXPLICITATION APPROACH The explicitation interview technique [35] brings together a psychological and phenomenological perspective to elicit verbalizations of subjective experiences. This psycho-phenomenological approach [34] is a form of guided introspection technique that seeks first-person accounts by using distinctions in language, internal sensory representations and imagery that have been incorporated from neuro-linguistic programming [34]. This interview technique is a retrospective approach and is thus typically applied after an experience has happened. For example, the technique has been used in the field of ergonomics for studying discomfort and negative emotions when driving a car [5], in the pedagogical domain by giving students access to their own thinking processes [36], and in human-robot interaction research for investigating how mental models are formed and influence participants evaluation when interacting with a robot [33]. Although a retrospective technique, Light [23], for instance, also used it in-situ to describe thoughts that went through the minds of the participants as they approached and started entering text when performing web-based tasks.

The value of this interview technique lies in the way of asking questions that supports participants in expressing their experiences linked to a specific moment. The participant is encouraged to keep talking, about the experiential (cognitive, perceptive, and affective) aspects of the moment without building on rational comments and explanations about it [29]. For example, the interviewer asks questions like “Please describe what you feel, see, hear, or perceive” and follows up with questions that help to place the participant in an evocation state so they talk about that specific lived experience (including action, sensorial perception, thoughts and emotions) in all its details rather then focusing on conceptual (theories, rules or knowledge), imaginary and symbolic verbalizations [35].

This technique helps to explore the unfolding of the experience in time, referring to the ‘diachronic’ dimension and examining the specific facets of the experience in a particular moment referring to the ‘synchronic’ structure of an experience [29]. The goal of questions related to the diachronic structure is to understand how the description of an experience unfolds over time (using questions like “What happened after you opened the door?” and “What

Figure 1. The 4 mechanoreceptors linked to fiber types, the receptive field size, distribution on the hand and functions.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1661

did you perceive next?”). With respect to the synchronic structure of an experience, the participant is questioned about a particular moment (using questions like “At the moment when you pushed the handle down how did it feel?” or “What else came in your mind at that moment?”). Thereby, the specific configuration of the experiential sensation at a given moment of time can be investigated.

Compared to other interview methods using an open questioning approach, this technique is non-inducive but directive [29]. Non-inducive because it keeps the participant talking about the experience without inducing any content, as it focuses on the structure of the experience. Directive, because it keeps the participant focused on the singular experience, which is explored. In this paper, we use this interview technique both in-situ and retrospectively. We primarily use it in-situ for exploring the tactile experiences, which have rapidly decaying memory [10] and retrospectively to examine interdependencies between the different tactile stimuli.

STUDY SETUP The study was carried out as part of a larger two-day workshop with designers, artists and developers with a focus and interest in haptic systems. The workshop explored design possibilities with a non-contact haptic system. The workshop encouraged free and open exploration of the haptic system without any application or scenario restrictions. This study was conducted on the second day of the workshop, where all participants already had the chance to try out the haptic system as part of a tutorial and open exploration phase. We chose to do our study in this manner, as we wanted to provide the participants with a context to imagine so they can express their tactile experiences easily (as noted by [8,27,3]) while at the same time not shoe-horning each participant to a specific context (see value of non pre-established opinions by [15]) and limiting the scope of this study. We felt that the selected user group is most able (at the first instance) to verbalize tactile experiences knowing from previous research [26] that people have difficulties in expressing tactile experiences, even for ordinary objects.

The study system was setup in an open corner office at the workshop venue. This was a quiet corner where only the study participants (one at a time) and two researchers were present. One researcher was exclusively managing the study system and setup for each participant (but did not participate in the interview process) and the other researcher carried out the explicitation interviews (see Figure 2). This study setup was out of sight from the other workshop happenings and no one else from the workshop could hear or observe the proceedings. We also instructed each participant to not talk about the study (the procedure and details) when they return to the workshop. The study took about 20 minutes per participant.

Tactile system used and setting The system used in our study allowed users to experience

non-contact tactile sensations using acoustic radiation pressures [14,1]. The system creates a focal point of fixed pressure (physical intensity) in mid-air using 40kHz ultrasound waves by applying the correct phase delays to an array of ultrasound transducers. This focal point of pressure, which was kept the same throughout the study, can then be felt when modulating the ultrasound waves within the frequency range of the mechanoreceptors of the human hand. The system was built with 64 ultrasound transducers arranged in an 8x8 array. We used this tactile system rather than a vibrator (such as piezo actuators) to allow the tactile stimuli to be experienced in both the palm and the fingertips. With vibration motors mounted on a surface, exploring different hand regions can be burdensome and the onset and off-set of touching the vibrating surface can be associated with other receptors; leading to unnecessary confounds in our study. The system used allowed the isolation of the haptic stimuli from the devices thus shielding any textures or geometry of the device from influencing the results.

Participants were allowed to explore an area of 9x16x20.5 cm3 above the transducers (see Figure 3). In order to provide all participants the same space for exploration we created a box around the ultrasound system. The transducer array was built 12.5 cm above the surface of the table and the entire setup was then covered by a white foam board box with a cut-out the same size as the exploration space to obscure all wires and electronic components. We did this as previous research has shown that people use different kinds of exploratory hand movements to experience tactile stimuli [21]. This active exploration emphasizes purposive touch that increases information received by the brain to help discern surface qualities such as roughness and hardness [19]. The participants were instructed to explore the stimuli with their dominant hand and were informed not to touch

Figure 2. Study setup with two researchers and the participant

(left picture). Box to control the hand movements (right picture).

Figure 3. Dimensions of the box around the system used

including the location of the ultrasound transducers.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1662

any of the sides of the exploration space to remove any other haptic cues besides the one produced by our system. Participants were explicitly asked to explore the stimuli with their whole hand (palm and the fingers). They wore a noise-cancelling headset to remove any audio cues.

Participants details A total of 14 participants (ten males and four females) aged between 23 and 50 years (mean 34.8, SD 8.1) took part in this study. All participants were native English speakers based in the UK. No participant reported any neuropathy or vascular problems associated with their upper limbs or diabetes. Participant 11 reported primary Raynaud’s on all three phalanges of the index, middle and ring finger on both hands. As a result for the 16Hz stimulus there was no data from participant 11 as s/he experienced no tactile sensation for this stimulus. This meant that this person’s threshold for the 16Hz stimulus was higher than that provided in the study. Ten participants were right-handed.

Before we started the study, we measured the room temperature (ranged between 26.3 °C and 29.0 °C) and skin temperature using a calibrated digital thermometer. Participants finger skin temperatures ranged between 26.3 °C and 33.7 °C (mean 31.1, SD 2.7) while their palm skin temperatures ranged between 29.9 °C and 34.8 °C (mean 33.3, SD 1.3). Studies have shown that PC and RA receptors (the two receptors used for our study) have little variance in sensitivity in these skin temperature ranges [38].

Data collection and procedure There were two modulation frequencies presented to the participants, the frequency of 16Hz (corresponding to RA receptor) and another of 250Hz (corresponding to Pacinian PC receptor). We picked these two frequencies as the receptors in question have peak sensitivity to these frequencies. The order of presentation was alternated between the participants to account for order effects. The participants were informed that the study involved two different stimuli and that each of the stimuli will be presented to them for maximum of 5 minutes. Participants were given the option to withdraw their hand earlier in this exploration phase if they wished to or when they thought nothing more can be said about the sensations. After each stimuli, they were asked to answer three open questions on a prepared A5 question cards: (1) What words would you use to describe how it felt on your hand, if at all? (2) Was there anything else you felt or thought or are there any sensations or pictures that come in your mind? and (3) Is there anything additional that you associate with it or not? These questions were inspired by the explicitation interview questioning technique and allowed participants to verbalize and reflect upon their tactile experiences in a written form (and represented a break between the two stimuli). Participants were also informed that throughout the 5 minutes of exploring each stimulus, we will ask questions about their tactile experiences, and encourage them to speak out loud whatever comes to their mind.

As noted earlier, the interview technique was used in-situ and included a retrospective questioning at the end after both stimuli were experienced. In-situ questioning for each individual stimulus is essential to reduce the effect of a rapidly decaying tactile memory, which is within just a few seconds [10]. Retrospectively each participant was asked to recall the particular moment when they switched from the first to the second stimulus (asking the question: “What was the first thought which come in your mind when you put in your hand for the second stimuli?”). We introduced this final step to explore the tactile experiences and interdependencies between the two stimuli, in particular in relation to the diachronic structure of an experience.

Analysis procedure All sessions were audio and video recorded, transcribed and then analysed according to the diachronic and synchronic structure of the experience [29]. The A5 question cards were transcribed too and included in the further analysis. In a first step, two coders one with a background in user experience research and the other in psychophysical research went through all transcripts independently. We picked coders with different expertise to draw from the data the two different perspectives we wanted to capture in this paper. Despite the different disciplinary backgrounds of the two coders the first coding step resulted in a fair inter-coder reliability [4] (Cohen’s kappa of κ > 0.5).

Each coder started with independently writing down the words and expressions used to describe each stimulus. These words (like “tingling”) or groups of words (expressions like “fan on low battery”) formed the initial groupings for each participant per coder. Each grouping was assigned as diachronic and/or synchronic dimension of the experience by referring to the temporal evolvement and the context of the verbalizations in the transcripts. Repeated verbalizations of the same verbal descriptors were subsumed in one ‘word/expression’ (defining a unique code) adding the line number as reference to the transcripts. The main author then merged the resulting groupings from both coders using NVivo 9 (qualitative data analysis software) resulting in 35 categories for stimuli 1 (16Hz) and 31 categories for stimuli 2 (250Hz). Each word or single expressions from a participant belonged to only 1 of these 66 categories. In total we had 279 words/expressions for stimuli 1, and 283 words/expressions for stimuli 2. To verify the merged grouping, the second coder was instructed to review the coding results and examine the categories based on the underlying sources in NVivo (i.e., transcripts and respective codes).

In a next step, both coders examined the categories again together in order to ensure that only categories specifically focusing on the participants’ tactile experiences are included. Verbalizations related to participants specific characteristics (e.g., one participant mentioned suffering from RSI, or another noted that he has climbed for several years resulting in a lower sensitivity in his hand) were

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1663

extracted from the stimuli 1 or 2 grouping and defined as extra categories. Any comparisons made between the two stimuli (e.g., ‘It’s a faster vibration’ or ‘The other feels more like a natural sensation’) were subsumed in an extra category. Furthermore, specific codes related to the position of the sensation on the hand (finger, palm) were categorized separately in order to allow specific conclusions regarding the relation between the tactile experiences and the receptors in the human hand. Finally, expressions related to temperature, i.e. ‘warm’, ‘it feels warm’ were excluded from the stimuli 1 and 2 categories too. While these can be genuine expressions related to the specific stimulus this could have also been from other contextual factors like ambient temperature in the box. Using the box for providing a constant region of hand movements for exploration across the participants (an important requirement) may have caused expressions of warmer sensation. The specific receptors in this study are not triggered by temperature and in order to only classify unambiguous associations we eliminated them.

Finally, the original 35 categories for stimuli 1 and 31 categories for stimuli 2 were further assessed by each coder for similarities and differences and merged into a higher-level categorization, resulting in 7 categories for each stimulus. Each category was named to best describe the subsumed verbalizations expressed by the participants. We involved an expert in the field of tactile/haptic systems, to review the wording of the final categorization, as finding the right wording for the higher-level abstractions is a key step during this formalisation process [29].

RESULTS AND DISCUSSION To draw out the relevance of our findings to HCI we will present the final high-level categorization (human-experiential vocabulary, referred to as H-E-Vocabulary) for the tactile experiences, include a discussion of potential design opportunities in each revealed category, and relate our findings back to characteristics of the RA and PC mechanoreceptors in the non-hairy skin of the human hand.

Overall, for stimuli 1 (16Hz) a total of 247 single references are classified along 7 categories and for stimuli 2 (250Hz), a total of 272 references are classified along 7 categories. Figure 4 provides an overview on all 14 categories for both stimuli along with the number of participants, number of participants’ verbalizations (references) and examples for each category. Our 14 categories contain 6 comparable, 6 distinct, and 2 non-distinct categories. These categories represent a formative classification to compare the tactile sensations and offer an initial H-E-Vocabulary. We start our description by presenting participants’ temporal evolution of expressed tactile experiences based on the insights from the diachronic analysis.

Tactile experiences unfolded over time In each case, when presented as first stimulus, participants started off with expressions related to an airy-experience (i.e., ‘Puffs of Air’ (16Hz) and ‘Breeze’ (250Hz)

categories). When presented as second stimulus, the analysis showed, that words such as ‘weak/weaker’ and ‘strong/stronger’ are mentioned first (see details subsumed in the ‘Weak’ (16Hz) and ‘Strong’ (250Hz) categories). Apart from these two aspects (effects of order in stimulus presentation), the verbalizations regarding the two tactile stimuli unfolded differently and are summarized below.

With respect to the 16Hz stimulus, participants’ continued their exploration (after the airy-experience) by referring to the localization of the sensation identified in the category ‘Focused’, which is followed by expressions on weak/subtle and prickly experiences. Participants further verbalized their tactile experiences with references to the irregular appearance of the sensation, describing it as coming and going. Finally, participants verbalized a pulsing vibration experience and associated the sensation with physical materials, such as thin textiles. In the cases where the 16Hz stimulus came after the 250Hz, participants expressed the non-constant and the pulsing vibration experience much earlier than the focused and prickly experiences.

For the 250Hz stimulus, after expressing the airy-experience, participants continued describing the tingling and constant experience, followed by references to the physicality of the tactile stimulus. Further expressions were then related to the vibrating experience, associated with flowing of blood or water. Finally participants pointed out the dispersed character of the experience on their palm and fingers. In the cases where the 250Hz stimulus came after the 16Hz participants expressed the dispersed experience right after the strength of the tactile stimulation. The order of the other verbalizations on air, physicality, tingling and the vibration experience did not change notably.

Comparable experiences between both stimuli The six comparable categories at first glance present similar facets of tactile experiences but are distinguishable based on participants’ verbalizations and how they relate to the mechanoreceptors. We describe these differences in each category and point out potential design opportunities.

‘Puffs of Air’ versus ‘Breeze’ The participants’ initial references in both stimuli were verbalized around the airy-experience. The analysis of the detailed descriptions of this particular experiences revealed that expressions used for the 250Hz experience were related to familiar concepts (e.g., ‘air-conditioning in the car’ or ‘air-conditioning on the airplane’) and included the sensation of a continuous blowing of wind/breeze against the hand. The expressions around 16Hz were less precise referring to, for instance, ‘hand-held fan on a low battery’ or ‘wind’. The sensitivity thresholds of the RA receptor are much lower than that of the PC one [5,16], meaning that for the same intensity of the stimuli, one is able to perceive sensations in the range of the PC much better than that of the RA (below 40Hz). Thus, at 16Hz one feels light puffs of air and at 250Hz stimuli feels a strong breeze. Participants in general describe the airy-experience as something you

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1664

would typically not recognize as air. The 250Hz stimulus was for instance associated with ‘a mechanical breeze “it feels like a breeze from the machine” [P14] or a generated breeze ‘like a rush of air’ [P9]. In contrast to these examples of strong and larger areas of experiences, the verbalizations around the 16Hz were more abstract including references to electrostatic feelings “just that thing of an electric current, when there’s a serious amount of it you can kind-of almost feel” [P14]. One participant expressed “I can feel a slight draft on my hand” [P7]. The comparable experience in these categories is distinguishable based on the participants’ abstractions of the experience (familiar/loose concepts), as well as the dimension of the perceived tactile experience (areas on the hand).

H-E-Vocabulary: These results suggest that words referring to light weather conditions such as ‘slight draft’ or ‘gentle wind’ are stimulating the RA receptors of the hand whereas words indicating a strong breeze such as ‘air-conditioning’ or ‘rush of air’ are stimulating the PC receptors. This knowledge is not only useful in terms of how a designers choice of words might be translated to a specific tactile feedback but it can be useful for creating tactile feedback in ubiquitous/ambient interfaces (e.g., weather forecasting interface; 16Hz lighter conditions, 250Hz indicating stormy weather).

‘Pulsing’ versus ‘Flowing’ In both stimuli participants expressed vibrating experiences. Expressions for the 16Hz stimuli included references to a certain pulse/rhythm of vibration (e.g., ‘drops from dry rain’ or ‘tapping’) and were associated with small and clearly defined concepts (e.g., ‘blowing raspberries’ or ‘touch of a guitar amp’). For the 250Hz participants expressed their tactile experiences with references to something moving and associated it with broader concepts (e.g., ‘flowing water’, ‘electric current’ or the ‘flow of blood’). One participant described the flowing character of

the vibration of the 250Hz as such: “It’s almost like flowing against my hand; like water or electricity or wind or something like that.” and continuous describing it as: “Yeah, like your hand is against air or water or something like … there’s kind-of like an electric current flowing” [P11]. The result suggests that users refer to low frequency vibrations (16Hz for RA receptor) as rhythmic or pulsing whereas vibrations at higher frequency (250Hz for PC receptor) are referred to as flowing or continuous.

H-E-Vocabulary: These results suggest that words referring to rhythmic vibrations such as ‘tapping’ or ‘pulsing’ are stimulating the RA receptors of the hand whereas words indicating continuous vibrations such as ‘flowing’ or ‘electric current’ are stimulating the PC receptors. This knowledge could be useful for tactile feedback in music interfaces (e.g., as learning interface for different rhythm).

‘Soft Material’ versus ‘Dense Objects’ Although our system was using air pressure with non-contact stimuli, 5 out of 14 and 8 out of 14 participants described physical experiences for the 16Hz and 250Hz stimuli respectively. The 16Hz stimulus was amongst others related to touching thin material (e.g., ‘muslin but lighter’, ‘gauze or perforated material’ or ‘soft brushes’). The 250Hz stimulus was described with a broader sensation of physicality and materiality (e.g., ‘uneven wall’ or ‘piece of foam’). Participants used strong physically related expressions for the 16Hz stimulus: “feels like I’m maybe holding onto something” [P2] or “it feels more solid” [P14]. For the 250Hz stimulus one said, “It just feels more physical… it feels like I can sort-of push down” [P5]. The verbalizations for these categories were not dominating the categorization however provide quite clear insights on the differences of the perceived physical experiences. While the 16Hz stimulus was more related to soft and graspable materials, the experiences of the 250Hz were reflecting to larger metaphors, such as a wall, which was further

Figure 4. Overview on the 14 categories (6 comparable, 6 distinct, 2 non-distinct) with number of participants, number of references (ref) and example verbalizations for both tactile stimuli and related mechanoreceptors.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1665

described as such: “going blind on a virtual wall” [P13]. The 250Hz stimulus seems to be more dispersed (relating to the receptive field of the PC) and ‘strong’, so participants feel an uneven wall or piece of foam, while the ‘softness’ of air and the ‘weakness’ of the 16Hz stimulus gives rise to references of thin or perforated materials. The ‘solid’ or ‘holding on to something’ feeling arises as the RC receptor is responsible for skin motion to detect slip between object and skin, generating a perception of holding an object.

H-E-Vocabulary: These results suggest that words referring to soft materials and textiles such as ‘soft brushes’ or ‘muslin’ are stimulating the RA receptors of the hand whereas words indicating larger, physical surfaces such as ‘wall’ or ‘piece of foam’ are stimulating the PC receptors. This knowledge can be useful for stimulating specific physical experiences and reaffirms work by Kildal [20] demonstrating that one can create a button-press illusion on a rigid surface through vibrations below the surface. The verbalization difference in the physical properties of the two stimuli can explain why Mousette [25] noted that “hard does not always feel exactly hard” (p229) when qualifying haptic experiences like relative hardness. One possible explanation for the different expressions could be the user is being stimulated at different frequencies and thus feeling sensations corresponding to different mechanoreceptors.

Distinct experiences between both stimuli Here we describe the six categories that are distinct for each stimulus ‘Coming & Going’ versus ‘Constant’, ‘Pointed’ versus ‘Dispersed’, and ‘Weak’ versus ‘Strong’.

‘Coming & Going’ versus ‘Constant’ A very distinct difference between the two stimuli was found in the participants’ verbalizations of the availability of the tactile stimulation. The 16Hz stimulus was described as ‘coming and going’, ‘not constant’, and perceived at different locations. One participant described the 16Hz as “I feel like dancing … getting … finding where it is and then losing it again” [P13]. In contrast, the 250Hz stimulus was experienced as always accessible, nothing which needs to be hunted for, as it is like “a static stream – can’t feel it changing” [P8]. This is either because the 16Hz experience is close to the threshold of the RA receptor, so some participants feel it coming and going, or that the receptor is responsible for skin motion to detect slip between object and skin, generating a perception of movement. The PC receptor is responsible for high frequency vibration and has a much lower threshold at 250Hz thus participants are always able to feel the stimulus.

H-E-Vocabulary: This suggests that words referring to altering movements such as ‘coming & going’ or ‘dancing’ are stimulating the RA receptors of the hand whereas words indicating constant movements such as ‘stream’ or ‘not changing’ are stimulating the PC receptors. This can be useful for improving the interaction in games, for instance providing accelerator feedback in a racing game through 16Hz and 250Hz tactile stimuli for the steering wheel.

‘Pointed’ versus ‘Dispersed’ Participants localized the two tactile experiences on their hand or fingers differently. The 16Hz stimulus was described to be ‘localised’, ‘focused’ or ‘reaches a point of focus’ while the 250Hz stimulus was expressed to be dispersed and described as ‘like across the whole hand’, ‘across my palm and fingers’ or ‘not focused’. Based on the participants’ verbalizations, the 16Hz tactile experience is likely to guide participants around and provide them different points of attention (e.g., “that is very identifiable” [P12], “It feels like it kind-of reaches a point of focus” [P9]) while the 250Hz is dispersed and less precise in indicating positions on the human hand (e.g., “maybe two less distinct points” [P12]). These experiences relate to the receptive-field size and distribution of the receptors. The RA receptor (16Hz) has a smaller receptive field and has distinct borders allowing for better localization compared to the PC receptor (250Hz), which has a larger receptive field and has indistinct borders limiting its ability to localize [17].

H-E-Vocabulary: This suggests that words referring to precise locations such as ‘localized’ or ‘centralized’ are stimulating the RA receptors of the hand whereas words indicating unclear positions such as ‘non focused’ or ‘diffused’ are stimulating the PC receptors. This knowledge can be useful for example in navigation/way finder systems as an unobtrusive tactile aid to guide people through a city.

‘Weak’ versus ‘Strong’ Participants described the 16Hz stimulus as ‘very very subtle’, ‘difficult to detect’, or ‘quite faint’. Its weakness was further expressed as “didn’t have much kind-of power in it to sort-of create a strong sensation in any way really” [P1]. In contrary, the 250Hz stimulus was described with respect to its powerful character, including expressions such as ‘pushing’ or ‘it seems quite forceful’. To specify the strong experience participants described it as “kind-of bad hand dryer in terms of its strength” [P14]. One participant related it to the hand as such: “like if I splay my fingers it kind-of seems to go through my fingers as if it had, you-know, capacity to go further than the box, that it’s contained within” [P1]. Comparing both tactile experiences, the 16Hz stimulus was lacking strength, and requested a higher level of attention: “It wouldn’t grab my attention” [P5]. This matches the sensitivity of the receptors; RA receptor although localized has a much lower sensitivity so is felt as weak and sporadic while the PC receptor has higher sensitivity so is felt as being strong and persisting.

H-E-Vocabulary: These results suggest if you keep the physical intensity of the tactile stimuli the same, words referring to weak intensity such as ‘faint’ or ‘subtle’ are stimulating the RA receptors of the hand whereas words indicating strong intensity such as ‘powerful’ or ‘forceful’ are stimulating the PC receptors. One should note that the perceived intensity of a stimulus varies across frequencies.

Non-distinct experience between both stimuli Participants expressed words such as ‘tingling’, ‘tickly’,

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1666

and ‘prickly’ for both stimuli. Although the described tactile experiences are somewhat different, it is still difficult to distinguish them. These particular tactile experiences, also known as paraesthesia sensations (sensations on the skin with no apparent long-term physical effect), are challenging, as neuroscientists are still trying to understand the physiological mechanism of paraesthesia [22].

‘Prickly’ versus ‘Tingling’ Based on our analysis, we can however provide insights on the participants’ verbalizations for the two tactile stimuli, which we classified as ‘Prickly’ (16Hz) and ‘Tingling’ (250Hz) in our overall categorization. Descriptions for the 16Hz stimulus were more focused on the ‘palm’/‘pad and in the center’ (e.g., ‘pins and needles’ or ‘blowing through a straw’), while for 250Hz the experiences were described as more dispersed across the whole hand including the palm and fingers (e.g., ‘fizzy sweets in your mouth’ or ‘electro static shocks’). One participant further associated the 16Hz sensation with the prickly experience after you got a cramp in your arm: “it’s the good feeling when the blood starts pushing through again” [P8]. While the 250Hz the tingling experience was still associated with a numb experience: “when your palm is going to sleep or your hand is going to sleep or something if you’ve been lying on it funny” [P11]. The differences between the two stimuli can be seen in their localization on the human hand. 250Hz is related to a larger area (whole hand), while 16Hz stimulates smaller parts of the human hand, such as at specific parts of the palm (e.g., centre of the palm or the fleshy area below the thumb). Stimulating the RA and PC receptors individually seems to give rise to paraesthesia experiences. Paraesthesia is associated with similar nerve fibres that innervate the mechanoreceptors [26]. This would mean that it is possible to artificially stimulate prickly and tingling by using air within the frequencies range of the two receptors.

H-E-Vocabulary: These results suggest that participants use words such as ‘pins and needles’ or ‘prickly’ more for the lower frequency (16Hz) that stimulates the RA receptor whereas words such as ‘tingling’ or ‘ticklish’ are more used for the higher frequency (250Hz), which stimulates the PC receptor. This knowledge can be useful in health-related work (e.g., therapeutic interfaces for patients with diseases causing paraesthesia).

DESIGN IMPLICATION We can draw several implications for the design of haptic systems using non-contact tactile sensations from this paper. The participants used nuanced and consistent choice of words to describe the different tactile experiences from each stimulus. This allowed us to explore participants’ words further, which can be expressly mapped to specifically stimulated mechanoreceptors and thereby tying back their verbalizations to specific tactile feedback. For example, we now have a solid starting point to hypothesise that when a designer wants to create a sensation like ‘Pulsing’ it refers to a specific vibration experience in the

16Hz frequency range and stimulates the Meissner corpuscle receptor. Our H-E-Vocabulary described in the previous sections provides further examples on the verbalizations used by participants for tactile experiences and can be further explored in relation to other efforts for designing haptic interactions [25].

The H-E-Vocabulary not only provides to talk about haptics but also identifies haptic sensations that can be created by each receptor and relates them to the receptor’s features (frequency, distribution, sensitiveness, etc.). A haptic software development toolkit can be created from this vocabulary. Such a toolkit could provide API calls for each of these high-level tactile experiences in the H-E-Vocabulary, and relate them to the features of the signal required to recreate them. As a result, programmers could describe tactile experiences at a high level, making them accessible to visual programming languages like Flash.

CONCLUSIONS Here we presented the results of a user study that used a specific interview technique to elicit the structure of tactile experiences related to two specific (Meissner and Pacinian corpuscle) receptors in the human hand. Our analysis of the verbalizations resulted in 14 categories of tactile experiences and revealed that users are able to verbalize their tactile experiences specific to the receptors stimulated. We are able to identify several opportunities to support the design of haptic systems for tactile experiences. The methodology used [35] to elicit the H-E-Vocabulary can be used to identify verbalizations for other parts of the body and for the other mechanoreceptors and thermal receptors [39]. Future studies can validate the vocabulary and extend it with richer qualitative descriptions.

ACKNOWLEDGMENTS We thank the members of the Pervasive Media Studio for hosting the workshop, particularly Tom Carter, David Coyle, and Beatrice Cahour for their support and valuable comments, and the participants for taking part in this study. This work is supported by the EPSRC (EP/J004448/1) and the EU Marie Curie IEF Action (FP7-PEOPLE-2010-IEF).

REFERENCES 1. Alexander, J., Marshall, M. T., and Subramanian. S.

Adding Haptic Feedback to Mobile TV. EA CHI ‘11, ACM, 1975-1980.

2. Bolanowski, S. J., Gescheider G. A, Verrillo R. T, and Checkosky, C.M. Four channel mediate the mechanical aspects of touch. Journal of Acoustical Society of America, 84(5) (1988), 1680-1694.

3. Brown, L., and Kaye, J. ‘J.’ Eggs-ploring the influence of material properties on haptic experience. HAID ’07.

4. Byrt, T. How Good Is That Agreement? Epidemiology, 7(5) (1996), 561

5. Cahour, B., and Salembier, P. 2012. The user phenomenological Experience; evoking the lived activity with “re-situating” interviews, CHI 2012.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1667

6. Chang, A., and Ishii, H. Sensorial interfaces. DIS '06. ACM, 50-59.

7. Choi, K., and Jun, C. A systematic approach to the Kansei factors of tactile sense regarding the surface roughness. Applied Ergonomics, 38 (1) (2007), 53-63.

8. Dagman, J., Karlsson, M. A., and Wikström L. Investigating the haptic aspects of verbalised product experiences. Intern. J. of Design, 4(3) (2010), 15–27.

9. Enriquez, M., MacLean, K., and Chita, C. Haptic phonemes: basic building blocks of haptic communication. ICMI '06. ACM, 302-309.

10. Gallace, A., and Spence, C. The Cognitive and Neural Correlates of Tactile Memory. Psychophysical Bulletin, 135(3) (2009), 380-406.

11. Gescheider, G. A, Wright, J. H., and Verrillo, R. T. Information-Processing Channels in the Tactile Sensory System: A Psychophysical and Physiological Analysis. New York: Taylor & Francis, 2009.

12. Gescheider, G. A., Bolanowski, S.J., Pope, J.V., and Verrillo, R.T. A four-channel analysis of the tactile sensitivity of the fingertip: frequency selectivity, spatial summation, and temporal summation. Somatosensory and Motor Research, 19(2) (2002), 114-124.

13. Gescheider, G. A., Bolanowski, S.J., and Hardick, K.R. The frequency selectivity of information-processing channels in the tactile sensory system. Somatosensory and Motor Research, 18(3) (2001), 191-201.

14. Hoshi, T., Takahashi, M., Iwamoto, T., and Shinoda, H. Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound. IEEE Tran. on Haptics, 3(3) (2010), 155-165.

15. Jansson-Boyd, C. V. Touch matters: exploring the relationship between consumption and tactile interaction. Social Semiotics. 21(4) (2011), 531-546.

16. Johnson, K. O. The roles and functions of cutaneous mechanoreceptors. Neurobiology, 11 (2001), 455-461.

17. Johnson, K. O., Yoshioka, T., and Vega-Bermudez, F. Tactile Functions of Mechanoreceptive Afferents Innervating the Hand. Journal of Clinical Neurophysiology, 17(6) (2000), 539-558.

18. Johansson, R. S., and Vallbo, A. B. Tactile sensibility in the human hand: relative and absolute densities of four types of mechanoreceptive units in glabrous skin. Journal Physiology, 286 (1979), 283–300.

19. Katz, D. The world of touch. Krueger, L. E. (Ed). Hillsdale, NJ: Erlbaum, 1989.

20. Kildal, J. Kooboh: Variable Tangible Properties in a Handheld Haptic-Illusion Box. Haptics: Perc., Devices, Mobility, and Comm., LNCS, 7283 (2012), 191-194.

21. Lederman, S.J., and Klatzky, R.L. Hand movements: A window into haptic object recognition. Cognitive Psychology, 19 (1987), 342-368.

22. Lennertz, R. C., Tsunozaki, M., Bautista, D. M., and Stucky, C. L. Physiological basis of tingling paresthesia evoked by hydroxy-alpha-sanshool. J Neurosci. 30(12) (2010), 4353-4361.

23. Light, A. Adding method to meaning: A Technique for exploring peoples' experiences with technology. Beh. & Information Technology, 25 (06) (2006), 175-187.

24. Mountcastle, V. B. The sensory hand: neural mechanisms of somatic sensation. Cambridge, MA: Harvard University Press, 2005.

25. Mousette, C. Simple Haptics: Sketching Perspectives for the design of Haptic Interactions, PhD Thesis, Umea University, Oct-2012.

26. Ochoa, J. L., Torebjörk, H. E. Paraesthesiae from ectopic impulse generation in human sensory nerves. Brain 103 (1980), 835-853.

27. O’Sullivan, C., and Chang, A. An Activity Classifica-tion for Vibrotactile Phenomena. HAID ’06, 145-156.

28. Pasquero, J., and Hayward, V. Tactile feedback can assist vision during mobile interactions. CHI '11. ACM, 3277-3280.

29. Petitmengin, C. Describing one's Subjective Experience in the Second Person. An Interview Method for the Science of Consciousness. Phenomenology and the Cognitive Sciences, 5 (2006), 229-269.

30. Sonneveld, M. H., and Schifferstein. H.N.J. The tactual experience of objects. In Schifferstein, H.N.J., Hekkert, P. (Eds.). Product Experience. Elsevier, 2008, 41-67.

31. Schifferstein. H. N. J. Multi sensory design. In Proc. DESIRE '11. ACM, 361-362.

32. Spence, Charles (2010). The multisensory perception of flavor. In the Psychologist (Sept. 2010), 23(9), 720-723.

33. Syrdal, D., Dautenhahn, K., Koay, K., Walters, M. and Otero, N. Exploring human mental models of robots through explicitation interviews. RO-MAN ‘10, 638-645.

34. Tosey, P., and Mathison, J. Exploring inner landscapes through psychophenomenology. Qual. Research in Organiz. & Management: Inter. J., 5(1) (2010), 63-82.

35. Vermersch P. L’entretien d’explicitation, ESF, 1994. 36. Vermersch P., and Maurel M., Pratiques de l’entretien

d’explicitation, ESF, 1997. 37. Williams, L., and Ackerman, J., Please touch the

merchandise. Harvard Business Review, 2011. 38. Verrillo, R. T., Bolanowski, S. J. Jr. The effects of skin

temperature on the psychophysical responses to vibration on glabrous and hairy skin. Journal of the Acoustical Society of America, 80(2) (1986), 528-32.

39. Wilson, G., Halvey, M., Brewster, S. A., and Hughes, S. A. Some like it hot: thermal feedback for mobile devices. Proc. CHI '11, ACM, 2555-2564.

Session: Tactile Experiences CHI 2013: Changing Perspectives, Paris, France

1668


Recommended