+ All Categories
Home > Documents > Orientation Responsive Touch Interaction

Orientation Responsive Touch Interaction

Date post: 25-Nov-2023
Category:
Upload: kist-kr
View: 0 times
Download: 0 times
Share this document with a friend
9
J.A. Jacko (Ed.): Human-Computer Interaction, Part II, HCII 2009, LNCS 5611, pp. 461–469, 2009. © Springer-Verlag Berlin Heidelberg 2009 Orientation Responsive Touch Interaction Jinwook Kim, Jong-gil Ahn, and Heedong Ko Korea Institute of Science and Technology, Imaging Media Research Center, P.O.Box 131, Cheongryang, Seoul 130-650, Korea {jwkim,hide989,ko}@imrc.kist.re.kr Abstract. A novel touch based interaction method by use of orientation infor- mation of a touch region is proposed. To capture higher dimensional informa- tion of touch including a position and an orientation as well, we develop robust algorithms to detect a contact shape and to estimate its orientation angle. Also we suggest practical guidelines to use our method through experiments consid- ering various conditions and show possible service scenarios of aligning docu- ments and controlling a media player. Keywords: Touch Interaction, Interaction techniques, Touch direction, Touch orientation, Tabletop, Media player controller. 1 Introduction We introduce a novel touch based interaction method by use of orientation informa- tion of a touch region. We observe explosive interests on touch based interaction due to recently introduced hardware and software [1,2,3,4,5]. They are focused mostly on detecting positions of multiple points of the contact. Our motivation is to use higher dimensional information for touch interaction not restricted to two-dimensional posi- tions only. Fig.1 shows consecutive images of a single contact shape during the touch interaction. Note that the center of contact shape changes little compared to its rela- tive orientation angle, which means that an orientation angle of the touch gives us a new independent axis of interaction information. We present how to detect the touch and retrieve a contact shape using computer vi- sion methods followed by robust estimation of an orientation angle of the approxi- mated elliptical shape of the contact. Also we suggest practical guidelines for our method through experiments considering various conditions. Finally we enumerate possible interaction scenarios using our methods. t = 0.0 t = 0.03 t = 0.06 t = 0.09 t = 0.12 t = 0.15 t = 0.18 Fig. 1. Images of a contact area during the touch interaction
Transcript

J.A. Jacko (Ed.): Human-Computer Interaction, Part II, HCII 2009, LNCS 5611, pp. 461–469, 2009. © Springer-Verlag Berlin Heidelberg 2009

Orientation Responsive Touch Interaction

Jinwook Kim, Jong-gil Ahn, and Heedong Ko

Korea Institute of Science and Technology, Imaging Media Research Center, P.O.Box 131, Cheongryang, Seoul 130-650, Korea

{jwkim,hide989,ko}@imrc.kist.re.kr

Abstract. A novel touch based interaction method by use of orientation infor-mation of a touch region is proposed. To capture higher dimensional informa-tion of touch including a position and an orientation as well, we develop robust algorithms to detect a contact shape and to estimate its orientation angle. Also we suggest practical guidelines to use our method through experiments consid-ering various conditions and show possible service scenarios of aligning docu-ments and controlling a media player.

Keywords: Touch Interaction, Interaction techniques, Touch direction, Touch orientation, Tabletop, Media player controller.

1 Introduction

We introduce a novel touch based interaction method by use of orientation informa-tion of a touch region. We observe explosive interests on touch based interaction due to recently introduced hardware and software [1,2,3,4,5]. They are focused mostly on detecting positions of multiple points of the contact. Our motivation is to use higher dimensional information for touch interaction not restricted to two-dimensional posi-tions only. Fig.1 shows consecutive images of a single contact shape during the touch interaction. Note that the center of contact shape changes little compared to its rela-tive orientation angle, which means that an orientation angle of the touch gives us a new independent axis of interaction information.

We present how to detect the touch and retrieve a contact shape using computer vi-sion methods followed by robust estimation of an orientation angle of the approxi-mated elliptical shape of the contact. Also we suggest practical guidelines for our method through experiments considering various conditions. Finally we enumerate possible interaction scenarios using our methods.

t = 0.0 t = 0.03 t = 0.06 t = 0.09 t = 0.12 t = 0.15 t = 0.18

Fig. 1. Images of a contact area during the touch interaction

462 J. Kim, J.-g. Ahn, and H. Ko

2 Related Works

There have been tremendous researches to rotate things in computer applications using traditional input peripheral devices such as a mouse, a key-pad, a stylus pen, and so on. Compared to manual manipulation of physical objects, most of the work however lack in enough degrees of freedom and are difficult to achieve relevant feed-back information during the manipulation.

In this case, automatic alignment of the things can be useful. InfoTable [22] rotates an item of interest automatically to align the item with display boundaries near to a user. STARS [15] and ConnecTable [17] introduce similar functions of automatic rotation of objects. In several tabletop display systems, the “corner to rotate” method has been used and proved its usefulness. Kruger et al. [18] presented the RNT algo-rithm to rotate digital objects using a physically based model.

Recently touch interaction using bare fingers gathers explosive interests and is used in many touch input and display systems such as DiamondTouch [4], SmartSkin [9], and DViT [11]. Rotation of the objects in touch based interaction becomes more important especially in multiple users tabletop environments [2,12,13,14] because users expect the system to support intuitive manipulation methods by using their hands and fingers only. Matsushita et al. [21] proposed the dual touch method using two fingers which is one of the most popular methods to rotate objects. Recently Mi-crosoft [2] announced a rotation technique using only one finger but details on the algorithms and performance have not been published officially yet.

3 Orientation Responsive Touch Interaction

While most of the touch interaction methods are point based, the proposed interaction method uses additional orientation information. We first identify a contact region between a finger and the touch interface followed by estimating the contact shape to achieve orientation information. Then the orientation angle can be used to rotate ob-jects or other subsystems requiring the orientation information. In this section we investigate how to detect a touch area, to retrieve the contact shape and to estimate an orientation angle of the shape.

3.1 Touch Detection

To detect an event of touch interaction, we use MTmini package [3] which is a soft-ware framework to expedite a development of multi touch applications. A camera inside a box can capture the silhouette of hands and fingers of touch (Fig.2). In our system, Phillips SPC 900NC USB interfaced camera is used. Then we can identify a region of contact and generate the corresponding grayscale intensity image.

3.2 Contact Shape Retrieval

A grayscale intensity image acquired previously is very blurry to identify an exact touch region and this ambiguity may affect on the accuracy of estimating the contact shape. To this end, we convert a grayscale intensity image (Fig.3(a)) into a binary

Orientation Responsive Touch Interaction 463

Fig. 2. MTmini

Fig. 3(a). Grayscale image (b) Binary image

image (Fig.3(b)) by Otsu’s method [6]. The method assumes that the image to be thresholded contains two classes of pixels (e.g. foreground and background) then calculates the optimum threshold separating those two classes so that their combined spread (intra-class variance) is minimal.

As shown in the Fig.3, the input grayscale image can be segmented into two re-gions clearly. One possible problem is that Otsu’s method is computationally inten-sive to be applied each of the input images every frame. However the threshold level changes little in a controlled working environment as ours. Thus the value can be computed once in the initialization step and be reused without any update afterward. Section 4.2. refers the reader to details on the observation.

3.3 Orientation Estimation

Given a set of points on a boundary of the contact shape, we estimate an orientation

angle by assuming an elliptical shape of the contact boundary. Let 2ℜ=ip be the thi point on the boundary. We translate all the points so as that the centroid of the

shape is located at the origin. Then we compute a covariance matrix

∑ −−= Tcici ppppX ))(( , where cp is the centroid of the shape represented

464 J. Kim, J.-g. Ahn, and H. Ko

as a column vector. Finally a rotation matrix indicating how much the approximated ellipse is rotated can be computed using the polar decomposition of the covariance matrix X [8].

4 Experimental Results

In this section, we experiment the proposed method in various conditions to draw an efficient combination of the algorithms according to target hardware and software platforms. First we examine consistency of the results in various resolutions of input images. Second we compare an accuracy of an estimated orientation angle when the Otsu’s method is applied every frame and when the threshold value is computed ini-tially once. Finally we analyze how much extracting boundary of the contact area affects on the orientation angle estimation.

4.1 Resolution Analysis

To verify that our method is applicable to various hardware setups ranging from mo-bile interfaces to tabletop display systems, we compare results on various resolutions of input images. Fig.4 shows an estimated orientation angle during a short period of touch interaction. In this experiment, binary images are used. Our method works very robustly regardless of the resolution of input images. Note that the difference among tests with various resolutions was measured less than 2° during approximately 45° of orientation change in total.

-180

-175

-170

-165

-160

-155

-150

-145

-140

-135

-130

-125

-120

0 0.03 0.06 0.09 0.12 0.15 0.18 0.21 0.24 0.27

40x30

80x60

160x120

320x240

640x480

1280x960

Resolut ion

)(°

)(Sec

Fig. 4. Orientation angle estimated using various resolutions of input images

4.2 Application of Otsu’s Method Analysis

Our method requires converting a grayscale intensity image into a binary image. The conversion process can be done using the Otsu’s method which might be expensive

Orientation Responsive Touch Interaction 465

-180

-175

-170

-165

-160

-155

-150

-145

-140

-135

-130

-125

-120

0 0.03 0.06 0.09 0.12 0.15 0.18 0.21 0.24 0.27

Method1

Method2

)(°

)(Sec

Fig. 5. Orientation angle estimated changing the Otsu’s method conditions.

for mobile interfaces equipped with a low computational power. Hence we examine whether the threshold level should be updated every frame.

In the method 1, we perform Otsu’s method every frame and the method 2 com-pute the threshold level once in an initialization step and reuse the value without any update afterward. As shown in the Fig.5, two methods show very close results be-cause illumination conditions and touch patterns change little in our hardware setup. Hence the method 2 is preferable considering computational costs.

4.3 Boundary Extraction Analysis

When estimating a touch orientation angle, our basic assumption is that a boundary shape of the contact region is elliptical, which means that our algorithm requires ex-tracting boundary pixels from the binary image. However the orientation is invariant to the scale of the image and therefore considering all the pixels inside the contact region not only the boundaries does not affect the orientation estimation much. Hence it is desirable to eliminate the unnecessary process of extracting boundaries if possible.

Fig. 6. (a) Intensity image. (b) Binary image. (c) Boundary image.

466 J. Kim, J.-g. Ahn, and H. Ko

-180

-175

-170

-165

-160

-155

-150

-145

-140

-135

-130

-125

-120

0 0.03 0.06 0.09 0.12 0.15 0.18 0.21 0.24 0.27

Boundary

Binary

)(Sec

)(°

Fig. 7. Orientation angle estimated using boundary image and binary image

In this sense, we compare results of applying Canny’s edge detection algorithm [7] to achieve the boundary and considering all the pixels in the contact region. As shown in the Fig.6, the maximum difference was measure less than 1.5° and we can safely skip the process of extracting a boundary of the binary image.

5 Service Scenarios

We propose two service scenarios using the proposed orientation estimation method. The first one is to align documents in a tabletop display system and the second is to control widgets in a media player on small touch interfaces.

5.1 Document Alignment

Assume that multiple users share a common tabletop environment where several ob-jects representing documents, media and images are located arbitrarily as shown in Fig.8. Possibly the object may not be aligned according to the user’s viewing direc-tion. This typical situation decreases user’s affordance especially in reading docu-ments like news papers [20].

While one of the most popular methods to orient the objects in touch interfaces is multiple contact points based, the proposed method can be used to reorient the objects using only one finger. Fig.9 illustrates the scenario. Arguably using only one finger is preferable in many applications.

5.2 Media Player Controller

One more example of the orientation responsive touch interaction is controlling wid-gets in media players. Slider controls and buttons to adjust a volume or fast forward the media shown in Fig.10(a) can be replaced by the proposed interaction method. In small touch interfaces, touch based slider control widgets can be cumbersome because

Orientation Responsive Touch Interaction 467

Fig. 8. Documents and media arbitrarily located on a tabletop display

Fig. 9. Changing an orientation of a document for a better alignment

Fig. 10 (a). Slider controller widget. (b) Dial widget using orientation responsive touch interac-tion method.

468 J. Kim, J.-g. Ahn, and H. Ko

the working area of touch is too small and a minute control of the widgets is required. However our method is robust to small area of touch with the limited resolution of the captured touch image and hence is very intuitive to use. Fig.10(b) shows a dial widget using our method. Users can control minutely the control value. Also our method does not require a user to scroll the finger across the touch interface and therefore fits well to small form factored devices.

6 Conclusion

We propose a novel orientation responsive touch interaction method. Since our ap-proach can estimate a contact orientation as well as a position, users can enjoy con-tents in a new interaction style. Our technique can be applied only when images of the contact shape are available. However many of the existing hardware setups for touch interaction can generate such an image data of the contact. Also computationally expensive image processing procedures required to acquire a robust contact shape should be minimized when applied to mobile devices equipped with a low computa-tional power. Nevertheless the proposed method introducing more dimensions for interaction information suggests us a variety of orientation sensitive interactions.

Acknowledgements

This work was supported by the IT R&D program of MKE/IITA [2009-F-033-01, Development of Real-time Physics Simulation Engine for e-Entertainment] and by Korea Institute of Science and Technology (KIST) through the Tangible Web Project.

References

1. Apple iPod touch, http://www.apple.com/ipodtouch 2. Microsoft Surface, http://www.microsoft.com/surface 3. MTmini package, http://ssandler.wordpress.com/MTmini 4. Deitz, P., Leigh, D.: DiamondTouch: a multi-user touch technology. In: Proceedings of

UIST 2001, pp. 219–226. ACM Press, New York (2001) 5. Han, J.Y.: Low-cost multi-touch sensing through frustrated total internal reflection. In:

Proceedings of UIST, pp. 115–118. ACM Press, New York (2005) 6. Otsu, N.: A threshold selection method from gray-level histograms. IEEE Trans. Systems,

Man, and Cybernetics 9(1) (1979) 7. Canny, J.F.: A computational approach to edge detection. IEEE Trans. Pattern Analysis

and Machine Intelligence 8(6) (1986) 8. Higham, N.J.: Computing the polar decomposition with applications. SIAM J. Scientific

and Statistical Computing 7(4) (1986) 9. Rekimoto, J.: SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Sur-

faces. In: Proc. CHI 2002, pp. 113–120 (2002) 10. Streitz, N.A., Tandler, P., Muller-Tomfelde, C., Konomi, S.: i-LAND: An Interactive

Landscape for Creativity and Innovation. Proc. CHI 1999, 120–127 (1999)

Orientation Responsive Touch Interaction 469

11. SMARTTech.: Digital Vision Touch Technology. White Paper (2003), http://www.smarttech.com/dvit/

12. Shen, C., Lesh, N., Forlines, C., Vernier, F.: Sharing and Building Digital Group Historie. In: Proc. CSCW 2002, pp. 324–333 (2002)

13. Shen, C., Lesh, N.B., Moghaddam, B., Beardsley, P.A., Bardsley, R.S.: Personal Digital Historian: Use Interface Design. In: Proc. CHI 2001 Extended Abstracts, pp. 29–30 (2001)

14. Shen, C., Everitt, K.M., Ryall, K.: UbiTable: Impromptu Face-to-Face Collaboration on Horizontal Interactive Surfaces. In: Dey, A.K., Schmidt, A., McCarthy, J.F. (eds.) Ubi-Comp 2003. LNCS, vol. 2864, pp. 281–288. Springer, Heidelberg (2003)

15. Magerkurth, C., Stenzel, R., Prante, T.: STARS – a ubiquitous computing platform for computer augmented tabletop games. In: Extended Abstract of UbiComp 2003, pp. 267–268. Springer, Heidelberg (2003)

16. Shen, C., Vernier, F., Forlines, C., Ringel, M.: DiamondSpin: An extensible toolkit for around-the-table interaction. In: Proceedings of CHI 2004, pp. 167–174. ACM Press, New York (2004)

17. Tandler, P., Prante, T., Müller-Tomfelde, C., Streitz, N., Steinmetz, R.: ConnecTables: dy-namic coupling of displays for the flexible creation of shared workspaces. In: Proceedings of UIST 2001, pp. 11–20. ACM Press, New York (2001)

18. Kruger, R., Carpendale, S., Scott, S.D., Tang, A.: Fluid integration of rotation and transla-tion. In: Proc. CHI, pp. 601–610. ACM Press, New York (2005)

19. Hancock, M.S., Vernier, F., Wigdor, D., Carpendale, S., Shen, C.: Rotation and translation mechanisms for tabletop interaction. In: Proc. Tabletop, pp. 79–86. IEEE Press, Los Alamitos (2006)

20. Koriat, A., Norman, J.: Reading Rotated Words. Journal of Experimental Psychology. Human Perception and Performance 11(4), 490–508 (1985)

21. Matsushita, N., Ayatsuka, Y., Rekimoto, J.: Dual touch: A two-handed interface for pen-based PDAs. In: ACM UIST Symposium on User Interface Software and Technology, pp. 211–212 (2000)

22. Rekimoto, J., Saitoh, M.: Augmented surfaces: a spatially continuous work space for hy-brid computing environments. In: Proceedings of CHI 1999, pp. 378–385. ACM Press, New York (1999)

23. http://www.youtube.com/watch?v=_wHQKbME39k


Recommended