+ All Categories
Home > Technology > Prototyping GNOME UI for Gestural Input

Prototyping GNOME UI for Gestural Input

Date post: 29-Jul-2015
Category:
Upload: adityo-pratomo
View: 441 times
Download: 0 times
Share this document with a friend
Popular Tags:
61
PROTOTYPING GNOME UI FOR GESTURAL INPUT Presented by Adityo Pratomo ( ) / Surya University Hidayat Febiansyah / Surya University GNOME.Asia 2015 @kotakmakan (@havban)
Transcript

PROTOTYPING GNOME UIFOR GESTURAL INPUT

Presented by Adityo Pratomo ( ) / Surya University Hidayat Febiansyah / Surya University

GNOME.Asia 2015

@kotakmakan(@havban)

ABOUT USWe are lecturers at the Informatics Department at SuryaUniversityAdityo Pratomo got a Master Degree in Interaction Designfrom University of Sydney, previously worked at anadvertising agencyHidayat Febiansyah an IT wizard... jk went with yellowJaket, obtained PhD from Sun Moon University, Republicof Korea

USER INPUT DEVICESDevices act as a medium for user to give input to computer

EVOLUTION OF USER INPUT DEVICEPunch Card

Keyboard

Mouse

Scanner

Graphic Tablet

Touchscreen

Voice Input

Tangible User Interface

Air Gesture Input

TODAY CONDITIONHuge variety of input device

(Still) Follows the WIMP (Windows, Icon, Menu, Pointer)style

Term coined in 1980 by Merzouga WilbertsPopularized by Apple Macintosh in 1984

Windows

Runs a self contained program, isolated from other program,run at the same time in other windows

Icon

Shortcut for an action that the computer performs

Menu

Text or icon-based selection system that selects and executesprograms or tasks

Pointer

An onscreen symbol that represents movement of a physicaldevice that the user controls to select icons, objects, data

elements, etc

GESTURAL INPUTProvides gestures (mostly fingers) input for users

Touchpad

Implemented in many OS

NEW GESTURAL INPUT DEVICESIntroduces free form gestures as means to interact

Microsoft Kinect

Full body input, including voiceNo finger detection

Leap Motion

Fingers input onlyHave built-in finger gestures detection

Intel RealSense

Essentially a much smaller KinectWindows only SDK

Generally speaking, hardware wise, these devices do their jobwell (can be improved though)

PROBLEMSCome from the software implementation of it Especially, for

productive, non-entertainment use

Metro UI

Supposed to be designed for touchFalls flat for these kind of interactions

Kinect Guidelines

Still involves choosing and confirming actionChooses a big iconConfirms by hanging there for a period of time

Direct Mouse to Leap Motion

Barely usable since it involves high accuracy in pickingobjectLost in translation between 2D (where the feedbackhappens) and 3D (where the action happens)

OS Wise

Information are structured inside tree, explored by movingdown branches, presented one node at a time (nothingwrong with this)Interaction wise, requires moving between directories,picking up objects, confirming and working on that (thisrequires specific interaction technique)

GOALS AND MOTIVATIONSTo suggest an interaction model for an OS-wide applicationof free gestures input

We view this interaction model as something that'shuman and natural, appreciating human's inherentmotoric ability

To push further research of this area, without waiting forbig corp to do it (hooray Open Source)

INFLUENCES AND PREVIOUS WORKSThis project takes cues from other projects that people

already worked on

GNOME Shell

Minimal use of icon, users are forced to memorize keyboardshortcuts

Controlling GNOME with Leap

http://www.joaquimrocha.com/2013/08/09/controlling-gnome-with-leap/

Leap Motion DBus

https://github.com/jamespcole/leapmotion-dbus

9 to 5

a Leap Motion puzzle game

https://www.youtube.com/watch?v=rz_wBzWCAXw

Kids Note Training System

A tool for kids to learn about musical notes and test theirhearing

DESIGN CONSIDERATIONApplied in Leap MotionComplementary to mouse and keyboardCan be used with only one handFocus on gesture use with minimal icon involvedMain use case for productivity, non-entertainmentfunctionality

ISOLATING THE PROBLEMBigest problem is telling people how to choose an object

using gestures

This is NOT a replacement for mouse

Mouse is precise, gestures are not.

Giving a bigger icon doesn't necessarily means it'll beeasier to pickIt still requires a dexterity to accurately choose the objectPlus, how to confirm that choice? Inaccurately pressing theair?

OUR SOLUTIONTo translate imprecise gestures into an act of choosing an

object and confirming it

OUR PROTOTYPELeap Motion-controlled windowing systemUI-model prototype for desktop

Both will demonstrate how choosing object is done withgestures

ARCHITECTUREApplicationOS WIndowing SystemSystem EventLeap Motion APIOS

FEATURESAlternative live tile windowing system prototype

All contents are live and directly editableActivated using gestures

Mapping of various gestures into keyboard shortcutTranslation of keyboard shortcuts into various eventsUsage of GNOME windowing capabilityKeyboard shortcuts receiver inside application

GESTURESCircleGesture() – A circular movement by a finger.SwipeGesture() – A straight line movement by the handwith fingers extended.ScreenTapGesture() – A forward tapping movement by afinger.KeyTapGesture() – A downward tapping movement by afinger.

MAPPING SHORTCUTSConnecting the element using keyboard shortcuts

e.g. ctrl+alt+=No need to access lower level componentPortable deployment

Leap motion control is only complementary input, notprimary

Other input can be proceded with existing keyboard ormouse

COMPONENTSLeapjs Controller

NWJS

RobotJs

With added functionality to accept multiple flags shortcutPackery.js

Other: node-open, jquery, scrollTo, Knobjs, Togglejs, node-gyp

https://developer.leapmotion.com/documentation/javascript/index.html

http://nwjs.io/

https://github.com/octalmage/robotjs

http://packery.metafizzy.co

CHARACTERISTICSOpen SourcenessPlatform Agnostic

as long as the device supports HTML5, specifically node-webkit

Easy to configureManage shortcuts

Easy to developCommon technology platforms

SYSTEM MODERunning with existing gnome shortcut mode

Move among workspacesctrl+alt+- => Swipe downctrl+alt+= => Swipe up

Move among windowsalt+tab => Swipe leftalt+shift+tab => Swipe right

Inner applications windows movementalt+` => keytap

DEMO MODEMove among widgets one step

ctrl+alt+i => Swipe leftctrl+alt+y => Swipe right

Move among widgets with rollingctrl+alt+i * => Rotate finger rightctrl+alt+y * => Rotate finger left

Focused widget → Full size widgetctrl+alt+space => keytap

FUTURE GOALSUser testIntegrate with other technologies (VR, haptic sensors)Possibility into integration with file system that behaveslike a graph databaseFormalize the system

CONCLUSIONAir gesture-based interaction will come along in the nextfew years (or even decades)Software and hardware capabilities are there, they justneed to be unified by a proper design systemRequires further development to integrate with othertechnologies and to user test it rigorously

COME JOIN THE PARTY

[email protected] [email protected]://www.github.com/lunchboxav/leap-gnome


Recommended