+ All Categories
Home > Automotive > Smart Devices: Advancing the User Experience

Smart Devices: Advancing the User Experience

Date post: 20-Aug-2015
Category:
Upload: harman-innovation
View: 633 times
Download: 1 times
Share this document with a friend
26
Advancing the User Experience: Implicit Interaction Systems Harman Innovation Series
Transcript
Page 1: Smart Devices: Advancing the User Experience

Advancing the User Experience: Implicit Interaction Systems

Harman Innovation Series

Page 2: Smart Devices: Advancing the User Experience

• Imagine the following scenario: You just bought a new trash can for your kitchen. It is the most advanced model by a company who specializes in Implicit Interaction Design.

• Kevin/ Jessica: Images here <http://harmaninnovation.com/blog/advancing-user-experience-

implicit-interaction-systems/>

Page 3: Smart Devices: Advancing the User Experience

• From the outside, the trash can looks fairly standard, with a lid that opens when you step on a foot pedal, etc. –nothing unusual, except maybe for the fact that the can has a power plug and wants to be plugged into an outlet.

Page 4: Smart Devices: Advancing the User Experience

• You place the new trash bin in your kitchen and use it in a very standard way: you throw in empty containers of milk, a box that had a six-pack of eggs, another box with cereal that expired a few months ago, the skin of a banana and so on.

• A few days later, you go grocery shopping…

Page 5: Smart Devices: Advancing the User Experience

• You just entered the grocery store and your phone buzzes with an e-mail. It reads: “Dear John, since you just entered a grocery store, may I remind you that you are likely out of eggs, bananas and milk, and maybe you want to get a new box of cereal since the last one expired.

• Sincerely, your kitchen trash can.”

Page 6: Smart Devices: Advancing the User Experience

• What just happened?

• Well, your trash can is an enhanced version of a passive trash can. It was built in a way that lets you keep your normal habits and behaviors (i.e., step on a foot pedal to open the lid, throw trash in, etc.), but also senses what you may want to know or do next based on that interaction.

Page 7: Smart Devices: Advancing the User Experience

• This is Implicit Interaction Design.

• But before we go deeper into Implicit Interaction, let’s briefly describe what happened on a technology level:

Page 8: Smart Devices: Advancing the User Experience

• The trash can is equipped with a scanner that recognizes the objects thrown into it. It may look for bar codes or use object recognition methods to directly recognize items. It may even have an artificial nose sensor that can identify objects just from their smell.

Page 9: Smart Devices: Advancing the User Experience

• Like many products these days, it also includes wireless connectivity that allows it to communicate with other devices, such as the refrigerator and freezer, but also the your cell phone. Such a mobile device can be used to identify your location (like the grocery store), and send messages when relevant.

Page 10: Smart Devices: Advancing the User Experience

• Although this particular trash bin may not exist yet as a product (and there are many unsolved issues with this scenario, such as the reliability of the sensors, how to clean it, and a host of privacy problems, of course), it illustrates the idea of a system with a focus on Implicit Interaction.

Page 11: Smart Devices: Advancing the User Experience

• Implicit Interaction is actually not a feature or component technology of a system, but rather an interaction type.

• As such, it can be contrasted with Explicit Interaction. Both are part of a continuum, with Implicit Interaction methods on one end and Explicit Interaction methods on the other end.

• And we are actually very familiar with the latter.

Page 12: Smart Devices: Advancing the User Experience

• Explicit Interaction is what most user interfaces are using these days. It means that the user has to perform an action explicitly to get a machine (computer, product, etc.) to do something; e.g., one presses a button and the machine turns on, or one rotates a knob and the audio volume increases.

• Humans have used these methods for centuries if not millennia.

Page 13: Smart Devices: Advancing the User Experience

• Interestingly, modern systems do not have to just be either explicit or implicit. Very often these days, they are somewhere in between the two extremes.

Page 14: Smart Devices: Advancing the User Experience

Various user interface technologies on the continuum of Explicit vs. Implicit Interaction

Page 15: Smart Devices: Advancing the User Experience

• The schematic shows an illustration for the continuum from Explicit to Implicit Interaction methods. Various user interface methods are located along this continuum.

• On the lower (explicit) end, we find, for example, the keyboard, and with it the command line interface, is one of the most common Explicit Interaction methods we use today in human-computer interaction.

• Each keystroke generates an explicit event. When key presses are combined to a predefined command, the computer executes this command and then ignores the user until the next key press is registered.

Page 16: Smart Devices: Advancing the User Experience

• If we look at other common user interface methods, such as mouse and touch interfaces, they are still fairly explicit, but the distinction between explicit and implicit starts to become a bit less clear-cut.

• For example, multi-touch interfaces can be used to create soundscapes and visual art, and the input from the fingers is not an explicit command for the system to execute a clearly defined action.

• Instead, the system senses the position of the fingers on the surface, interprets these positions and reacts in a visual or even musical way.

Page 17: Smart Devices: Advancing the User Experience

• Interaction methods such as pointing and grasping are even more interesting because they are right in between explicit and implicit.

• For example, if I point at a book (or tap at a photo of a book on Amazon.com) and maybe even say “I want to buy this book,” then we use an Explicit Interaction method.

• Hands and hand gestures can also be used in an implicit way. For example, if I am involved in an animated face-to-face conversation, and I gesture wildly with my hands, then this would be more unconscious (most gesturing is not conscious).

Page 18: Smart Devices: Advancing the User Experience

• This gesturing still supports my overall communication, and to most human conversational partners, these nonverbal signals have important meaning and help make the conversation clearer and smoother.

Page 19: Smart Devices: Advancing the User Experience

• Imagine now that our computers (and in fact any system) can track such continuous hand gestures and “make sense” of them. While pointing and hand gestures are already used in gaming systems (e.g., Kinect), it is still an Explicit Interaction method.

Page 20: Smart Devices: Advancing the User Experience

• Our eyes and our voices are highly interesting interaction methods because they are both implicit and explicit. For example, if it comes to recognizing the human voice, there is technology that does a good job at detecting commands.

Page 21: Smart Devices: Advancing the User Experience

• For example, a voice interaction system in a car that can understand “Where is the closest hospital?” may provide the driver with directions but at the same time determine if the person is yelling the phrase, as in the case of an emergency vs. a casual request.

Page 22: Smart Devices: Advancing the User Experience

• If a system can actually sense these subtle variations of the human voice, it would move toward a more Implicit Interaction system-and of course become much more useful overall to the user.

Page 23: Smart Devices: Advancing the User Experience

• Our eyes are also controlled by an unconscious processes.

• Recent advances in eye-tracking technology now allow eye tracking even on handheld devices (e.g., NTT Docomo has a tablet with the Tobii eye gaze sensor).

Page 24: Smart Devices: Advancing the User Experience

• This opens up fantastic user interface opportunities for Explicit Interaction. For example, a device gets activated only when a user looks at it. It senses a user’s eye contact and powers up-without someone having to press an “On” button.

• This “intentional looking” could be part of a set of “eye gestures” which allow a user to select functions by just looking at a button, or even part of a device.

• Such eye gestures may be helpful for hands-free situations; e.g., in the car (hands on steering wheel), in a kitchen (dirty hands), but also in a medical setting such as surgery rooms.

Page 25: Smart Devices: Advancing the User Experience

• The important point here is that eye gaze tracking technology has not only enabled the sensing of conscious eye gestures, but also the sensing of unconscious eye motions, which are in turn part of an Implicit Interaction method.

Page 26: Smart Devices: Advancing the User Experience

• Imagine the possibilities if we could make full use of these emerging technologies and create products and systems that are not only waiting for explicit commands from the user, but are also highly sensitive to Implicit Interaction signals.


Recommended