One Click Search

Post on 26-Oct-2014

107 views 0 download

Tags:

transcript

IntentSearch: Capturing User Intention forOne-Click Internet Image Search

Presented by

Nayana.K.Raj0941943PAACET

26-07-2012

INTRODUCTION

•Search engine which helps to interpret users’ search intention by using ONE-CLICK query image

•Uses 4 steps for image searching

•Text based information of query word and visual content of query image to expand the image pool

EXPLANATION User search intention only by query

keywords is difficult because text based image search suffers from..

•Ambiguity of query keywords

•User doesn't have enough knowledge

•Hard for users to describe the visual content of target images

•Easier search by using both textual and visual content of query

•Web-scale image search engines mostly rely on surrounding text features.

•Users’ search intention by only by query keywords

PROPOSED SYSTEM• Image search on the basis of both

textual and visual content of images

•Image pool is re-ranked based on textual and visual features

EXISTING SYSTEM

Fig. 1: Top-ranked images returned from ‘Bing’ using

“apple” as query

Key contribution is to capture the users’ search intention from this one-click query image in four steps.

•Adaptive similarity

•Keyword expansion

•Visual query expansion

•Image pool expansion

SEARCH TECHNIQUES

The user first submits query keywords q. A pool of image is retrieved by text-based search User is asked to select the query image from image pool The query image is classified as one of the predefined adaptive weight categories Images in the pool are re-ranked based on their visual similarities to the query imageSimilarities are computed using the weight schema

Visual feature design

Existing features : Gist , SIFT, Daubechies Wavelet , Histogram of Gradient (HoG)

New features : Attention guided Color Signature, Color spatialet (CSpa) , Multilayer Rotation Invarient ( EOH), Facial Feauter

Adaptive Weight Schema

•Weight schema is used for similarity calculations

•Lets take image i and j… Adaptive similarity between i & j

Sq(i , j) = ∑fm=1

αmq sm(i,j)

where sm(i,j) similarity between I and j on feature mf is the visual featureαm

q is the express the importance of feature m for measuring similarity

existence of faces, the number of faces in the image

percentage of the image frame taken up by the face region

coordinate of face center relative to the centre of image

Directionality

Color Spatial Homogeneousness (variance of values in different blocks of Color Spatialet)

Total energy of edge map obtained from Canny Operator

Edge Spatial Distribution

Features for query categorization

•Image is divided into clusters•Each word wi has ti clusters

C(wi)= { ci,1 ,.............,ci,ti }

•Visual distance between the query image and a cluster c is calculated as the mean of the distances between the query image and the images in c.

•The cluster Ci,j with the minimal distance is chosen as visual query expansion and its corresponding word wi .

q = wi + q’

Image Clustering

• duplicate images

• User friendly

• Easy search for a particular image(on the internet)

• Can find the image is real or not

DISADVANTAGES

ADVANTAGES

FUTURE ENHANCEMENT

•query log data, which provides valuable co-occurrence information of keywords , for keyword expansion

•Can be improved by including duplicate detection in the future work

CONCLUSION

•Internet image search approach which only requires one-click user feedback

•Intention specific weight schema

•Without additional human feedback

•Possible for industrial scale image search by both text and visual content

REFERENCES

•J. Cui, F. Wen, and X. Tang, “Real Time Google and LiveImage Search Re-Ranking,” Proc. 16th ACM Int’l Conf. Multimedia, 2008.

• J. Cui, F. Wen, and X. Tang, “IntentSearch: Interactive On-Line Image Search Re-Ranking,” Proc. 16th ACM Int’l Conf. Multimedia,2008.

• “Bing Image Search,” http://www.bing.com/images

•http://www.google.com/imagesearch