Object Proposal with Kernelized Partial Ranking Jing Wang Joint work with Jie Shen and Ping Li Rutgers University

August 15, 2017

Jing Wang

PR

August 15, 2017

1 / 15

Problem Setup

Object proposals An ensemble of bounding boxes with high potential to contain objects

Goal Determine a small set of proposals with a high recall

Jing Wang

PR

August 15, 2017

2 / 15

Existing solutions and shortcomings

Existing solutions Step 1 Extract multiple features of boxes Step 2 Usually choose a ranking algorithm, such as Ranking SVM

Shortcomings of Ranking SVM: 1 High time complexity due to pairwise constraint 2 Linear kernels are usually utilized in ranking algorithm due to the computational and memory bottleneck of training a kernelized model.

Jing Wang

PR

August 15, 2017

3 / 15

Our solution

Our method A kernelized partial ranking model Benefits 1 Reduce the number of constraints from O(n2 ) to O(nk ) (n is the number of all potential proposals for an image, we are only interested in the top-k) 2 Permit non-linear kernels 3 Introduce a consistent weighted sampling (CWS) paradigm

Jing Wang

PR

August 15, 2017

4 / 15

Problem Formulation For each image, we have an ensemble of candidates B = {b1 , b2 , · · · , bn } a vector y = {y1 , y2 , · · · , yn }, with each yi being the IoU to the ground truth of the candidate bi . Learning the prediction function f :

X → Y.

(1)

The mapping function f is formulated as follows f (X ; w ) = (w · φ(x 1 ), · · · , w · φ(x n )),

(2)

where w is the weight vector we aim to learn,“·” denotes the inner product and the potential φ(x) maps x to a new feature space. Jing Wang

PR

August 15, 2017

5 / 15

None-Linear Kernel

Given u and v are any d-dimensional vectors with non-negative components, Min-Max Kernel is defined as Pd

Min-Max: gmm (u, v) = Pdi=1

min{ui , vi }

i=1 max{ui , vi }

(3)

,

where ui and vi denote the ith component of u and v respectively.

Jing Wang

PR

August 15, 2017

6 / 15

Consisted Weighted Sampling Approximates the min-max kernel by linear functions Algorithm 1 Consistent Weighted Sampling (CWS)1 Require: Feature vector u ∈ Rd with non-negative elements, number of trials S. Ensure: Consistent uniform samples (i1∗ , i2∗ , · · · , iS∗ ) and (t1∗ , t2∗ , · · · , tS∗ ). 1: for s = 1, 2, · · · , S do 2: for all i = 1, 2, · · · , d do 3: ri ∼ Gamma(2, 1), ci ∼ Gamma(2, 1), βi ∼ Uniform(0, 1). 4: ti ← blog(ui )/ri + βi c, yi ← exp(ri (ti − βi )), ai ← ci /(yi exp(ri )). 5: end for 6: is∗ = arg mini ai , ts∗ = tis∗ . 7: end for 1

Ioffe Sergey. Improved Consistent Sampling, Weighted Minhash and L1 Sketching, ICDM, 2010. Jing Wang

PR

August 15, 2017

7 / 15

CWS -theoretical guarantee Theorem (Collision probability) ∗ , t ∗ ) and (i ∗ , t ∗ ) For any two non-negative vectors u and v, let (is,u s,u s,v s,v be the consistent samples produced by Algorithm 1 at the s-th trial. Then we have  ∗ ∗ ∗ ∗ Pr (is,u , ts,u ) = (is,v , ts,v ) = gmm (u, v). (4)

a

a

Ioffe Sergey. Improved Consistent Sampling, Weighted Minhash and L1 Sketching, ICDM, 2010.

Due to the above theorem, we immediately have the following result:   ∗ ∗ ∗ ∗ E 1{(is,u , ts,u ) = (is,v , ts,v )} = gmm (u, v), (5) where the indicator function 1{event} outputs 1 if event happens and 0 otherwise. Jing Wang

PR

August 15, 2017

8 / 15

Partial Ranking Model Given a training set {(X j , y j )}N j=1 where N denotes the number of training images. Assumptions 1 y j = (y1j , · · · , ynj ): in a non-ascending order 2 φ(·): the feature map for the min-max kernel The convex optimization problem min w

s.t.

1 kw k22 , 2 w · φ(x jp ) ≥ w · φ(x jq ), ∀ j ∈ [N], p ∈ [k ], q ∈ [n]\[k],

(6)

where [N] denotes the integer set of {1, · · · , N} and likewise for [n] and [k ].

Jing Wang

PR

August 15, 2017

9 / 15

Learning with Large Margin Model

A soft-margin formulation N

min

w ,ξ1 ,...,ξN

s.t.

X 1 kw k22 + C ξj , 2 j=1   j w · φ(x p ) − φ(x jq ) ≥ 1 − ξj , ∀ j ∈ [N], ∀ p ∈ [k], ∀ q ∈ [n]\[k].

where ξj is a non-negative slack variable, C is a non-negative trade-off parameter.

Jing Wang

PR

August 15, 2017

10 / 15

Overview of learning procedure

Positive top k

Negative

1. Image

2. Object proposals

last n-k

3. Training samples

Consistent Weighted Sampling

Partial Ranking model

5. Kernel Linearization

6. Training

4. Features

Figure 1: Overview of the learning procedure.

Jing Wang

PR

August 15, 2017

11 / 15

Experiment

Dataset PASCAL VOC2007 Evaluation Metrics Recall, Average Recall (AR) Baselines BING, CPMC, GOP, EB, Endres, MCG, OBJ, Rigor, Rantalankila, RS, M-MCG, RP, SS

Jing Wang

PR

August 15, 2017

12 / 15

Experimental Results 1

1 BING CPMC EB Endres GOP MCG M-MCG OBJ Rantalankila Rigor RP RS SS PR-EB PR-GOP PR-MCG PR-OBJ PR-Rigor PR-RS PR-SS

0.6 0.4 0.2 0 0.5

0.6

0.7

0.8

0.9

BING CPMC EB Endres GOP MCG M-MCG OBJ Rantalankila Rigor RP RS SS PR-EB PR-GOP PR-MCG PR-OBJ PR-Rigor PR-RS PR-SS

0.8

Recall

Recall

0.8

0.6 0.4 0.2 0 0.5

1

0.6

0.7

0.8

0.9

1

IoU overlap threshold

IoU overlap threshold

(a) 100 proposals per image.

(b) 500 proposals per image. BING

1

1

0.8

0.8

CPMC EB Endres GOP

0.6

Average recall

Recall

MCG

BING CPMC EB Endres GOP

0.4

MCG M−MCG OBJ Rantalankila Rigor

0.2

M−MCG OBJ Rantalankila Rigor RP

0.6

RS SS PR−EB PR−GOP PR−MCG

0.4

PR−OBJ PR−Rigor PR−RS PR−SS

0.2

RP RS SS PR−EB

0 0

PR−GOP

200

400

600

# proposals

PR−MCG

800

PR−OBJ PR−Rigor

1000

PR−RS PR−SS

(c) Recall at 0.7 IoU.

0 0

200

400 600 # proposals

800

1000

(d) Average recall.

Figure 2: Comparison results with all baselines in terms of Recall versus IoU threshold. Jing Wang

PR

August 15, 2017

13 / 15

Experimental Results on each class

Table 1: Average Recall on each 20 class of VOC 2007 test set with 300 proposals per image Algorithms BING CPMC EB Endres GOP MCG M-MCG OBJ Rantalankila Rigor RP RS SS VGG ZF PR-EB PR-GOP PR-MCG PR-OBJ PR-Rigor PR-RS PR-SS

aero 0.27 0.55 0.51 0.49 0.41 0.52 0.60 0.33 0.45 0.37 0.54 0.04 0.62 0.42 0.40 0.51 0.42 0.56 0.34 0.50 0.50 0.69

bicycle 0.27 0.44 0.51 0.53 0.43 0.49 0.52 0.33 0.35 0.29 0.38 0.14 0.48 0.48 0.46 0.55 0.43 0.55 0.34 0.38 0.45 0.60

Jing Wang

bird 0.22 0.44 0.45 0.40 0.34 0.42 0.47 0.27 0.35 0.27 0.33 0.12 0.41 0.41 0.36 0.49 0.38 0.45 0.29 0.32 0.41 0.54

boat 0.16 0.33 0.37 0.33 0.26 0.32 0.39 0.23 0.25 0.22 0.28 0.08 0.33 0.34 0.32 0.41 0.30 0.38 0.25 0.30 0.35 0.45

bottle 0.13 0.17 0.20 0.18 0.12 0.25 0.25 0.13 0.16 0.09 0.13 0.02 0.14 0.29 0.23 0.33 0.17 0.28 0.17 0.09 0.20 0.27

bus 0.27 0.61 0.58 0.59 0.55 0.63 0.62 0.41 0.42 0.40 0.50 0.13 0.56 0.44 0.41 0.59 0.54 0.64 0.41 0.47 0.53 0.66

car 0.20 0.43 0.41 0.47 0.38 0.45 0.46 0.28 0.35 0.29 0.34 0.08 0.39 0.45 0.41 0.48 0.43 0.50 0.29 0.34 0.44 0.52

cat 0.40 0.76 0.61 0.74 0.72 0.73 0.73 0.45 0.63 0.62 0.66 0.19 0.72 0.54 0.51 0.62 0.63 0.72 0.45 0.68 0.59 0.79

chair 0.19 0.34 0.32 0.39 0.27 0.39 0.41 0.21 0.33 0.17 0.31 0.12 0.33 0.34 0.27 0.42 0.33 0.41 0.24 0.21 0.39 0.51

cow 0.22 0.52 0.50 0.49 0.37 0.52 0.55 0.29 0.42 0.30 0.38 0.13 0.44 0.50 0.45 0.53 0.48 0.58 0.30 0.35 0.46 0.57

PR

table 0.32 0.56 0.50 0.60 0.58 0.53 0.57 0.42 0.44 0.35 0.55 0.08 0.60 0.47 0.47 0.52 0.50 0.61 0.44 0.48 0.45 0.70

dog 0.34 0.71 0.62 0.70 0.63 0.67 0.70 0.41 0.59 0.52 0.59 0.18 0.68 0.55 0.53 0.63 0.57 0.68 0.42 0.60 0.55 0.76

horse 0.28 0.57 0.57 0.56 0.50 0.57 0.58 0.37 0.40 0.37 0.43 0.16 0.50 0.52 0.49 0.57 0.43 0.59 0.38 0.47 0.44 0.61

mbike 0.27 0.51 0.52 0.55 0.47 0.52 0.55 0.31 0.41 0.36 0.42 0.16 0.49 0.49 0.44 0.57 0.42 0.57 0.34 0.43 0.46 0.61

person 0.24 0.37 0.37 0.37 0.32 0.42 0.43 0.26 0.25 0.20 0.28 0.09 0.34 0.48 0.43 0.45 0.33 0.46 0.28 0.22 0.35 0.47

plant 0.20 0.31 0.32 0.33 0.26 0.32 0.35 0.21 0.25 0.16 0.24 0.13 0.27 0.36 0.32 0.43 0.27 0.40 0.25 0.19 0.36 0.44

sheep 0.21 0.47 0.47 0.46 0.33 0.47 0.52 0.26 0.35 0.24 0.37 0.14 0.41 0.45 0.40 0.53 0.42 0.52 0.27 0.29 0.45 0.55

sofa 0.37 0.71 0.57 0.75 0.67 0.70 0.71 0.46 0.59 0.55 0.67 0.10 0.71 0.52 0.50 0.58 0.58 0.71 0.47 0.62 0.56 0.80

August 15, 2017

train 0.34 0.62 0.57 0.66 0.59 0.63 0.65 0.41 0.44 0.41 0.52 0.14 0.61 0.50 0.48 0.56 0.50 0.66 0.41 0.56 0.46 0.68

tv 0.24 0.53 0.55 0.49 0.39 0.59 0.58 0.30 0.51 0.31 0.48 0.19 0.50 0.47 0.38 0.52 0.48 0.56 0.30 0.28 0.53 0.65

14 / 15

Thank you!

Jing Wang

PR

August 15, 2017

15 / 15

Object Proposal with Kernelized Partial Ranking

Object Proposal with Kernelized Partial Ranking. Jing Wang. Joint work with Jie Shen and Ping Li. Rutgers University. August 15, 2017. Jing Wang. PR.

366KB Sizes 0 Downloads 189 Views

Recommend Documents

Kernelized Structural SVM Learning for Supervised Object Segmentation
that tightly integrates object-level top down information .... a principled manner, an alternative to the pseudo-likelihood ..... What energy functions can be.

Kernelized Structural SVM Learning for Supervised Object ... - CiteSeerX
dim. HOG Grid feature. Right: Horse detector bounding boxes generated by [7], the coordinates of the 9 bounding boxes are con- catenated to create a 36 dim.

Ranking with decision tree
This is an online mistake-driven procedure initialized with ... Decision trees can, to some degree, overcome these shortcomings of perceptron-based ..... Research Program of Chinese Academy of Sciences (06S3011S01), National Key Technology R&D Pro- .

Contour Grouping with Partial Shape Similarity - CiteSeerX
the illustration of the process of prediction and updating in particle filters. The .... fine the classes of the part segments according to the length percentage. CLi.

Contour Grouping with Partial Shape Similarity - CiteSeerX
... and Information Engineering,. Huazhong University of Science and Technology, Wuhan 430074, China ... Temple University, Philadelphia, PA 19122, USA ... described a frame integrates top-down with bottom-up segmentation, in which ... The partial sh

Proposal Draft with Template -
Researchers have noticed that the Chinese web culture might differ from that in the ... 1 E.g. from a travel agency web page, we find such a note: “Our website ...

inhouse proposal with training material -
Audio CD & Video DVD player and Projector with sound system – with 1 or 2. Screens (1 screen is sufficient for 60 people). 4. Visualizer cum Projector (to project ...

Quantum Search Algorithm with more Reliable Behaviour using Partial ...
School of Computer Science. University of Birmingham. Julian Miller ‡. Department of Electronics. University of York. November 16, 2006. Abstract. In this paper ...

A Fragment Based Scale Adaptive Tracker with Partial ...
In [2], a multi-part representation is used to track ice-hockey players, dividing the rectangular box which bounds the target into two non-overlapping areas corresponding to the shirt and trousers of each player. A similar three part based approach i

Hybrid Decoding: Decoding with Partial Hypotheses ...
†School of Computer Science and Technology. Harbin Institute of .... obtained from the model training process, which is shown in ..... BLEU: a method for auto-.

Process Theory for Supervisory Control with Partial ...
Abstract—We present a process theory that can specify supervisory control feedback loops comprising nondeterministic plants and supervisors with event- and ...

Recommendation for New Users with Partial ...
propose to leverage some auxiliary data of online reviewers' aspect-level opinions, so as to .... called CompleteRank), mainly contains the following three steps. ... defined dictionary). Inspired from this observation, we emphasize the usage of aspe

ranking geral_valeEscolar.pdf
Download. Connect more apps... Try one of the apps below to open or edit this item. ranking geral_valeEscolar.pdf. ranking geral_valeEscolar.pdf. Open. Extract.

Ranking Arizona - Snell & Wilmer
Apr 17, 2017 - ... business leaders throughout Arizona, participated in an online opinion poll ... company's revenue and number of employees. About Snell & Wilmer. Founded in 1938, Snell & Wilmer is a full-service business law firm with ...

Partial Default - Cristina Arellano
(Trade costs, Rose 2002; financial crises, Reinhart and Rogoff 2010; lawsuits and sanctions ... partial recovery of those debts. Arellano, Mateos-Planas ... Public debt data from World Development Indicators: debt in arrears and new loans.

Partial Default
Oct 7, 2013 - SDN. SEN. SEN. SEN. SLB. SLE. SLE. SLE. SLV. SYC. TGOTGO. TGO. TGO. TUR. TUR. UKR. URY. URY. URYURY. VEN. VEN. VEN. VEN. VEN. VNM. ZAR. ZMB. ZWE. ZWE. 0 .2 .4 .6 .8. 1. Defaulted. Debt / P aym en ts D ue. -20. -10. 0. 10. 20. GDP growth

Group Bayesian personalized ranking with rich ...
a College of Computer Science and Software Engineering, Shenzhen University, China ... recent years. ... have developed an extension of GBPR (i.e., GBPRю ) in Section 4; (2) we have ...... Ph.D. degree in Computer Science and Engineering.

Web Image Retrieval Re-Ranking with Relevance Model
ates the relevance of the HTML document linking to the im- age, and assigns a ..... Database is often used for evaluating image retrieval [5] and classification ..... design plan group veget improv garden passion plant grow tree loom crop organ.

Ranking with query-dependent loss for web search
Feb 4, 2010 - personal or classroom use is granted without fee provided that copies are not made or distributed ...... we will try to explore more meaningful query features and investigate their ... Overview of the okapi projects. In Journal of.

Ranking linear budget sets with different available goods
Dec 30, 2011 - tion problem. The path-breaking studies of ranking opportunity sets are Jones and Sugden (1982) and Pattanaik and Xu (1990). Applications of ...

FRank: A Ranking Method with Fidelity Loss - Microsoft
result, our new learning method named Fidelity Rank (FRank) combines the ... from click-through data. ...... Discovery and Data Mining (KDD), ACM, 2002.