1

Development of PEAR Lior Weizman

In the implementation of PEAR for functional MRI, we aim to solve the following minimization problem: 1 argmin D(A, P) = argmin ||E{A + P} − y||22 + λ||Ft {P}||1 2 A∈C A∈C P∈R

N ×K

P∈R

(1)

N ×K

where Ft is a unitary transformations, y are the measurements, E is a known transformation and C is the set of matrices with a known rank r. The dimensions of the A and P are N × K. Since (1) is an optimization problem over two variables, it can be solved via alternating minimization. In this approach, in each iteration we perform minimizing with respect to one variable while keeping the other one fixed, and then switching between the variables. In our case, we start with an arbitrary initial point P0 and for n ≥ 1 we iteratively compute: An = arg min D(A, Pn−1 )

(2)

A∈C

Pn = arg min D(An , P)

(3)

P∈RN ×K

We will solve each sub-problems (2),(3) using the the Incremental Subgradient Proximal (ISP) method [1], that solves problems in the form of: min {Σm i Fi (X)}

(4)

Fi (X) = fi (X) + hi (X)

(5)

X∈D

where Fi (X) are convex functions, and: where the functions fi are suitable for a proximal iteration, while the components hi are not and thus may be preferably treated with a subgradient iteration. In our case, in each subproblem m = 1, therefore the subscript i can be omitted. The general solution of (4) is given by [1], Equations (4.14-4.15): iterate: Zk = arg min {f {X} + X∈RN xK

1 kX − Xk−1 k2 } 2αk

Xk = PC (Zk − αk g k )

(6) (7)

where g k ∈ ∂h(Zk ). For (2), and based on the definition in (1) while omitting terms that are independent of A we obtain that the ISP formulation is: h(A) = 12 kE{A + Pn−1 } − yk22 and f (A) = 0. Since in our case, h(A) is smooth and differentiable, the subgradient set ∂h(Zk ) is a singleton and equals to: ∇h(A) = EH {E{A + Sn−1 } − y}. (8) and the ISP step is: An = PC (An−1 − αk EH {E{An−1 + Pn−1 } − y}) = Rr (An−1 − αk EH {E(An−1 + Pn−1 ) − y}) (9) Pr H where Rr (Q) is a singular value hard thresholding operator with threshold r, defined as: Rr (Q) = i=1 σi ui vi where σ1 ≥ σ2 ≥, ... ≥ σm are the singular values of Q, and ui and vi are the singular vectors associated with σi . In [2] it has been shown that adding matrix shrinkage to the hard thresholding operator (to obtain iterative hard thresholding+matrix shrinkage (IHT+MS) [3]) provides better results experimentally. This is done by replacing (Rr (·)) with Rr (Sµ (·)), where Sµ (Q) = U[Σ − µI]+ VH where Σ = diag(σ1 , σ2 , ..., σm ) and U and V are the left and right singular vectors associated with Σ. For (3), and based on the definition in (1) while omitting terms that are independent of A we obtain that the ISP formulation is: h(P) = 21 kE{An + P} − yk22 and f (P) = λkFt {P}k1 . As a result, (6) for our case is: Zk = arg min {λkFt {P}k1 + P∈RN xK

February 15, 2017

1 kP − Pk−1 k2 } 2αk

(10)

DRAFT

2

Since Ft is unitary, we define: Q = Ft {P} and solve: arg min {kQk1 + Q∈RN xK

1 kFH {Q} − Xk k2 } 2αk t

(11)

and the solution of (11) is iterative (for l = 1..L), and obtained using the iterative shrinkage-thresholding algorithm (ISTA), where the general step is: 1 k−1 Ql+1 = Λλ (Ql − Ft {FH }) (12) t {Ql } − P αk where Q0 = Pn−1 , Λλ indicates the soft-thresholding operator with parameter λ and Zk = FH t {QL }. For (7), since the projection is into RN ×K we get: Pk = Zk − αk g k = Zk − αk EH {E{An + Zk } − y}

(13)

Putting it all together and setting αk = α, and L = 1 we get that the solution of (1) is: Algorithm 1 Iterate for n ≥ 1: Yn = An−1 − αEH {E{An−1 + Pn−1 } − y} An = Rr (Sµ (Yn )) Z n = FH t {Λλ (Ft {Pn−1 })} Pn = Zn − αEH {E{An + Zn } − y}

R EFERENCES [1] S. Sra, S. Nowozin, and S. J. Wright, Optimization for machine learning. MIT Press, 2012. [2] M. Chiew, S. M. Smith, P. J. Koopmans, N. N. Graedel, T. Blumensath, and K. L. Miller, “k-t FASTER: a novel method for accelerating fMRI data acquisition using low rank constraints,” Magnetic Resonance in Medicine, vol. 74, no. 2, pp. 353–364, 2015. [3] D. Goldfarb and S. Ma, “Convergence of fixed-point continuation algorithms for matrix rank minimization,” Foundations of Computational Mathematics, vol. 11, no. 2, pp. 183–210, 2011.

February 15, 2017

DRAFT

Development of PEAR

Feb 15, 2017 - shrinkage (IHT+MS) [3]) provides better results experimentally. ... [1] S. Sra, S. Nowozin, and S. J. Wright, Optimization for machine learning.

140KB Sizes 0 Downloads 182 Views

Recommend Documents

Pear tree named 'Roksolana'
Feb 28, 2008 - Primary Examinel'iKent L Bell. U_S_C_ 154(1)) by 222 ... and have been found to store well over long periods. Once. A01H 5/00. (2006.01).

Pear tree named 'Roksolana'
Feb 28, 2008 - One year old shoot: LengthiAvg. 40 cm. Pubescence intensityiweak. .... U S. Patent. Nov. 30, 2010. Sheet 3 of5. US PP21,534 P3 xvi. "mu , ...

pdf pear
Page 1 of 1. File: Pdf pear. Download now. Click here if your download doesn't start automatically. Page 1 of 1. pdf pear. pdf pear. Open. Extract. Open with. Sign In. Main menu. Displaying pdf pear. Page 1 of 1.

Economical effectiveness of vegetative pear nurseries ...
The data showed that different rootstocks affected sapling ... experimental data, description analysis, LSD and ... Statistical analysis via LSD test showed that.

Prickly Pear (Opuntia sp.) Pectin Alters Hepatic ...
dietary fit saturation. J. Lipid Res. 33: 97-109. Fernandez .... in man. J. Lipid Res. 19: 82-94. Shinnick, F. L., Longacre, M. J., Ink, S. L. & Marlett, J. A. (1988) Oat.

PEAR: PEriodic And fixed Rank separation for fast fMRI - Technion ...
better modeling of fMRI, over state-of-the-art methods. © 2017 American ... incoherent sampling of k-space, resulting in robust recovery of task-based fMRI data at ..... derived from high-dimensional group-level Independent. Component ...

Pear Pears (Pyrus communis) are a pome fruit relative of the apple ...
Pear. Pears (Pyrus communis) are a pome fruit relative of the apple. One of the earliest written histories or records comes from Homer's reference to them as "Gifts from the Gods." The first pears arrived in the United States by European settlers in