On the growth factor for Hadamard matrices

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions

Christos Kravvaritis joint work with Marilena Mitrouli

Importance

History Determinants Preliminaries

University of Athens Department of Mathematics

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Householder Symposium XVII - Zeuthen 2008

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

linear system Ax = b, A = [aij ] ∈ Rn×n

I

Gaussian Elimination (GE):  a11 a12 · · · a1n  a21 a22 · · · a2n    A= .. .. ..  −→  . . ··· . 

A(n−1)

an1 an2 · · · ann  a11 a12 · · ·  0 a(1) · · ·  22  (2) 0 0 a33 =  .. .. ..   . . . 0 0 0

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea

··· ··· ··· .. . ···

a1n (1) a2n (2) a3n .. . (n−1)

ann



Pivots from the beginning Pivots from the end

      

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

linear system Ax = b, A = [aij ] ∈ Rn×n

I

Gaussian Elimination (GE):  a11 a12 · · · a1n  a21 a22 · · · a2n    A= .. .. ..  −→  . . ··· . 

A(n−1)

an1 an2 · · · ann  a11 a12 · · ·  0 a(1) · · ·  22  (2) 0 0 a33 =  .. .. ..   . . . 0 0 0

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea

··· ··· ··· .. . ···

a1n (1) a2n (2) a3n .. . (n−1)

ann



Pivots from the beginning Pivots from the end

      

Numerical experiments Pivot patterns

SummaryReferences

Backward error analysis for GE −→ growth factor (k )

g(n, A) =

maxi,j,k |aij | maxi,j |aij |

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants

Theorem (Wilkinson) The computed solution xˆ to the linear system Ax = b using GE with partial pivoting satisfies

Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

(A + ∆A)xˆ = b with k∆Ak∞ ≤ cn3 g(n, A)kAk∞ u.

SummaryReferences

Backward error analysis for GE −→ growth factor (k )

g(n, A) =

maxi,j,k |aij | maxi,j |aij |

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants

Theorem (Wilkinson) The computed solution xˆ to the linear system Ax = b using GE with partial pivoting satisfies

Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

(A + ∆A)xˆ = b with k∆Ak∞ ≤ cn3 g(n, A)kAk∞ u.

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

Definition Completely pivoted (CP) matrices: no row and column exchanges are needed during GE with complete pivoting.

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Definition A Hadamard matrix H of order n (symb. Hn ) is a ±1 matrix satisfying

C. Kravvaritis Introduction Gaussian Elimination Definitions

HH T = H T H = nIn .

Importance

History Determinants

Important property: every two distinct rows and columns of a Hadamard matrix are orthogonal.

Preliminaries

Solution The proposed idea Pivots from the beginning

Example

Pivots from the end Numerical experiments Pivot patterns

 1 1 1 1  1 −1 1 −1   H4 =   1 1 −1 −1  1 −1 −1 1 

SummaryReferences

On the growth factor for Hadamard matrices

Definition A Hadamard matrix H of order n (symb. Hn ) is a ±1 matrix satisfying

C. Kravvaritis Introduction Gaussian Elimination Definitions

HH T = H T H = nIn .

Importance

History Determinants

Important property: every two distinct rows and columns of a Hadamard matrix are orthogonal.

Preliminaries

Solution The proposed idea Pivots from the beginning

Example

Pivots from the end Numerical experiments Pivot patterns

 1 1 1 1  1 −1 1 −1   H4 =   1 1 −1 −1  1 −1 −1 1 

SummaryReferences

On the growth factor for Hadamard matrices

Definition A Hadamard matrix H of order n (symb. Hn ) is a ±1 matrix satisfying

C. Kravvaritis Introduction Gaussian Elimination Definitions

HH T = H T H = nIn .

Importance

History Determinants

Important property: every two distinct rows and columns of a Hadamard matrix are orthogonal.

Preliminaries

Solution The proposed idea Pivots from the beginning

Example

Pivots from the end Numerical experiments Pivot patterns

 1 1 1 1  1 −1 1 −1   H4 =   1 1 −1 −1  1 −1 −1 1 

SummaryReferences

On the growth factor for Hadamard matrices

Example

C. Kravvaritis



H16

             =             

1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1−1 1−1 1−1 1−1 1−1 1−1 1−1 1−1  1 1−1−1 1 1−1−1 1 1−1−1 1 1−1−1  1−1−1 1 1−1−1 1 1−1−1 1 1−1−1 1  1 1 1 1−1−1−1−1 1 1 1 1−1−1−1−1  1−1 1−1−1 1−1 1 1−1 1−1−1 1−1 1  1 1−1−1−1−1 1 1 1 1−1−1−1−1 1 1  1−1−1 1−1 1 1−1 1−1−1 1−1 1 1−1  1 1 1 1 1 1 1 1−1−1−1−1−1−1−1−1  1−1 1−1 1−1 1−1−1 1−1 1−1 1−1 1  1 1−1−1 1 1−1−1−1−1 1 1−1−1 1 1  1−1−1 1 1−1−1 1−1 1 1−1−1 1 1−1  1 1 1 1−1−1−1−1−1−1−1−1 1 1 1 1  1−1 1−1−1 1−1 1−1 1−1 1 1−1 1−1  1 1−1−1−1−1 1 1−1−1 1 1 1 1−1−1 

1−1−1 1−1 1 1−1−1 1 1−1 1−1−1 1

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination

Definition Two matrices are said to be Hadamard equivalent or H-equivalent if one can be obtained from the other by a sequence of the operations: 1. interchange any pairs of rows and/or columns;

Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end

2. multiply any rows and/or columns through by −1.

Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Why Hadamard matrices? 1. Numerous applications in various areas of modern Mathematics: I I I I I I I

Statistics-Theory of Experimental Designs Coding Theory Cryptography Combinatorics Image Processing Signal Processing Analytical Chemistry

2. Interesting properties regarding the size of the pivots appearing after application of GE

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Why Hadamard matrices? 1. Numerous applications in various areas of modern Mathematics: I I I I I I I

Statistics-Theory of Experimental Designs Coding Theory Cryptography Combinatorics Image Processing Signal Processing Analytical Chemistry

2. Interesting properties regarding the size of the pivots appearing after application of GE

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Why Hadamard matrices? 1. Numerous applications in various areas of modern Mathematics: I I I I I I I

Statistics-Theory of Experimental Designs Coding Theory Cryptography Combinatorics Image Processing Signal Processing Analytical Chemistry

2. Interesting properties regarding the size of the pivots appearing after application of GE

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

History

On the growth factor for Hadamard matrices C. Kravvaritis

I

Tornheim, 1964 g(n, H) ≥ n for a CP n × n Hadamard matrix H

Introduction Gaussian Elimination Definitions Importance

History

I

Cryer, 1968 Conjecture: g(n, A) ≤ n, with equality iff A is a Hadamard matrix

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments

I

Day & Peterson, 1988 proved the equality only for   the Hadamard-Sylvester Hn Hn class H2n = Hn −Hn Conjecture: pn−3 = n/4

Pivot patterns

SummaryReferences

History

On the growth factor for Hadamard matrices C. Kravvaritis

I

Tornheim, 1964 g(n, H) ≥ n for a CP n × n Hadamard matrix H

Introduction Gaussian Elimination Definitions Importance

History

I

Cryer, 1968 Conjecture: g(n, A) ≤ n, with equality iff A is a Hadamard matrix

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments

I

Day & Peterson, 1988 proved the equality only for   the Hadamard-Sylvester Hn Hn class H2n = Hn −Hn Conjecture: pn−3 = n/4

Pivot patterns

SummaryReferences

History

On the growth factor for Hadamard matrices C. Kravvaritis

I

Tornheim, 1964 g(n, H) ≥ n for a CP n × n Hadamard matrix H

Introduction Gaussian Elimination Definitions Importance

History

I

Cryer, 1968 Conjecture: g(n, A) ≤ n, with equality iff A is a Hadamard matrix

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments

I

Day & Peterson, 1988 proved the equality only for   the Hadamard-Sylvester Hn Hn class H2n = Hn −Hn Conjecture: pn−3 = n/4

Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

Gould, 1991 found a 13 × 13 matrix with growth 13.0205 The first part of Cryer’s conjecture is false. The second part still remains open

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

Edelman & Mascarenhas, 1995 g(12, H12 ) = 12

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

I

Edelman & Friedman, 1998 found the first H16 with pn−3 = n/2 Day & Peterson’s conjecture is false K. & Mitrouli, 2007 g(16, H16 ) = 16

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Difficulty of the problem

On the growth factor for Hadamard matrices C. Kravvaritis Introduction

Pivot pattern invariant under H-equivalence operations, i.e. H-equivalent matrices may have different pivot patterns.

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

A naive computer exhaustive search finding all possible H-equivalent H16 requires (16!)2 (216 )2 ≈ 1036 trials.

Solution The proposed idea Pivots from the beginning Pivots from the end

In addition, the pivot pattern of each one of these matrices should be computed. → many years of computations!

Numerical experiments Pivot patterns

SummaryReferences

Difficulty of the problem

On the growth factor for Hadamard matrices C. Kravvaritis Introduction

Pivot pattern invariant under H-equivalence operations, i.e. H-equivalent matrices may have different pivot patterns.

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

A naive computer exhaustive search finding all possible H-equivalent H16 requires (16!)2 (216 )2 ≈ 1036 trials.

Solution The proposed idea Pivots from the beginning Pivots from the end

In addition, the pivot pattern of each one of these matrices should be computed. → many years of computations!

Numerical experiments Pivot patterns

SummaryReferences

Difficulty of the problem

On the growth factor for Hadamard matrices C. Kravvaritis Introduction

Pivot pattern invariant under H-equivalence operations, i.e. H-equivalent matrices may have different pivot patterns.

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

A naive computer exhaustive search finding all possible H-equivalent H16 requires (16!)2 (216 )2 ≈ 1036 trials.

Solution The proposed idea Pivots from the beginning Pivots from the end

In addition, the pivot pattern of each one of these matrices should be computed. → many years of computations!

Numerical experiments Pivot patterns

SummaryReferences

Difficulty of the problem

On the growth factor for Hadamard matrices C. Kravvaritis Introduction

Pivot pattern invariant under H-equivalence operations, i.e. H-equivalent matrices may have different pivot patterns.

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

A naive computer exhaustive search finding all possible H-equivalent H16 requires (16!)2 (216 )2 ≈ 1036 trials.

Solution The proposed idea Pivots from the beginning Pivots from the end

In addition, the pivot pattern of each one of these matrices should be computed. → many years of computations!

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Solution Main idea 1: Calculate pivots with:

C. Kravvaritis

Lemma

Introduction

Let A be a CP matrix and A(j) denote the absolute value of the upper left j × j principal minor of A. (i) [Gantmacher 1959] The magnitude of the pivots appearing after application of GE operations on A is given by

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

pj =

A(j) , A(j − 1)

Pivots from the end

j = 1, 2, . . . , n,

A(0) = 1.

(1)

Numerical experiments Pivot patterns

SummaryReferences

(ii) [Cryer 1968] The maximum j × j leading principal minor of A, when the first j − 1 rows and columns are fixed, is A(j). → it is important to calculate minors!

On the growth factor for Hadamard matrices

Solution Main idea 1: Calculate pivots with:

C. Kravvaritis

Lemma

Introduction

Let A be a CP matrix and A(j) denote the absolute value of the upper left j × j principal minor of A. (i) [Gantmacher 1959] The magnitude of the pivots appearing after application of GE operations on A is given by

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

pj =

A(j) , A(j − 1)

Pivots from the end

j = 1, 2, . . . , n,

A(0) = 1.

(1)

Numerical experiments Pivot patterns

SummaryReferences

(ii) [Cryer 1968] The maximum j × j leading principal minor of A, when the first j − 1 rows and columns are fixed, is A(j). → it is important to calculate minors!

On the growth factor for Hadamard matrices

Solution Main idea 1: Calculate pivots with:

C. Kravvaritis

Lemma

Introduction

Let A be a CP matrix and A(j) denote the absolute value of the upper left j × j principal minor of A. (i) [Gantmacher 1959] The magnitude of the pivots appearing after application of GE operations on A is given by

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

pj =

A(j) , A(j − 1)

Pivots from the end

j = 1, 2, . . . , n,

A(0) = 1.

(1)

Numerical experiments Pivot patterns

SummaryReferences

(ii) [Cryer 1968] The maximum j × j leading principal minor of A, when the first j − 1 rows and columns are fixed, is A(j). → it is important to calculate minors!

On the growth factor for Hadamard matrices

Solution Main idea 1: Calculate pivots with:

C. Kravvaritis

Lemma

Introduction

Let A be a CP matrix and A(j) denote the absolute value of the upper left j × j principal minor of A. (i) [Gantmacher 1959] The magnitude of the pivots appearing after application of GE operations on A is given by

Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

pj =

A(j) , A(j − 1)

Pivots from the end

j = 1, 2, . . . , n,

A(0) = 1.

(1)

Numerical experiments Pivot patterns

SummaryReferences

(ii) [Cryer 1968] The maximum j × j leading principal minor of A, when the first j − 1 rows and columns are fixed, is A(j). → it is important to calculate minors!

On the growth factor for Hadamard matrices C. Kravvaritis

First known effort for calculating the n − 1, n − 2 and n − 3 minors of Hadamard matrices:

Introduction Gaussian Elimination Definitions Importance

F. R. Sharpe, The maximum value of a determinant, Bull. Amer. Math. Soc. 14, 121–123 (1907)

History Determinants Preliminaries

n − 4 minors of Hadamard matrices, relative computer algorithm:

Solution The proposed idea Pivots from the beginning Pivots from the end

C. Koukouvinos, M. Mitrouli and J. Seberry, An algorithm to find formulae and values of minors of Hadamard matrices, Linear Algebra Appl. 330, 129–147 (2001)

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

First known effort for calculating the n − 1, n − 2 and n − 3 minors of Hadamard matrices:

Introduction Gaussian Elimination Definitions Importance

F. R. Sharpe, The maximum value of a determinant, Bull. Amer. Math. Soc. 14, 121–123 (1907)

History Determinants Preliminaries

n − 4 minors of Hadamard matrices, relative computer algorithm:

Solution The proposed idea Pivots from the beginning Pivots from the end

C. Koukouvinos, M. Mitrouli and J. Seberry, An algorithm to find formulae and values of minors of Hadamard matrices, Linear Algebra Appl. 330, 129–147 (2001)

Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Preliminary Results

C. Kravvaritis Introduction

Lemma



  Let A = (k − λ)Iv + λJv =  

Gaussian Elimination



k λ ··· λ k ··· .. .. . . λ λ ···

λ λ   , where k , λ  k

are integers. Then,

Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end

v −1

det A = [k + (v − 1)λ](k − λ)

and for k 6= λ, −(v − 1)λ, A is nonsingular with k2

(2) A−1

=

1 {[k + (v − 1)λ]Iv − λJv }. (3) + (v − 2)k λ − (v − 1)λ2

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions

Lemma Let B =

Importance

B1 B3

B2 B4

History



, B1 nonsingular. Then

det B = det B1 · det(B4 − B3 B1−1 B2 ).

Determinants Preliminaries

Solution

(4)

The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Main idea 2: Calculation of pivots from the beginning and from the end with different techniques

Introduction Gaussian Elimination Definitions Importance

p1

. . p2 . . . p8 .. p9 .. p10 . . . p16

Determinants

−→

Solution

←−

History

Preliminaries

The proposed idea

and

Pivots from the beginning Pivots from the end Numerical experiments

det H p9 = Q16

i=1,i6=9 pi

Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Pivots from the beginning

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

I

we specify all possible j × j matrices that can appear as upper left corner of a CP H16 → algorithm Exist (symbolical, implemented in Maple)

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Preliminaries

C. Kravvaritis

u1 1 1 Uj = . .. 1 1

u2 . . . u2j−1 −1 u2j−1 1 ... 1 1 1 ... −1 −1 .. .. .. = [u 1 u 2 . . . u 2j−1 ] . . . 1 ... −1 −1 −1 . . . 1 −1

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Lemma For the first j rows, j ≥ 3, of a normalized Hadamard matrix H of order n, n > 3, and for all the 2j−1 possible columns u 1 , . . . , u 2j−1 of Uj , it holds 0 ≤ ui ≤

n , for i = 1, . . . , 2j−1 . 4

SummaryReferences

On the growth factor for Hadamard matrices

Preliminaries

C. Kravvaritis

u1 1 1 Uj = . .. 1 1

u2 . . . u2j−1 −1 u2j−1 1 ... 1 1 1 ... −1 −1 .. .. .. = [u 1 u 2 . . . u 2j−1 ] . . . 1 ... −1 −1 −1 . . . 1 −1

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Lemma For the first j rows, j ≥ 3, of a normalized Hadamard matrix H of order n, n > 3, and for all the 2j−1 possible columns u 1 , . . . , u 2j−1 of Uj , it holds 0 ≤ ui ≤

n , for i = 1, . . . , 2j−1 . 4

SummaryReferences

Implementation of algorithm Exist Step 1 We want to establish whether  1 1 1 1  1 −1 1 −1 H4 =   1 1 −1 −1 1 −1 −1 1

On the growth factor for Hadamard matrices C. Kravvaritis Introduction



Gaussian Elimination Definitions Importance

  = [u u u u ] 1 6 4 7 

History Determinants Preliminaries

Solution

always exists in the upper left 4 × 4 corner of a H16 .

The proposed idea Pivots from the beginning Pivots from the end

Step 2 u1 u2 u3 u4 u5 u6 u7 u8 1 1 1 1 1 1 1 1 1 1 1 −1 −1 −1 −1 U4 = 1 1 1 −1 −1 1 1 −1 −1 1 −1 1 −1 1 −1 1 −1

Numerical experiments Pivot patterns

SummaryReferences

Implementation of algorithm Exist Step 1 We want to establish whether  1 1 1 1  1 −1 1 −1 H4 =   1 1 −1 −1 1 −1 −1 1

On the growth factor for Hadamard matrices C. Kravvaritis Introduction



Gaussian Elimination Definitions Importance

  = [u u u u ] 1 6 4 7 

History Determinants Preliminaries

Solution

always exists in the upper left 4 × 4 corner of a H16 .

The proposed idea Pivots from the beginning Pivots from the end

Step 2 u1 u2 u3 u4 u5 u6 u7 u8 1 1 1 1 1 1 1 1 1 1 1 −1 −1 −1 −1 U4 = 1 1 1 −1 −1 1 1 −1 −1 1 −1 1 −1 1 −1 1 −1

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Step 3 counting of columns + orthogonality of every two distinct rows of U4 ⇒

Introduction Gaussian Elimination Definitions Importance

History

u1 + u2 + u3 + u4 + u5 + u6 + u7 + u8 u1 + u2 + u3 + u4 − u5 − u6 − u7 − u8 u1 + u2 − u3 − u4 + u5 + u6 − u7 − u8 u1 + u2 − u3 − u4 − u5 − u6 + u7 + u8 u1 − u2 + u3 − u4 + u5 − u6 + u7 − u8 u1 − u2 + u3 − u4 − u5 + u6 − u7 + u8 u1 − u2 − u3 + u4 + u5 − u6 − u7 + u8

=n =0 =0 =0 =0 =0 =0

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

solution: u1 u2 u3 u4 u5 u6 u7 u8

= 4 − u8 = u8 = u8 = 4 − u8 = u8 = 4 − u8 = 4 − u8 = u8

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

Lemma ⇒ u8 = 0, 1, 2, 3, 4 u8 = 0: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (4, 0, 0, 4, 0, 4, 4, 0) u8 = 1: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (3, 1, 1, 3, 1, 3, 3, 1) u8 = 2: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (2, 2, 2, 2, 2, 2, 2, 2) u8 = 3: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (1, 3, 3, 1, 3, 1, 1, 3) u8 = 4: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (0, 4, 4, 0, 4, 0, 0, 4) → always u1 , u4 , u6 , u7 ≥ 1

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

solution: u1 u2 u3 u4 u5 u6 u7 u8

= 4 − u8 = u8 = u8 = 4 − u8 = u8 = 4 − u8 = 4 − u8 = u8

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

Lemma ⇒ u8 = 0, 1, 2, 3, 4 u8 = 0: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (4, 0, 0, 4, 0, 4, 4, 0) u8 = 1: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (3, 1, 1, 3, 1, 3, 3, 1) u8 = 2: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (2, 2, 2, 2, 2, 2, 2, 2) u8 = 3: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (1, 3, 3, 1, 3, 1, 1, 3) u8 = 4: (u1 , u2 , u3 , u4 , u5 , u6 , u7 , u8 ) = (0, 4, 4, 0, 4, 0, 0, 4) → always u1 , u4 , u6 , u7 ≥ 1

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Results

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

Table: j × j minors of H16

History Determinants

j 4 5 6 7 8

H16 (j) 16 32, 48 128, 160 256, 384, 512, 576 1024, 1536, 2048, 2304, 2560, 3072, 4096

Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Pivots from the end

C. Kravvaritis

I

Computation of (n − j) × (n − j) minors → algorithm Minors (symbolical, implemented in Maple)

Introduction Gaussian Elimination Definitions Importance

History

I

 H=

M UjT

Uj D



Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

solve the familiar linear system

Pivots from the end Numerical experiments Pivot patterns

I

form DD T partitioned appropriately in blocks

I

compute det DD T with consecutive applications of (4), with help of (2) and (3).

SummaryReferences

On the growth factor for Hadamard matrices

Pivots from the end

C. Kravvaritis

I

Computation of (n − j) × (n − j) minors → algorithm Minors (symbolical, implemented in Maple)

Introduction Gaussian Elimination Definitions Importance

History

I

 H=

M UjT

Uj D



Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

solve the familiar linear system

Pivots from the end Numerical experiments Pivot patterns

I

form DD T partitioned appropriately in blocks

I

compute det DD T with consecutive applications of (4), with help of (2) and (3).

SummaryReferences

On the growth factor for Hadamard matrices

Pivots from the end

C. Kravvaritis

I

Computation of (n − j) × (n − j) minors → algorithm Minors (symbolical, implemented in Maple)

Introduction Gaussian Elimination Definitions Importance

History

I

 H=

M UjT

Uj D



Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

solve the familiar linear system

Pivots from the end Numerical experiments Pivot patterns

I

form DD T partitioned appropriately in blocks

I

compute det DD T with consecutive applications of (4), with help of (2) and (3).

SummaryReferences

On the growth factor for Hadamard matrices

Pivots from the end

C. Kravvaritis

I

Computation of (n − j) × (n − j) minors → algorithm Minors (symbolical, implemented in Maple)

Introduction Gaussian Elimination Definitions Importance

History

I

 H=

M UjT

Uj D



Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

solve the familiar linear system

Pivots from the end Numerical experiments Pivot patterns

I

form DD T partitioned appropriately in blocks

I

compute det DD T with consecutive applications of (4), with help of (2) and (3).

SummaryReferences

On the growth factor for Hadamard matrices

Pivots from the end

C. Kravvaritis

I

Computation of (n − j) × (n − j) minors → algorithm Minors (symbolical, implemented in Maple)

Introduction Gaussian Elimination Definitions Importance

History

I

 H=

M UjT

Uj D



Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

I

solve the familiar linear system

Pivots from the end Numerical experiments Pivot patterns

I

form DD T partitioned appropriately in blocks

I

compute det DD T with consecutive applications of (4), with help of (2) and (3).

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Numerical experiments

On the growth factor for Hadamard matrices C. Kravvaritis Introduction

Table: Values of minors of orders n − 1, . . . , n − 7 for Hadamard matrices of order n = 12, 16, 20

Gaussian Elimination Definitions Importance

History

order n−1 n−2 n−3 n−4 n−5 n−6 n−7

values of minors n/2−1

n 0, 2nn/2−2 0, 4nn/2−3 0, 8nn/2−4 , 16nn/2−4 0, 16nn/2−5 , 32nn/2−5 , 48nn/2−5 0, 32nn/2−6 , 64nn/2−6 , 96nn/2−6 , 128nn/2−6 , 160nn/2−6 0, 64nn/2−7 , 128nn/2−7 , 192nn/2−7 , 256nn/2−7 , 320nn/2−7 , 384nn/2−7 , 448nn/2−7 , 512nn/2−7 , 576nn/2−7

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

The conjecture for minors of Hadamard matrices

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

All possible (n − j) × (n − j), j ≥ 1, minors of Hadamard matrices are

History Determinants Preliminaries

0 or p · n

(n/2)−j

, for p = 2

j−1

j−1

,2 · 2

j−1

,3 · 2

j−1

,...,s · 2

,

Solution The proposed idea Pivots from the beginning Pivots from the end

where s · 2j−1 = max{det(A)|A ∈ Rj×j , with entries ± 1} and the value 0 is excluded from the case j = 1.

Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination

Table: Possible determinant values for n × n ±1 matrices

Definitions Importance

n 1 2 3 4 5 6 7

det 1 0, 2 0, 4 0, 8, 16 0, 16, 32, 48 0, 32, 64, 96, 128, 160 0, 64, 128, 192, 256, 320, 384, 448, 512, 576

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Outline Introduction Gaussian Elimination Definitions Importance of this study

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History

History of the problem Determinants Preliminary Results

Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

Pivot patterns

On the growth factor for Hadamard matrices C. Kravvaritis

class I

II

III

IV/V

pivot pattern (1,2,2,4,2,4,4,8,2,4,4,8,4,8,8,16) (1,2,2,4,3,8/3,4,6,8/3,4,4,8,4,8,8,16) (1,2,2,4,3,8/3,2,4,4,4,4,8,8,8,8,16) (1,2,2,4,3,10/3,8/(10/3),4,16/3,5,16/(10/3),16/3,4,8,8,16) (1,2,2,4,2,4,4,4,4,4,4,8,4,8,8,16) (1,2,2,4,2,4,4,6,8/3,4,6,16/3,4,8,8,16) (1,2,2,4,2,4,4,8,2,4,4,8,4,8,8,16) (1,2,2,4,2,4,4,6,8/3,4,4,8,4,8,8,16) (1,2,2,4,2,4,4,4,9/2,16/(18/5),16/(10/3),16/3,4,8,8,16) (1,2,2,4,3,10/3,18/5,4,4,16/(18/5),16/(10/3),16/3,4,8,8,16) (1,2,2,4,2,4,4,8,2,4,4,8,4,8,8,16) (1,2,2,4,2,4,4,4,9/2,16/(18/5),16/(10/3),16/3,4,8,8,16)

Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis

Appropriate substitution of minors in (1) → we obtain all 34 possible pivot patterns of H16

Introduction Gaussian Elimination Definitions Importance

History

For a CP matrix A:

Determinants

g(n, A) =

(r −1) max1≤r ≤n |arr |

|a11 |

Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

Hence g(16, H16 ) = 16

SummaryReferences

Conclusions-Discussions-Open Problems

On the growth factor for Hadamard matrices C. Kravvaritis

I

pivot patterns of H20 , H24 etc.

Introduction Gaussian Elimination

I

High complexity → more effective implementation

Definitions Importance

History

I

Parallel implementation of the two main independent tasks

Determinants Preliminaries

Solution The proposed idea

I

Classification of the pivot patterns of H16

I

the value 8 as fourth pivot from the end for H16

Pivots from the beginning Pivots from the end Numerical experiments

I

sharper bound than n/4 (maybe n/8)

I

connection of Hadamard minors with ±1 determinants

Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

I

I

Statistical approach L. N. Trefethen and R. S. Schreiber, Average-case stability of Gaussian elimination, SIAM J. Matrix Anal. Appl. 11, 335–360 (1990) Generalization for OD’s: An orthogonal design (OD) of order n and type (u1 , u2 , . . . , ut ), ui positive integers, is an n × n matrix D with entries from the set {0, ±x1 , ±x2 , . . . , ±xt } that satisfies ! t X T T 2 DD = D D = ui xi In . i=1

C. Kravvaritis Introduction Gaussian Elimination Definitions Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

References Progress in the growth factor for Hadamard matrices: L. Tornheim, Pivot size in Gauss reduction, Tech. Report, Calif. Res. Corp., Richmond, Calif., February 1964.

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination

C. W. Cryer, Pivot size in Gaussian elimination, Numer. Math. 12, 335–345 (1968) J. Day and B. Peterson, Growth in Gaussian Elimination, Amer. Math. Monthly 95, 489–513 (1988)

Definitions Importance

History Determinants Preliminaries

Solution The proposed idea

A. Edelman and W. Mascarenhas, On the complete pivoting conjecture for a Hadamard matrix of order 12, Linear Multilinear Algebra 38, 181–187 (1995) A. Edelman and D. Friedman, A counterexample to a Hadamard matrix pivot conjecture, Linear Multilinear Algebra 44, 53–56 (1998) C. Kravvaritis and M. Mitrouli, Evaluation of Minors associated to weighing matrices, Linear Algebra Appl. 426, 774-809 (2007)

Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

References Progress in the growth factor generally: A.M. Cohen, A note on pivot size in Gaussian elimination, Linear Algebra Appl. 8, 361–368 (1974)

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination

T. A. Driscoll and K. L. Maki, Searching for Rare Growth Factors Using Multicanonical Monte Carlo Methods, SIAM Review 49, 673–692 (2007)

Definitions Importance

History Determinants Preliminaries

A. Edelman, The Complete Pivoting Conjecture for Gaussian Elimination is false, The Mathematica Journal 2, 58–61 (1992)

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

N. Gould, On growth in Gaussian elimination with pivoting, SIAM J. Matrix Anal. Appl. 12, 354–361 (1991) N. J. Higham, Accuracy and Stability of Numerical Algorithms, SIAM, Philadelphia, 2002. N. J. Higham and D. J. Higham, Large growth factors in Gaussian Elimination with Pivoting, SIAM J. Matrix Anal. Appl. 10 155–164 (1989)

SummaryReferences

References

On the growth factor for Hadamard matrices C. Kravvaritis

Books on orthogonal matrices:

Introduction Gaussian Elimination Definitions

A. V. Geramita and J. Seberry, Orthogonal Designs: Quadratic Forms and Hadamard Matrices, Marcel Dekker, New York-Basel (1979) A. S. Hedayat, N. J. A. Sloane and J. Stufken, Orthogonal Arrays: Theory and Application, Springer, New York, 1999. K. J. Horadam, Hadamard matrices and their appplications, Princeton University Press, Princeton (2007)

Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices C. Kravvaritis Introduction Gaussian Elimination Definitions

Thank you very much for your attention!

Importance

History Determinants Preliminaries

Solution The proposed idea Pivots from the beginning

http://ckravv.googlepages.com

Pivots from the end Numerical experiments Pivot patterns

SummaryReferences

On the growth factor for Hadamard matrices

Determinants. Preliminaries. Solution. The proposed idea. Pivots from the beginning. Pivots from the end. Numerical experiments. Pivot patterns. Summary-. References. Backward error analysis for GE −→ growth factor g(n,A) = maxi,j,k |a. (k) ij. | maxi,j |aij|. Theorem (Wilkinson). The computed solution ˆx to the linear system ...

782KB Sizes 0 Downloads 214 Views

Recommend Documents

The growth factor of a Hadamard matrix of order 16 is 16
Aug 8, 2016 - 1. interchange any pair of rows and/or columns;. 2. multiply .... in the solutions of the systems, which represent columns of H16. It is useful for ...

Growth Factor Upregulation during Obliterative
Raleigh, NC) were obtained from pathogen-free colonies and were housed and used in ... sion (using ImageQuant software), and a secondary endpoint, graft epi- thelialization .... and/or myofibroblast recruitment and activation as these events.

A Comment on the Hadamard Conjecture
binary error correcting block codes and sets of mutually orthogonal F- squares. Except in the case of F-squares, the ratio between the lower bound given by ...

The effect of vascular endothelial growth factor and brain‐derived ...
from the site of injury [6]. ... adhered to charged slides (Superfrost Plus, ... *all comparisons P < 0.05, except 3 vs 4; †number of nerve fibres positive per ...

On inversion of Toeplitz matrices - ScienceDirect.com
Comparison of the latter two theorems reveals that in Theorem 1.3 triangular .... [4] G. Heinig, On the reconstruction of Toeplitz matrix inverses from columns ...

Linear Operators on the Real Symmetric Matrices ...
Dec 13, 2006 - Keywords: exponential operator, inertia, linear preserver, positive semi-definite ... Graduate Programme Applied Algorithmic Mathematics, Centre for ... moment structure and an application to the modelling of financial data.

On matrices with the Edmonds-Johnson property ...
Oct 5, 2017 - Given a rational polyhedron P ⊆ Rn, the Chvátal closure of P is the polyhedron defined by ... †Department of Mathematics, London School of Economics and Political Science, United Kingdom. E-mail: ... if b is an integral vector, dec

on Honesty & Integrity, for Continuous Growth & Development.
Business Unit: ______ ... Phone: Mobile: Pin Code: Nature of location: Rented Own Other (Specify). Address Proof submitted: Please note your name should be ...

On the growth problem for skew and symmetric ...
Abstract. C. Koukouvinos, M. Mitrouli and Jennifer Seberry, in “Growth in Gaussian elimi- nation for weighing matrices, W(n, n − 1)”, Linear Algebra and its Appl., 306 (2000),. 189-202, conjectured that the growth factor for Gaussian eliminatio

Connective-tissue growth factor (CTGF) modulates cell ... - Nature
Jul 22, 2002 - ber of the CCN family of secreted proteins, named after. CTGF, cysteine-rich 61 (CYR61), and nephroblastoma overexpressed (NOV) proteins.

Increased nerve growth factor after rat plantar incision ...
After tissue damage a number of substances are released within the local tissue environment ... to the sensory nerve cell body (Richardson and Riopelle,. 1984).

Epidermal growth factor and serum activate distinct pathways ... - Nature
Jun 12, 2006 - National University of Singapore, 8 Medical Drive, MD7, Singapore. 117 597 .... monoclonal antibody and incubated with 1 μg of GSK-3 fusion.

Expression of Vascular Endothelial Growth Factor ...
Sep 20, 2004 - 2004;110;3699-3707; originally published online Sep 20, 2004; ... Administration of ... to Dr Suchitra Sumitran-Holgersson, Department of Transplantation Surgery B56, Huddinge University Hospital, S-141 86 Stockholm,.

Fibroblast Growth Factor 23 and Mortality among ...
ratio per unit increase in log-transformed cFGF-23 values, 1.8; 95% CI, 1.4 to 2.4) or in quartiles, with ... erated by Fresenius Medical Care North America. (Waltham, MA) in 2004 ..... osteomalacia express genes important in bone and mineral ...

Growth Facts, Factor Shares and Intellectual Property ...
in R&D (SEV-2015-0563) for financial support. .... separation is essential for our study of the effects of IPP capitalization on growth and the big ... counting (2) has focused on the change in the accounting treatment of IPP in the business sector.

POSITIVE DEFINITE RANDOM MATRICES
We will write the complement of α in N as c α . For any integers i .... distribution with n degrees of freedom and covariance matrix , and write . It is easy to see that ...

On Computing Determinants of Matrices Without Divisions
Jun 14, 1995 - 06077 and under Grant No. CDA-88-05910. In Proc. Internat. Symp. Symbolic Algebraic Comput. IS-. SAC '92, P. S. Wang, ed., ACM Press, pp.

Scheduling Traffic Matrices On General Switch Fabrics
use the component design technique from [6]. For each vari- able in the clause database we design a choice component. For example, suppose the database is {(x1 + x2), (x1 + x3), (x1 + x2 + x3)}; then the choice component for vari- able x1 is construc

On the detection and refinement of transcription factor ...
Jan 6, 2010 - 1Center for Statistical Genetics, 2Department of Biostatistics, 3Michigan Center of ..... model is no longer sufficient for analyzing ChIP-Seq data.