CSE342/542  -­‐  Lecture  2   January  10,  2013  

Slides  from  Bebis  and  other  sources  

Probability  Review     •  Prior  probability   –  P(Face)=0.1  means  that  “in  the  absence  of  any   other  informa0on,  there  is  a  10%  chance  that  the   any  image  contains  a  face”.  

•  Posterior  probability   –  P(Face/Eyes)  =  0.95  means  that  “there  is  95%   chance  that  the  image  contains  a  face  given  that  it   has  eyes”  

2  

Probability  Review     •  CondiOonal  probabiliOes  can  be  defined  in   terms  of  uncondiOonal  probabiliOes:  

P(A,B) P(B, A) P(A /B) = , P(B / A) = P(B) P(A)



•  CondiOonal  probabiliOes  lead  to  the  chain   rule:      P(A,B)           = P(A /B)P(B) = P(B / A)P(A) 3  

Probability  Review     •  Law  of  Total  Probability   –  If  A1,  A2,  …,  An  is  a  parOOon  of  mutually  exclusive   events  and  B  is  any  event,  then  

4  

Bayes’  Theorem   •  CondiOonal  probabiliOes  lead  to  the  Bayes’  rule:  

•  Example:  consider  the  probability  of  Disease  given  Symptom:   P(Symptom / Disease)P(Disease) P(Disease / Symptom) = P(Symptom) P(Symptom) = P(Symptom / Disease)P(Disease) + P(Symptom / NoDisease)P(NoDisease) 5  

Bayes’  Theorem   •  MeningiOs  causes  a  sOff  neck  50%  of  the  Ome.   •  A  paOent  comes  in  with  a  sOff  neck  –  what  is  the  probability   that  he  has  meningiOs?     •  Need  to  know  two  things:   –  The  prior  probability  of  a  paOent  having  meningiOs   (1/50,000)   –  The  prior  probability  of  a  paOent  having  a  sOff  neck  (1/20)  

P(M /S) = 0.0002 6  

Bayes’  Theorem   •  If  A1,  A2,  …,  An  is  a  parOOon  of  mutually   exclusive  events  and  B  is  any  event,  then  the   Bayes’  rule  is  given  by:  

7  

Random  Variables     •  A  discrete  random  variable  X  is  one  that  can   assume  only  a  finite  or  countably  infinite  number   of  disOnct  values.   •  A  conOnuous  random  variable  is  one  that  can   assume  any  non-­‐countably  infinite  number  of   values.   •  The  collecOon  of  probabiliOes  associated  with  the   different  values  of  a  random  variable  is  called  the   probability  distribuOon  of  the  random  variable.   8  

Random  Variables   •  For  a  discrete  random  variable,  the  probability   distribuOon  is  called  the  probability  mass   funcOon  (pmf).   •  For  a  conOnuous  random  variable,  it  is  called   the  probability  density  funcOon  (pdf).  

9  

Random  Variables   •  For  n  random  variables,  the  joint  pmf  assigns  a  probability  for   each  possible  combinaOon  of  values:  

                       p(x1,x2,…,xn)=P(X1=x1,  X2=x2,  …,  Xn=xn)     •  Specifying  the  joint  pmf  requires  an  enormous  number  of   values     –  kn  assuming  n  random  variables  where  each  one  can   assume  one  of  k  discrete  values.  

10  

Random  Variables   •  From  a  joint  probability,  we  can  compute  the  probability  of   any  subset  of  the  variables  by  marginalizaOon:   –  Example  -­‐  case  of  joint  pmf  :  

–  Examples  -­‐  case  of  joint  pdf  :  

11  

ExpectaOon   •  Expected  value  of  a  funcOon  g(x)    

12  

Variance     •  Variance  of  a  random  variable  X  is  defined  as    

•  Sample  variance  for  a  random  variable  X  is   defined  as  

13  

Covariance  and  CorrelaOon   •  Covariance  of  two  random  variables  

•  CorrelaOon  coefficient  between  two  random  variables  

•  Sample  covariance  matrix  is  given  by        

14  

Covariance   •  Covariance  matrix  of  two  random  variables  

•  Covariance  matrix  of  n  random  variables  

15  

CorrelaOon   •  X  and  Y  are  uncorrelated  if  

•  If  n  random  variables  are  independent,  then  

•  Note  that  if  X  and  Y  are  independent,  then  their   correlaOon  coefficient  is  zero  but  not  all   uncorrelated  variables  are  independent   16  

EvaluaOon  Metrics     Let  the  problem  statement  be:  classifying  between  dogs  and  cats.  Dogs  are   labeled  as  posiOve  class  and  cats  are  labeled  as  negaOve  class   Predicted  Class  

Actual  Class  

Nega/ve  

Posi/ve  

Nega/ve    

A  (true  negaOve)  

C  (false  posiOve)  

Posi/ve    

D  (false  negaOve)  

B  (true  posiOve)  

Term    

Meaning  

Example  

True  posiOve  

Correctly  idenOfied  

Dog  idenOfied  as  dog  

False  posiOve  

Incorrectly  idenOfied  

Cat  idenOfied  as  dog  

True  negaOve  

Correctly  rejected  

Cat  idenOfied  as  cat  

False  negaOve  

Incorrectly  rejected  

Dog  idenOfied  as  cat  

hjp://www.cs.rpi.edu/~leen/misc-­‐publicaOons/SomeStatDefs.html  

17  

EvaluaOon  Metrics     Predicted  Class  

Actual  Class  

Nega/ve  

Posi/ve  

Nega/ve    

A  (true  negaOve)  

C  (false  posiOve)  

Posi/ve    

D  (false  negaOve)  

B  (true  posiOve)  

Metric    

Formula  

Average  classificaOon  accuracy  

 (TN  +  TP)  /  (TN+TP+FN+FP)  

Type  I  error  (false  posiOve  rate)  

FP  /  (TN  +  FP)  

Type  II  error  (false  negaOve  rate)    

FN  /  (FN  +  TP)  

True  posiOve  rate  

TP  /  (TP  +  FN)  

True  negaOve  rate  

TN  /  (TN  +  FP)  

hjp://www.cs.rpi.edu/~leen/misc-­‐publicaOons/SomeStatDefs.html  

18  

EvaluaOon  Metrics     Predicted  Class  

Actual  Class  

Nega/ve  

Posi/ve  

Nega/ve    

A  (true  negaOve)  

C  (false  posiOve)  

Posi/ve    

D  (false  negaOve)  

B  (true  posiOve)  

Metric    

Formula  

Average  classificaOon  accuracy  

 (TN  +  TP)  /  (TN+TP+FN+FP)  

Type  I  error  (false  posiOve  rate)  

FP  /  (TN  +  FP)  

Type  II  error  (false  negaOve  rate)    

FN  /  (FN  +  TP)   TP  /  (TP  Prevalent   +  FN)   in   computer  vision   TN  /related    (TN  +  FcP)   lassificaOon   problems  

True  posiOve  rate   True  negaOve  rate  

hjp://www.cs.rpi.edu/~leen/misc-­‐publicaOons/SomeStatDefs.html  

19  

EvaluaOon  Metrics     Metric    

Formula  

Precision  

TP  /  (TP  +  FP)  

Recall  

TP  /  (TP  +  FN)  

SensiOvity    

TP  /  (TP  +  FN)  

Specificity  

TN  /  (TN  +  FP)  

Precision:  FracOon  of  retrieved  instances  that  are  relevant   Recall:  FracOon  of  relevant  instances  that  are  retrieved  

20  

EvaluaOon  Metrics     Metric    

Formula  

Precision  

TP  /  (TP  +  FP)  

Recall  

TP  /  (TP  +  FN)  

SensiOvity    

TP  /  (TP  +  FN)  

Specificity  

TN  /  (TN  +  FP)  

Precision:  FracOon  of  retrieved  instances  that  are  relevant   Recall:  FracOon  of  relevant  instances  that  are  retrieved  

What  does  the  precision  score  of  1.0  mean?   What  does  the  recall  score  of  1.0  mean?   21  

EvaluaOon  Metrics     Metric    

Formula  

Precision  

TP  /  (TP  +  FP)  

Recall  

TP  /  (TP  +  FN)  

Precision:  FracOon  of  retrieved  instances  that  are  relevant   More  prevalent   Recall:  FracOon  of  relevant  instances  that  are  retrieved   in  informaOon   retrieval   domain   What  does  the  precision  score  of  1.0  mean?   What  does  the  recall  score  of  1.0  mean?   22  

EvaluaOon  Metrics     Metric    

Formula  

Precision  

TP  /  (TP  +  FP)  

Recall  

TP  /  (TP  +  FN)  

SensiOvity    

TP  /  (TP  +  FN)  

Specificity  

TN  /  (TN  +  FP)  

SensiOvity:  ProporOon  of  actual  posiOves  which  are  correctly  idenOfied   Specificity:  ProporOon  of  actual  negaOves  which  are  correctly  idenOfied  

23  

EvaluaOon  Metrics     Metric    

Formula  

SensiOvity    

TP  /  (TP  +  FN)  

Specificity  

TN  /  (TN  +  FP)  

PredicOve  value  for  a  posiOve  result  (PV+)  

TP  /  (TP  +  FP)  

PredicOve  value  for  a  negaOve  result  (PV-­‐)  

TN  /  (TN  +  FN)  

SensiOvity:  ProporOon  of  actual  posiOves  which  are  correctly  idenOfied   Specificity:  ProporOon  of  actual  negaOves  which  are  correctly  idenOfied   What  does  the  sensiOvity  score  of  1.0  mean?   What  does  the  specificity  score  of  1.0  mean?   24  

EvaluaOon  Metrics     Metric    

Formula  

SensiOvity    

TP  /  (TP  +  FN)  

Specificity  

TN  /  (TN  +  FP)  

PredicOve  value  for  a  posiOve  result  (PV+)  

TP  /  (TP  +  FP)  

PredicOve  value  for  a  negaOve  result  (PV-­‐)  

TN  /  (TN  +  FN)  

SensiOvity:  ProporOon  of  actual  posiOves  which  are  correctly  idenOfied   More  prevalent   Specificity:  ProporOon  of  actual  negaOves  which  are  correctly  idenOfied   in  research   related  to   medical  sciences   What  does  the  sensiOvity  score  of  1.0  mean?   What  does  the  specificity  score  of  1.0  mean?   25  

EvaluaOon  Metrics   •  Type  I  error  or  false  posiOve  rate  

–  The  chance  of  incorrectly  classifying  a  (randomly  selected)   sample  as  posiOve  

•  Type  II  error  or  false  negaOve  rate  

–  The  chance  of  incorrectly  classificaOon  a  (randomly   selected)  sample  as  negaOve  

•  Precision  

–  Probability  that  a  (randomly  selected)  retrieved  document   in  relevant  

•  Recall    

–  Probability  that  a  (randomly  selected)  relevant  document   is  retrieved  in  a  search   26  

EvaluaOon  Metrics   •  SensiOvity   –  The  chance  of  correctly  idenOfying  posiOve  samples   –  A  sensiOve  test  helps  rule  out  disease  (when  the  result  is  negaOve)   •  Specificity     –  The  chance  of  correctly  classifying  negaOve  samples   –  A  very  specific  test  rules  in  disease  with  a  higher  degree  of   confidence   •  PredicOve  value  of  a  posiOve  result   –  If  the  test  is  posiOve,  what  is  the  probability  that  the  paOent   actually  has  the  disease   •   PredicOve  value  of  a  negaOve  result   –  If  the  test  is  negaOve,  what  is  the  probability  that  the  paOent  does   not  have  the  disease   27  

Performance  EvaluaOon     •  ClassificaOon  is  of  two  types:   –  AuthenOcaOon  /  verificaOon  (1:1  Matching)   •  Is  she  Richa?   •  Is  this  an  image  of  a  cat?  

–  IdenOficaOon  (1:n  matching)   •  Who’s  photo  is  this?   •  This  image  belongs  to  which  class?  

28  

Performance  EvaluaOon   •  Receiver  operaOng  characterisOcs  (ROC)  curve   –  For  authenOcaOon/verificaOon   –  False  posiOve  rate  vs  true  posiOve  rate  

•  DetecOon  error-­‐tradeoff  (DET)  curve   –  False  posiOve  rate  vs  false  negaOve  rate  

•  CumulaOve  match  curve  (CMC)     –  Rank  vs  idenOficaOon  accuracy  

29  

Quiz  on  Monday,     January  14,  2013  

30  

CSE342/542 -‐ Lecture 2

18. Metric. Formula. Average classifica on accuracy ... Precision: Frac on of retrieved instances that are relevant ... What does the precision score of 1.0 mean?

2MB Sizes 4 Downloads 52 Views

Recommend Documents

Week 2 Lecture Material.pdf
Page 5 of 107. 5. Three-valued logic. Fuzzy connectives defined for such a three-valued logic better can. be stated as follows: Symbol Connective Usage Definition. NOT. OR. AND. IMPLICATION. EQUALITY. Debasis Samanta. CSE. IIT Kharagpur. Page 5 of 10

Econ 712 Lecture 2
where t. G is another Hilbert space. The dimension of this Hilbert space is either 0 or 1. This is so because the Hilbert space t. G must be spanned by the single.

Lecture 2 of 4.pdf
Page 1 of 40. Data Processing with PC-SAS. PubH 6325. J. Michael Oakes, PhD. Associate Professor. Division of Epidemiology. University of Minnesota.

2-TLC-Lecture note.pdf
Page 4 of 8. Page 4 of 8. 2-TLC-Lecture note.pdf. 2-TLC-Lecture note.pdf. Open. Extract. Open with. Sign In. Main menu. Displaying 2-TLC-Lecture note.pdf.

phys570-lecture-2.pdf
Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. phys570-lecture-2.pdf. phys570-lecture-2.pdf. Open. Extract.

EE 396: Lecture 2
Feb 12, 2011 - where I : Ω → R is the observed or measured image, and α > 0. The energy E is a functional that is defined on the class of functions U, which ...

Econ 712 Lecture 2
the natural metric, i.e. notion of length, for a random variable its standard deviation,. ( )2. = E t t x x. (5.1) with covariance as the associated notion of inner product ...

lecture 2: intro to statistics - GitHub
Continuous Variables. - Cumulative probability function. PDF has dimensions of x-1. Expectation value. Moments. Characteristic function generates moments: .... from realized sample, parameters are unknown and described probabilistically. Parameters a

Old Dominion University Lecture 2 - GitHub
Old Dominion University. Department of ... Our Hello World! [user@host ~]$ python .... maxnum = num print("The biggest number is: {}".format(maxnum)) ...

Lecture 2: Measuring Firm Heterogeneity
Oct 23, 2017 - Not a trivial issue: input-output linkages, firm-to-firm trade relationships, etc. • ACF doesn't work in this case. • Recall that mit = m(kit,lit,ωit). • If mit is directly an input factor in gross production function, which var

AP Physics 2 Lecture Notes 2015-2016.pdf
B. Example # 4 (2004 AP Physics B) While exploring a sunken ocean liner, the. principal researcher found the absolute pressure on the robot observation.

Evaluation de lecture trimestre 2 cuissart a.pdf
On. fera une route neuve. Page 3 of 7. Evaluation de lecture trimestre 2 cuissart a.pdf. Evaluation de lecture trimestre 2 cuissart a.pdf. Open. Extract. Open with.

A Lecture on Compressive Sensing 1 Scope 2 ...
The ideas presented here can be used to illustrate the links between data .... a reconstruction algorithm to recover x from the measurements y. Initially ..... Baraniuk, “Analog-to-information conversion via random demodulation,” in IEEE Dallas.

Lecture 7
Nov 22, 2016 - Faculty of Computer and Information Sciences. Ain Shams University ... A into two subsequences A0 and A1 such that all the elements in A0 are ... In this example, once the list has been partitioned around the pivot, each sublist .....

A Lecture on Compressive Sensing 1 Scope 2 ...
Audio signals and many communication signals are compressible in a ..... random number generator (RNG) sets the mirror orientations in a pseudorandom 0/1 pattern to ... tion from highly incomplete frequency information,” IEEE Trans. Inform.

Computer Science E-259 XML with Java Lecture 2
Sep 24, 2007 - This is an XML document that describes students -->.

Public Lecture - Ariel Fernandez 2.pdf
Page 1 of 1. The Department of Modern Languages. and Literatures. The U.W.I., Mona Campus. invites you to a. Public Lecture. by. His Excellency Ariel Fernández. Ambassador of the Republic of Argentina to Jamaica. Title: “The Community of Latin Ame

LECTURE - CHECKLIST
Consider hardware available for visual aids - Computer/ Laptop, LCD ... Decide timing- 65 minutes for lecture and 10 minutes for questions and answers.