SlideShare a Scribd company logo
1 of 10
Download to read offline
Journal of Natural Sciences Research                                                         www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

        JACKKNIFE ALGORITHM FOR THE ESTIMATION OF
             LOGISTIC REGRESSION PARAMETERS
                           1*       2
                                        H.O.Obiora-Ilouno     and J.I.Mbegbu

                            1. Department of Statistics, Nnamdi Azikiwe University, Awka

                 2. Department of Mathematics, University of Benin, Benin City, Edo State, Nigeria.

                             *
                                 E-mail of the corresponding author: obiorailounoho@yahoo.com

Abstract
This paper proposes an algorithm for the estimation of the parameters of logistic regression analysis using
Jackknife. Jackknife delete-one and delete-d algorithm was used to provide estimates of logistic regression
coefficient. The Jackknife standard deviation provides an estimate of variability of the standard deviation
of sample and it is a good measure of precision. The method was illustrated with real life data; and the
results obtained from the Jackknife samples was compared with the result from ordinary logistic regression
using the maximum likelihood method and results obtained reveals that the values from the jackknife
algorithm for the parameter estimation, standard deviation and confidence interval were so close to the
result from ordinary logistic regression analysis, this provides a good approximation to the result which
shows that there is no bias in the jackknife coefficients.
Keywords: Jackknife algorithm, Logistic regression, dichotomous variable, maximum likelihood

INTRODUCTION
Sometimes one may be interested in situations where one is trying to predict whether something happens or
not. For example a patient survives a treatment or not, a person contracts a disease or not, and a student
passes a course or not. These are binary measures. Logistic regression regresses a dichotomous dependent
variable on a set of independent variables, especially where the data set is very large, and the predictor
variables do not behave in orderly ways, or obey the assumptions required of ordinary linear regression or
discriminant analysis (Michael, 2008; Russell and Chritine, 2009; Ryan, 1997). Logistic regression applies
maximum likelihood estimation after transforming the dependent variable into a logit variable which can be
used to determine the effect of the independent variables on the dependent variable using the maximum
likelihood estimation method (Russell and Chritine,2009).This is accomplished using iterative estimation
algorithm. Jackknifing is used in statistical inference to estimate the bias and standard error when a random
sample of observation is used. The basic idea behind the jackknife estimators lies in systematically
recomputing the statistic estimate leaving out one or more observations at a time from the sample set. From
this new set of replicates of the statistic, an estimate for the bias and an estimate for the variance of the
statistic can be calculated. (Efron,1982; Efron and Tibshirami,1993)

MATERIALS AND METHOD
Material: The aim of this paper is to illustrate the Jackknife logistic regression parameter estimation. The
data used is a secondary data collected from the delivery ward of general hospital Onitsha, Anambra state,
Nigeria. Here, Maternal age, Parity, and babies Sex were considered as independent variables in order to
determine their effect on the gestation period of n = 256 mothers. R programming language was used for
the statistical analysis of these data.


1   JACKKNIFE DELETE-ONE ALGORITHM

The jackknife delete-one procedure is as follow:

                                                         74
Journal of Natural Sciences Research                                                                                            www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012


Step1:        Given a randomly drawn n sized sample from population consisting of a dichotomous dependent
                                                                   Z i = (Yi , X ji )
                                                                                     ′
variable and label the element of the vector

                                                   ′                                                               ′
where  Yi = ( y1 , y 2 , y 3 ,..., y n ) and the matrix X ji = ( x j1 , x j 2 , x j 3 ,..., x jn ) ; j = 1, 2,3,..., k , and
i = 1,2,3,..., n.
Omit first row of the vector Z i = (Yi , X ji ) and label the remaining n − 1 sample sized observation sets
                                                  ′

Yi
     (J )
              (      (J )     (J )
            = y 2 , y 3 ,..., y n
                                          (J )
                                                 )′ and X   ji
                                                                 (J )
                                                                          (    (J )        (J )
                                                                        = x j 2 , x j 3 ,..., x jn
                                                                                                     (J )
                                                                                                            )′ as the first delete-one jackknife
                  (J )        ˆ                                                  (J )
sample Z 1 , and estimate β 1 coefficient from Z 1 using the maximum likelihood estimate of the
                                                 (J )

logistic regression model in the Jackknife sample (Efron,1982; Sahinler and Topuz, 2007). The maximum
likelihood estimate of β i in the logistic regression model are those values of β i that maximize the log-
likelihood function
                                                          exp( β ′ X i )
                                            E (Yi ) =                                                                  (1)
                                                        1 + exp( β ′ X i )

Where        β xi ′ = β 0 + β 1 X i1 + ... + β p −1 X i ( p −1)

                         1          
         β0                       
                        xi1 
          β0 
β p×1 =           ; X = x          
        M  ip×1  i 2 
                       M          
         β p −1 
                       x          
                          i ( p −1) 

E (Y ) = [1 + exp(− β ′xi )]
                                          −1
                                                                                                             (2)

Where Yi are ordinary Bernoulli random variables with expected values E (Yi ) = π i ,
Where
                                            exp( β ′X i )
                         E (Y ) = π i =
                                          1 + exp(β ′X i )


Since each Yi observation in an ordinary Bernoulli random variable, where
P (Yi = 1) = π i
P (Yi = 0) = 1 - π i
then, its probability distribution can be represented as
             f i (Yi ) = π iYi (1 − π i )1−Yi , Yi = 0,1; i = 1L n

Since Yi observations are independent, their joint probability function is




                                                                                      75
Journal of Natural Sciences Research                                                               www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

                                 n                 n                  1− Yi

                             ∏       f i ( Y i ) = ∏ π i (1 − π i )
                                                         Yi
g ( Y i ,..., Y n ) =                                                                     (3)
                              i =1                i =1
To find the maximum likelihood estimates we take the log of both sides

                                                    1− Yi
                                  n                      
log [g ( Y i ,..., Y n ) ] = log  ∏ π i (1 − π i )
                                        Yi
                                                          
                                  i =1
                                                         
                                                          
      n                      n
= ∑ Yi log(π i ) + ∑ (1 − Yi ) log(1 − π i )
     i =1                   i =1

            π       n
= ∑ Yi log( i ) + ∑ log(1 − π i )                                             (4)
           1−πi   i =1
Therefore, the log likelihood function for multiple logistic regression is
                     n                   n
log L( β ) = ∑ Yi ( β ′X i ) −∑ log(1 + exp( β ′X i ))                              (5)
                    i =1                i =1



Step2:
Computer intensive numerical search procedures are employed to find the values of           β 0 , β1 ,..., β p −1 that
maximize log ( β ) using the Gauss Newton method. These maximum likelihood estimates will be denoted
by   b0 , b1 , b2 ,..., b p −1 ,.


        b0 
        b 
β p×1 =  
          1

        M 
              
        bp −1 
              

The fitted logistic response function and fitted value can be expressed as
                       exp(b′xi )
                                    = [1 + exp(−b′xi )]
                                                        −1
E (Yi ) = π i =
           ˆ                                                                                               (6)
                     1 + exp(b′xi )
Here standard statistical R codes and minitab package were used for logistic regression to conduct the
numerical search procedure by iteratively re-weighted least squares for the maximum likelihood estimates.

step3:
Iterative Procedures:
a. Obtain starting basic values for the regression parameters denoted by b(0). This can be obtained by
ordinary least squares regression of Y on the predictor variables using a first-order linear model.


b. Using these starting values, obtain



                           π i′(0) = [b(0)]′ xi
                            ˆ                                                                             (7)


                                                                 76
Journal of Natural Sciences Research                                                                                                          www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

                                                  exp[π i′ (0)]
                                                       ˆ
                               π i′(0) =
                                ˆ                                                                                                                  (8)
                                                1 + exp[π i′ (0)]
                                                          ˆ

C. Calculate the new response variable

                                                                        Yi − π i (0)
                                                                              ˆ
                                                Y(0) = π i (0) +
                                                  ′     ˆ                                                                                          (9)
                                                                   π i (0)[1 − π i (0)]
                                                                    垐

And the weights            Wi (0) = π i (0) 1 − π i (0)
                                     ˆ           ˆ     [               ]
d. Regress        Y(′0) in (equ. 10) on the predictor variables X i ,...., X p −1 using re-weighted least squares for
the maximum likelihood estimates to obtain b (1).Repeat step (a) through (d) using the latest revised
estimated regression coefficient until there is little if any change in the estimated coefficients which leads to
convergence (Neter et al,1996; Hamadu, 2010; Ryan,1997)

Then, omit second row of the vector             Z i = (Yi , X ji )′ and label the remaining n − 1 sample size
observation sets as             Z2     ˆ
                           , estimate β ( J 2 ) coefficient from Z 2 using the maximum likelihood estimate
                                     (J )                              (J )

                                                                                                        ˆ (J )
of the logistic regression also. Alternatively, omit each row of the observation set and estimate the β i
coefficient           using          maximum               likelihood        estimate      of        the       logistic        regression.           Where
                                            ′                                                                                                            ′
Yi = ( y1 , y 2 , y 3 ,..., y n ) and the matrix of the independent variable X ji = ( x j1 , x j 2 , x j 3 ,..., x jn ) and
 ˆ
β ( J i ) is jackknife logistic regression coefficient vector estimated after deleting the ith observation sets from
Zi.


Step4:
Obtain the probability distribution F ( βˆ                   ( Ji )
                                                                      ) of jackknife estimates        ˆ    1   ˆ
                                                                                                     β (J ) , β (J   2)
                                                                                                                          , …, βˆ     (Jn )



Step5:
Calculate the jackknife regression coefficient estimate which is the mean of the F ( βˆ                                      ( Ji )
                                                                                                                                      ) distribution as;
            n

           ∑ βˆ       ( Ji )

 ˆ
β (J ) =   i =1
                               = β ( Ji )                                                                             (10)
                  n
Step 6
The delete-one jackknife logistic regression equation is thus,                             ˆ     [
                                                                                          Y = 1 + exp(−b ( J ) ) X              ]−1
                                                                                                                                       where b(J) is the
unbiased estimator of              β.

2 JACKKNIFE DELETE-D ALGORITHM
Steps to the jackknife delete-d are as follows:
Step1:
Given a randomly drawn n sized sample ( Z1 , Z 2 ,L , Z n ) from a population,
divide the sample into "S" independent group of size d.

Step2:
Omit first d observation set from full sample at a time and estimate the logistic regression coefficient using

                                                                                   77
Journal of Natural Sciences Research                                                                                    www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

the maximum                likelihood estimate             ˆ
                                                          β (J    2)
                                                                       from ( n − d ) sized remaining observation set.

Step3:
Omit second d observation set from full sample at a time and estimate the logistic regression coefficient                                  ˆ
                                                                                                                                          β (J   2)
                                                                                                                                                      from
( n − d ) sized remaining observation set.

Step4:
                                                                                    ˆ              ˆ
Alternately omit each d of the n observation sets and estimate the coefficients as β k , where β k is the
                                                                                                         (J )                     (J )

                                                                     th
jackknife regression coefficient vector estimated after deleting of k d observation set from full sample.
Thus,
                               n 
                            S = 
                               d 
                                
delete-d jackknife sample are obtained , K=1,2,...,S.

Step 5:
Obtain the probability distribution F( βˆ     (J )
                                                     ) of delete-d of jackknife estimates        ˆ   1     ˆ       2 ˆ        3   ˆ
                                                                                                β ( J ) , β ( J ) , β ( J ) ,..., β ( J    s)
                                                                                                                                                .

Step 6:
Calculate the jackknife regression coefficient estimate which is the mean of the F ( βˆ                    (J )
                                                                                                                  ) distribution as;

                                        n

                                       ∑ βˆ       (Jk )

                             ˆ
                            β (J ) =   k =1
                                                          = β (Jk )                                                                (11)
                                              s

The jackknife confidence interval

                                         δ
                            X ± Zα                                                                                                 (12)
                                              n
The jackknife delete-one estimate of the standard error is defined by
                                                                                1
                                        n −1
                                                          (                )
                                                          2 2
                            ˆ
                            se jeck   =
                                        n
                                              ∑ θ垐 − θ(.) 
                                                 (i )
                                                            
                                                                                                                                   (13)


         θ (.) = ∑
          ˆ          θˆi
where
                 n
The jackknife delete-d estimate of the standard error is
                                                                       1
                                                                  2
                                                                  
                             r                                   2
                            
                              n 
                                      (s (
                                   ∑ θ垐) − θ(.)               )                                                       (14)
                                                                  
                             d 
                            
                                                                  
                                                                   


                                                                           78
Journal of Natural Sciences Research                                                                   www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

(Efron, 1982; Sahinler and Topuz, 2007)

Illustrative example
The logistic regression model was fitted in the data in the table 1 regressing gestation period on mother age,
parity and babies’ sex.


                             1          2 3       4        ……     254          255    256
 Gestation period Y          1        0   0        1       …..        0           1     0
 Mother Age                  27      30   30      25       …          36         30    31
 Parity                      5        1   1        2       ….         2           5     6
 Baby Sex                    1        0   1        1       ….         1           0     1

Table 1: The data used in calculation of logistic regression and Jackknifes' results with n=256

Let Yi be the response of the ith randomly selected subject (gestation period) which assumes values of either
1 (positive response) or 0 (negative response) for i= 1, 2, ,n.




Let X1, X2 and X3 be independent variables mothers age, parity and sex of the baby respectively, where 1 represent
baby boys and 0 represent baby girls.


Results:

First, logistic regression model was fitted to the data in (Table1) and the results was summarized in the
(Table 2) below. The regression model is significant as the P-value is 0.000 that is (P < 0.05).

The jackknife samples are generated omitting each d=1 or 2 or 3 sample(s) respectively of the n=256
observation sets and estimated coefficients as βˆ   (J )
                                                           .


Variable                          ˆ
                          Ord. J( β )               ˆ
                                              S.E( β )                 P-value        95% conf. Interval
           ˆ
Constant ( β 0 )          0.555092            0.856362                 0.517          Lower      upper
               ˆ
Mother’s age ( β1 )       -0.0601875          0.0321786                0.061          0.88      1.00
         ˆ
Parity ( β 2 )            0.407925            0.0899854                0.00           1.26      1.79
              ˆ
Baby’s sex ( β 3 )        -0.0108895          0.268384                 0.968          0.58      1.67

Table 2: The summary statistics of regression coefficients for binary logistics regression

Log-likelihood = -164.996
Test that all slopes are zero: G= 23.335 df= 3 and P-value=0.000




                                                                 79
Journal of Natural Sciences Research                                                                www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012




  r       Variables       1     2      3       …    256        ˆ
                                                               β0J         ˆ
                                                                           β1 J            ˆ
                                                                                           β2J           ˆ
                                                                                                         β3J

   1      Gestation             1      1    …         1
          mother age            23     38   …         27      0.4583      -0.0573        0.4139       0.0089
            Parity              3      6    …         5
           Babysex              0      1    …         1
   2      Gestation       1            1    …         1
          mother age      33           38   …         27      0.5472      -0.0591        0.4036       -0.0237
            Parity        5            6    …         5
           Babysex        1            1    …         1
   4      Gestation       1     1           …         1
          mother age      33    23          …         27      0.5302      -0.0589        0.4017       -0.0018
            Parity        5     3           …         5
           Babysex        1     0           …         1
   .          .            .     .      .   …          .            .        .             .             .
   .          .            .     .      .              .            .        .             .             .
   .          .            .     .      .              .            .        .             .             .
  256     Gestation       1     1      1    …
          mother age      33    23     38   …                 0.5334      -0.0589        0.4036       -0.0187
            Parity        5     3      6    …
           Babysex        1     0      1    …
                                                            0.5552                       0.4080
            ˆ
            β   ( Ji )
                                                                      -0.0602              -0.0109
Table 3: The illustration of the Jackknife delete-1 logistic regression procedure from the data in Table 1 for the
estimation of the regression parameters


                                Constant           Mother’s age                   ˆ                          ˆ
                                  ˆ                  ˆ                  Parity ( β 2 )         Baby’s sex ( β 3 )
                                ( β0 )             ( β1 )
   Ordinary logistic coef.        0.55509                 -0.0602           0.40793                  -0.0109

  Delete-1 jackknife coef.           0.55519              -0.0602           0.40797                  -0.0109

  Delete-2 jackknife coef.           0.5553               -0.0607           0.40809                  -0.0109

  Delete-3 jackknife coef.           0.5554               -0.0602           0.40805                  -0.0109


Table 4: The summaries of ordinary logistic regression and the jackknife results of
delete-1, delete-2 and delete-3 values of logistic regression coefficients




                                                             80
Journal of Natural Sciences Research                                                                  www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

    var.         ord.S.E       del-1 S.E   del-2 S.E         del-3 S.E     Ord.P-value     del-1      del-2      del-3
    const.        0.8564         0.8581      0.8599            0.8619         0.517        0.548       0.563     0.5387
    M.age        0.03218        0.03224     0.03226           0.03236         0.061        0.067      0.0734      0.067
    parity       0.08999        0.09030      0.0901           0.09078          0.00        0.000       0.000      0.000
    B.sex         0.2684         0.2689     0.02692           0.26997         0.968        0.946      0.9508      0.929

Table 5: The summary statistics of the logistic regression standard errors and their P-values and
    that jackknife delete-1, delete-2 and delete-3 standard errors and their P-values results.

3 Discussion and Conclusions:
Jackknife samples were generated by omitting each, one or two or three n observation(s)
corresponding to delete-one or delete-two or delete-three jackknife respectively for the
estimation of logistic regression coefficients βˆ   ( Ji )
                                                             . Ordinary logistic regression on the data in
table 1 for   β 0 , β1 ,..., β 3 are
b0 = 0.55509, b1= -0.06019, b2 = 0.40793 and b3 = -0.01089 respectively, and the estimated
precision of these estimates were
S(b0)=0.85636, S(b1)=0.03218, S(b2)=0.08999, S(b3)=0.26838 with 95% confidence intervals as
0.88 ≤ β1 ≥ 1.00, 1.26 ≤ β 2 ≥ 1.79 and 0.58 ≤ β 3 ≥ 1.67 respectively. The jackknifes results for the
estimation of coefficients of logistic regression using the jackknife algorithm , estimation of the precision
(standard error) and the confidence interval as shown in table 4 and 5 reveals that the Jackknife delete-one,
delete-two, and delete-three estimated coefficients are quite close to analytical result. Also the estimated
precisions and the confidence interval of the jackknife delete-one, delete-two and delete-three are also very
close to that of analytical result when compared together. This reveals the appropriateness of the algorithm
developed to the theoretical method.

Delete d R-Algorithm

#This R code defines a function 'jack' for performing delete-d jacknife for logistic regression
#p is the no of cols in the data. p=4 then there is 1 dept. var and 3 indept vars
#d is the no of rows to be deleted
jack=function(data,p,d)
{
n=length(data[,1]) #the sample size
u=combn(n,d) #Assign the matrix of all possible combinations to u
output=matrix(0,ncol=p,nrow=ncol(u))#define the output
y=data[,1] #the response vector
x=data[,2:p] #the matrix of covariates
for (i in 1:(ncol(u)))
{
dd=c(u[,i])
yn=y[-dd] #delete d rows of the independent var
xn=x[-dd,] #delete d rows of the dependent var
logreg=glm(formula=yn~xn[,1]+xn[,2]+factor(xn[,3]),family = binomial(link = "logit"),na.action =
na.pass)#Assuming 3 indpt vars with 1 as a factor
coef=logreg$coef
output[i,]=c(coef) #store the regression coefficients
}
output

                                                                  81
Journal of Natural Sciences Research                                                    www.iiste.org
ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online)
Vol.2, No.4, 2012

}
#This part can be used to obtain a jacknife estimate of the regression
coefficients
u=jack(data,4,2)
beta=c(mean(u[,1]),mean(u[,2]),mean(u[,3]),mean(u[,4]))
(Venables and Smith,2007)


References

[1] Efron,B.1982.The Jackknife,the Bootstrap and other Resampling Plans,CBN-NSF Regional
COnference Series in Applied Mathematics Philadelphia,Pennsylvania.5-27.
[2] Efron,B. and Tibshirani,R.J.1993.An Introduction to the Boot-strap.Chapman and Hall,New York.
[3] Hamadu,D.2010.A Bootstrap Approach to Bias-Reduction of Non-linear Parameter in Regression
     Analysis.Journal of science Research Development.Vol.12,110-127.

[4] Michael,P.L.2008.Logistic Regression. Circulation American Heart Association.Doi:10.1161.

[5] Neter,J.,Kutner,M.H.,Nachtsheim,C.J.,andWasserman,W.1996. Regression Analysis.Fourth Edition,Mc-Grow
    Hill,U.S.A.pp429-536.

[6] Russell,C. and Chritine,C.2009.Resampling Method of Analysis in Simulation Studies.Proceeding of Winter
    Simulation Conference.45-59.

[7] Ryan,P.T.1997.Modern Regression Method.John Wiley and sons inc,Third aveenue, New York.pp255-308.

[8] Sahinler,S. and Topuz,D.2007.Bootstrap and Jackknife Resampling Agorithm for Estmation of Regression
    Parameters.Journal of Applied Quantitative Method.vol.2,No.2:188-199.

[9] Venables,W.N.and Smith .2007.An Introduction to R, A programming Environment for Data Analysis and
    Graphics.Version 2.6.1, 1-100




                                                          82
This academic article was published by The International Institute for Science,
Technology and Education (IISTE). The IISTE is a pioneer in the Open Access
Publishing service based in the U.S. and Europe. The aim of the institute is
Accelerating Global Knowledge Sharing.

More information about the publisher can be found in the IISTE’s homepage:
http://www.iiste.org


The IISTE is currently hosting more than 30 peer-reviewed academic journals and
collaborating with academic institutions around the world. Prospective authors of
IISTE journals can find the submission instruction on the following page:
http://www.iiste.org/Journals/

The IISTE editorial team promises to the review and publish all the qualified
submissions in a fast manner. All the journals articles are available online to the
readers all over the world without financial, legal, or technical barriers other than
those inseparable from gaining access to the internet itself. Printed version of the
journals is also available upon request of readers and authors.

IISTE Knowledge Sharing Partners

EBSCO, Index Copernicus, Ulrich's Periodicals Directory, JournalTOCS, PKP Open
Archives Harvester, Bielefeld Academic Search Engine, Elektronische
Zeitschriftenbibliothek EZB, Open J-Gate, OCLC WorldCat, Universe Digtial
Library , NewJour, Google Scholar

More Related Content

What's hot

VTU Syllabus - 2010 Scheme (III to VIII Semester)
VTU Syllabus - 2010 Scheme (III to VIII Semester)VTU Syllabus - 2010 Scheme (III to VIII Semester)
VTU Syllabus - 2010 Scheme (III to VIII Semester)Ravikiran A
 
PR 103: t-SNE
PR 103: t-SNEPR 103: t-SNE
PR 103: t-SNETaeoh Kim
 
Testing boolean difference
Testing boolean differenceTesting boolean difference
Testing boolean differenceAnish Gupta
 
multiplexers and demultiplexers
 multiplexers and demultiplexers multiplexers and demultiplexers
multiplexers and demultiplexersUnsa Shakir
 
Traffic Lights Controller in VHDL
Traffic Lights Controller in VHDLTraffic Lights Controller in VHDL
Traffic Lights Controller in VHDLAbhishek Jaisingh
 
12.8 independent and dependent events 1
12.8 independent and dependent events   112.8 independent and dependent events   1
12.8 independent and dependent events 1bweldon
 
5.2 Strong Induction
5.2 Strong Induction5.2 Strong Induction
5.2 Strong Inductionshowslidedump
 
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...Waqas Afzal
 
Fourier analysis of signals and systems
Fourier analysis of signals and systemsFourier analysis of signals and systems
Fourier analysis of signals and systemsBabul Islam
 
Digital signal processing
Digital signal processingDigital signal processing
Digital signal processingLiving Online
 
Notes Day 6: Bernoulli Trials
Notes Day 6: Bernoulli TrialsNotes Day 6: Bernoulli Trials
Notes Day 6: Bernoulli TrialsKate Nowak
 

What's hot (20)

VTU Syllabus - 2010 Scheme (III to VIII Semester)
VTU Syllabus - 2010 Scheme (III to VIII Semester)VTU Syllabus - 2010 Scheme (III to VIII Semester)
VTU Syllabus - 2010 Scheme (III to VIII Semester)
 
Huffman coding || Huffman Tree
Huffman coding || Huffman TreeHuffman coding || Huffman Tree
Huffman coding || Huffman Tree
 
PR 103: t-SNE
PR 103: t-SNEPR 103: t-SNE
PR 103: t-SNE
 
Testing boolean difference
Testing boolean differenceTesting boolean difference
Testing boolean difference
 
multiplexers and demultiplexers
 multiplexers and demultiplexers multiplexers and demultiplexers
multiplexers and demultiplexers
 
Combinational circuits
Combinational circuitsCombinational circuits
Combinational circuits
 
DSP lab manual
DSP lab manualDSP lab manual
DSP lab manual
 
Ch17
Ch17Ch17
Ch17
 
Traffic Lights Controller in VHDL
Traffic Lights Controller in VHDLTraffic Lights Controller in VHDL
Traffic Lights Controller in VHDL
 
Bayesian data analysis
Bayesian data analysisBayesian data analysis
Bayesian data analysis
 
12.8 independent and dependent events 1
12.8 independent and dependent events   112.8 independent and dependent events   1
12.8 independent and dependent events 1
 
Lti system(akept)
Lti system(akept)Lti system(akept)
Lti system(akept)
 
5.2 Strong Induction
5.2 Strong Induction5.2 Strong Induction
5.2 Strong Induction
 
Code Converters & Parity Checker
Code Converters & Parity CheckerCode Converters & Parity Checker
Code Converters & Parity Checker
 
Binary Arithmetic
Binary ArithmeticBinary Arithmetic
Binary Arithmetic
 
Probabilty1
Probabilty1Probabilty1
Probabilty1
 
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...
LTI System, Basic Types of Digital signals, Basic Operations, Causality, Stab...
 
Fourier analysis of signals and systems
Fourier analysis of signals and systemsFourier analysis of signals and systems
Fourier analysis of signals and systems
 
Digital signal processing
Digital signal processingDigital signal processing
Digital signal processing
 
Notes Day 6: Bernoulli Trials
Notes Day 6: Bernoulli TrialsNotes Day 6: Bernoulli Trials
Notes Day 6: Bernoulli Trials
 

Similar to Jackknife Algorithm Estimates Logistic Regression

A new class of a stable implicit schemes for treatment of stiff
A new class of a stable implicit schemes for treatment of stiffA new class of a stable implicit schemes for treatment of stiff
A new class of a stable implicit schemes for treatment of stiffAlexander Decker
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011antigonon
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011antigonon
 
02 probabilistic inference in graphical models
02 probabilistic inference in graphical models02 probabilistic inference in graphical models
02 probabilistic inference in graphical modelszukun
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingSSA KPI
 
Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...Alexander Decker
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCSR2011
 
Regression Theory
Regression TheoryRegression Theory
Regression TheorySSA KPI
 
Application of ordinal logistic to pregnancy outcomes
Application of ordinal logistic to pregnancy outcomesApplication of ordinal logistic to pregnancy outcomes
Application of ordinal logistic to pregnancy outcomesAlexander Decker
 
Generalization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsGeneralization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsYoshihiro Mizoguchi
 
Cheatsheet probability
Cheatsheet probabilityCheatsheet probability
Cheatsheet probabilityAshish Patel
 
5. cem granger causality ecm
5. cem granger causality  ecm 5. cem granger causality  ecm
5. cem granger causality ecm Quang Hoang
 
Bayes Independence Test
Bayes Independence TestBayes Independence Test
Bayes Independence TestJoe Suzuki
 
Engr 371 midterm february 1997
Engr 371 midterm february 1997Engr 371 midterm february 1997
Engr 371 midterm february 1997amnesiann
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会Keikusl
 

Similar to Jackknife Algorithm Estimates Logistic Regression (20)

A new class of a stable implicit schemes for treatment of stiff
A new class of a stable implicit schemes for treatment of stiffA new class of a stable implicit schemes for treatment of stiff
A new class of a stable implicit schemes for treatment of stiff
 
Image denoising
Image denoisingImage denoising
Image denoising
 
Symmetrical2
Symmetrical2Symmetrical2
Symmetrical2
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011
 
Presentation cm2011
Presentation cm2011Presentation cm2011
Presentation cm2011
 
02 probabilistic inference in graphical models
02 probabilistic inference in graphical models02 probabilistic inference in graphical models
02 probabilistic inference in graphical models
 
Basics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programmingBasics of probability in statistical simulation and stochastic programming
Basics of probability in statistical simulation and stochastic programming
 
Side 2019, part 2
Side 2019, part 2Side 2019, part 2
Side 2019, part 2
 
Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...Classification of aedes adults mosquitoes in two distinct groups based on fis...
Classification of aedes adults mosquitoes in two distinct groups based on fis...
 
Csr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawalCsr2011 june14 14_00_agrawal
Csr2011 june14 14_00_agrawal
 
Regression Theory
Regression TheoryRegression Theory
Regression Theory
 
Application of ordinal logistic to pregnancy outcomes
Application of ordinal logistic to pregnancy outcomesApplication of ordinal logistic to pregnancy outcomes
Application of ordinal logistic to pregnancy outcomes
 
Holographic Cotton Tensor
Holographic Cotton TensorHolographic Cotton Tensor
Holographic Cotton Tensor
 
Generalization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on GroupsGeneralization of Compositons of Cellular Automata on Groups
Generalization of Compositons of Cellular Automata on Groups
 
Cheatsheet probability
Cheatsheet probabilityCheatsheet probability
Cheatsheet probability
 
5. cem granger causality ecm
5. cem granger causality  ecm 5. cem granger causality  ecm
5. cem granger causality ecm
 
Bayes Independence Test
Bayes Independence TestBayes Independence Test
Bayes Independence Test
 
Engr 371 midterm february 1997
Engr 371 midterm february 1997Engr 371 midterm february 1997
Engr 371 midterm february 1997
 
修士論文発表会
修士論文発表会修士論文発表会
修士論文発表会
 
www.ijerd.com
www.ijerd.comwww.ijerd.com
www.ijerd.com
 

More from Alexander Decker

Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...Alexander Decker
 
A validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale inA validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale inAlexander Decker
 
A usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websitesA usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websitesAlexander Decker
 
A universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banksA universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banksAlexander Decker
 
A unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized dA unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized dAlexander Decker
 
A trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistanceA trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistanceAlexander Decker
 
A transformational generative approach towards understanding al-istifham
A transformational  generative approach towards understanding al-istifhamA transformational  generative approach towards understanding al-istifham
A transformational generative approach towards understanding al-istifhamAlexander Decker
 
A time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibiaA time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibiaAlexander Decker
 
A therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school childrenA therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school childrenAlexander Decker
 
A theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banksA theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banksAlexander Decker
 
A systematic evaluation of link budget for
A systematic evaluation of link budget forA systematic evaluation of link budget for
A systematic evaluation of link budget forAlexander Decker
 
A synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjabA synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjabAlexander Decker
 
A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...Alexander Decker
 
A survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incrementalA survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incrementalAlexander Decker
 
A survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniquesA survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniquesAlexander Decker
 
A survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo dbA survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo dbAlexander Decker
 
A survey on challenges to the media cloud
A survey on challenges to the media cloudA survey on challenges to the media cloud
A survey on challenges to the media cloudAlexander Decker
 
A survey of provenance leveraged
A survey of provenance leveragedA survey of provenance leveraged
A survey of provenance leveragedAlexander Decker
 
A survey of private equity investments in kenya
A survey of private equity investments in kenyaA survey of private equity investments in kenya
A survey of private equity investments in kenyaAlexander Decker
 
A study to measures the financial health of
A study to measures the financial health ofA study to measures the financial health of
A study to measures the financial health ofAlexander Decker
 

More from Alexander Decker (20)

Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...Abnormalities of hormones and inflammatory cytokines in women affected with p...
Abnormalities of hormones and inflammatory cytokines in women affected with p...
 
A validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale inA validation of the adverse childhood experiences scale in
A validation of the adverse childhood experiences scale in
 
A usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websitesA usability evaluation framework for b2 c e commerce websites
A usability evaluation framework for b2 c e commerce websites
 
A universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banksA universal model for managing the marketing executives in nigerian banks
A universal model for managing the marketing executives in nigerian banks
 
A unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized dA unique common fixed point theorems in generalized d
A unique common fixed point theorems in generalized d
 
A trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistanceA trends of salmonella and antibiotic resistance
A trends of salmonella and antibiotic resistance
 
A transformational generative approach towards understanding al-istifham
A transformational  generative approach towards understanding al-istifhamA transformational  generative approach towards understanding al-istifham
A transformational generative approach towards understanding al-istifham
 
A time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibiaA time series analysis of the determinants of savings in namibia
A time series analysis of the determinants of savings in namibia
 
A therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school childrenA therapy for physical and mental fitness of school children
A therapy for physical and mental fitness of school children
 
A theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banksA theory of efficiency for managing the marketing executives in nigerian banks
A theory of efficiency for managing the marketing executives in nigerian banks
 
A systematic evaluation of link budget for
A systematic evaluation of link budget forA systematic evaluation of link budget for
A systematic evaluation of link budget for
 
A synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjabA synthetic review of contraceptive supplies in punjab
A synthetic review of contraceptive supplies in punjab
 
A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...A synthesis of taylor’s and fayol’s management approaches for managing market...
A synthesis of taylor’s and fayol’s management approaches for managing market...
 
A survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incrementalA survey paper on sequence pattern mining with incremental
A survey paper on sequence pattern mining with incremental
 
A survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniquesA survey on live virtual machine migrations and its techniques
A survey on live virtual machine migrations and its techniques
 
A survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo dbA survey on data mining and analysis in hadoop and mongo db
A survey on data mining and analysis in hadoop and mongo db
 
A survey on challenges to the media cloud
A survey on challenges to the media cloudA survey on challenges to the media cloud
A survey on challenges to the media cloud
 
A survey of provenance leveraged
A survey of provenance leveragedA survey of provenance leveraged
A survey of provenance leveraged
 
A survey of private equity investments in kenya
A survey of private equity investments in kenyaA survey of private equity investments in kenya
A survey of private equity investments in kenya
 
A study to measures the financial health of
A study to measures the financial health ofA study to measures the financial health of
A study to measures the financial health of
 

Recently uploaded

SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure servicePooja Nehwal
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationRadu Cotescu
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfEnterprise Knowledge
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountPuma Security, LLC
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Paola De la Torre
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdfhans926745
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024The Digital Insurer
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsMaria Levchenko
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityPrincipled Technologies
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Servicegiselly40
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...HostedbyConfluent
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking MenDelhi Call girls
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking MenDelhi Call girls
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024Results
 

Recently uploaded (20)

SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure serviceWhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
WhatsApp 9892124323 ✓Call Girls In Kalyan ( Mumbai ) secure service
 
Scaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organizationScaling API-first – The story of a global engineering organization
Scaling API-first – The story of a global engineering organization
 
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
Neo4j - How KGs are shaping the future of Generative AI at AWS Summit London ...
 
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdfThe Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
The Role of Taxonomy and Ontology in Semantic Layers - Heather Hedden.pdf
 
Breaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path MountBreaking the Kubernetes Kill Chain: Host Path Mount
Breaking the Kubernetes Kill Chain: Host Path Mount
 
Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101Salesforce Community Group Quito, Salesforce 101
Salesforce Community Group Quito, Salesforce 101
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024Finology Group – Insurtech Innovation Award 2024
Finology Group – Insurtech Innovation Award 2024
 
Handwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed textsHandwritten Text Recognition for manuscripts and early printed texts
Handwritten Text Recognition for manuscripts and early printed texts
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
Transforming Data Streams with Kafka Connect: An Introduction to Single Messa...
 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men08448380779 Call Girls In Greater Kailash - I Women Seeking Men
08448380779 Call Girls In Greater Kailash - I Women Seeking Men
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024A Call to Action for Generative AI in 2024
A Call to Action for Generative AI in 2024
 

Jackknife Algorithm Estimates Logistic Regression

  • 1. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 JACKKNIFE ALGORITHM FOR THE ESTIMATION OF LOGISTIC REGRESSION PARAMETERS 1* 2 H.O.Obiora-Ilouno and J.I.Mbegbu 1. Department of Statistics, Nnamdi Azikiwe University, Awka 2. Department of Mathematics, University of Benin, Benin City, Edo State, Nigeria. * E-mail of the corresponding author: obiorailounoho@yahoo.com Abstract This paper proposes an algorithm for the estimation of the parameters of logistic regression analysis using Jackknife. Jackknife delete-one and delete-d algorithm was used to provide estimates of logistic regression coefficient. The Jackknife standard deviation provides an estimate of variability of the standard deviation of sample and it is a good measure of precision. The method was illustrated with real life data; and the results obtained from the Jackknife samples was compared with the result from ordinary logistic regression using the maximum likelihood method and results obtained reveals that the values from the jackknife algorithm for the parameter estimation, standard deviation and confidence interval were so close to the result from ordinary logistic regression analysis, this provides a good approximation to the result which shows that there is no bias in the jackknife coefficients. Keywords: Jackknife algorithm, Logistic regression, dichotomous variable, maximum likelihood INTRODUCTION Sometimes one may be interested in situations where one is trying to predict whether something happens or not. For example a patient survives a treatment or not, a person contracts a disease or not, and a student passes a course or not. These are binary measures. Logistic regression regresses a dichotomous dependent variable on a set of independent variables, especially where the data set is very large, and the predictor variables do not behave in orderly ways, or obey the assumptions required of ordinary linear regression or discriminant analysis (Michael, 2008; Russell and Chritine, 2009; Ryan, 1997). Logistic regression applies maximum likelihood estimation after transforming the dependent variable into a logit variable which can be used to determine the effect of the independent variables on the dependent variable using the maximum likelihood estimation method (Russell and Chritine,2009).This is accomplished using iterative estimation algorithm. Jackknifing is used in statistical inference to estimate the bias and standard error when a random sample of observation is used. The basic idea behind the jackknife estimators lies in systematically recomputing the statistic estimate leaving out one or more observations at a time from the sample set. From this new set of replicates of the statistic, an estimate for the bias and an estimate for the variance of the statistic can be calculated. (Efron,1982; Efron and Tibshirami,1993) MATERIALS AND METHOD Material: The aim of this paper is to illustrate the Jackknife logistic regression parameter estimation. The data used is a secondary data collected from the delivery ward of general hospital Onitsha, Anambra state, Nigeria. Here, Maternal age, Parity, and babies Sex were considered as independent variables in order to determine their effect on the gestation period of n = 256 mothers. R programming language was used for the statistical analysis of these data. 1 JACKKNIFE DELETE-ONE ALGORITHM The jackknife delete-one procedure is as follow: 74
  • 2. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 Step1: Given a randomly drawn n sized sample from population consisting of a dichotomous dependent Z i = (Yi , X ji ) ′ variable and label the element of the vector ′ ′ where Yi = ( y1 , y 2 , y 3 ,..., y n ) and the matrix X ji = ( x j1 , x j 2 , x j 3 ,..., x jn ) ; j = 1, 2,3,..., k , and i = 1,2,3,..., n. Omit first row of the vector Z i = (Yi , X ji ) and label the remaining n − 1 sample sized observation sets ′ Yi (J ) ( (J ) (J ) = y 2 , y 3 ,..., y n (J ) )′ and X ji (J ) ( (J ) (J ) = x j 2 , x j 3 ,..., x jn (J ) )′ as the first delete-one jackknife (J ) ˆ (J ) sample Z 1 , and estimate β 1 coefficient from Z 1 using the maximum likelihood estimate of the (J ) logistic regression model in the Jackknife sample (Efron,1982; Sahinler and Topuz, 2007). The maximum likelihood estimate of β i in the logistic regression model are those values of β i that maximize the log- likelihood function exp( β ′ X i ) E (Yi ) = (1) 1 + exp( β ′ X i ) Where β xi ′ = β 0 + β 1 X i1 + ... + β p −1 X i ( p −1) 1   β0       xi1  β0  β p×1 =  ; X = x  M  ip×1  i 2    M   β p −1    x   i ( p −1)  E (Y ) = [1 + exp(− β ′xi )] −1 (2) Where Yi are ordinary Bernoulli random variables with expected values E (Yi ) = π i , Where exp( β ′X i ) E (Y ) = π i = 1 + exp(β ′X i ) Since each Yi observation in an ordinary Bernoulli random variable, where P (Yi = 1) = π i P (Yi = 0) = 1 - π i then, its probability distribution can be represented as f i (Yi ) = π iYi (1 − π i )1−Yi , Yi = 0,1; i = 1L n Since Yi observations are independent, their joint probability function is 75
  • 3. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 n n 1− Yi ∏ f i ( Y i ) = ∏ π i (1 − π i ) Yi g ( Y i ,..., Y n ) = (3) i =1 i =1 To find the maximum likelihood estimates we take the log of both sides 1− Yi  n  log [g ( Y i ,..., Y n ) ] = log  ∏ π i (1 − π i ) Yi   i =1    n n = ∑ Yi log(π i ) + ∑ (1 − Yi ) log(1 − π i ) i =1 i =1 π n = ∑ Yi log( i ) + ∑ log(1 − π i ) (4) 1−πi i =1 Therefore, the log likelihood function for multiple logistic regression is n n log L( β ) = ∑ Yi ( β ′X i ) −∑ log(1 + exp( β ′X i )) (5) i =1 i =1 Step2: Computer intensive numerical search procedures are employed to find the values of β 0 , β1 ,..., β p −1 that maximize log ( β ) using the Gauss Newton method. These maximum likelihood estimates will be denoted by b0 , b1 , b2 ,..., b p −1 ,. b0  b  β p×1 =   1 M    bp −1    The fitted logistic response function and fitted value can be expressed as exp(b′xi ) = [1 + exp(−b′xi )] −1 E (Yi ) = π i = ˆ (6) 1 + exp(b′xi ) Here standard statistical R codes and minitab package were used for logistic regression to conduct the numerical search procedure by iteratively re-weighted least squares for the maximum likelihood estimates. step3: Iterative Procedures: a. Obtain starting basic values for the regression parameters denoted by b(0). This can be obtained by ordinary least squares regression of Y on the predictor variables using a first-order linear model. b. Using these starting values, obtain π i′(0) = [b(0)]′ xi ˆ (7) 76
  • 4. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 exp[π i′ (0)] ˆ π i′(0) = ˆ (8) 1 + exp[π i′ (0)] ˆ C. Calculate the new response variable Yi − π i (0) ˆ Y(0) = π i (0) + ′ ˆ (9) π i (0)[1 − π i (0)] 垐 And the weights Wi (0) = π i (0) 1 − π i (0) ˆ ˆ [ ] d. Regress Y(′0) in (equ. 10) on the predictor variables X i ,...., X p −1 using re-weighted least squares for the maximum likelihood estimates to obtain b (1).Repeat step (a) through (d) using the latest revised estimated regression coefficient until there is little if any change in the estimated coefficients which leads to convergence (Neter et al,1996; Hamadu, 2010; Ryan,1997) Then, omit second row of the vector Z i = (Yi , X ji )′ and label the remaining n − 1 sample size observation sets as Z2 ˆ , estimate β ( J 2 ) coefficient from Z 2 using the maximum likelihood estimate (J ) (J ) ˆ (J ) of the logistic regression also. Alternatively, omit each row of the observation set and estimate the β i coefficient using maximum likelihood estimate of the logistic regression. Where ′ ′ Yi = ( y1 , y 2 , y 3 ,..., y n ) and the matrix of the independent variable X ji = ( x j1 , x j 2 , x j 3 ,..., x jn ) and ˆ β ( J i ) is jackknife logistic regression coefficient vector estimated after deleting the ith observation sets from Zi. Step4: Obtain the probability distribution F ( βˆ ( Ji ) ) of jackknife estimates ˆ 1 ˆ β (J ) , β (J 2) , …, βˆ (Jn ) Step5: Calculate the jackknife regression coefficient estimate which is the mean of the F ( βˆ ( Ji ) ) distribution as; n ∑ βˆ ( Ji ) ˆ β (J ) = i =1 = β ( Ji ) (10) n Step 6 The delete-one jackknife logistic regression equation is thus, ˆ [ Y = 1 + exp(−b ( J ) ) X ]−1 where b(J) is the unbiased estimator of β. 2 JACKKNIFE DELETE-D ALGORITHM Steps to the jackknife delete-d are as follows: Step1: Given a randomly drawn n sized sample ( Z1 , Z 2 ,L , Z n ) from a population, divide the sample into "S" independent group of size d. Step2: Omit first d observation set from full sample at a time and estimate the logistic regression coefficient using 77
  • 5. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 the maximum likelihood estimate ˆ β (J 2) from ( n − d ) sized remaining observation set. Step3: Omit second d observation set from full sample at a time and estimate the logistic regression coefficient ˆ β (J 2) from ( n − d ) sized remaining observation set. Step4: ˆ ˆ Alternately omit each d of the n observation sets and estimate the coefficients as β k , where β k is the (J ) (J ) th jackknife regression coefficient vector estimated after deleting of k d observation set from full sample. Thus, n  S =  d    delete-d jackknife sample are obtained , K=1,2,...,S. Step 5: Obtain the probability distribution F( βˆ (J ) ) of delete-d of jackknife estimates ˆ 1 ˆ 2 ˆ 3 ˆ β ( J ) , β ( J ) , β ( J ) ,..., β ( J s) . Step 6: Calculate the jackknife regression coefficient estimate which is the mean of the F ( βˆ (J ) ) distribution as; n ∑ βˆ (Jk ) ˆ β (J ) = k =1 = β (Jk ) (11) s The jackknife confidence interval δ X ± Zα (12) n The jackknife delete-one estimate of the standard error is defined by 1  n −1 ( ) 2 2 ˆ se jeck =  n ∑ θ垐 − θ(.)  (i )  (13) θ (.) = ∑ ˆ θˆi where n The jackknife delete-d estimate of the standard error is 1  2    r 2  n  (s ( ∑ θ垐) − θ(.) )  (14)    d      78
  • 6. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 (Efron, 1982; Sahinler and Topuz, 2007) Illustrative example The logistic regression model was fitted in the data in the table 1 regressing gestation period on mother age, parity and babies’ sex. 1 2 3 4 …… 254 255 256 Gestation period Y 1 0 0 1 ….. 0 1 0 Mother Age 27 30 30 25 … 36 30 31 Parity 5 1 1 2 …. 2 5 6 Baby Sex 1 0 1 1 …. 1 0 1 Table 1: The data used in calculation of logistic regression and Jackknifes' results with n=256 Let Yi be the response of the ith randomly selected subject (gestation period) which assumes values of either 1 (positive response) or 0 (negative response) for i= 1, 2, ,n. Let X1, X2 and X3 be independent variables mothers age, parity and sex of the baby respectively, where 1 represent baby boys and 0 represent baby girls. Results: First, logistic regression model was fitted to the data in (Table1) and the results was summarized in the (Table 2) below. The regression model is significant as the P-value is 0.000 that is (P < 0.05). The jackknife samples are generated omitting each d=1 or 2 or 3 sample(s) respectively of the n=256 observation sets and estimated coefficients as βˆ (J ) . Variable ˆ Ord. J( β ) ˆ S.E( β ) P-value 95% conf. Interval ˆ Constant ( β 0 ) 0.555092 0.856362 0.517 Lower upper ˆ Mother’s age ( β1 ) -0.0601875 0.0321786 0.061 0.88 1.00 ˆ Parity ( β 2 ) 0.407925 0.0899854 0.00 1.26 1.79 ˆ Baby’s sex ( β 3 ) -0.0108895 0.268384 0.968 0.58 1.67 Table 2: The summary statistics of regression coefficients for binary logistics regression Log-likelihood = -164.996 Test that all slopes are zero: G= 23.335 df= 3 and P-value=0.000 79
  • 7. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 r Variables 1 2 3 … 256 ˆ β0J ˆ β1 J ˆ β2J ˆ β3J 1 Gestation 1 1 … 1 mother age 23 38 … 27 0.4583 -0.0573 0.4139 0.0089 Parity 3 6 … 5 Babysex 0 1 … 1 2 Gestation 1 1 … 1 mother age 33 38 … 27 0.5472 -0.0591 0.4036 -0.0237 Parity 5 6 … 5 Babysex 1 1 … 1 4 Gestation 1 1 … 1 mother age 33 23 … 27 0.5302 -0.0589 0.4017 -0.0018 Parity 5 3 … 5 Babysex 1 0 … 1 . . . . . … . . . . . . . . . . . . . . . . . . . . . . . . . 256 Gestation 1 1 1 … mother age 33 23 38 … 0.5334 -0.0589 0.4036 -0.0187 Parity 5 3 6 … Babysex 1 0 1 … 0.5552 0.4080 ˆ β ( Ji ) -0.0602 -0.0109 Table 3: The illustration of the Jackknife delete-1 logistic regression procedure from the data in Table 1 for the estimation of the regression parameters Constant Mother’s age ˆ ˆ ˆ ˆ Parity ( β 2 ) Baby’s sex ( β 3 ) ( β0 ) ( β1 ) Ordinary logistic coef. 0.55509 -0.0602 0.40793 -0.0109 Delete-1 jackknife coef. 0.55519 -0.0602 0.40797 -0.0109 Delete-2 jackknife coef. 0.5553 -0.0607 0.40809 -0.0109 Delete-3 jackknife coef. 0.5554 -0.0602 0.40805 -0.0109 Table 4: The summaries of ordinary logistic regression and the jackknife results of delete-1, delete-2 and delete-3 values of logistic regression coefficients 80
  • 8. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 var. ord.S.E del-1 S.E del-2 S.E del-3 S.E Ord.P-value del-1 del-2 del-3 const. 0.8564 0.8581 0.8599 0.8619 0.517 0.548 0.563 0.5387 M.age 0.03218 0.03224 0.03226 0.03236 0.061 0.067 0.0734 0.067 parity 0.08999 0.09030 0.0901 0.09078 0.00 0.000 0.000 0.000 B.sex 0.2684 0.2689 0.02692 0.26997 0.968 0.946 0.9508 0.929 Table 5: The summary statistics of the logistic regression standard errors and their P-values and that jackknife delete-1, delete-2 and delete-3 standard errors and their P-values results. 3 Discussion and Conclusions: Jackknife samples were generated by omitting each, one or two or three n observation(s) corresponding to delete-one or delete-two or delete-three jackknife respectively for the estimation of logistic regression coefficients βˆ ( Ji ) . Ordinary logistic regression on the data in table 1 for β 0 , β1 ,..., β 3 are b0 = 0.55509, b1= -0.06019, b2 = 0.40793 and b3 = -0.01089 respectively, and the estimated precision of these estimates were S(b0)=0.85636, S(b1)=0.03218, S(b2)=0.08999, S(b3)=0.26838 with 95% confidence intervals as 0.88 ≤ β1 ≥ 1.00, 1.26 ≤ β 2 ≥ 1.79 and 0.58 ≤ β 3 ≥ 1.67 respectively. The jackknifes results for the estimation of coefficients of logistic regression using the jackknife algorithm , estimation of the precision (standard error) and the confidence interval as shown in table 4 and 5 reveals that the Jackknife delete-one, delete-two, and delete-three estimated coefficients are quite close to analytical result. Also the estimated precisions and the confidence interval of the jackknife delete-one, delete-two and delete-three are also very close to that of analytical result when compared together. This reveals the appropriateness of the algorithm developed to the theoretical method. Delete d R-Algorithm #This R code defines a function 'jack' for performing delete-d jacknife for logistic regression #p is the no of cols in the data. p=4 then there is 1 dept. var and 3 indept vars #d is the no of rows to be deleted jack=function(data,p,d) { n=length(data[,1]) #the sample size u=combn(n,d) #Assign the matrix of all possible combinations to u output=matrix(0,ncol=p,nrow=ncol(u))#define the output y=data[,1] #the response vector x=data[,2:p] #the matrix of covariates for (i in 1:(ncol(u))) { dd=c(u[,i]) yn=y[-dd] #delete d rows of the independent var xn=x[-dd,] #delete d rows of the dependent var logreg=glm(formula=yn~xn[,1]+xn[,2]+factor(xn[,3]),family = binomial(link = "logit"),na.action = na.pass)#Assuming 3 indpt vars with 1 as a factor coef=logreg$coef output[i,]=c(coef) #store the regression coefficients } output 81
  • 9. Journal of Natural Sciences Research www.iiste.org ISSN 2224-3186 (Paper) ISSN 2225-0921 (Online) Vol.2, No.4, 2012 } #This part can be used to obtain a jacknife estimate of the regression coefficients u=jack(data,4,2) beta=c(mean(u[,1]),mean(u[,2]),mean(u[,3]),mean(u[,4])) (Venables and Smith,2007) References [1] Efron,B.1982.The Jackknife,the Bootstrap and other Resampling Plans,CBN-NSF Regional COnference Series in Applied Mathematics Philadelphia,Pennsylvania.5-27. [2] Efron,B. and Tibshirani,R.J.1993.An Introduction to the Boot-strap.Chapman and Hall,New York. [3] Hamadu,D.2010.A Bootstrap Approach to Bias-Reduction of Non-linear Parameter in Regression Analysis.Journal of science Research Development.Vol.12,110-127. [4] Michael,P.L.2008.Logistic Regression. Circulation American Heart Association.Doi:10.1161. [5] Neter,J.,Kutner,M.H.,Nachtsheim,C.J.,andWasserman,W.1996. Regression Analysis.Fourth Edition,Mc-Grow Hill,U.S.A.pp429-536. [6] Russell,C. and Chritine,C.2009.Resampling Method of Analysis in Simulation Studies.Proceeding of Winter Simulation Conference.45-59. [7] Ryan,P.T.1997.Modern Regression Method.John Wiley and sons inc,Third aveenue, New York.pp255-308. [8] Sahinler,S. and Topuz,D.2007.Bootstrap and Jackknife Resampling Agorithm for Estmation of Regression Parameters.Journal of Applied Quantitative Method.vol.2,No.2:188-199. [9] Venables,W.N.and Smith .2007.An Introduction to R, A programming Environment for Data Analysis and Graphics.Version 2.6.1, 1-100 82
  • 10. This academic article was published by The International Institute for Science, Technology and Education (IISTE). The IISTE is a pioneer in the Open Access Publishing service based in the U.S. and Europe. The aim of the institute is Accelerating Global Knowledge Sharing. More information about the publisher can be found in the IISTE’s homepage: http://www.iiste.org The IISTE is currently hosting more than 30 peer-reviewed academic journals and collaborating with academic institutions around the world. Prospective authors of IISTE journals can find the submission instruction on the following page: http://www.iiste.org/Journals/ The IISTE editorial team promises to the review and publish all the qualified submissions in a fast manner. All the journals articles are available online to the readers all over the world without financial, legal, or technical barriers other than those inseparable from gaining access to the internet itself. Printed version of the journals is also available upon request of readers and authors. IISTE Knowledge Sharing Partners EBSCO, Index Copernicus, Ulrich's Periodicals Directory, JournalTOCS, PKP Open Archives Harvester, Bielefeld Academic Search Engine, Elektronische Zeitschriftenbibliothek EZB, Open J-Gate, OCLC WorldCat, Universe Digtial Library , NewJour, Google Scholar