glpls1a.mlogit.cv.error {gpls}R Documentation

Leave-one-out cross-validation error using MIRWPLS and MIRWPLSF model

Description

Leave-one-out cross-validation training set error for fitting MIRWPLS or MIRWPLSF model for multi-group classification

Usage

glpls1a.mlogit.cv.error(train.X, train.y, K.prov = NULL, eps = 0.001,lmax = 100, mlogit = T, br = T)

Arguments

train.X n by p design matrix (with no intercept term) for training set
train.y response vector with class lables 1 to C+1 for C+1 group classification, baseline class should be 1
K.prov number of PLS components
eps tolerance for convergence
lmax maximum number of iteration allowed
mlogit if TRUE use the multinomial logit model, otherwise fit all C-1 logistic models (vs baseline class 1) separately
br TRUE if Firth's bias reduction procedure is used

Details

Value

error LOOCV training error
error.obs the misclassified error observation indices

Note

Author(s)

Beiying Ding, Robert Gentleman

References

  • Ding, B.Y. and Gentleman, R. (2003) Classification using generalized partial least squares.
  • Marx, B.D (1996) Iteratively reweighted partial least squares estimation for generalized linear regression. Technometrics 38(4): 374-381.

    See Also

    glpls1a.cv.error, glpls1a.train.test.error,glpls1a, glpls1a.mlogit,glpls1a.logit.all

    Examples

     x <- matrix(rnorm(20),ncol=2)
     y <- sample(1:3,10,TRUE)
    
     ## no bias reduction
     glpls1a.mlogit.cv.error(x,y,br=FALSE)
     glpls1a.mlogit.cv.error(x,y,mlogit=FALSE,br=FALSE)
     ## bias reduction
     glpls1a.mlogit.cv.error(x,y,br=TRUE)
     glpls1a.mlogit.cv.error(x,y,mlogit=FALSE,br=TRUE)
    
    

    [Package gpls version 1.0.6 Index]