{gpls}R Documentation

Leave-one-out cross-validation error using IRWPLS and IRWPLSF model


Leave-one-out cross-validation training set classification error for fitting IRWPLS or IRWPLSF model for two group classification

Usage,train.y, K.prov=NULL,eps=1e-3,lmax=100,family="binomial",link="logit",br=T)


train.X n by p design matrix (with no intercept term) for training set
train.y response vector (0 or 1) for training set
K.prov number of PLS components, default is the rank of train.X
eps tolerance for convergence
lmax maximum number of iteration allowed
family glm family, binomial is the only relevant one here
link link function, logit is the only one practically implemented now
br TRUE if Firth's bias reduction procedure is used



error LOOCV training error
error.obs the misclassified error observation indices



Beiying Ding, Robert Gentleman


  • Ding, B.Y. and Gentleman, R. (2003) Classification using generalized partial least squares.
  • Marx, B.D (1996) Iteratively reweighted partial least squares estimation for generalized linear regression. Technometrics 38(4): 374-381.

    See Also

    glpls1a.train.test.error,, glpls1a, glpls1a.mlogit,glpls1a.logit.all


     x <- matrix(rnorm(20),ncol=2)
     y <- sample(0:1,10,TRUE)
     ## no bias reduction,y,br=FALSE)
     ## bias reduction and 1 PLS component,y,K.prov=1, br=TRUE)

    [Package gpls version 1.0.6 Index]