Skip to contents

Fits Nonparametric Classification Procedure Based on DD–plot (depth-versus-depth plot) for G dimensions (\(G=g\times h\), g levels and p data depth).

Usage

classif.DD(
  group,
  fdataobj,
  depth = "FM",
  classif = "glm",
  w,
  par.classif = list(),
  par.depth = list(),
  control = list(verbose = FALSE, draw = TRUE, col = NULL, alpha = 0.25)
)

Arguments

group

Factor of length n with g levels.

fdataobj

data.frame, fdata or list with the multivariate, functional or both covariates respectively.

depth

Character vector specifying the type of depth functions to use, see Details.

classif

Character vector specifying the type of classifier method to use, see Details.

w

Optional case weights, weights for each value of depth argument, see Details.

par.classif

List of parameters for classif procedure.

par.depth

List of parameters for depth function.

control

List of parameters for controlling the process.

If verbose=TRUE, report extra information on progress.

If draw=TRUE print DD-plot of two samples based on data depth.

col, the colors for points in DD–plot.

alpha, the alpha transparency used in the background of DD–plot, a number in [0,1].

Value

  • group.est: Estimated vector groups by classified method selected.

  • misclassification: Probability of misclassification.

  • prob.classification: Probability of correct classification by group level.

  • dep: Data frame with the depth of the curves for functional data (or points for multivariate data) in fdataobj w.r.t. each group level.

  • depth: Character vector specifying the type of depth functions used.

  • par.depth: List of parameters for depth function.

  • classif: Type of classifier used.

  • par.classif: List of parameters for classif procedure.

  • w: Optional case weights.

  • fit: Fitted object by classif method using the depth as covariate.

Details

Make the group classification of a training dataset using DD-classifier estimation in the following steps.

  1. The function computes the selected depth measure of the points in fdataobj w.r.t. a subsample of each g level group and p data dimension (\(G=g \times p\)). The user can be specify the parameters for depth function in par.depth.

    (i) Type of depth function from functional data, see Depth:

    • "FM": Fraiman and Muniz depth.

    • "mode": h-modal depth.

    • "RT": Random Tukey depth.

    • "RP": Random projection depth.

    • "RPD": Double random projection depth.

    (ii) Type of depth function from multivariate functional data, see depth.mfdata:

    • "FMp": Fraiman and Muniz depth with common support. Suppose that all p-fdata objects have the same support (same rangevals); see depth.FMp.

    • "modep": h-modal depth using a p-dimensional metric; see depth.modep.

    • "RPp": Random projection depth using a p-variate depth with the projections, depth.RPp.

    If the procedure requires to compute a distance such as in "knn" or "np" classifier or "mode" depth, the user must use a proper distance function: metric.lp for functional data and metric.dist for multivariate data.

    (iii) Type of depth function from multivariate data, see Depth.Multivariate:

    • "SD": Simplicial depth (for bivariate data).

    • "HS": Half-space depth.

    • "MhD": Mahalanobis depth.

    • "RD": Random projections depth.

    • "LD": Likelihood depth.

  2. The function calculates the misclassification rate based on data depth computed in step (1) using the following classifiers.

    • "MaxD": Maximum depth.

    • "DD1": Search for the best separating polynomial of degree 1.

    • "DD2": Search for the best separating polynomial of degree 2.

    • "DD3": Search for the best separating polynomial of degree 3.

    • "glm": Logistic regression computed using Generalized Linear Models; see classif.glm.

    • "gam": Logistic regression computed using GAM models; see classif.gsam.

    • "lda": Linear Discriminant Analysis computed using lda.

    • "qda": Quadratic Discriminant Analysis computed using qda.

    • "knn": k-Nearest Neighbour classification computed using classif.knn.

    • "np": Non-parametric Kernel classifier computed using classif.np.

    The user can be specify the parameters for classifier function in par.classif such as the smoothing parameter par.classif[["h"]], if classif="np" or the k-Nearest Neighbour par.classif[["knn"]], if classif="knn".

    In the case of polynomial classifier ("DD1", "DD2" and "DD3") uses the original procedure proposed by Li et al. (2012), by defalut rotating the DD-plot (to exchange abscise and ordinate) using in par.classif argument rotate=TRUE. Notice that the maximum depth classifier can be considered as a particular case of DD1, fixing the slope with a value of 1 (par.classif=list(pol=1)).

    The number of possible different polynomials depends on the sample size n and increases polynomially with order \(k\). In the case of \(g\) groups, so the procedure applies some multiple-start optimization scheme to save time:

    • Generate all combinations of the elements of \( n \) taken \( k \) at a time: \(g \times combn(N,k)\) candidate solutions. When this number exceeds nmax=10000, a random sample of 10000 combinations is selected.

    • Smooth the empirical loss with the logistic function: \(1/(1+e^{-tx})\). The classification rule is constructed by optimizing the best noptim combinations in this random sample (by default, noptim=1 and tt=50/range(depth values)). Note that Li et al. found that the optimization results become stable for \(t \in [50, 200]\) when the depth is standardized with an upper bound of 1.

    The original procedure (Li et al. (2012)) not need to try many initial polynomials (nmax=1000) and that the procedure optimize the best (noptim=1), but we recommended to repeat the last step for different solutions, as for example nmax=250 and noptim=25. User can change the parameters pol, rotate, nmax, noptim and tt in the argument par.classif.

    The classif.DD procedure extends to multi-class problems by incorporating the method of majority voting in the case of polynomial classifier and the method One vs the Rest in the logistic case ("glm" and "gam").

References

Cuesta-Albertos, J.A., Febrero-Bande, M. and Oviedo de la Fuente, M. The DDG-classifier in the functional setting, (2017). Test, 26(1), 119-142. DOI: doi:10.1007/s11749-016-0502-6 .

See also

See Also as predict.classif.DD

Author

This version was created by Manuel Oviedo de la Fuente and Manuel Febrero Bande and includes the original version for polynomial classifier created by Jun Li, Juan A. Cuesta-Albertos and Regina Y. Liu.

Examples

if (FALSE) { # \dontrun{
# DD-classif for functional data
data(tecator)
ab <- tecator$absorp.fdata
ab1 <- fdata.deriv(ab, nderiv = 1)
ab2 <- fdata.deriv(ab, nderiv = 2)
gfat <- factor(as.numeric(tecator$y$Fat>=15))

# DD-classif for p=1 functional  data set
out01 <- classif.DD(gfat,ab,depth="mode",classif="np")
out02 <- classif.DD(gfat,ab2,depth="mode",classif="np")
# DD-plot in gray scale
ctrl <- list(draw=T,col=gray(c(0,.5)),alpha=.2)
out02bis <- classif.DD(gfat,ab2,depth="mode",classif="np",control=ctrl)

# 2 depth functions (same curves) 
ldat <- mfdata("ab" = ab, "ab2" = ab2)
out03 <- classif.DD(gfat,list(ab2,ab2),depth=c("RP","mode"),classif="np")
# DD-classif for p=2 functional data set
# Weighted version 
out04 <- classif.DD(gfat, ldat, depth="mode",
                    classif="np", w=c(0.5,0.5))
# Model version
out05 <- classif.DD(gfat,ldat,depth="mode",classif="np")
# Integrated version (for multivariate functional data)
out06 <- classif.DD(gfat,ldat,depth="modep",classif="np")

# DD-classif for multivariate data
data(iris)
group <- iris[,5]
x <- iris[,1:4]
out07 <- classif.DD(group,x,depth="LD",classif="lda")
summary(out07)
out08 <- classif.DD(group, list(x,x), depth=c("MhD","LD"),
                    classif="lda")
summary(out08)

# DD-classif for functional data: g levels 
data(phoneme)
mlearn <- phoneme[["learn"]]
glearn <- as.numeric(phoneme[["classlearn"]])-1
out09 <- classif.DD(glearn,mlearn,depth="FM",classif="glm")
out10 <- classif.DD(glearn,list(mlearn,mlearn),depth=c("FM","RP"),classif="glm")
summary(out09)
summary(out10)
} # }