Skip to contents

Computes functional (ridge or penalized) regression between functional explanatory variable \(X(t)\) and scalar response \(Y\) using Principal Components Analysis.
$$Y=\big<X,\beta\big>+\epsilon=\int_{T}{X(t)\beta(t)dt+\epsilon}$$ where \( \big< \cdot , \cdot \big>\) denotes the inner product on \(L_2\) and \(\epsilon\) are random errors with mean zero , finite variance \(\sigma^2\) and \(E[X(t)\epsilon]=0\).

Usage

fregre.pc(
  fdataobj,
  y,
  l = NULL,
  lambda = 0,
  P = c(0, 0, 1),
  weights = rep(1, len = n),
  ...
)

Arguments

fdataobj

fdata class object or fdata.comp class object created
by create.pc.basis function.

y

Scalar response with length n.

l

Index of components to include in the model.If is null l (by default), l=1:3.

lambda

Amount of penalization. Default value is 0, i.e. no penalization is used.

P

If P is a vector: P are coefficients to define the penalty matrix object. If P is a matrix: P is the penalty matrix object, see P.penalty.

weights

weights

...

Further arguments passed to or from other methods.

Value

Return:

  • call: The matched call of fregre.pc function.

  • coefficients: A named vector of coefficients.

  • residuals: y minus fitted values.

  • fitted.values: Estimated scalar response.

  • beta.est: Beta coefficient estimated of class fdata.

  • df.residual: The residual degrees of freedom. In ridge regression, df(rn) is the effective degrees of freedom.

  • r2: Coefficient of determination.

  • sr2: Residual variance.

  • Vp: Estimated covariance matrix for the parameters.

  • H: Hat matrix.

  • l: Index of principal components selected.

  • lambda: Amount of shrinkage.

  • P: Penalty matrix.

  • fdata.comp: Fitted object in fdata2pc function.

  • lm: lm object.

  • fdataobj: Functional explanatory data.

  • y: Scalar response.

Details

The function computes the \(\left\{\nu_k\right\}_{k=1}^{\infty}\) orthonormal basis of functional principal components to represent the functional data as \(X_i(t)=\sum_{k=1}^{\infty}\gamma_{ik}\nu_k\) and the functional parameter as \(\beta(t)=\sum_{k=1}^{\infty}\beta_k\nu_k\), where \(\gamma_{ik}=\Big< X_i(t),\nu_k\Big>\) and \(\beta_{k}=\Big<\beta,\nu_k\Big>\).
The response can be fitted by:

  • \(\lambda=0\), no penalization, $$\hat{y}=\nu_k^{\top}(\nu_k^{\top}\nu_k)^{-1}\nu_k^{\top}y$$

  • Ridge regression, \(\lambda>0\) and \(P=1\), $$\hat{y}=\nu_k^{\top}(\nu_k\top \nu_k+\lambda I)^{-1}\nu_k^{\top}y$$

  • Penalized regression, \(\lambda>0\) and \(P\neq0\). For example, \(P=c(0,0,1)\) penalizes the second derivative (curvature) by P=P.penalty(fdataobj["argvals"],P), $$\hat{y}=\nu_k^{\top}(\nu_k\top \nu_k+\lambda \nu_k^{\top} \textbf{P}\nu_k)^{-1}\nu_k^{\top}y$$

References

Cai TT, Hall P. 2006. Prediction in functional linear regression. Annals of Statistics 34: 2159-2179.

Cardot H, Ferraty F, Sarda P. 1999. Functional linear model. Statistics and Probability Letters 45: 11-22.

Hall P, Hosseini-Nasab M. 2006. On properties of functional principal components analysis. Journal of the Royal Statistical Society B 68: 109-126.

Febrero-Bande, M., Oviedo de la Fuente, M. (2012). Statistical Computing in Functional Data Analysis: The R Package fda.usc. Journal of Statistical Software, 51(4), 1-28. https://www.jstatsoft.org/v51/i04/

N. Kraemer, A.-L. Boulsteix, and G. Tutz (2008). Penalized Partial Least Squares with Applications to B-Spline Transformations and Functional Data. Chemometrics and Intelligent Laboratory Systems, 94, 60 - 69. doi:10.1016/j.chemolab.2008.06.009

See also

See Also as: fregre.pc.cv, summary.fregre.fd and predict.fregre.fd.

Alternative method: fregre.basis and fregre.np.

Author

Manuel Febrero-Bande, Manuel Oviedo de la Fuente manuel.oviedo@udc.es

Examples

if (FALSE) { # \dontrun{
data(tecator)
absorp <- tecator$absorp.fdata
ind <- 1:129
x <- absorp[ind,]
y <- tecator$y$Fat[ind]
res <- fregre.pc(x,y)
summary(res)
res2 <- fregre.pc(x,y,l=c(1,3,4))
summary(res2)
# Functional Ridge Regression
res3 <- fregre.pc(x,y,l=c(1,3,4),lambda=1,P=1)
summary(res3)
# Functional Regression with 2nd derivative penalization
res4 <- fregre.pc(x,y,l=c(1,3,4),lambda=1,P=c(0,0,1))
summary(res4)
betas <- c(res$beta.est,res2$beta.est,
           res3$beta.est,res4$beta.est)
plot(betas)
} # }