Computes functional regression between functional explanatory variables \((X^{1}(t_1),...,X^{q}(t_q))\) and scalar response \(Y\) using backfitting algorithm.
Arguments
- formula
an object of class
formula
(or one that can be coerced to that class): a symbolic description of the model to be fitted. The procedure only considers functional covariates (not implemented for non-functional covariates). The details of model specification are given underDetails
.- family
a description of the error distribution and link function to be used in the model. This can be a character string naming a family function, a family function or the result of a call to a family function. (See
family
for details of family functions).- data
List that containing the variables in the model.
- weights
weights
- par.metric
List of arguments by covariate to pass to the
metric
function by covariate.- par.np
List of arguments to pass to the
fregre.np.cv
function- offset
this can be used to specify an a priori known component to be included in the linear predictor during fitting.
- control
a list of parameters for controlling the fitting process, by default:
maxit
,epsilon
,trace
andinverse
- ...
Further arguments passed to or from other methods.
- inverse
="svd" (by default) or ="solve" method.
Value
result
: List of non-parametric estimation by covariate.fitted.values
: Estimated scalar response.residuals
:y
minusfitted values
.effects
: The residual degrees of freedom.alpha
: Hat matrix.family
: Coefficient of determination.linear.predictors
: Residual variance.deviance
: Scalar response.aic
: Functional explanatory data.null.deviance
: Non functional explanatory data.iter
: Distance matrix between curves.w
: Beta coefficient estimated.eqrank
: List that containing the variables in the model.prior.weights
: Asymmetric kernel used.y
: Scalar response.H
: Hat matrix, see Opsomer and Ruppert (1997) for more details.converged
: Conv.
Details
The smooth functions \(f(.)\) are estimated nonparametrically using a
iterative local scoring algorithm by applying Nadaraya-Watson weighted
kernel smoothers using fregre.np.cv
in each step, see
Febrero-Bande and Gonzalez-Manteiga (2011) for more details.
Consider the fitted response \(\hat{Y}=g^{-1}(H_{Q}y)\),
where \(H_{Q}\) is the weighted hat matrix.
Opsomer and Ruppert
(1997) solves a system of equations for fit the unknowns
\(f(\cdot)\) computing the additive smoother matrix \(H_k\)
such that \(\hat{f}_k (X^k)=H_{k}Y\) and
\(H_Q=H_1+,\cdots,+H_q\). The additive model is fitted
as follows: $$\hat{Y}=g^{-1}\Big(\sum_i^q
\hat{f_i}(X_i)\Big)$$
References
Febrero-Bande M. and Gonzalez-Manteiga W. (2012). Generalized Additive Models for Functional Data. TEST. Springer-Velag. doi:10.1007/s11749-012-0308-0
Opsomer J.D. and Ruppert D.(1997). Fitting a bivariate additive model
by local polynomial regression.Annals of Statistics, 25
, 186-211.
See also
See Also as: fregre.gsam
, fregre.glm
and fregre.np.cv
Examples
if (FALSE) { # \dontrun{
data(tecator)
ab=tecator$absorp.fdata[1:100]
ab2=fdata.deriv(ab,2)
yfat=tecator$y[1:100,"Fat"]
# Example 1: # Changing the argument par.np and family
yfat.cat=ifelse(yfat<15,0,1)
xlist=list("df"=data.frame(yfat.cat),"ab"=ab,"ab2"=ab2)
f2<-yfat.cat~ab+ab2
par.NP<-list("ab"=list(Ker=AKer.norm,type.S="S.NW"),
"ab2"=list(Ker=AKer.norm,type.S="S.NW"))
res2=fregre.gkam(f2,family=binomial(),data=xlist,
par.np=par.NP)
res2
# Example 2: Changing the argument par.metric and family link
par.metric=list("ab"=list(metric=semimetric.deriv,nderiv=2,nbasis=15),
"ab2"=list("metric"=semimetric.basis))
res3=fregre.gkam(f2,family=binomial("probit"),data=xlist,
par.metric=par.metric,control=list(maxit=2,trace=FALSE))
summary(res3)
# Example 3: Gaussian family (by default)
# Only 1 iteration (by default maxit=100)
xlist=list("df"=data.frame(yfat),"ab"=ab,"ab2"=ab2)
f<-yfat~ab+ab2
res=fregre.gkam(f,data=xlist,control=list(maxit=1,trace=FALSE))
res
} # }