symbolic description of the model of type y ~ x or y ~ x | z, specifying the variables influencing mean and precision of y, respectively. For details see betareg.
partition
symbolic description of the partitioning variables, e.g., ~ p1 + p2. The argument partition can be omitted if formula is a three-part formula of type y ~ x | z | p1 + p2.
data, subset, na.action, weights, offset, cluster
arguments controlling data/model processing passed to mob.
link
character specification of the link function in the mean model (mu). Currently, “logit”, “probit”, “cloglog”, “cauchit”, “log”, “loglog” are supported. Alternatively, an object of class “link-glm” can be supplied.
link.phi
character specification of the link function in the precision model (phi). Currently, “identity”, “log”, “sqrt” are supported. Alternatively, an object of class “link-glm” can be supplied.
control
a list of control arguments for the beta regression specified via betareg.control.
…
further control arguments for the recursive partitioning passed to mob_control.
Details
Beta regression trees are an application of model-based recursive partitioning (implemented in mob, see Zeileis et al. 2008) to beta regression (implemented in betareg, see Cribari-Neto and Zeileis 2010). See also Grün at al. (2012) for more details.
Various methods are provided for “betatree” objects, most of them inherit their behavior from “mob” objects (e.g., print, summary, coef, etc.). The plot method employs the node_bivplot panel-generating function.
Value
betatree() returns an object of S3 class “betatree” which inherits from “modelparty”.
References
Cribari-Neto F, Zeileis A (2010). Beta Regression in R. Journal of Statistical Software, 34(2), 1–24. doi:10.18637/jss.v034.i02
Grün B, Kosmidis I, Zeileis A (2012). Extended Beta Regression in R: Shaken, Stirred, Mixed, and Partitioned. Journal of Statistical Software, 48(11), 1–25. doi:10.18637/jss.v048.i11
Zeileis A, Hothorn T, Hornik K (2008). Model-Based Recursive Partitioning. Journal of Computational and Graphical Statistics, 17(2), 492–514.
See Also
betareg, betareg.fit, mob
Examples
library("betareg")options(digits =4)suppressWarnings(RNGversion("3.5.0"))## data with two groups of dyslexic and non-dyslexic childrendata("ReadingSkills", package ="betareg")## additional random noise (not associated with reading scores)set.seed(1071)ReadingSkills$x1<-rnorm(nrow(ReadingSkills))ReadingSkills$x2<-runif(nrow(ReadingSkills))ReadingSkills$x3<-factor(rnorm(nrow(ReadingSkills))>0)## fit beta regression tree: in each node## - accurcay's mean and precision depends on iq## - partitioning is done by dyslexia and the noise variables x1, x2, x3## only dyslexia is correctly selected for splittingbt<-betatree(accuracy~iq|iq, ~dyslexia+x1+x2+x3, data =ReadingSkills, minsize =10)plot(bt)
Call:
betatree(formula = accuracy ~ iq | iq, data = ReadingSkills)
Quantile residuals:
Min 1Q Median 3Q Max
-2.426 -0.631 -0.067 0.778 1.555
Coefficients (mean model with logit link):
Estimate Std. Error z value Pr(>|z|)
(Intercept) 0.3809 0.0486 7.83 4.8e-15 ***
iq -0.0862 0.0549 -1.57 0.12
Phi coefficients (precision model with log link):
Estimate Std. Error z value Pr(>|z|)
(Intercept) 4.808 0.414 11.61 <2e-16 ***
iq 0.826 0.395 2.09 0.036 *
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Type of estimator: ML (maximum likelihood)
Log-likelihood: 27.3 on 4 Df
Pseudo R-squared: 0.0391
Number of iterations: 16 (BFGS) + 2 (Fisher scoring)
## IGNORE_RDIFF_END## add a numerical variable with relevant information for splittingReadingSkills$x4<-rnorm(nrow(ReadingSkills), c(-1.5, 1.5)[ReadingSkills$dyslexia])bt2<-betatree(accuracy~iq|iq, ~x1+x2+x3+x4, data =ReadingSkills, minsize =10)plot(bt2)