asmahani r-universe repositoryhttps://asmahani.r-universe.devPackage updated in asmahanicranlike-server 0.17.1https://github.com/asmahani.png?size=400asmahani r-universe repositoryhttps://asmahani.r-universe.devMon, 20 Feb 2023 22:40:06 GMT[asmahani] DBR 1.4.1alireza.s.mahani@gmail.com (Alireza Mahani)Bayesian Beta Regression, adapted for bounded discrete
responses, commonly seen in survey responses. Estimation is
done via Markov Chain Monte Carlo sampling, using a Gibbs
wrapper around univariate slice sampler (Neal (2003)
<DOI:10.1214/aos/1056562461>), as implemented in the R package
MfUSampler (Mahani and Sharabiani (2017) <DOI:
10.18637/jss.v078.c01>).https://github.com/r-universe/asmahani/actions/runs/8276249841Mon, 20 Feb 2023 22:40:06 GMTDBR1.4.1failurehttps://asmahani.r-universe.devhttps://github.com/cran/DBRDBR.RnwDBR.pdfBayesian Discretised Beta Regression2022-03-23 11:30:112022-08-06 19:20:02[asmahani] CFC 1.2.0alireza.s.mahani@gmail.com (Alireza S. Mahani)Numerical integration of cause-specific survival curves to
arrive at cause-specific cumulative incidence functions, with
three usage modes: 1) Convenient API for parametric survival
regression followed by competing-risk analysis, 2) API for CFC,
accepting user-specified survival functions in R, and 3) Same
as 2, but accepting survival functions in C++. For mathematical
details and software tutorial, see Mahani and Sharabiani (2019)
<DOI:10.18637/jss.v089.i09>.https://github.com/r-universe/asmahani/actions/runs/8229084624Mon, 09 Jan 2023 07:00:06 GMTCFC1.2.0successhttps://asmahani.r-universe.devhttps://github.com/cran/CFC[asmahani] BSGW 0.9.4alireza.s.mahani@gmail.com (Alireza S. Mahani)Bayesian survival model using Weibull regression on both
scale and shape parameters. Dependence of shape parameter on
covariates permits deviation from proportional-hazard
assumption, leading to dynamic - i.e. non-constant with time -
hazard ratios between subjects. Bayesian Lasso shrinkage in the
form of two Laplace priors - one for scale and one for shape
coefficients - allows for many covariates to be included.
Cross-validation helper functions can be used to tune the
shrinkage parameters. Monte Carlo Markov Chain (MCMC) sampling
using a Gibbs wrapper around Radford Neal's univariate slice
sampler (R package MfUSampler) is used for coefficient
estimation.https://github.com/r-universe/asmahani/actions/runs/8313285757Mon, 12 Dec 2022 12:10:08 GMTBSGW0.9.4successhttps://asmahani.r-universe.devhttps://github.com/cran/BSGW[asmahani] MfUSampler 1.1.0alireza.s.mahani@gmail.com (Alireza S. Mahani)Convenience functions for multivariate MCMC using
univariate samplers including: slice sampler with stepout and
shrinkage (Neal (2003) <DOI:10.1214/aos/1056562461>), adaptive
rejection sampler (Gilks and Wild (1992)
<DOI:10.2307/2347565>), adaptive rejection Metropolis (Gilks et
al (1995) <DOI:10.2307/2986138>), and univariate Metropolis
with Gaussian proposal.https://github.com/r-universe/asmahani/actions/runs/8586903383Thu, 08 Dec 2022 07:02:34 GMTMfUSampler1.1.0successhttps://asmahani.r-universe.devhttps://github.com/cran/MfUSamplerMfUSampler.RnwMfUSampler.pdfMultivariate-from-Univariate MCMC Sampler: R Package MfUSampler2014-12-222015-08-13[asmahani] sns 1.2.2alireza.s.mahani@gmail.com (Alireza Mahani)Stochastic Newton Sampler (SNS) is a
Metropolis-Hastings-based, Markov Chain Monte Carlo sampler for
twice differentiable, log-concave probability density functions
(PDFs) where the proposal density function is a multivariate
Gaussian resulting from a second-order Taylor-series expansion
of log-density around the current point. The mean of the
Gaussian proposal is the full Newton-Raphson step from the
current point. A Boolean flag allows for switching from SNS to
Newton-Raphson optimization (by choosing the mean of proposal
function as next point). This can be used during burn-in to get
close to the mode of the PDF (which is unique due to
concavity). For high-dimensional densities, mixing can be
improved via 'state space partitioning' strategy, in which SNS
is applied to disjoint subsets of state space, wrapped in a
Gibbs cycle. Numerical differentiation is available when
analytical expressions for gradient and Hessian are not
available. Facilities for validation and numerical
differentiation of log-density are provided. Note: Formerly
available versions of the MfUSampler can be obtained from the
archive
<https://cran.r-project.org/src/contrib/Archive/MfUSampler/>.https://github.com/r-universe/asmahani/actions/runs/8260791858Wed, 02 Nov 2022 10:02:22 GMTsns1.2.2successhttps://asmahani.r-universe.devhttps://github.com/cran/snsSNS.RnwSNS.pdfStochastic Newton Sampler: The R Package sns2015-01-302022-11-02 10:02:22[asmahani] SAMUR 1.1alireza.s.mahani@gmail.com (Alireza S. Mahani)Augmenting a matched data set by generating multiple
stochastic, matched samples from the data using a
multi-dimensional histogram constructed from dropping the input
matched data into a multi-dimensional grid built on the full
data set. The resulting stochastic, matched sets will likely
provide a collectively higher coverage of the full data set
compared to the single matched set. Each stochastic match is
without duplication, thus allowing downstream validation
techniques such as cross-validation to be applied to each set
without concern for overfitting.https://github.com/r-universe/asmahani/actions/runs/8415693898Wed, 31 Aug 2022 12:00:14 GMTSAMUR1.1successhttps://asmahani.r-universe.devhttps://github.com/cran/SAMUR[asmahani] MatchLinReg 0.8.1alireza.s.mahani@gmail.com (Alireza S. Mahani)Core functions as well as diagnostic and calibration tools
for combining matching and linear regression for causal
inference in observational studies.https://github.com/r-universe/asmahani/actions/runs/8518662283Tue, 30 Aug 2022 12:30:08 GMTMatchLinReg0.8.1successhttps://asmahani.r-universe.devhttps://github.com/cran/MatchLinRegMatchLinReg.pdf.asisMatchLinReg.pdfMatching and Linear Regression2022-08-30 12:30:082022-08-30 12:30:08[asmahani] EnsemblePCReg 1.1.4alireza.s.mahani@gmail.com (Alireza S. Mahani)Extends the base classes and methods of 'EnsembleBase'
package for Principal-Components-Regression-based (PCR)
integration of base learners. Default implementation uses
cross-validation error to choose the optimal number of PC
components for the final predictor. The package takes advantage
of the file method provided in 'EnsembleBase' package for
writing estimation objects to disk in order to circumvent RAM
bottleneck. Special save and load methods are provided to allow
estimation objects to be saved to permanent files on disk, and
to be loaded again into temporary files in a later R session.
Users and developers can extend the package by extending the
generic methods and classes provided in 'EnsembleBase' package
as well as this package.https://github.com/r-universe/asmahani/actions/runs/8496314497Mon, 18 Apr 2022 20:54:29 GMTEnsemblePCReg1.1.4successhttps://asmahani.r-universe.devhttps://github.com/cran/EnsemblePCRegEnsemblePCReg.pdf.asisEnsemblePCReg.pdfMulti-stage heterogeneous ensemble meta-learning with hands-off user-interface and on-demand prediction using principal components regression: The R package EnsemblePCReg2016-02-13 00:33:032016-06-29 02:38:08[asmahani] RegressionFactory 0.7.4alireza.s.mahani@gmail.com (Alireza S. Mahani)The expander functions rely on the mathematics developed
for the Hessian-definiteness invariance theorem for linear
projection transformations of variables, described in authors'
paper, to generate the full, high-dimensional gradient and
Hessian from the lower-dimensional derivative objects. This
greatly relieves the computational burden of generating the
regression-function derivatives, which in turn can be fed into
any optimization routine that utilizes such derivatives. The
theorem guarantees that Hessian definiteness is preserved,
meaning that reasoning about this property can be performed in
the low-dimensional space of the base distribution. This is
often a much easier task than its equivalent in the full,
high-dimensional space. Definiteness of Hessian can be useful
in selecting optimization/sampling algorithms such as
Newton-Raphson optimization or its sampling equivalent, the
Stochastic Newton Sampler. Finally, in addition to being a
computational tool, the regression expansion framework is of
conceptual value by offering new opportunities to generate
novel regression problems.https://github.com/r-universe/asmahani/actions/runs/8407330105Mon, 26 Oct 2020 05:30:07 GMTRegressionFactory0.7.4successhttps://asmahani.r-universe.devhttps://github.com/cran/RegressionFactoryRegressionFactory.RnwRegressionFactory.pdfExpander Framework for Generating High-Dimensional GLM Gradient and Hessian from Low-Dimensional Base Distributions: R Package MfUSampler2015-01-012020-10-26 05:30:07[asmahani] EnsemblePenReg 0.7alireza.s.mahani@gmail.com (Alireza S. Mahani)Extending the base classes and methods of EnsembleBase
package for Penalized-Regression-based (Ridge and Lasso)
integration of base learners. Default implementation uses
cross-validation error to choose the optimal lambda (shrinkage
parameter) for the final predictor. The package takes advantage
of the file method provided in EnsembleBase package for writing
estimation objects to disk in order to circumvent RAM
bottleneck. Special save and load methods are provided to allow
estimation objects to be saved to permanent files on disk, and
to be loaded again into temporary files in a later R session.
Users and developers can extend the package by extending the
generic methods and classes provided in EnsembleBase package as
well as this package.https://github.com/r-universe/asmahani/actions/runs/8586840908Wed, 14 Sep 2016 18:50:35 GMTEnsemblePenReg0.7successhttps://asmahani.r-universe.devhttps://github.com/cran/EnsemblePenReg[asmahani] EnsembleBase 1.0.2alireza.s.mahani@gmail.com (Alireza S. Mahani)Extensible S4 classes and methods for batch training of
regression and classification algorithms such as Random Forest,
Gradient Boosting Machine, Neural Network, Support Vector
Machines, K-Nearest Neighbors, Penalized Regression (L1/L2),
and Bayesian Additive Regression Trees. These algorithms
constitute a set of 'base learners', which can subsequently be
combined together to form ensemble predictions. This package
provides cross-validation wrappers to allow for downstream
application of ensemble integration techniques, including
best-error selection. All base learner estimation objects are
retained, allowing for repeated prediction calls without the
need for re-training. For large problems, an option is provided
to save estimation objects to disk, along with prediction
methods that utilize these objects. This allows users to train
and predict with large ensembles of base learners without being
constrained by system RAM.https://github.com/r-universe/asmahani/actions/runs/8407420371Tue, 13 Sep 2016 22:30:52 GMTEnsembleBase1.0.2successhttps://asmahani.r-universe.devhttps://github.com/cran/EnsembleBase[asmahani] EnsembleCV 0.8alireza.s.mahani@gmail.com (Alireza S. Mahani)Extends the base classes and methods of EnsembleBase
package for cross-validation-based integration of base
learners. Default implementation calculates average of repeated
CV errors, and selects the base learner / configuration with
minimum average error. The package takes advantage of the file
method provided in EnsembleBase package for writing estimation
objects to disk in order to circumvent RAM bottleneck. Special
save and load methods are provided to allow estimation objects
to be saved to permanent files on disk, and to be loaded again
into temporary files in a later R session. The package can be
extended, e.g. by adding variants of the current
implementation.https://github.com/r-universe/asmahani/actions/runs/8313507302Tue, 13 Sep 2016 22:20:51 GMTEnsembleCV0.8successhttps://asmahani.r-universe.devhttps://github.com/cran/EnsembleCV[asmahani] BayesMixSurv 0.9.1alireza.s.mahani@gmail.com (Alireza S. Mahani)Bayesian Mixture Survival Models using Additive
Mixture-of-Weibull Hazards, with Lasso Shrinkage and
Stratification. As a Bayesian dynamic survival model, it
relaxes the proportional-hazard assumption. Lasso shrinkage
controls overfitting, given the increase in the number of free
parameters in the model due to presence of two Weibull
components in the hazard function.https://github.com/r-universe/asmahani/actions/runs/8399884588Thu, 08 Sep 2016 10:24:27 GMTBayesMixSurv0.9.1successhttps://asmahani.r-universe.devhttps://github.com/cran/BayesMixSurv