The goal of higrad is to implement the Hierarchical Incremental GRAdient Descent (HiGrad) algorithm. HiGrad is a first-order algorithm for finding the minimizer of a function in online learning just like SGD and, in addition, this method attaches a confidence interval to assess the uncertainty of its predictions.
This is a basic example which shows you how to solve a linear regression using higrad with simulated data. The predictions obtained at the end come with 95% confidence intervals.
library(higrad)
# generate a data set for linear regression
<- 1e6
n <- 50
d <- 1
sigma <- rep(0, d)
theta <- matrix(rnorm(n * d), n, d)
x <- as.numeric(x %*% theta + rnorm(n, 0, sigma))
y # fit the linear regression with higrad using the default setting
<- higrad(x, y, model = "lm")
fit # predict for 10 new samples
<- matrix(rnorm(10 * d), 10, d)
newx <- predict(fit, newx) pred