The Madgrad package is an R port of the original madgrad by Aaron Defazio and Samy Jelassi. See the Arxiv paper for details on the method.
Madgrad is not yet on CRAN. The development version from GitHub can be installed with:
# install.packages("devtools")
::install_github("mlverse/madgrad") devtools
This is a small example showing how to use madgrad
with
torch to minimize a function, of course, madgrad
is not the
best algorithm for this task and should work better for neural network
training.
library(madgrad)
library(torch)
torch_manual_seed(1)
<- function(x, y) {
f log((1.5 - x + x*y)^2 + (2.25 - x - x*(y^2))^2 + (2.625 - x + x*(y^3))^2)
}
<- torch_tensor(-5, requires_grad = TRUE)
x <- torch_tensor(-2, requires_grad = TRUE)
y
<- optim_madgrad(params = list(x, y), lr = 0.1)
opt
for (i in 1:100) {
$zero_grad()
opt<- f(x, y)
z $backward()
z$step()
opt
}
x#> torch_tensor
#> 2.2882
#> [ CPUFloatType{1} ]
y#> torch_tensor
#> 0.2412
#> [ CPUFloatType{1} ]