We check if the package is instyalled or not.
install.packages("JuliaCall")
Then we load it.
library("JuliaCall")
We need to inform R about the repository where Julian’s binaries are installed
julia_setup(JULIA_HOME = "/home/jchiquet/julia-1.0.0/bin")
## Julia version 1.0.0 at location /home/jchiquet/julia-1.0.0/bin will be used.
## Loading setup script for JuliaCall...
## Finish loading setup script for JuliaCall.
It is then possible to installed Julia package via R:
julia_install_package_if_needed("Optim")
## Julia version 1.0.0 at location /home/jchiquet/julia-1.0.0/bin will be used.
## Loading setup script for JuliaCall...
## Finish loading setup script for JuliaCall.
julia_install_package_if_needed("LineSearches")
Commands are then executed line by line, which is a bit annoying, but it works !
julia_library("Optim")
julia_library("LineSearches")
julia_command("rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2")
## rosenbrock (generic function with 1 method)
julia_eval("optimize(rosenbrock, zeros(2), BFGS())")
## Julia Object of type Optim.MultivariateOptimizationResults{BFGS{InitialStatic{Float64},HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##17#19"))},Float64,Array{Float64,1},Float64,Float64,Array{OptimizationState{Float64,BFGS{InitialStatic{Float64},HagerZhang{Float64,Base.RefValue{Bool}},getfield(Optim, Symbol("##17#19"))}},1}}.
## Results of Optimization Algorithm
## * Algorithm: BFGS
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.9999999926033423,0.9999999852005353]
## * Minimum: 5.471433e-17
## * Iterations: 16
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 3.47e-07
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 1.20e+03 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 2.33e-09
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 53
## * Gradient Calls: 53
It is also possible to include Julia Chuhnk just like we do with R in Rmarkdown ! First, I needed to link mly Julia’s binary to the usual place where Julia is installed and where XRJulia is going to look at:
sudo ln -s /home/jchiquet/julia-1.0.0/bin/julia /usr/local/bin/julia
And call direct julia’s code:
using Pkg
Pkg.add("Optim")
Pkg.add("LineSearches")
using Optim, LineSearches
rosenbrock(x) = (1.0 - x[1])^2 + 100.0 * (x[2] - x[1]^2)^2
## rosenbrock (generic function with 1 method)
result = optimize(rosenbrock, zeros(2), BFGS())
## Results of Optimization Algorithm
## * Algorithm: BFGS
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.9999999926033423,0.9999999852005353]
## * Minimum: 5.471433e-17
## * Iterations: 16
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 3.47e-07
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 1.20e+03 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 2.33e-09
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 53
## * Gradient Calls: 53
optimize(rosenbrock, zeros(2), BFGS(linesearch = LineSearches.BackTracking()))
## Results of Optimization Algorithm
## * Algorithm: BFGS
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.9999999926655744,0.9999999853309254]
## * Minimum: 5.379380e-17
## * Iterations: 23
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 1.13e-09
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 1.57e-01 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 8.79e-11
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 31
## * Gradient Calls: 24
optimize(rosenbrock, zeros(2), BFGS(linesearch = LineSearches.BackTracking(order = 2)))
## Results of Optimization Algorithm
## * Algorithm: BFGS
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.9999999926644578,0.9999999853284671]
## * Minimum: 5.381020e-17
## * Iterations: 23
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 4.73e-09
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 9.42e-01 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 1.76e-10
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 29
## * Gradient Calls: 24
For the Rosenbrock example, the analytical gradient can be shown to be:
function g!(G, x)
G[1] = -2.0 * (1.0 - x[1]) - 400.0 * (x[2] - x[1]^2) * x[1]
G[2] = 200.0 * (x[2] - x[1]^2)
end
## g! (generic function with 1 method)
optimize(rosenbrock, g!, zeros(2), LBFGS())
## Results of Optimization Algorithm
## * Algorithm: L-BFGS
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.999999999999928,0.9999999999998559]
## * Minimum: 5.191703e-27
## * Iterations: 24
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 4.58e-11
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 8.50e+07 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 1.44e-13
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 67
## * Gradient Calls: 67
In addition to providing gradients, you can provide a Hessian function h! as well. In our current case this is:
function h!(H, x)
H[1, 1] = 2.0 - 400.0 * x[2] + 1200.0 * x[1]^2
H[1, 2] = -400.0 * x[1]
H[2, 1] = -400.0 * x[1]
H[2, 2] = 200.0
end
## h! (generic function with 1 method)
optimize(rosenbrock, g!, h!, zeros(2)) # newton
## Results of Optimization Algorithm
## * Algorithm: Newton's Method
## * Starting Point: [0.0,0.0]
## * Minimizer: [0.9999999999999994,0.9999999999999989]
## * Minimum: 3.081488e-31
## * Iterations: 14
## * Convergence: true
## * |x - x'| ≤ 0.0e+00: false
## |x - x'| = 3.06e-09
## * |f(x) - f(x')| ≤ 0.0e+00 |f(x)|: false
## |f(x) - f(x')| = 3.03e+13 |f(x)|
## * |g(x)| ≤ 1.0e-08: true
## |g(x)| = 1.11e-15
## * Stopped by an increasing objective: false
## * Reached Maximum Number of Iterations: false
## * Objective Calls: 44
## * Gradient Calls: 44
## * Hessian Calls: 14