Replies: 3 comments 1 reply
-
Hi @edljk, it is possible, but it is not immediately available. It is possible to wrap a model in a way that the Hessian is substituted by a L-BFGS approximation, but it is only available as matrix-vector product, because we don't form the matrix in the L-BFGS case. A user recently asked about this for the Percival solver, in which case it is possible, and here is an example: https://gist.github.com/abelsiqueira/c4e4c590fb7f0d755f756e21e11e40ad The issue is that DCISolver.jl uses factorizations, so the strategy above doesn't work. What is possible is to create a similar NLPModel wrapper that provides a dense BFGS - or maybe just a simple diagonal matrix - when the Hessian is asked for. So I see three possibilities:
Let me know if this is what you are looking for, and maybe your use case, please. |
Beta Was this translation helpful? Give feedback.
-
Hi @abelsiqueira, thank you so much for the very precise answer. using NLPModels, ADNLPModels, Percival, NLPModelsModifiers, ManualNLPModels
A = [1.0 1.0]
b = [1.0]
f(x) = (x[1] - 1)^2 + 4 * (x[2] - x[1]^2)^2
g!(gx, x) = begin
gx[1] = 2 * (x[1] - 1) - 16 * x[1] * (x[2] - x[1]^2)
gx[2] = 8 * (x[2] - x[1]^2)
gx
end
nlp = NLPModel(
[-1.2; 1.0],
f,
grad = g!,
cons = (x -> A * x .- b, [0.0], [0.0])
)
# Wrap into LBFGS Model
qnlp = LBFGSModel(nlp)
xt = zeros(2)
gt = zeros(2)
cb = (nlp, solver, stats) -> begin
@show stats.iter solver.x
# Handle LBFGS update via callback
if stats.iter > 1
xt .-= solver.x
xt .*= -1 # = xₖ₊₁ - xₖ
gt .-= solver.gx
gt .*= -1 # = ∇f(xₖ₊₁) - ∇f(xₖ)
push!(nlp, xt, gt)
end
xt .= solver.x
gt .= solver.gx
end
stats = percival(qnlp, callback=cb, verbose=1)
@show hess_op(qnlp, stats.solution) |> Matrix |
Beta Was this translation helpful? Give feedback.
-
It works perfectly. |
Beta Was this translation helpful? Give feedback.
-
Is it possible to use DCISolver.jl if only analytical cost gradient and Jacobian of the constraints are available in the model?
Is there a way to use for instance BFGS / L-BFGS approximation?
Thanks for the nice work!
Beta Was this translation helpful? Give feedback.
All reactions