Ipopt.jl is a wrapper for the Ipopt solver.
This wrapper is maintained by the JuMP community and is not a COIN-OR project.
If you need help, please ask a question on the JuMP community forum.
If you have a reproducible example of a bug, please open a GitHub issue.
Ipopt.jl
is licensed under the MIT License.
The underlying solver, coin-or/Ipopt, is licensed under the Eclipse public license.
Install Ipopt.jl
using the Julia package manager:
import Pkg Pkg.add("Ipopt")
In addition to installing the Ipopt.jl
package, this will also download and install the Ipopt binaries. You do not need to install Ipopt separately.
To use a custom binary, read the Custom solver binaries section of the JuMP documentation.
For details on using a different linear solver, see the Linear Solvers
section below. You do not need a custom binary to change the linear solver.
You can use Ipopt with JuMP as follows:
using JuMP, Ipopt model = Model(Ipopt.Optimizer) set_attribute(model, "max_cpu_time", 60.0) set_attribute(model, "print_level", 0)
Ipopt.jl v1.10.0 moved the Ipopt.Optimizer
object to a package extension. As a consequence, Ipopt.Optimizer
is now type unstable, and it will be inferred as Ipopt.Optimizer()::Any
.
In most cases, this should not impact performance. If it does, there are two work-arounds.
First, you can use a function barrier:
using JuMP, Ipopt function main(optimizer::T) where {T} model = Model(optimizer) return end main(Ipopt.Optimizer)
Although the outer Ipopt.Optimizer
is type unstable, the optimizer
inside main
will be properly inferred.
Second, you may explicitly get and use the extension module:
using JuMP, Ipopt const IpoptMathOptInterfaceExt = Base.get_extension(Ipopt, :IpoptMathOptInterfaceExt) model = Model(IpoptMathOptInterfaceExt.Optimizer)
The Ipopt optimizer supports the following constraints and attributes.
List of supported objective functions:
MOI.ObjectiveFunction{MOI.ScalarAffineFunction{Float64}}
MOI.ObjectiveFunction{MOI.ScalarNonlinearFunction}
MOI.ObjectiveFunction{MOI.ScalarQuadraticFunction{Float64}}
MOI.ObjectiveFunction{MOI.VariableIndex}
List of supported variable types:
List of supported constraint types:
MOI.ScalarAffineFunction{Float64}
in MOI.EqualTo{Float64}
MOI.ScalarAffineFunction{Float64}
in MOI.GreaterThan{Float64}
MOI.ScalarAffineFunction{Float64}
in MOI.Interval{Float64}
MOI.ScalarAffineFunction{Float64}
in MOI.LessThan{Float64}
MOI.ScalarNonlinearFunction
in MOI.EqualTo{Float64}
MOI.ScalarNonlinearFunction
in MOI.GreaterThan{Float64}
MOI.ScalarNonlinearFunction
in MOI.Interval{Float64}
MOI.ScalarNonlinearFunction
in MOI.LessThan{Float64}
MOI.ScalarQuadraticFunction{Float64}
in MOI.EqualTo{Float64}
MOI.ScalarQuadraticFunction{Float64}
in MOI.GreaterThan{Float64}
MOI.ScalarQuadraticFunction{Float64}
in MOI.Interval{Float64}
MOI.ScalarQuadraticFunction{Float64}
in MOI.LessThan{Float64}
MOI.VariableIndex
in MOI.EqualTo{Float64}
MOI.VariableIndex
in MOI.GreaterThan{Float64}
MOI.VariableIndex
in MOI.Interval{Float64}
MOI.VariableIndex
in MOI.LessThan{Float64}
List of supported model attributes:
MOI.BarrierIterations
MOI.NLPBlock
MOI.NLPBlockDualStart
MOI.Name
MOI.ObjectiveSense
MOI.SolveTimeSec
List of supported optimizer attributes:
Supported options are listed in the Ipopt documentation.
Solver-specific callbacksIpopt provides a callback that can be used to log the status of the optimization during a solve. It can also be used to terminate the optimization by returning false
. Here is an example:
using JuMP, Ipopt, Test model = Model(Ipopt.Optimizer) set_silent(model) @variable(model, x >= 1) @objective(model, Min, x + 0.5) x_vals = Float64[] function my_callback( alg_mod::Cint, iter_count::Cint, obj_value::Float64, inf_pr::Float64, inf_du::Float64, mu::Float64, d_norm::Float64, regularization_size::Float64, alpha_du::Float64, alpha_pr::Float64, ls_trials::Cint, ) push!(x_vals, callback_value(model, x)) @test isapprox(obj_value, 1.0 * x_vals[end] + 0.5, atol = 1e-1) # return `true` to keep going, or `false` to terminate the optimization. return iter_count < 1 end MOI.set(model, Ipopt.CallbackFunction(), my_callback) optimize!(model) @test MOI.get(model, MOI.TerminationStatus()) == MOI.INTERRUPTED @test length(x_vals) == 2
See the Ipopt documentation for an explanation of the arguments to the callback. They are identical to the output contained in the logging table printed to the screen.
To access the current solution and primal, dual, and complementarity violations of each iteration, use Ipopt.GetIpoptCurrentViolations
and Ipopt.GetIpoptCurrentIterate
. The two functions are identical to the ones in the Ipopt C interface.
Ipopt.jl wraps the Ipopt C interface with minimal modifications.
A complete example is available in the test/C_wrapper.jl
file.
For simplicity, the five callbacks required by Ipopt are slightly different to the C interface. They are as follows:
""" eval_f(x::Vector{Float64})::Float64 Returns the objective value `f(x)`. """ function eval_f end """ eval_grad_f(x::Vector{Float64}, grad_f::Vector{Float64})::Nothing Fills `grad_f` in-place with the gradient of the objective function evaluated at `x`. """ function eval_grad_f end """ eval_g(x::Vector{Float64}, g::Vector{Float64})::Nothing Fills `g` in-place with the value of the constraints evaluated at `x`. """ function eval_g end """ eval_jac_g( x::Vector{Float64}, rows::Vector{Cint}, cols::Vector{Cint}, values::Union{Nothing,Vector{Float64}}, )::Nothing Compute the Jacobian matrix. * If `values === nothing` - Fill `rows` and `cols` with the 1-indexed sparsity structure * Otherwise: - Fill `values` with the elements of the Jacobian matrix according to the sparsity structure. !!! warning If `values === nothing`, `x` is an undefined object. Accessing any elements in it will cause Julia to segfault. """ function eval_jac_g end """ eval_h( x::Vector{Float64}, rows::Vector{Cint}, cols::Vector{Cint}, obj_factor::Float64, lambda::Float64, values::Union{Nothing,Vector{Float64}}, )::Nothing Compute the Hessian-of-the-Lagrangian matrix. * If `values === nothing` - Fill `rows` and `cols` with the 1-indexed sparsity structure * Otherwise: - Fill `values` with the Hessian matrix according to the sparsity structure. !!! warning If `values === nothing`, `x` is an undefined object. Accessing any elements in it will cause Julia to segfault. """ function eval_h end
If you get a termination status MOI.INVALID_MODEL
, it is probably because you have some undefined value in your model, for example, a division by zero. Fix this by removing the division, or by imposing variable bounds so that you cut off the undefined region.
Instead of
model = Model(Ipopt.Optimizer) @variable(model, x) @NLobjective(model, 1 / x)
do
model = Model(Ipopt.Optimizer) @variable(model, x >= 0.0001) @NLobjective(model, 1 / x)
To improve performance, Ipopt supports a number of linear solvers.
Obtain a license and download HSL_jll.jl
from https://licences.stfc.ac.uk/products/Software/HSL/LibHSL.
Install this download into your current environment using:
import Pkg Pkg.develop(path = "/full/path/to/HSL_jll.jl")
Then, use a linear solver in HSL by setting the hsllib
and linear_solver
attributes:
using JuMP, Ipopt import HSL_jll model = Model(Ipopt.Optimizer) set_attribute(model, "hsllib", HSL_jll.libhsl_path) set_attribute(model, "linear_solver", "ma86")
The available HSL solvers are "ma27"
, "ma57"
, "ma77"
, "ma86"
, and "ma97"
. We recommend using either sequential BLAS and LAPACK backends or a multithreaded version limited to one thread when employing the linear solvers "ma86"
, or "ma97"
. These solvers already leverage parallelism via OpenMP, and enabling multiple threads in BLAS and LAPACK may result in thread oversubscription.
Due to the security policy of macOS, Mac users may need to delete the quarantine attribute of the ZIP archive before extracting. For example:
xattr -d com.apple.quarantine HSL_jll.jl-2025.7.21.zip
Download Pardiso from https://www.pardiso-project.org. Save the shared library somewhere, and record the filename.
Then, use Pardiso by setting the pardisolib
and linear_solver
attributes:
using JuMP, Ipopt model = Model(Ipopt.Optimizer) set_attribute(model, "pardisolib", "/full/path/to/libpardiso") set_attribute(model, "linear_solver", "pardiso")
If you use Ipopt.jl with Julia ≥ v1.9, the linear solver SPRAL is available. You can use it by setting the linear_solver
attribute:
using JuMP, Ipopt model = Model(Ipopt.Optimizer) set_attribute(model, "linear_solver", "spral")
Note that the following environment variables must be set before starting Julia:
export OMP_CANCELLATION=TRUE
export OMP_PROC_BIND=TRUE
With Julia v1.9 or later, Ipopt and the linear solvers MUMPS (default), SPRAL, and HSL are compiled with libblastrampoline
(LBT), a library that can change between BLAS and LAPACK backends at runtime.
Note that the BLAS and LAPACK backends loaded at runtime must be compiled with 32-bit integers. The default BLAS and LAPACK backend is OpenBLAS, and we rely on the Julia artifact OpenBLAS32_jll.jl
if no backend is loaded before using Ipopt
.
Using LBT, we can also switch dynamically to other BLAS backends such as Intel MKL, BLIS, and Apple Accelerate. Because Ipopt and the linear solvers heavily rely on BLAS and LAPACK routines, using an optimized backend for a particular platform can improve the performance.
Sequential BLAS and LAPACKIf you have ReferenceBLAS32_jll.jl
and LAPACK32_jll.jl
installed, switch to sequential and reference version of BLAS and LAPACK with:
using ReferenceBLAS32_jll, LAPACK32_jll LinearAlgebra.BLAS.lbt_forward(libblas32) LinearAlgebra.BLAS.lbt_forward(liblapack32) using Ipopt
If you have MKL.jl installed, switch to MKL by adding using MKL
to your code:
If you have BLIS32_jll.jl
and LAPACK32_jll.jl
installed, switch to BLIS with:
using blis32_jll, LAPACK32_jll LinearAlgebra.BLAS.lbt_forward(blis32) LinearAlgebra.BLAS.lbt_forward(liblapack32) using Ipopt
If you are using macOS ≥ v13.4 and you have AppleAccelerate.jl installed, add using AppleAccelerate
to your code:
using AppleAccelerate using Ipopt
Check what backends are loaded using:
import LinearAlgebra LinearAlgebra.BLAS.lbt_get_config()
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4