Skip to content

Commit

Permalink
Update to MOI v0.8 and ModelMode
Browse files Browse the repository at this point in the history
  • Loading branch information
blegat committed Dec 18, 2018
1 parent 68793fb commit 83df5c2
Show file tree
Hide file tree
Showing 23 changed files with 197 additions and 195 deletions.
2 changes: 1 addition & 1 deletion REQUIRE
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
julia 0.7
MathOptInterface 0.7 0.8
MathOptInterface 0.8 0.9
ForwardDiff 0.5 0.11
Calculus
DataStructures
Expand Down
2 changes: 1 addition & 1 deletion docs/src/constraints.md
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ DocTestSetup = quote
@objective(model, Max, -2x);
JuMP.optimize!(model);
mock = JuMP.backend(model).optimizer.model;
MOI.set(mock, MOI.DualStatus(), MOI.FeasiblePoint)
MOI.set(mock, MOI.DualStatus(), MOI.FEASIBLE_POINT)
MOI.set(mock, MOI.ConstraintDual(), JuMP.optimizer_index(con), -2.0)
end
```
Expand Down
20 changes: 10 additions & 10 deletions docs/src/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ julia> model = Model(with_optimizer(GLPK.Optimizer))
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: Automatic
CachingOptimizer state: NoOptimizer
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
```

Expand Down Expand Up @@ -99,9 +99,9 @@ DocTestSetup = quote
# Now we load in the solution. Using a caching optimizer removes the need to
# load a solver such as GLPK for building the documentation.
mock = JuMP.backend(model).optimizer.model
MOI.set(mock, MOI.TerminationStatus(), MOI.Optimal)
MOI.set(mock, MOI.PrimalStatus(), MOI.FeasiblePoint)
MOI.set(mock, MOI.DualStatus(), MOI.FeasiblePoint)
MOI.set(mock, MOI.TerminationStatus(), MOI.OPTIMAL)
MOI.set(mock, MOI.PrimalStatus(), MOI.FEASIBLE_POINT)
MOI.set(mock, MOI.DualStatus(), MOI.FEASIBLE_POINT)
MOI.set(mock, MOI.ResultCount(), 1)
MOI.set(mock, MOI.ObjectiveValue(), 10.6)
MOI.set(mock, MOI.VariablePrimal(), JuMP.optimizer_index(x), 2.0)
Expand All @@ -120,9 +120,9 @@ to a setting such as a time limit. We can ask the solver why it stopped using
the `JuMP.termination_status` function:
```jldoctest quickstart_example
julia> JuMP.termination_status(model)
Optimal::TerminationStatusCode = 1
OPTIMAL::TerminationStatusCode = 1
```
In this case, `GLPK` returned `Optimal`, this mean that it has found the optimal
In this case, `GLPK` returned `OPTIMAL`, this mean that it has found the optimal
solution.

```@meta
Expand All @@ -134,14 +134,14 @@ a primal-dual pair of feasible solutions with zero duality gap.
We can verify the primal and dual status as follows to confirm this:
```jldoctest quickstart_example
julia> JuMP.primal_status(model)
FeasiblePoint::ResultStatusCode = 1
FEASIBLE_POINT::ResultStatusCode = 1
julia> JuMP.dual_status(model)
FeasiblePoint::ResultStatusCode = 1
FEASIBLE_POINT::ResultStatusCode = 1
```
Note that the primal and dual status only inform that the primal and dual
solutions are feasible and it is only because we verified that the termination
status is `Optimal` that we can conclude that they form an optimal solution.
status is `OPTIMAL` that we can conclude that they form an optimal solution.

Finally, we can query the result of the optimization. First, we can query the
objective value:
Expand Down
14 changes: 7 additions & 7 deletions docs/src/solvers.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,18 +32,18 @@ that is completely transparent to JuMP. While the MOI API may seem very
demanding, it allows MOI models to be a succession of lightweight MOI layers
that fill the gap between JuMP requirements and the solver capabilities.

JuMP models can be created in three different modes: Automatic, Manual and
Direct.
JuMP models can be created in three different modes: `AUTOMATIC`, `MANUAL` and
`DIRECT`.

## Automatic and Manual modes

In Automatic and Manual modes, two MOI layers are automatically applied to the
optimizer:
In `AUTOMATIC` and `MANUAL` modes, two MOI layers are automatically applied to
the optimizer:

* `CachingOptimizer`: maintains a cache of the model so that when the optimizer
does not support an incremental change to the model, the optimizer's internal
model can be discarded and restored from the cache just before optimization.
The `CachingOptimizer` has two different modes: Automatic and Manual
The `CachingOptimizer` has two different modes: `AUTOMATIC` and `MANUAL`
corresponding to the two JuMP modes with the same names.
* `LazyBridgeOptimizer` (this can be disabled using the `bridge_constraints`
keyword argument to [`Model`](@ref) constructor): when a constraint added is
Expand Down Expand Up @@ -81,8 +81,8 @@ TODO: how to control the caching optimizer states

## Direct mode

JuMP models can be created in Direct mode using the [`JuMP.direct_model`](@ref)
function.
JuMP models can be created in `DIRECT` mode using the
[`JuMP.direct_model`](@ref) function.
```@docs
JuMP.direct_model
```
Expand Down
4 changes: 2 additions & 2 deletions docs/src/variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,8 +30,8 @@ julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: Automatic
CachingOptimizer state: NoOptimizer
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, x[1:2])
Expand Down
44 changes: 22 additions & 22 deletions src/JuMP.jl
Original file line number Diff line number Diff line change
Expand Up @@ -129,11 +129,11 @@ end
# Model

# Model has three modes:
# 1) Automatic: moi_backend field holds a CachingOptimizer in Automatic mode.
# 2) Manual: moi_backend field holds a CachingOptimizer in Manual mode.
# 3) Direct: moi_backend field holds an AbstractOptimizer. No extra copy of the model is stored. The moi_backend must support add_constraint etc.
# 1) AUTOMATIC: moi_backend field holds a CachingOptimizer in AUTOMATIC mode.
# 2) MANUAL: moi_backend field holds a CachingOptimizer in MANUAL mode.
# 3) DIRECT: moi_backend field holds an AbstractOptimizer. No extra copy of the model is stored. The moi_backend must support add_constraint etc.
# Methods to interact with the CachingOptimizer are defined in solverinterface.jl.
@enum ModelMode Automatic Manual Direct
@enum ModelMode AUTOMATIC MANUAL DIRECT

abstract type AbstractModel end
# All `AbstractModel`s must define methods for these functions:
Expand All @@ -152,8 +152,8 @@ mutable struct Model <: AbstractModel
variable_to_fix::Dict{MOIVAR, MOIFIX}
variable_to_integrality::Dict{MOIVAR, MOIINT}
variable_to_zero_one::Dict{MOIVAR, MOIBIN}
# In Manual and Automatic modes, CachingOptimizer.
# In Direct mode, will hold an AbstractOptimizer.
# In MANUAL and AUTOMATIC modes, CachingOptimizer.
# In DIRECT mode, will hold an AbstractOptimizer.
moi_backend::MOI.AbstractOptimizer
# Hook into a solve call...function of the form f(m::Model; kwargs...),
# where kwargs get passed along to subsequent solve calls.
Expand All @@ -171,7 +171,7 @@ mutable struct Model <: AbstractModel
end

"""
Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic,
Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.AUTOMATIC,
bridge_constraints::Bool=true)
Return a new JuMP model without any optimizer; the model is stored the model in
Expand All @@ -182,7 +182,7 @@ optimizer are automatically bridged to equivalent supported constraints when
an appropriate transformation is defined in the `MathOptInterface.Bridges`
module or is defined in another module and is explicitely added.
"""
function Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic,
function Model(; caching_mode::MOIU.CachingOptimizerMode=MOIU.AUTOMATIC,
solver=nothing)
if solver !== nothing
error("The solver= keyword is no longer available in JuMP 0.19 and " *
Expand All @@ -197,7 +197,7 @@ end

"""
Model(optimizer_factory::OptimizerFactory;
caching_mode::MOIU.CachingOptimizerMode=MOIU.Automatic,
caching_mode::MOIU.CachingOptimizerMode=MOIU.AUTOMATIC,
bridge_constraints::Bool=true)
Return a new JuMP model using the optimizer factory `optimizer_factory` to
Expand Down Expand Up @@ -272,19 +272,19 @@ MathOptInterface or solver-specific functionality.
"""
backend(model::Model) = model.moi_backend

moi_mode(model::MOI.ModelLike) = Direct
moi_mode(model::MOI.ModelLike) = DIRECT
function moi_mode(model::MOIU.CachingOptimizer)
if model.mode == MOIU.Automatic
return Automatic
if model.mode == MOIU.AUTOMATIC
return AUTOMATIC
else
return Manual
return MANUAL
end
end

"""
mode(model::Model)
Return mode (Direct, Automatic, Manual) of model.
Return mode (DIRECT, AUTOMATIC, MANUAL) of model.
"""
mode(model::Model) = moi_mode(backend(model))
# The type of backend(model) is unknown so we directly redirect to another
Expand Down Expand Up @@ -313,14 +313,14 @@ end
solver_name(model::Model)
If available, returns the `SolverName` property of the underlying optimizer.
Returns `"No optimizer attached"` in `Automatic` or `Manual` modes when no
Returns `"No optimizer attached"` in `AUTOMATIC` or `MANUAL` modes when no
optimizer is attached. Returns
"SolverName() attribute not implemented by the optimizer." if the attribute is
not implemented.
"""
function solver_name(model::Model)
if mode(model) != Direct &&
MOIU.state(backend(model)) == MOIU.NoOptimizer
if mode(model) != DIRECT &&
MOIU.state(backend(model)) == MOIU.NO_OPTIMIZER
return "No optimizer attached."
else
return try_get_solver_name(backend(model))
Expand Down Expand Up @@ -451,19 +451,19 @@ Base.copy(v::AbstractArray{VariableRef}, new_model::Model) = (var -> VariableRef

function optimizer_index(v::VariableRef)
model = owner_model(v)
if mode(model) == Direct
if mode(model) == DIRECT
return index(v)
else
@assert backend(model).state == MOIU.AttachedOptimizer
@assert backend(model).state == MOIU.ATTACHED_OPTIMIZER
return backend(model).model_to_optimizer_map[index(v)]
end
end

function optimizer_index(cr::ConstraintRef{Model})
if mode(cr.model) == Direct
if mode(cr.model) == DIRECT
return index(cr)
else
@assert backend(cr.model).state == MOIU.AttachedOptimizer
@assert backend(cr.model).state == MOIU.ATTACHED_OPTIMIZER
return backend(cr.model).model_to_optimizer_map[index(cr)]
end
end
Expand All @@ -478,7 +478,7 @@ return false.
See also [`dual`](@ref) and [`shadow_price`](@ref).
"""
has_duals(model::Model) = dual_status(model) != MOI.NoSolution
has_duals(model::Model) = dual_status(model) != MOI.NO_SOLUTION

"""
dual(cr::ConstraintRef)
Expand Down
14 changes: 7 additions & 7 deletions src/constraints.jl
Original file line number Diff line number Diff line change
Expand Up @@ -218,7 +218,7 @@ end
function moi_add_constraint(model::MOI.ModelLike, f::MOI.AbstractFunction,
s::MOI.AbstractSet)
if !MOI.supports_constraint(model, typeof(f), typeof(s))
if moi_mode(model) == Direct
if moi_mode(model) == DIRECT
bridge_message = "."
elseif moi_bridge_constraints(model)
bridge_message = " and there are no bridges that can reformulate it into supported constraints."
Expand Down Expand Up @@ -282,8 +282,8 @@ end
The change in the objective from an infinitesimal relaxation of the constraint.
This value is computed from [`dual`](@ref) and can be queried only when
`has_duals` is `true` and the objective sense is `MinSense` or `MaxSense`
(not `FeasibilitySense`). For linear constraints, the shadow prices differ at
`has_duals` is `true` and the objective sense is `MIN_SENSE` or `MAX_SENSE`
(not `FEASIBILITY_SENSE`). For linear constraints, the shadow prices differ at
most in sign from the `dual` value depending on the objective sense.
## Notes
Expand All @@ -308,9 +308,9 @@ function shadow_price_less_than_(dual_value, sense::MOI.OptimizationSense)
# shadow price is nonnegative (because relaxing a constraint can only
# improve the objective). By MOI convention, a feasible dual on a LessThan
# set is nonpositive, so we flip the sign when maximizing.
if sense == MOI.MaxSense
if sense == MOI.MAX_SENSE
return -dual_value
elseif sense == MOI.MinSense
elseif sense == MOI.MIN_SENSE
return dual_value
else
error("The shadow price is not available because the objective sense " *
Expand All @@ -322,9 +322,9 @@ end
function shadow_price_greater_than_(dual_value, sense::MOI.OptimizationSense)
# By MOI convention, a feasible dual on a GreaterThan set is nonnegative,
# so we flip the sign when minimizing. (See comment in the method above).
if sense == MOI.MaxSense
if sense == MOI.MAX_SENSE
return dual_value
elseif sense == MOI.MinSense
elseif sense == MOI.MIN_SENSE
return -dual_value
else
error("The shadow price is not available because the objective sense " *
Expand Down
12 changes: 6 additions & 6 deletions src/copy.jl
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ the reference map.
## Note
Model copy is not supported in Direct mode, i.e. when a model is constructed
Model copy is not supported in `DIRECT` mode, i.e. when a model is constructed
using the [`direct_model`](@ref) constructor instead of the [`Model`](@ref)
constructor. Moreover, independently on whether an optimizer was provided at
model construction, the new model will have no optimizer, i.e., an optimizer
Expand All @@ -90,10 +90,10 @@ cref_new = reference_map[cref]
```
"""
function copy_model(model::Model)
if mode(model) == Direct
error("Cannot copy a model in Direct mode. Use the `Model` constructor",
" instead of the `direct_model` constructor to be able to copy",
" the constructed model.")
if mode(model) == DIRECT
error("Cannot copy a model in `DIRECT` mode. Use the `Model` ",
"constructor instead of the `direct_model` constructor to be ",
"able to copy the constructed model.")
end
caching_mode = backend(model).mode
new_model = Model(caching_mode = caching_mode)
Expand Down Expand Up @@ -147,7 +147,7 @@ and its copy.
## Note
Model copy is not supported in Direct mode, i.e. when a model is constructed
Model copy is not supported in `DIRECT` mode, i.e. when a model is constructed
using the [`direct_model`](@ref) constructor instead of the [`Model`](@ref)
constructor. Moreover, independently on whether an optimizer was provided at
model construction, the new model will have no optimizer, i.e., an optimizer
Expand Down
18 changes: 9 additions & 9 deletions src/macros.jl
Original file line number Diff line number Diff line change
Expand Up @@ -796,17 +796,17 @@ Constructs a vector of `QuadConstraint` objects. Similar to `@QuadConstraint`, e
Return an expression whose value is an `MOI.OptimizationSense` corresponding
to `sense`. Sense is either the symbol `:Min` or `:Max`, corresponding
respectively to `MOI.MinSense` and `MOI.MaxSense` or it is another symbol,
respectively to `MOI.MIN_SENSE` and `MOI.MAX_SENSE` or it is another symbol,
which should be the name of a variable or expression whose value is an
`MOI.OptimizationSense`.
In the last case, the expression throws an error using the `_error`
function in case the value is not an `MOI.OptimizationSense`.
"""
function moi_sense(_error::Function, sense)
if sense == :Min
expr = MOI.MinSense
expr = MOI.MIN_SENSE
elseif sense == :Max
expr = MOI.MaxSense
expr = MOI.MAX_SENSE
else
# Refers to a variable that holds the sense.
# TODO: Better document this behavior
Expand All @@ -829,8 +829,8 @@ end
@objective(model::Model, sense, func)
Set the objective sense to `sense` and objective function to `func`. The
objective sense can be either `Min`, `Max`, `MathOptInterface.MinSense`,
`MathOptInterface.MaxSense` or `MathOptInterface.FeasibilitySense`; see
objective sense can be either `Min`, `Max`, `MathOptInterface.MIN_SENSE`,
`MathOptInterface.MAX_SENSE` or `MathOptInterface.FEASIBILITY_SENSE`; see
[`MathOptInterface.ObjectiveSense`](http://www.juliaopt.org/MathOptInterface.jl/v0.6/apireference.html#MathOptInterface.ObjectiveSense).
In order to set the sense programatically, i.e., when `sense` is a Julia
variable whose value is the sense, one of the three
Expand All @@ -847,8 +847,8 @@ julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: Automatic
CachingOptimizer state: NoOptimizer
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, x)
Expand All @@ -867,8 +867,8 @@ julia> @objective(model, Max, 2x - 1)
To set a quadratic objective and set the objective sense programatically, do
as follows:
```jldoctest @objective
julia> sense = JuMP.MOI.MinSense
MinSense::OptimizationSense = 0
julia> sense = JuMP.MOI.MIN_SENSE
MIN_SENSE::OptimizationSense = 0
julia> @objective(model, sense, x^2 - 2x + 1)
x² - 2 x + 1
Expand Down
2 changes: 1 addition & 1 deletion src/nlp.jl
Original file line number Diff line number Diff line change
Expand Up @@ -1153,7 +1153,7 @@ function register(m::Model, s::Symbol, dimension::Integer, f::Function, ∇f::Fu
end

# TODO: Add a docstring.
# Ex: set_NL_objective(model, MOI.MinSense, :($x + $y^2))
# Ex: set_NL_objective(model, MOI.MIN_SENSE, :($x + $y^2))
function set_NL_objective(model::Model, sense::MOI.OptimizationSense, x)
return set_objective(model, sense, NonlinearExprData(model, x))
end
Expand Down
4 changes: 2 additions & 2 deletions src/objective.jl
Original file line number Diff line number Diff line change
Expand Up @@ -103,8 +103,8 @@ julia> model = Model()
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: Automatic
CachingOptimizer state: NoOptimizer
Model mode: AUTOMATIC
CachingOptimizer state: NO_OPTIMIZER
Solver name: No optimizer attached.
julia> @variable(model, x)
Expand Down
Loading

0 comments on commit 83df5c2

Please sign in to comment.