Skip to content

Commit c25f2de

Browse files
authored
JuMP extension added (#52)
* Made things compatible with ExaModelsMOI * Null added * Added docstrings * Works for luksan instance * Works for OPF, but need to address the scalability issue * A bit dirty, but works * added testing * Testing and addressing Josh's comments * Update readme * Update documentation * bug fix for affine objective * prinft added * Added utils * Updated testing * test should work now * docstring updated * md file clean up * removed comment * separated MOI
1 parent 598ba84 commit c25f2de

32 files changed

+1236
-558
lines changed

Project.toml

Lines changed: 10 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -5,37 +5,42 @@ version = "0.4.2"
55

66
[deps]
77
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
8+
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
89
SolverCore = "ff4d7338-4cf1-434d-91df-b86cb86fb843"
910

1011
[weakdeps]
12+
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
13+
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
14+
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
1115
AMDGPU = "21141c5a-9bdb-4563-92ae-f87d6854732e"
1216
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
13-
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
14-
SpecialFunctions = "276daf66-3868-5448-9aa4-cd146d93841b"
1517
oneAPI = "8f75cd03-7ff8-4ecb-9b8f-daf728133b1b"
18+
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
1619

1720
[extensions]
1821
ExaModelsAMDGPU = "AMDGPU"
1922
ExaModelsCUDA = "CUDA"
2023
ExaModelsKernelAbstractions = "KernelAbstractions"
2124
ExaModelsOneAPI = "oneAPI"
2225
ExaModelsSpecialFunctions = "SpecialFunctions"
26+
ExaModelsJuMP = "JuMP"
27+
ExaModelsMOI = "MathOptInterface"
2328

2429
[compat]
25-
AMDGPU = "0.5"
26-
CUDA = "4"
30+
AMDGPU = "0.7"
31+
CUDA = "5"
2732
KernelAbstractions = "0.9"
2833
NLPModels = "0.18, 0.19, 0.20"
2934
SolverCore = "0.3"
3035
SpecialFunctions = "2"
3136
julia = "1.9"
3237
oneAPI = "1"
38+
MathOptInterface = "1.19"
3339

3440
[extras]
3541
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
3642
Downloads = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
3743
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
38-
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
3944
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
4045
MadNLP = "2621e9c9-9eb4-46b1-8089-e8c72242dfb6"
4146
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"

README.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@
66

77
| **License** | **Documentation** | **Build Status** | **Coverage** | **Citation** |
88
|:-----------------:|:----------------:|:----------------:|:----------------:|:----------------:|
9-
| [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/sshin23/ExaModels.jl/blob/main/LICENSE) | [![doc](https://img.shields.io/badge/docs-stable-blue.svg)](https://sshin23.github.io/ExaModels.jl/stable) [![doc](https://img.shields.io/badge/docs-dev-blue.svg)](https://sshin23.github.io/ExaModels.jl/dev) | [![build](https://github.com/sshin23/ExaModels.jl/actions/workflows/test.yml/badge.svg)](https://github.com/sshin23/ExaModels.jl/actions/workflows/test.yml) | [![codecov](https://codecov.io/gh/sshin23/ExaModels.jl/branch/main/graph/badge.svg?token=8ViJWBWnZt)](https://codecov.io/gh/sshin23/ExaModels.jl) | [![arXiv](https://img.shields.io/badge/arXiv-2307.16830-b31b1b.svg)](https://arxiv.org/abs/2307.16830) |
9+
| [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://github.com/exanauts/ExaModels.jl/blob/main/LICENSE) | [![doc](https://img.shields.io/badge/docs-stable-blue.svg)](https://exanauts.github.io/ExaModels.jl/stable) [![doc](https://img.shields.io/badge/docs-dev-blue.svg)](https://exanauts.github.io/ExaModels.jl/dev) | [![build](https://github.com/exanauts/ExaModels.jl/actions/workflows/test.yml/badge.svg)](https://github.com/exanauts/ExaModels.jl/actions/workflows/test.yml) | [![codecov](https://codecov.io/gh/exanauts/ExaModels.jl/branch/main/graph/badge.svg?token=8ViJWBWnZt)](https://codecov.io/gh/exanauts/ExaModels.jl) | [![arXiv](https://img.shields.io/badge/arXiv-2307.16830-b31b1b.svg)](https://arxiv.org/abs/2307.16830) |
1010

1111
## Overview
1212
ExaModels.jl employs what we call **[SIMD](https://en.wikipedia.org/wiki/Single_instruction,_multiple_data) abstraction for [nonlinear programs](https://en.wikipedia.org/wiki/Nonlinear_programming)** (NLPs), which allows for the **preservation of the parallelizable structure** within the model equations, facilitating **efficient, parallel [reverse-mode automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation)** on the **[GPU](https://en.wikipedia.org/wiki/Graphics_processing_unit) accelerators**.
@@ -27,7 +27,7 @@ for double-precision arithmetic.
2727

2828
## Highlight
2929
The performance comparison of ExaModels with other algebraic modeling systems for evaluating different NLP functions (obj, con, grad, jac, and hess) are shown below. Note that Hessian computations are the typical bottlenecks.
30-
![benchmark](https://raw.githubusercontent.com/sshin23/ExaModels.jl/main/docs/src/assets/benchmark.svg)
30+
![benchmark](https://raw.githubusercontent.com/exanauts/ExaModels.jl/main/docs/src/assets/benchmark.svg)
3131
## Supporting ExaModels.jl
32-
- Please report issues and feature requests via the [GitHub issue tracker](https://github.com/sshin/ExaModels.jl/issues).
33-
- Questions are welcome at [GitHub discussion forum](https://github.com/sshin23/ExaModels.jl/discussions).
32+
- Please report issues and feature requests via the [GitHub issue tracker](https://github.com/exanatus/ExaModels.jl/issues).
33+
- Questions are welcome at [GitHub discussion forum](https://github.com/exanauts/ExaModels.jl/discussions).

benchmark/Project.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ ExaModelsBenchmarkKernelAbstractions = "KernelAbstractions"
3232
ExaModelsBenchmarkOneAPI = "oneAPI"
3333

3434
[compat]
35-
AMDGPU = "0.5"
36-
CUDA = "4"
35+
AMDGPU = "0.7"
36+
CUDA = "5"
3737
KernelAbstractions = "0.9"
3838
oneAPI = "1"

docs/Project.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
77
KernelAbstractions = "63c18a36-062a-441e-b654-da1e3ab1ce7c"
88
Literate = "98b081ad-f1c9-55d3-8b20-4c87d4299306"
99
MadNLP = "2621e9c9-9eb4-46b1-8089-e8c72242dfb6"
10+
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
1011
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
1112
NLPModelsIpopt = "f4238b75-b362-5c4c-b852-0801c9a21d71"
1213
PowerModels = "c36e90e8-916a-50a6-bd94-075b64ef4655"

docs/make.jl

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ if !(@isdefined _PAGES)
1010
"Mathematical Abstraction" => "simd.md",
1111
"Tutorial" => [
1212
"guide.md",
13+
"jump.md",
1314
"performance.md",
1415
"gpu.md",
1516
"develop.md",
@@ -24,7 +25,7 @@ end
2425

2526
if !(@isdefined _JL_FILENAMES)
2627
const _JL_FILENAMES =
27-
["guide.jl", "quad.jl", "distillation.jl", "opf.jl", "gpu.jl", "performance.jl"]
28+
["guide.jl", "jump.jl", "quad.jl", "distillation.jl", "opf.jl", "gpu.jl", "performance.jl"]
2829
end
2930

3031
for jl_filename in _JL_FILENAMES
@@ -52,7 +53,7 @@ bib = CitationBibliography(joinpath(@__DIR__, "src", "refs.bib"))
5253

5354

5455
makedocs(
55-
bib,
56+
plugins=[bib],
5657
sitename = "ExaModels.jl",
5758
modules = [ExaModels],
5859
authors = "Sungho Shin",

docs/src/.gitignore

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,4 +3,5 @@ quad.md
33
distillation.md
44
opf.md
55
gpu.md
6-
performance.md
6+
performance.md
7+
jump.md

docs/src/develop.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
ExaModels.jl's API only uses simple julia funcitons, and thus, implementing the extensions is straightforward. Below, we suggest a good practice for implementing an extension package.
44

5-
Let's say that we want to implement an extension package for the example problem in [Getting Started](@ref). An extension package may look like:
5+
Let's say that we want to implement an extension package for the example problem in [Getting Started](@ref guide). An extension package may look like:
66
```
77
Root
88
├───Project.toml

docs/src/distillation.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# # Example: Distillation Column
1+
# # [Example: Distillation Column](@id distillation)
22

33
function distillation_column_model(T = 3; backend = nothing)
44

docs/src/guide.jl

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# # Getting Started
1+
# # [Getting Started](@id guide)
22
# ExaModels can create nonlinear prgogramming models and allows solving the created models using NLP solvers (in particular, those that are interfaced with `NLPModels`, such as [NLPModelsIpopt](https://github.com/JuliaSmoothOptimizers/NLPModelsIpopt.jl) and [MadNLP](https://github.com/MadNLP/MadNLP.jl). This documentation page will describe how to use `ExaModels` to model and solve nonlinear optimization problems.
33

44
# We will first consider the following simple nonlinear program [lukvsan1998indefinitely](@cite):
@@ -10,11 +10,11 @@
1010
# ```
1111
# We will follow the following Steps to create the model/solve this optimization problem.
1212
# - Step 0: import ExaModels.jl
13-
# - Step 1: create a [`ExaCore`](@ref ExaCore) object, wherein we can progressively build an optimization model.
14-
# - Step 2: create optimization variables with [`variable`]((@ref variable)), while attaching it to previously created `ExaCore`.
15-
# - Step 3 (interchangable with Step 3): create objective function with [`objective`](@ref objective), while attaching it to previously created `ExaCore`.
16-
# - Step 4 (interchangable with Step 2): create constraints with [`constraint`](@ref constraint), while attaching it to previously created `ExaCore`.
17-
# - Step 5: create an [`ExaModel`](@ref ExaModel) based on the `ExaCore`.
13+
# - Step 1: create a [`ExaCore`](@ref) object, wherein we can progressively build an optimization model.
14+
# - Step 2: create optimization variables with [`variable`](@ref), while attaching it to previously created `ExaCore`.
15+
# - Step 3 (interchangable with Step 3): create objective function with [`objective`](@ref), while attaching it to previously created `ExaCore`.
16+
# - Step 4 (interchangable with Step 2): create constraints with [`constraint`](@ref), while attaching it to previously created `ExaCore`.
17+
# - Step 5: create an [`ExaModel`](@ref) based on the `ExaCore`.
1818

1919
# Now, let's jump right in. We import ExaModels via (Step 0):
2020
using ExaModels
@@ -55,10 +55,10 @@ sol = solution(result, x)
5555

5656

5757
# ExaModels provide several APIs similar to this:
58-
# - [`solution`](@ref solution) inquires the primal solution.
59-
# - [`multiplier`](@ref multiplier) inquires the dual solution.
60-
# - [`multiplier_L`](@ref multiplier_L) inquires the lower bound dual solution.
61-
# - [`multiplier_U`](@ref multiplier_U) inquires the upper bound dual solution.
58+
# - [`solution`](@ref) inquires the primal solution.
59+
# - [`multipliers`](@ref) inquires the dual solution.
60+
# - [`multipliers_L`](@ref) inquires the lower bound dual solution.
61+
# - [`multipliers_U`](@ref) inquires the upper bound dual solution.
6262

6363
# This concludes a short tutorial on how to use ExaModels to model and solve optimization problems. Want to learn more? Take a look at the following examples, which provide further tutorial on how to use ExaModels.jl. Each of the examples are designed to instruct a few additional techniques.
6464
# - [Example: Quadrotor](): modeling multiple types of objective values and constraints.

docs/src/index.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ Welcome to the documentation of [ExaModels.jl](https://github.com/sshin23/ExaMod
88
**Please help us improve ExaModels and this documentation!** ExaModels is in the early stage of development, and you may encounter unintended behaviors or missing documentations. If you find anything is not working as intended or documentation is missing, please [open issues](https://github.com/sshin/ExaModels.jl/issues) or [pull requests](https://github.com/sshin/ExaModels.jl/pulls) or start [discussions](https://github.com/sshin/ExaModels.jl/discussions).
99

1010
## What is ExaModels.jl?
11-
ExaModels.jl is an [algebraic modeling](https://en.wikipedia.org/wiki/Algebraic_modeling_language) and [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) tool in [Julia Language](https://julialang.org/), specialized for [SIMD](https://en.wikipedia.org/wiki/Single_instruction,_multiple_data) abstraction of [nonlinear programs](https://en.wikipedia.org/wiki/Nonlinear_programming). ExaModels.jl employs what we call [SIMD](https://en.wikipedia.org/wiki/Single_instruction,_multiple_data) abstraction for [nonlinear programs](https://en.wikipedia.org/wiki/Nonlinear_programming) (NLPs), which allows for the preservation of the parallelizable structure within the model equations, facilitating efficient [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) either on the single-thread CPUs, multi-threaded CPUs, as well as [GPU accelerators](https://en.wikipedia.org/wiki/Graphics_processing_unit). More details about SIMD abstraction can be found [here](/simd).
11+
ExaModels.jl is an [algebraic modeling](https://en.wikipedia.org/wiki/Algebraic_modeling_language) and [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) tool in [Julia Language](https://julialang.org/), specialized for [SIMD](https://en.wikipedia.org/wiki/Single_instruction,_multiple_data) abstraction of [nonlinear programs](https://en.wikipedia.org/wiki/Nonlinear_programming). ExaModels.jl employs what we call [SIMD](https://en.wikipedia.org/wiki/Single_instruction,_multiple_data) abstraction for [nonlinear programs](https://en.wikipedia.org/wiki/Nonlinear_programming) (NLPs), which allows for the preservation of the parallelizable structure within the model equations, facilitating efficient [automatic differentiation](https://en.wikipedia.org/wiki/Automatic_differentiation) either on the single-thread CPUs, multi-threaded CPUs, as well as [GPU accelerators](https://en.wikipedia.org/wiki/Graphics_processing_unit). More details about SIMD abstraction can be found [here](@ref simd).
1212

1313
## Key differences from other algebraic modeling tools
1414
ExaModels.jl is different from other algebraic modeling tools, such as [JuMP](https://github.com/jump-dev/JuMP.jl) or [AMPL](https://ampl.com/), in the following ways:
@@ -40,10 +40,10 @@ a 9241 bus system, derivative evaluation using ExaModels.jl on GPUs
4040
can be up to two orders of magnitude faster compared to JuMP or
4141
AMPL. Some benchmark results are available below. The following
4242
problems are used for benchmarking:
43-
- [LuksanVlcek problem](../guide)
44-
- [Quadrotor control problem](../quad)
45-
- [Distillation column control problem](../dist)
46-
- [AC optimal power flow problem](../opf)
43+
- [LuksanVlcek problem](@ref guide)
44+
- [Quadrotor control problem](@ref quad)
45+
- [Distillation column control problem](@ref distillation)
46+
- [AC optimal power flow problem](@ref opf)
4747

4848
![benchmark](./assets/benchmark.svg)
4949

@@ -54,9 +54,9 @@ ExaModels can be used with any solver that can handle `NLPModel` data type, but
5454

5555
## Documentation Structure
5656
This documentation is structured in the following way.
57-
- The remainder of [this page](.) highlights several key aspects of ExaModels.jl.
58-
- The mathematical abstraction---SIMD abstraction of nonlinear programming---of ExaModels.jl is discussed in [Mathematical Abstraction page](./simd).
59-
- The step-by-step tutorial of using ExaModels.jl can be found in [Tutorial page](./guide).
57+
- The remainder of this page highlights several key aspects of ExaModels.jl.
58+
- The mathematical abstraction---SIMD abstraction of nonlinear programming---of ExaModels.jl is discussed in [Mathematical Abstraction page](@ref simd).
59+
- The step-by-step tutorial of using ExaModels.jl can be found in [Tutorial page](@ref guide).
6060
- This documentation does not intend to discuss the engineering behind the implementation of ExaModels.jl. Some high-level idea is discussed in [a recent publication](https://arxiv.org/abs/2307.16830), but the full details of the engineering behind it will be discussed in the future publications.
6161

6262

docs/src/jump.jl

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
# # JuMP Interface
2+
3+
# We have an experimental interface to JuMP model. A JuMP model can be directly converted to a `ExaModel`. It is as simple as this:
4+
5+
using ExaModels, JuMP
6+
7+
N = 10
8+
jm = Model()
9+
10+
@variable(jm, x[i = 1:N], start = mod(i, 2) == 1 ? -1.2 : 1.0)
11+
@constraint(
12+
jm,
13+
s[i = 1:N-2],
14+
3x[i+1]^3 + 2x[i+2] - 5 + sin(x[i+1] - x[i+2])sin(x[i+1] + x[i+2]) + 4x[i+1] -
15+
x[i]exp(x[i] - x[i+1]) - 3 == 0.0
16+
)
17+
@objective(jm, Min, sum(100(x[i-1]^2 - x[i])^2 + (x[i-1] - 1)^2 for i = 2:N))
18+
19+
em = ExaModel(jm)
20+
21+
# Here, note that only scalar objective/constraints created via `@constraint` and `@objective` API are supported. Older syntax like `@NLconstraint` and `@NLobjective` are not supported.
22+
# We can solve the model using any of the solvers supported by ExaModels. For example, we can use Ipopt:
23+
24+
using NLPModelsIpopt
25+
26+
result = ipopt(em)
27+
28+
29+

docs/src/opf.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# # Example: Optimal Power Flow
1+
# # [Example: Optimal Power Flow](@id opf)
22

33
function parse_ac_power_data(filename)
44
data = PowerModels.parse_file(filename)

docs/src/quad.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# # Example: Quadrotor
1+
# # [Example: Quadrotor](@id quad)
22

33
function quadrotor_model(N = 3; backend = nothing)
44

docs/src/simd.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
# SIMD Abstraction
1+
# [SIMD Abstraction](@id simd)
22

33
In this page, we explain what SIMD abstraction of nonlinear program is, and why it can be beneficial for scalable optimization of large-scale optimization problems. More discussion can be found in our [paper](https://arxiv.org/abs/2307.16830).
44

0 commit comments

Comments
 (0)