Skip to content

Commit 02bbd70

Browse files
ericphansonblegat
andauthored
More MOIified implementation, again (#504)
* reland on master * add tests * remove more complicated variants * remove 2's from names * simplify * rename * some tweaks * wip * fix strange bug in `transpose` * move fallback somewhere better * tweak * slightly cleanup, restore comment * misc * format * complex vcat * get `sdp_relative_entropy` working * use `evaluate` * format * Convert remaining conic forms, get more tests passing * get some more tests passing * format, more complex variable fixes * fix * small fixes * fixes * wip * fix bug in NormCone usage * add `latex_formulation` helper * rename `template` to `conic_form!` * swap order of arguments in `conic_form!` * rename to `add_constraint!` * improve handling of infeasbility at problem creation time * pass sign info along * wip * wip * wip * fix suspected bug * cleanup * try to improve clarity * tweak objective values for satisfy problems * format * satisfy * vexity of problem is that of its objective * format * fix sign/monotonicity/curvature for problems * refactor, tighten up operate code * fixes * fix * wip * Remove most `head` and `id_hash` fields * wip * fix * add `isequal` and `hash` definitions * don't early exit tests * format * fix order * update tests * fixes * Changing handling of ComplexVariable * up * modernize workflows * restore some comments * further tighten up types * more fixes * fix geomean evaluate & add tests * wip * restore `sdp_lieb_ando` problem formulation * restrict types further * Some cleanup * Restore `norm` from master * update docs * cleanup * update CI to trigger when marked ready for review * relax types & improve perf * fix vexity (needs test) * Revert "relax types & improve perf" This reverts commit 82ad0e0. * try adding cache * fix * relax types & improve perf * use SuiteSparseGraphBLAS * use WeakKeyIdDict * wip * switch back to SparseArrays to avoid GC corruption * format * comment out SuiteSparseGraphBLAS code * fix tests * fix tests, add show test, confused about constraint vexity * wip * fix? * up * up * up * fix * try to optimize operations * Update MOI wrapper * Fixes * Fix for Julia v1.6 * Fixes * Remove warmstart in docs * Fix * Remove warmstart * Add missing docstring * Remove unused code * Fix * Fix format * Update antidiag * Remove unused import * Rename and add docs * Fix * Fix * Fix format * Fix * Add coverage * Fix format * Fix * Add MOI wrapper tests * Refactor instantiate * Fix doc * Fix MOI wrapper tests * Fix format * Remove size * Prevent non-empty model being given * Fixes * Fix * Fix format --------- Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
1 parent 4c4bead commit 02bbd70

File tree

104 files changed

+3474
-3663
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

104 files changed

+3474
-3663
lines changed

.github/workflows/cancel.yml

Lines changed: 0 additions & 20 deletions
This file was deleted.

.github/workflows/ci.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,14 +10,21 @@ on:
1010
- 'src/**'
1111
- 'Project.toml'
1212
pull_request:
13-
types: [opened, synchronize, reopened]
13+
types: [opened, synchronize, reopened, ready_for_review]
1414
paths:
1515
- '.github/workflows/ci.yml'
1616
- 'test/**'
1717
- 'src/**'
1818
- 'Project.toml'
19+
concurrency:
20+
# Skip intermediate builds: always.
21+
# Cancel intermediate builds: only if it is a pull request build.
22+
group: ${{ github.workflow }}-${{ github.ref }}
23+
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
1924
jobs:
2025
test:
26+
# Run on push's or non-draft PRs
27+
if: (github.event_name == 'push') || (github.event.pull_request.draft == false) || (github.event_name == 'workflow_dispatch')
2128
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
2229
runs-on: ${{ matrix.os }}
2330
strategy:

.github/workflows/docs.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,13 +8,20 @@ on:
88
- 'src/**'
99
- 'docs/**'
1010
pull_request:
11-
types: [opened, synchronize, reopened]
11+
types: [opened, synchronize, reopened, ready_for_review]
1212
paths:
1313
- '.github/workflows/docs.yml'
1414
- 'src/**'
1515
- 'docs/**'
16+
concurrency:
17+
# Skip intermediate builds: always.
18+
# Cancel intermediate builds: only if it is a pull request build.
19+
group: ${{ github.workflow }}-${{ github.ref }}
20+
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
1621
jobs:
1722
build:
23+
# Run on push's or non-draft PRs
24+
if: (github.event_name == 'push') || (github.event.pull_request.draft == false) || (github.event_name == 'workflow_dispatch')
1825
runs-on: ubuntu-latest
1926
env:
2027
GKSwstype: nul

.github/workflows/format_check.yml

Lines changed: 9 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -5,9 +5,16 @@ on:
55
- master
66
- release-*
77
pull_request:
8-
types: [opened, synchronize, reopened]
8+
types: [opened, synchronize, reopened, ready_for_review]
9+
concurrency:
10+
# Skip intermediate builds: always.
11+
# Cancel intermediate builds: only if it is a pull request build.
12+
group: ${{ github.workflow }}-${{ github.ref }}
13+
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
914
jobs:
1015
build:
16+
# Run on push's or non-draft PRs
17+
if: (github.event_name == 'push') || (github.event.pull_request.draft == false) || (github.event_name == 'workflow_dispatch')
1118
runs-on: ubuntu-latest
1219
steps:
1320
- uses: julia-actions/setup-julia@latest
@@ -18,7 +25,7 @@ jobs:
1825
shell: julia --color=yes {0}
1926
run: |
2027
using Pkg
21-
Pkg.add(PackageSpec(name="JuliaFormatter", version="0.22.4"))
28+
Pkg.add(PackageSpec(name="JuliaFormatter", version="1"))
2229
using JuliaFormatter
2330
format(".", verbose=true)
2431
out = String(read(Cmd(`git diff`)))

.github/workflows/nightly_ci.yml

Lines changed: 8 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,14 +10,21 @@ on:
1010
- 'src/**'
1111
- 'Project.toml'
1212
pull_request:
13-
types: [opened, synchronize, reopened]
13+
types: [opened, synchronize, reopened, ready_for_review]
1414
paths:
1515
- '.github/workflows/nightly_ci.yml'
1616
- 'test/**'
1717
- 'src/**'
1818
- 'Project.toml'
19+
concurrency:
20+
# Skip intermediate builds: always.
21+
# Cancel intermediate builds: only if it is a pull request build.
22+
group: ${{ github.workflow }}-${{ github.ref }}
23+
cancel-in-progress: ${{ startsWith(github.ref, 'refs/pull/') }}
1924
jobs:
2025
test:
26+
# Run on push's or non-draft PRs
27+
if: (github.event_name == 'push') || (github.event.pull_request.draft == false) || (github.event_name == 'workflow_dispatch')
2128
name: Julia ${{ matrix.version }} - ${{ matrix.os }} - ${{ matrix.arch }} - ${{ github.event_name }}
2229
runs-on: ${{ matrix.os }}
2330
strategy:

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,3 +8,4 @@ Manifest.toml
88
benchmark/*.json
99
test/Project.toml
1010
dev
11+
testproblem/D0W.jls

Project.toml

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
name = "Convex"
22
uuid = "f65535da-76fb-5f13-bab9-19810c17039a"
3-
version = "0.15.4"
3+
version = "0.16.0"
44

55
[deps]
66
AbstractTrees = "1520ce14-60c1-5f80-bbc7-55ef81b5835c"
@@ -15,6 +15,7 @@ Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"
1515
[compat]
1616
AbstractTrees = "0.2, 0.3, 0.4"
1717
BenchmarkTools = "1"
18+
Clarabel = "0.5"
1819
ECOS = "1"
1920
GLPK = "1"
2021
LDLFactorizations = "0.8.1, 0.9, 0.10"
@@ -24,11 +25,12 @@ SCS = "1"
2425
julia = "1.6"
2526

2627
[extras]
28+
Clarabel = "61c947e1-3e6d-4ee4-985a-eec8c727bd6e"
2729
ECOS = "e2685f51-7e38-5353-a97d-a921fd2c8199"
2830
GLPK = "60bf3e95-4087-53dc-ae20-288a0d20c6a6"
2931
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
3032
SCS = "c946c3f1-0d1f-5ce8-9dea-7daa1f7e2d13"
3133
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
3234

3335
[targets]
34-
test = ["ECOS", "GLPK", "Random", "SCS", "Statistics"]
36+
test = ["Clarabel", "ECOS", "GLPK", "Random", "SCS", "Statistics"]

benchmark/254.jl

Lines changed: 29 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,29 @@
1+
using Convex, Clarabel, JLD2, Downloads
2+
import MathOptInterface as MOI
3+
file = JLD2.load(
4+
Downloads.download(
5+
"https://github.com/cossio/cvx_example_data/raw/master/cvx_example.jld2",
6+
),
7+
)
8+
9+
m = 2000
10+
11+
let
12+
Adj = file["Adj"]
13+
N = file["N"]
14+
if m < size(Adj, 1)
15+
Adj = Adj[1:m, :]
16+
N = N[1:m, :, :]
17+
end
18+
x = Variable(2730)
19+
S, V, T = size(N)
20+
lN = log.(N)
21+
M = vec(sum(N[:, :, 2:T]; dims = (2, 3)))
22+
H = Adj * x
23+
problem = maximize(
24+
-sum(logsumexp(lN[:, v, t] - H) for t in 1:T-1, v in 1:V) - dot(M, H),
25+
[x -1e2, x 1e2],
26+
)
27+
28+
@time context = Convex.Context(problem, MOI.Utilities.Model{Float64}())
29+
end

benchmark/Project.toml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,9 @@
11
[deps]
2+
Clarabel = "61c947e1-3e6d-4ee4-985a-eec8c727bd6e"
3+
Convex = "f65535da-76fb-5f13-bab9-19810c17039a"
4+
Downloads = "f43a241f-c20a-4ad4-852c-f6b1247861c6"
5+
JLD2 = "033835bb-8acc-5ee8-8aae-3f567f8a3819"
6+
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
7+
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee"
28
PkgBenchmark = "32113eaa-f34f-5b0d-bd6c-c81e245fc73d"
9+
Test = "8dfed614-e22c-5e08-85e1-65c5234f0b40"

benchmark/alternating_minimization.jl

Lines changed: 118 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,118 @@
1+
using Clarabel
2+
using Convex
3+
using MathOptInterface
4+
const MOI = MathOptInterface
5+
using JuMP
6+
using Test
7+
8+
# Generate fake data matrix
9+
function gen_data(m, n, k)
10+
return (10 * rand(m, k) * 2 * rand(k, n))
11+
end
12+
13+
function gen_masks(A, holdout)
14+
training_mask = rand(size(A)...) .> 1 - holdout
15+
validation_mask = .!training_mask
16+
return training_mask, validation_mask
17+
end
18+
19+
function alternating_minimization(f, A, M, Y_init, k, MAX_ITERS)
20+
m, n = size(A)
21+
22+
X = Variable(m, k)
23+
Y = Variable(k, n)
24+
25+
objective = (
26+
norm(vec(M .* A) - vec(M .* (X * Y)), 2) +
27+
γ1 * norm(vec(X), 2) +
28+
γ2 * norm(vec(Y), 1)
29+
)
30+
31+
constraints = [X * Y >= ϵ]
32+
33+
problem = minimize(objective, constraints)
34+
35+
Y.value = Y_init
36+
for i in 1:MAX_ITERS
37+
fix!(Y)
38+
res = f(problem)
39+
free!(Y)
40+
fix!(X)
41+
res = f(problem)
42+
free!(X)
43+
end
44+
45+
return problem, X, Y
46+
end
47+
48+
function JuMP_setup!(model, X, Y, A, M, k)
49+
m, n = size(A)
50+
51+
@variable(model, t1 >= 0)
52+
@variable(model, t2 >= 0)
53+
@variable(model, Y_abs[1:n*k] >= 0)
54+
@constraint(model, vcat(t1, vec(M .* A - M .* (X * Y))) SecondOrderCone())
55+
@constraint(model, vcat(t2, vec(X)) SecondOrderCone())
56+
@constraint(model, vec(Y) .<= Y_abs)
57+
@constraint(model, -vec(Y) .<= Y_abs)
58+
@objective(model, Min, t1 + γ1 * t2 + γ2 * sum(Y_abs))
59+
@constraint(model, X * Y .>= ϵ)
60+
end
61+
62+
function alternating_minimization_JuMP(A, M, Y_init, k, MAX_ITERS)
63+
m, n = size(A)
64+
65+
function Y_fix_model(Y)
66+
model = Model(() -> Clarabel.Optimizer(verbose = false))
67+
@variable(model, X[1:m, j = 1:k])
68+
JuMP_setup!(model, X, Y, A, M, k)
69+
return X, model
70+
end
71+
72+
function X_fix_model(X)
73+
model = Model(() -> Clarabel.Optimizer(verbose = false))
74+
@variable(model, Y[1:k, j = 1:n])
75+
JuMP_setup!(model, X, Y, A, M, k)
76+
return Y, model
77+
end
78+
79+
local Xval, model
80+
Yval = Y_init
81+
for i in 1:MAX_ITERS
82+
X, model = Y_fix_model(Yval)
83+
JuMP.optimize!(model)
84+
Xval = value.(X)
85+
86+
Y, model = X_fix_model(Xval)
87+
JuMP.optimize!(model)
88+
Yval = value.(Y)
89+
end
90+
91+
return model, Xval, Yval
92+
end
93+
94+
const γ1 = 1.0
95+
const γ2 = 1.0
96+
const ϵ = 0.0001
97+
MAX_ITERS = 2
98+
99+
m, n, k = 125, 125, 3
100+
holdout = 0.80
101+
102+
A = gen_data(m, n, k)
103+
Mt, Mv = gen_masks(A, holdout)
104+
105+
Y_init = rand(k, n)
106+
@info "Running with Convex.jl..." (m, n, k)
107+
@time p1, X1, Y1 =
108+
alternating_minimization(A, Mt, Y_init, k, MAX_ITERS) do problem
109+
return solve!(problem, () -> Clarabel.Optimizer(verbose = false))
110+
end;
111+
112+
@info "Running with JuMP..." (m, n, k)
113+
@time model, X3, Y3 = alternating_minimization_JuMP(A, Mt, Y_init, k, MAX_ITERS);
114+
115+
@testset "Same results" begin
116+
@test evaluate(X1) X3 atol = 1e-2 rtol = 1e-2
117+
@test evaluate(Y1) Y3 atol = 1e-2 rtol = 1e-2
118+
end

benchmark/benchmarks.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -33,6 +33,6 @@ problems = [
3333
]
3434

3535
SUITE["formulation"] = ProblemDepot.benchmark_suite(problems) do problem
36-
model = MOIU.MockOptimizer(MOIU.Model{Float64}())
37-
return Convex.load_MOI_model!(model, problem)
36+
opt = MOIU.MockOptimizer(MOIU.Model{Float64}())
37+
return Convex.Context(problem, opt)
3838
end

benchmark/runjudge.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ group_target = benchmarkpkg(
1414

1515
group_baseline = benchmarkpkg(
1616
dirname(@__DIR__),
17-
mkconfig(id = "baseline"),
17+
mkconfig(id = "eph/baseline"),
1818
resultfile = joinpath(@__DIR__, "result-baseline.json"),
1919
)
2020

docs/examples_literate/mixed_integer/aux_files/antidiag.jl

Lines changed: 26 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -10,13 +10,7 @@
1010
#############################################################################
1111
import Convex.sign,
1212
Convex.monotonicity, Convex.curvature, Convex.evaluate, Convex.conic_form!
13-
using Convex:
14-
AbstractExpr,
15-
ConstVexity,
16-
Nondecreasing,
17-
has_conic_form,
18-
cache_conic_form!,
19-
get_conic_form
13+
using Convex: AbstractExpr, ConstVexity, Nondecreasing
2014
export antidiag
2115

2216
### Diagonal
@@ -79,31 +73,33 @@ antidiag(x::AbstractExpr, k::Int = 0) = AntidiagAtom(x, k)
7973
# 3. We populate coeff with 1s at the correct indices
8074
# The canonical form will then be:
8175
# coeff * x - d = 0
82-
function conic_form!(
76+
function Convex.new_conic_form!(
77+
context::Convex.Context{T},
8378
x::AntidiagAtom,
84-
unique_conic_forms::Convex.UniqueConicForms,
85-
)
86-
if !has_conic_form(unique_conic_forms, x)
87-
(num_rows, num_cols) = x.children[1].size
88-
k = x.k
79+
) where {T}
80+
(num_rows, num_cols) = x.children[1].size
81+
k = x.k
8982

90-
if k >= 0
91-
start_index = k * num_rows + num_rows
92-
sz_diag = Base.min(num_rows, num_cols - k)
93-
else
94-
start_index = num_rows + k
95-
sz_diag = Base.min(num_rows + k, num_cols)
96-
end
97-
98-
select_diag = spzeros(sz_diag, length(x.children[1]))
99-
for i in 1:sz_diag
100-
select_diag[i, start_index] = 1
101-
start_index += num_rows - 1
102-
end
83+
if k >= 0
84+
start_index = k * num_rows + num_rows
85+
sz_diag = Base.min(num_rows, num_cols - k)
86+
else
87+
start_index = num_rows + k
88+
sz_diag = Base.min(num_rows + k, num_cols)
89+
end
10390

104-
objective = conic_form!(x.children[1], unique_conic_forms)
105-
new_obj = select_diag * objective
106-
cache_conic_form!(unique_conic_forms, x, new_obj)
91+
select_diag = spzeros(T, sz_diag, length(x.children[1]))
92+
for i in 1:sz_diag
93+
select_diag[i, start_index] = 1
94+
start_index += num_rows - 1
10795
end
108-
return get_conic_form(unique_conic_forms, x)
96+
97+
objective = conic_form!(context, Convex.only(Convex.children(x)))
98+
return Convex.operate(
99+
Convex.add_operation,
100+
T,
101+
Convex.sign(x),
102+
select_diag,
103+
objective,
104+
)
109105
end

0 commit comments

Comments
 (0)