Is Julia.JuMP 15x slower then Python.Cvxpy?

922 views Asked by At

I was trying to solve a simple optimization problem, first via Python.Cvxpy framework and then via Julia.JuMP framework, but Julia.JuMP formulation is 15x slower.

My optimization problem:

  1. In Python.Cvxpy: (runtime: 4 sec)
# Run: time python this_file.py
import cvxpy as cp
import numpy as np
n = 2
b = np.array([2,3])
c1 = np.array([[3,4],[1,0],[0,1]])
c2 = [1,0,0]

x = cp.Variable(n)
prob = cp.Problem( cp.Minimize(b@x), [ c1@x >= c2 ])
prob.solve(cp.MOSEK)   # FOSS alternative: prob.solve(cp.GLPK)

print('Solution:', prob.value)
  1. In Julia.JuMP: (runtime: 1min 7sec)
# Run: time julia this_file.jl
using JuMP
using Mosek, MosekTools   # FOSS alternative: using GLPK

function compute()
    n = 2
    b = [2,3]
    c1 = [3 4 ; 1 0 ; 0 1]
    c2 = [1,0,0]

    prob = Model(optimizer_with_attributes(Mosek.Optimizer))   
    # FOSS alternative: Model(optimizer_with_attributes(GLPK.Optimizer))
    @variable(prob, x[1:n])
    @objective(prob, Min, b'*x)
    @constraint(prob, c1*x .>= c2)
    JuMP.optimize!(prob)

    println("Solution: ", JuMP.objective_value(prob))
end;

compute()

Any tips or tricks to fasten the Julia.JuMP code?

1

There are 1 answers

1
Oscar Dowson On BEST ANSWER

More than 1 minute is excessive. Did you update packages or something and recompile?

Here's what I get;

(base) oscar@Oscars-MBP lore % cat ~/Desktop/discourse.jl
@time using JuMP
@time using GLPK

function compute()
    n = 2
    b = [2,3]
    c1 = [3 4 ; 1 0 ; 0 1]
    c2 = [1,0,0]

    prob = Model(GLPK.Optimizer)
    @variable(prob, x[1:n])
    @objective(prob, Min, b' * x)
    @constraint(prob, c1 * x .>= c2)
    optimize!(prob)
    println("Solution: ", objective_value(prob))
end

@time compute()
@time compute()
(base) oscar@Oscars-MBP lore % time ~/julia --project=/tmp/jump ~/Desktop/discourse.jl
  4.070492 seconds (8.34 M allocations: 599.628 MiB, 4.17% gc time, 0.09% compilation time)
  0.280838 seconds (233.24 k allocations: 16.040 MiB, 41.37% gc time)
Solution: 0.6666666666666666
 12.746518 seconds (17.74 M allocations: 1.022 GiB, 3.71% gc time, 44.57% compilation time)
Solution: 0.6666666666666666
  0.000697 seconds (2.87 k allocations: 209.516 KiB)
~/julia --project=/tmp/jump ~/Desktop/discourse.jl  22.63s user 0.55s system 100% cpu 23.102 total

Breaking it down

  • Total: 23 seconds
  • Of which, 4 seconds is using JuMP
  • 13 seconds is the first solve
  • ~0 seconds is the second solve
  • so that leaves 6 seconds to start Julia

We're working on improving the using JuMP and our "time-to-first-solve" issue, but there are a few things you can do in the meantime.

  1. Don't run scripts via julia file.jl. Open Julia once and use the REPL. That avoids the 6sec overhead.
  2. Solve more than one JuMP model in a session. You only need to pay the 13 seconds once. The second solve was quick.
  3. Solve bigger models. If the solve time is measured in minutes, you probably don't care about 13 seconds of start-up.
  4. Use PackageCompiler https://github.com/JuliaLang/PackageCompiler.jl to avoid some of the latency issues.
  5. Use a different tool. If your workflow is to solve lots of small optimization problems and you can't do the above things, at this moment JuMP might not be the right tool for the job (although we plan on improving the latency issues going forward).