From cfe02da4c04191a42f57d41e8d1c1dcf9864b8e9 Mon Sep 17 00:00:00 2001 From: Ronny Bergmann Date: Mon, 8 Jul 2024 17:20:46 +0200 Subject: [PATCH] Refine docs. --- src/solvers/interior_point_Newton.jl | 9 +++++---- 1 file changed, 5 insertions(+), 4 deletions(-) diff --git a/src/solvers/interior_point_Newton.jl b/src/solvers/interior_point_Newton.jl index 6c4f9ee2ae..01e58efdbf 100644 --- a/src/solvers/interior_point_Newton.jl +++ b/src/solvers/interior_point_Newton.jl @@ -41,12 +41,13 @@ The interior point Newton method iteratively solves ``F(p, μ, λ, s) = 0`` such by a Newton method, that is ```math -\operatorname{grad} F(p, μ, λ, s)[X, Y, Z, W] = -F(p, μ, λ, s), +\operatorname{Jacobian} F(p, μ, λ, s)[X, Y, Z, W] = -F(p, μ, λ, s), \text{ where } -X ∈ T_p\mathcal M, Y,W ∈ ℝ^m, Z ∈ ℝ^n +X ∈ T_p\mathcal M, Y,W ∈ ℝ^m, Z ∈ ℝ^n, ``` -together denote the new search direction. -This can for example be done in the reduced form. +see [`CondensedKKTVectorFieldJacobian`](@ref) and [`CondensedKKTVectorField`](@ref), respectively, +for the reduced form, this is usually solved in. +From the resulting `X` and `Z` in the reeuced form, the other two can be computed. Note that since the vector field ``F`` includes the gradients of the constraint functions ``g,h`, its gradient or Jacobian requires the Hessians of the constraints.