Skip to content

Commit 2da8f19

Browse files
authored
doc: fix bugs in doc (abess-team#105)
1 parent 6d1627b commit 2da8f19

File tree

10 files changed

+297
-329
lines changed

10 files changed

+297
-329
lines changed

docs/source/feature/DataScienceTool.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -79,9 +79,9 @@ Information Criterion
7979

8080

8181
Information criterion is a statistical measure used to assess the goodness of fit of a model while penalizing model complexity. It helps in selecting the optimal model from a set of competing models. In the context of sparsity-constrained optimization, information criterion can be used to evaluate different sparsity levels and identify the most suitable support size.
82-
.. There is another way to evaluate sparsity levels, which is information criterion. The smaller the information criterion, the better the model.
82+
There is another way to evaluate sparsity levels, which is information criterion. The smaller the information criterion, the better the model.
8383
There are four types of information criterion can be implemented in ``skscope.utilities``: Akaike information criterion `[1]`_, Bayesian information criterion (BIC, `[2]`_), extend BIC `[3]`_, and special information criterion (SIC `[4]`_).
84-
.. If sparsity is list and ``cv=None``, the solver will use information criterions to evaluate the sparsity level.
84+
If sparsity is list and ``cv=None``, the solver will use information criterions to evaluate the sparsity level.
8585
The input parameter ``ic_method`` in the solvers of skscope can be used to choose the information criterion. It should be a method to compute information criterion which has the same parameters with this example:
8686

8787
.. code-block:: python

docs/source/feature/Variants.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ In addition to standard sparsity-constrained optimization (SCO) problems, ``sksc
88
Group-structured parameters
99
----------------------------
1010

11-
In certain cases, we may encounter group-structured parameters where all parameters are divided into non-overlapping groups. Examples of such scenarios include group variable selection under linear model `[1]`_, `multitask learning <../userguide/examples/GeneralizedLinearModels/multiple-response-linear-regression.html>`__, and so on.
11+
In certain cases, we may encounter group-structured parameters where all parameters are divided into non-overlapping groups. Examples of such scenarios include group variable selection under linear model `[1]`_, `multitask learning <../gallery/GeneralizedLinearModels/multiple-response-linear-regression.html>`__, and so on.
1212

1313
When dealing with group-structured parameters, we treat each parameter group as a unit, selecting or deselecting all the parameters in the group simultaneously. This problem is referred to as group SCO (GSCO).
1414

@@ -174,8 +174,8 @@ In some cases, there may be additional constraints on the intrinsic structure of
174174

175175
.. math::
176176
\arg\min_{\theta \in R^s, \theta \in \mathcal{C}} f(\theta).
177-
178-
A typical example is the Gaussian graphical model for continuous random variables, which constrains :math:`\theta` on symmetric positive-definite spaces (see this example `<../userguide/examples/GraphicalModels/sparse-gaussian-precision-matrix.html>`__). Although the default numeric solver cannot solve this problem, ``skscope`` provides a flexible interface that allows for its replacement. Specifically, users can change the default numerical optimization solver by properly setting the ``numeric_solver`` in the solver.
177+
178+
A typical example is the Gaussian graphical model for continuous random variables, which constrains :math:`\theta` on symmetric positive-definite spaces (see this example `gaussian precision matrix <../gallery/GraphicalModels/sparse-gaussian-precision-matrix.html>`__). Although the default numeric solver cannot solve this problem, ``skscope`` provides a flexible interface that allows for its replacement. Specifically, users can change the default numerical optimization solver by properly setting the ``numeric_solver`` in the solver.
179179

180180
> Notice that, the accepted input of ``numeric_solver`` should have the same interface as ``skscope.numeric_solver.convex_solver_LBFGS``.
181181

docs/source/gallery/GeneralizedLinearModels/gamma-regression.ipynb

Lines changed: 1 addition & 112 deletions
Original file line numberDiff line numberDiff line change
@@ -317,110 +317,7 @@
317317
"id": "c4d3720f",
318318
"metadata": {},
319319
"source": [
320-
"Now the `solver.params` contains the coefficients of gamma model with no more than 5 variables. That is, those variables with a coefficient 0 is unused in the model:"
321-
]
322-
},
323-
{
324-
"cell_type": "code",
325-
"execution_count": 10,
326-
"id": "e416367f",
327-
"metadata": {
328-
"scrolled": true
329-
},
330-
"outputs": [
331-
{
332-
"name": "stdout",
333-
"output_type": "stream",
334-
"text": [
335-
"[10.96270773 0. 0. 0. 0. 0.\n",
336-
" 0. 0. 0. 0. 0. 0.\n",
337-
" 0. 0. 0. 0. 0. 0.\n",
338-
" 0. 0. 0. 0. 0. 0.\n",
339-
" 0. 0. 0. 0. 0. 0.\n",
340-
" 0. 0. 0. 0. 0. 0.\n",
341-
" 0. 0. 0. 0. 0. 0.\n",
342-
" 0. 0. 0. 0. 0. 0.\n",
343-
" 0. 0. 0. 0. 0. 0.\n",
344-
" 0. 0. 0. 0. 0. 0.\n",
345-
" 0. 0. 0. 0. 0. 0.\n",
346-
" 0. 0. 0. 0. 0. 0.\n",
347-
" 0. 0. 0. 0. 0. 0.\n",
348-
" 0. 0. 0. 0. 0. 0.\n",
349-
" 0. 0. 0. 0. 0. 0.\n",
350-
" 0. 0. 0. 0. 1.2021258 0.\n",
351-
" 0. 0. 0. 0. 0. 0.\n",
352-
" 0. 0. 0. 0. 0. 0.\n",
353-
" 0. 0. 0. 0. 0. 0.\n",
354-
" 0. 0. 0. 0. 0. 0.\n",
355-
" 0. 0. 0. 0. 0. 0.\n",
356-
" 0. 0. 0. 0. 0. 0.\n",
357-
" 0. 0. 0. 0. 0. 0.\n",
358-
" 0. 0. 0. 0. 0. 0.\n",
359-
" 0. 0. 0. 0. 0. 0.\n",
360-
" 0. 0.99600871 0. 0. 0. 0.\n",
361-
" 0. 0. 0. 0. 0. 0.\n",
362-
" 0. 0. 0. 0. 0. 0.\n",
363-
" 0. 0. 0. 0. 1.74258709 0.\n",
364-
" 0. 0. 0. 0. 0. 0.\n",
365-
" 0. 0. 0. 0. 0. 0.\n",
366-
" 0. 0. 0. 0. 0. 0.\n",
367-
" 0. 0. 0. 0. 0. 0.\n",
368-
" 0. 0. 0. 0. 0. 0.\n",
369-
" 0. 0. 0. 0. 0. 0.\n",
370-
" 0. 0. 0. 0. 0. 0.\n",
371-
" 0. 0. 0. 0. 0. 0.\n",
372-
" 0. 0. 0. 0. 0. 0.\n",
373-
" 0. 0. 0. 0. 0. 0.\n",
374-
" 0. 0. 0. 0. 0. 0.\n",
375-
" 0. 0. 0. 0. 0. 0.\n",
376-
" 0. 0. 0. 0. 0. 0.\n",
377-
" 0. 0. 0. 0. 0. 0.\n",
378-
" 0. 0. 0. 0. 0. 0.\n",
379-
" 0. 0. 0. 0. 0. 0.\n",
380-
" 0. 0. 0. 0. 0. 0.\n",
381-
" 0. 0. 0. 0. 0. 0.\n",
382-
" 0. 0. 0. 0. 0. 0.\n",
383-
" 0. 0. 0. 0. 0. 0.\n",
384-
" 0. 0. 0. 0. 0. 0.\n",
385-
" 0. 0. 0. 0. 0. 0.\n",
386-
" 0. 0. 0. 0. 0. 0.\n",
387-
" 0. 0. 0. 0. 0. 0.\n",
388-
" 0. 0. 0. 0. 0. 0.\n",
389-
" 0. 0. 0. 0. 0. 0.\n",
390-
" 0. 0. 0. 0. 0. 0.\n",
391-
" 0. 0. 0. 0. 0. 0.\n",
392-
" 0. 0. 0. 0. 0. 0.\n",
393-
" 0. 0. 0. 0. 0. 0.\n",
394-
" 0. 0. 0. 0. 0. 0.\n",
395-
" 0. 0. 0. 0. 0. 0.\n",
396-
" 0. 0. 0. 0. 0. 0.\n",
397-
" 0. 0. 1.18841825 0. 0. 0.\n",
398-
" 0. 0. 0. 0. 0. 0.\n",
399-
" 0. 0. 0. 0. 0. 0.\n",
400-
" 0. 0. 0. 0. 0. 0.\n",
401-
" 0. 0. 0. 0. 0. 0.\n",
402-
" 0. 0. 0. 0. 0. 0.\n",
403-
" 1.46535362 0. 0. 0. 0. 0.\n",
404-
" 0. 0. 0. 0. 0. 0.\n",
405-
" 0. 0. 0. 0. 0. 0.\n",
406-
" 0. 0. 0. 0. 0. 0.\n",
407-
" 0. 0. 0. 0. 0. 0.\n",
408-
" 0. 0. 0. 0. 0. 0.\n",
409-
" 0. 0. 0. 0. 0. 0.\n",
410-
" 0. 0. 0. 0. 0. 0.\n",
411-
" 0. 0. 0. 0. 0. 0.\n",
412-
" 0. 0. 0. 0. 0. 0.\n",
413-
" 0. 0. 0. 0. 0. 0.\n",
414-
" 0. 0. 0. 0. 0. 0.\n",
415-
" 0. 0. 0. 0. 0. 0.\n",
416-
" 0. 0. 0. 0. 0. 0.\n",
417-
" 0. 0. 0. 0. 0. 0.\n",
418-
" 0. 0. 0. ]\n"
419-
]
420-
}
421-
],
422-
"source": [
423-
"print(solver.params)"
320+
"Now the `solver.params` contains the coefficients of gamma model with no more than 5 variables."
424321
]
425322
},
426323
{
@@ -557,14 +454,6 @@
557454
"- [2] Abess docs, \"make_glm_data\".\n",
558455
"https://abess.readthedocs.io/en/latest/Python-package/datasets/glm.html\n"
559456
]
560-
},
561-
{
562-
"cell_type": "code",
563-
"execution_count": null,
564-
"id": "84ae9a94",
565-
"metadata": {},
566-
"outputs": [],
567-
"source": []
568457
}
569458
],
570459
"metadata": {

docs/source/gallery/GeneralizedLinearModels/index.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,4 +12,5 @@ Generalized Linear Models
1212
gamma-regression
1313
multiple-response-linear-regression
1414
multinomial-logistic-regression
15+
poisson-identity-link
1516
.. Inverse-gaussian-regression

docs/source/gallery/GeneralizedLinearModels/logistic-regression.ipynb

Lines changed: 67 additions & 144 deletions
Large diffs are not rendered by default.

docs/source/gallery/LinearModelAndVariants/quantile-expectile-regression.ipynb

Lines changed: 15 additions & 23 deletions
Large diffs are not rendered by default.

docs/source/userguide/install.rst

Lines changed: 0 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -52,17 +52,6 @@ Note that ``--recurse-submodules`` is required since there are some submodules i
5252
5353
Thanks to the editable mode with the flag ``-e``, we needn't re-build the package :ref:`skscope <skscope_package>` when the source python code changes.
5454

55-
If the dependence packages has been installed, we can build the package faster by
56-
57-
.. code-block:: Bash
58-
59-
python setup.py develop
60-
61-
where the function of the flag ``develop`` is similar with ``-e`` of command ``pip``.
62-
63-
This command will not check or prepare the required environment, so it can save a lot of time.
64-
Thus, we can use ``pip`` with first building and ``python`` with re-building.
65-
6655

6756

6857

docs/source/userguide/quickstart.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -147,7 +147,7 @@ Further reading
147147

148148
- `JAX library <https://jax.readthedocs.io/en/latest/index.html>`__
149149

150-
- A bunch of `machine learning methods <examples/index.html>`__ implemented on the ``skscope``
150+
- A bunch of `machine learning methods <gallery/index.html>`__ implemented on the ``skscope``
151151

152152
- More `advanced features <../feature/index.html>`__ implemented in ``skscope``
153153

docs/source/userguide/whatscope.rst

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -6,39 +6,39 @@ What is ``skscope``?
66

77
``skscope`` is a powerful open-source Python package specifically developed to tackle sparsity-constrained optimization (SCO) problems with utmost efficiency. With SCO's broad applicability in machine learning, statistics, signal processing, and other related domains, ``skscope`` can find extensive usage in these fields. For example, it excels in solving classic SCO problems like variable selection (also known as feature selection or compress sensing). Even more impressively, it goes beyond that and handles a diverse range of intriguing real-world problems:
88

9-
1. `Robust variable selection <examples/LinearModelAndVariants/robust-regression.html>`__
9+
1. `Robust variable selection <gallery/LinearModelAndVariants/robust-regression.html>`__
1010

1111
.. image:: figure/variable_selection.png
1212
:width: 300
1313
:align: center
1414

15-
2. `Nonlinear variable selection <examples/Miscellaneous/hsic-splicing.html>`__
15+
2. `Nonlinear variable selection <gallery/Miscellaneous/hsic-splicing.html>`__
1616

1717
.. image:: figure/nonlinear_variable_selection.png
1818
:width: 666
1919
:align: center
2020

2121

22-
3. `Spatial trend filtering <examples/FusionModels/spatial-trend-filtering.html>`__
22+
3. `Spatial trend filtering <gallery/FusionModels/spatial-trend-filtering.html>`__
2323

2424
.. image:: figure/trend_filter.png
2525
:width: 666
2626
:align: center
2727

28-
4. `Network reconstruction <examples/GraphicalModels/sparse-gaussian-precision.html>`__
28+
4. `Network reconstruction <gallery/GraphicalModels/sparse-gaussian-precision.html>`__
2929

3030
.. image:: figure/precision_matrix.png
3131
:width: 666
3232
:align: center
3333

34-
5. `Portfolio selection <examples/Miscellaneous/portfolio-selection.html>`__
34+
5. `Portfolio selection <gallery/Miscellaneous/portfolio-selection.html>`__
3535

3636
.. image:: figure/portfolio_selection.png
3737
:width: 300
3838
:align: center
3939

4040

41-
These above examples represent just a glimpse of the practical problems that ``skscope`` can effectively address. With its efficient optimization algorithms and versatility, ``skscope`` proves to be an invaluable tool for a wide range of disciplines. Currently, we offer over 20 examples in our comprehensive `example gallery <examples/index.html>`__.
41+
These above examples represent just a glimpse of the practical problems that ``skscope`` can effectively address. With its efficient optimization algorithms and versatility, ``skscope`` proves to be an invaluable tool for a wide range of disciplines. Currently, we offer over 20 examples in our comprehensive `example gallery <gallery/index.html>`__.
4242

4343

4444
.. How does ``skscope`` work?

0 commit comments

Comments
 (0)