Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add doc #26

Merged
merged 18 commits into from
Nov 21, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
59 changes: 59 additions & 0 deletions .github/workflows/test_doc.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
name: "Test and deploy"

on:
push:
branches: [ test_doc ]

jobs:
build:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@main
- name: Set up Python 3.10.5
uses: actions/setup-python@v2
with:
python-version: 3.10.5

- name: Install dependencies
run: |
sudo apt-get install build-essential graphviz libgraphviz-dev
pip install --upgrade pygraphviz graphviz

# pip install --no-deps --index-url https://test.pypi.org/simple/ --pre macapype

pip install -e .[doc]

python -c "import skullTo3d; print(skullTo3d.__version__)"

cd ..
git clone https://github.com/macatools/macapype.git

cd macapype
pip install -e .

python -c "import macapype; print(macapype.__version__)"

cd ../skullTo3d
pwd


- name: Test with pytest
run:
py.test --cov skullTo3d

- name: Build the Doc 🔧
run: |
cd docs
make clean
make html
touch _build/html/.nojekyll

- name: Deploy Github Pages 🚀
uses: JamesIves/github-pages-deploy-action@v4.4.3
with:
branch: gh-pages
folder: docs/_build/html/
clean: true
ssh-key: ${{ secrets.DEPLOY_KEY }}
139 changes: 139 additions & 0 deletions docs/Makefile
Original file line number Diff line number Diff line change
@@ -0,0 +1,139 @@
# Makefile for Sphinx documentation
#

# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = _build

# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif

# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) .

.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html-noplot to make standalone HTML files, without plotting anything"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
@echo " install to make the html and push it online"

.PHONY: clean

clean:
rm -rf $(BUILDDIR)/*
rm -rf auto_examples/
rm -rf generated/*
rm -rf modules/*

clean_no_plot:
rm -rf $(BUILDDIR)/*
rm -rf generated/*
rm -rf modules/*


html-noplot:
$(SPHINXBUILD) -D plot_gallery=0 -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

.PHONY: html
html:
$(SPHINXBUILD) -b html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."

.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."

.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."

.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."

.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."

.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/macapype.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/macapype.qhc"

.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."

.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."

.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."

.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."

.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."

.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."

136 changes: 136 additions & 0 deletions docs/command.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,136 @@
:orphan:

.. _command:

~~~~~~~~~~~~~~~~~~~~~~
Launching a processing
~~~~~~~~~~~~~~~~~~~~~~

Commands
********

The main file is located in workflows and is called segment_petra.py and should be called like a python script:

.. code:: bash

$ python workflows/segment_petra.py

**N.B. if you have installed the pypi version (e.g. using pip install macapype) or a docker/singularity version, you can replace the previous command by the following command:**

.. code:: bash

$ segment_petra



For container (docker and singularity), here are some examples - add your proper bindings:

.. code:: bash

$ docker run -B binding_to_host:binding_guest macatools/skullto3d:latest segment_petra

.. code:: bash

$ singularity run -v binding_to_host:binding_guest /path/to/containers/skullto3d_v0.0.4.1.sif segment_petra

Expected input data
*******************


All the data have to be in BIDS format to run properly (see `BIDS specification <https://bids-specification.readthedocs.io/en/stable/index.html>`_ for more details)

In particular:

* _T1w (BIDS) extension is expected for T1 weighted images (BIDS)
* _T2w (BIDS) extension is expected for T2 weighted images (BIDS)
* _PDw (BIDS) or petra (non-BIDS) extensions are expected for petra images

* _acq-CT_T2star (BIDS, but non canonical) extension is expected for CT images
* _angio extension is expected for angiography images


Command line parameters
***********************

--------------------------------------
The following parameters are mandatory
--------------------------------------

* -data *like in macapype*
the path to your data dataset (existing BIDS format directory)

* -out *like in macapype*
the path to the output results (an existing path)

* -soft *like in macapype*
can be one of these : SPM or ANTS
* with _skull after SPM or ANTS if you want to process skull or angio *specific to skullTo3d*; otherwise the main pipelines of macapype will be launched (only brain segmentation will be performed)
* with _robustreg (at the end) to have a more robust registration (in two steps) *like in macapype*
* with _test (at the end) to check if the full pipeline is coherent (will only generate the graph.dot and graph.png) *like in macapype*
* with _prep (at the end) will perform data preparation (no brain extraction and segmentation) *like in macapype*
* with _noseg (at the end) will perform data preparation and brain extraction (no segmentation) *like in macapype*
* with _seq (at the end) to run in sequential mode (all iterables will be processed one after the other; equivalent to -nprocs 1) *like in macapype*


--------------------------------------
The following parameters are exclusive
--------------------------------------
*(but one is mandatory)*

* -params *(mandatory if -species is omitted)*
a json file specifiying the global parameters of the analysis. See :ref:`Parameters <params>` for more details

* -species *(mandatory if -params is omitted)*
followed the NHP species corresponding to the image, e.g. {macaque | marmo | baboon | chimp}

--------------------------------------
The following parameters are optional
--------------------------------------
*(but highly recommanded)*

* -brain_dt *equivalent to -dt in macapype*
specifies the datatype available to perform brain segmentation (can be "T1", or "T1 T2").
**Note** : default is T1 if the attribute is omitted

* -skull_dt *specific to skullTo3d*
specifies the datatype available for skull segmentation (can be, "T1", "petra", "CT", "angio" or a combination of the latter (with space(s) in between).
**Note** : default is T1 if the attribute is omitted.

* -deriv creates a derivatives directory, with all important files, properly named following BIDS derivatives convertion

* -pad exports (in derivatives) important files in native (original) space

--------------------------------------
The following parameters are optional
--------------------------------------

* -indiv or -indiv_params : a json file overwriting the default parameters (both macapype default and parameters specified in -params json file) for specific subjects/sessions. See :ref:`Individual Parameters <indiv_params>` for more details

* -sub (-subjects), -ses (-sessions), -acq (-acquisions), -rec (-reconstructions) allows to specifiy a subset of the BIDS dataset respectively to a range of subjects, session, acquision types and reconstruction types. The arguments can be listed with space seperator. **Note** if not specified, the full BIDS dataset will be processed

* -mask allows to specify a precomputed binary mask file (skipping brain extraction). The best usage of this option is: precomputing the pipeline till brain_extraction_pipe, modify by hand the mask and use the mask for segmentation. Better if only one subject*session is specified (one file is specified at a time...).

**Warning: the mask should be in the same space as the data. And only works with -soft ANTS so far**

* -nprocs : an integer, to specifiy the number of processes that should be allocated by the parralel engine of macapype
* typically equals to the number of subjects*session (i.e. iterables).
* can be multiplied by 2 if T1*T2 pipelines are run (the first steps at least will benefit from it)
* default = 4 if unspecified ; if is put to 0, then the sequential processing is used (equivalent to -soft with _seq, see before)

***********************
Command line examples
***********************


.. code:: bash

$ python workflows/segment_petra.py -data ~/Data_maca -out ./local_test -soft ANTS_skull -params params.json


.. code:: bash

$ python workflows/segment_petra.py -data ~/Data_maca -out ./local_test -soft ANTS_skull_robustreg -species macaque

.. code:: bash

$ python workflows/segment_petra.py -data ~/Data_maca -out ./local_test -soft ANTS_skull -params params.json -sub Apache Baron -ses 01 -rec mean -deriv -pad
Loading
Loading