please refer to our publication: here
a, SHAPR consists of a 2D encoder, which embeds 2D images into a 128-dimensional latent space, and a 3D decoder, which reconstructs 3D shapes from the latent space representation. To train SHAPR we segment 3D microscopy images (we show an exemplary single red blood cell). We pair a 2D segmentation with the microscopy image of the same slice to enter the encoder as input. During supervised training (Fig. 1, step 1), we minimize the reconstruction loss (see Methods), which is the sum of the Dice loss and the binary cross entropy loss between the 3D segmentations y and SHAPR predictions p. For an input image of 64 x 64 pixels, we provide the pixel sizes for each layer in the gray boxes and the filter sizes on top of each box. b, In the second step, we fine-tune SHAPR by adding a discriminator. The discriminator is trained to differentiate between SHAPR output p ground truth segmentation r and minimize the adversarial loss. It thereby challenges SHAPR to output realistic 3D objects.@article{waibel2022shapr,
title={SHAPR predicts 3D cell shapes from 2D microscopic images},
author={Waibel, Dominik JE and Kiermeyer, Niklas and Atwell, Scott and Sadafi, Ario and Meier, Matthias and Marr, Carsten},
journal={Iscience},
volume={25},
number={11},
pages={105298},
year={2022},
publisher={Elsevier}
}
Fork the repository.
$ cd SHAPR
$ pip3 install -e .
We would recommend to first set a virtual environment and then install the package:
$ cd SHAPR
$ python3 -m venv .venv_shape
$ source .venv_shape/bin/activate
(.venv_shape) $ pip3 install -e .
By activating the virtual environment your shell’s prompt will be changed in order to show what virtual environment you’re using.
You can dectivate a virtual environment by:
(.venv_shape) $ deactivate
We can also use the virtual environment as a kernel for Jupyter Notebook. First you should install `ipykernel' package when the virtual environment is activated:
(.venv_shape) $ pip3 install ipykernel
We need to manually add our virtual environment as a kernel to Jupyter Notebook:
(.venv_shape) $ python -m ipykernel install --name=.venv_shapr
Now by openning the Jupyter-Notebook you have the option to select the .venv_shapr
as the kernel.
You can also install SHAPR using:
pip install shapr
For running the code, you need to have Python 3.6 or higher installed. In addition, these are the main dependencies:
- cudatoolkit: 10.1.243 # in case of GPU existance
- cudnn: 7.6.5 # in case of GPU existance
- h5py: 2.10.0
- hdf5: 1.10.6
- imageio: 2.9.0
- keras: 2.4.3
- matplotlib: 3.3.4
- numpy: 1.20.3
- python: 3.6.7
- scikit-image: 0.18.1
- scikit-learn: 0.24.1
- scipy: 1.6.2
- tensorflow: 2.4.1
- tensorflow-gpu: 2.4.1 # in case of GPU existance
Please find an example of how to run SHAPR from a jupyter notebook in /SHAPR/docs/jupyter notebook/Run SHAPR from notebook.ipynb
You can also run SHAPR using a params.json file, which is provided in SHAPR/docs/sample/params.json.
To run SHAPR you should set the following parameters: Setting parameters are:
path
: path to a folder that includes three subfolder ofobj
: containing the 3D groundtruth segmentations,mask
: containing the 2D masks,image
: containing the images from which the 2D masks were segmented (e.g. brightfield).
result_path
: path to a folder for saving the results of predictions.pretrained_weights_path
: path to a folder for saving and reloading pretrain modelrandom_seed
: seed for random generator in order to keep the results reproducible.
The setting parameters are read from the settings
object. You may change the setting parameters by directly changing their default values in a '/SHAPR/params.json` file or simply package API like:
> from shapr import settings
> settings.path = "a/new/path"
We have added an example of a 'params.json' file to SHAPR/docs/sample/params.json. If you want to use it, please adapt the paths to your project and copy the 'params.json' to /SHAPR/params.json, then execute the /SHAPR/shapr/run_train_script.py
You can also print all the parameters and their values using print()
function:
print(settings) ------ settings parameters ------ path: "path value" result_path: "result_path value" pretrained_weights_path: "pretrained_weights_path value" random_seed: 0
## Running functions:
You can run the training and evaluating on the test sample by calling the `run_train()` and `run_evaluation()` functions respectively.
```console
> from shapr import run_train
> run_train()
> run_evaluation()
Please find the datasets used for this publication on Zenodo: https://doi.org/10.5281/zenodo.7031924
SHAPR expects the data in the following folder structure (see sample). With corresponding files having the same name. 2D microscopy images (64x64px) should be contained in the images folder, 2D segmentations (64x64px) in the mask folder and the 3D segmentation (64x64x64px) in the obj folder.
path
├── image
│ ├── 000003-num1.png
│ │── 000004-num9.png
│ │── 000006-num1.png
│ │── .
│ │── .
│ │── .
│ │── 059994-num1.png
│
└── mask
│ ├── 000003-num1.png
│ │── 000004-num9.png
│ │── 000006-num1.png
│ │── .
│ │── .
│ │── .
│ │── 059994-num1.png
│
└── obj
│ ├── 000003-num1.png
│ │── 000004-num9.png
│ │── 000006-num1.png
│ │── .
│ │── .
│ │── .
│ │── 059994-num1.png
We are happy about any contributions. For any suggested changes, please send a pull request to the develop branch.