Skip to content

Commit 9f37d02

Browse files
authored
Merge pull request #413 from Deep-MI/dev
PR for eTIV fix and new stable release
2 parents 8d6e7ee + 5ab400d commit 9f37d02

File tree

91 files changed

+9967
-2748
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

91 files changed

+9967
-2748
lines changed

.dockerignore

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,12 @@
11
**/.git
22
**/.github
3+
checkpoints
4+
Singularity
5+
images
6+
venv
7+
Tutorial
8+
**/*.md
9+
Docker/Dockerfile*
10+
*.txt
11+
venv
12+
/srun_fastsurfer.sh

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
/BUILD.info
2+
/.idea/**

CONTRIBUTING.md

Lines changed: 47 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -54,6 +54,53 @@ Enhancement suggestions are tracked as [GitHub issues](issues).
5454
- **Describe the current behavior** and **explain which behavior you expected to see instead** and why.
5555
- **Explain why this enhancement would be useful** to most users.
5656

57+
## Contributing Code
58+
59+
1. [Fork](https://docs.github.com/en/get-started/quickstart/fork-a-repo) this repository to your github account
60+
2. Clone your fork to your computer (`git clone https://github.com/<username>/FastSurfer.git`)
61+
3. Change into the project directory (`cd FastSurfer`)
62+
4. Add Deep-MI repo as upstream (`git remote add upstream https://github.com/Deep-MI/FastSurfer.git`)
63+
5. Update information from upstream (`git fetch upstream`)
64+
6. Checkout the upstream dev branch (`git checkout -b dev upstream/dev`)
65+
7. Create your feature branch from dev (`git checkout -b my-new-feature`)
66+
8. Commit your changes (`git commit -am 'Add some feature'`)
67+
9. Push to the branch to your github (`git push origin my-new-feature`)
68+
10. Create new pull request on github web interface from that branch into Deep-NI **dev branch** (not into stable, which is default)
69+
70+
If lots of things changed in the meantime or the pull request is showing conflicts you should rebase your branch to the current upstream dev.
71+
This is the preferred way, but only possible if you are the sole develop or your branch:
72+
73+
10. Switch into dev branch (`git checkout dev`)
74+
11. Update your dev branch (`git pull upstream dev`)
75+
12. Switch into your feature (`git chekcout my-new-feature`)
76+
13. Rebase your branch onto dev (`git rebase dev`), resolve conflicts and continue until complete
77+
14. Force push the updated feature branch to your gihub (`git push -f origin my-new-feature`)
78+
79+
If other people co-develop the my-new-feature branch, rewriting history with a rebase is not possible.
80+
Instead you need to merge upstream dev into your branch:
81+
82+
10. Switch into dev branch (`git checkout dev`)
83+
11. Update your dev branch (`git pull upstream dev`)
84+
12. Switch into your feature (`git chekcout my-new-feature`)
85+
13. Merge dev into your feature (`git merge dev`), resolve conflicts and commit
86+
14. Push to origin (`git push origin my-new-feature`)
87+
88+
Either method updates the pull request and resolves conflicts, so that we can merge it once it is complete.
89+
Once the pull request is merged by us you can delete the feature branch in your clone and on your fork:
90+
91+
15. Switch into dev branch (`git checkout dev`)
92+
16. Delete feature branch (`git branch -D my-new-feature`)
93+
17. Delete the branch on your github fork either via GUI, or via command line (`git push origin --delete my-new-feature`)
94+
95+
This procedure will ensure that your local dev branch always follows our dev branch and will never diverge. You can, once in a while, push the dev branch, or similarly update stable and push it to your fork (origin), but that is not really necessary.
96+
97+
Next time you contribute a feature, you do not need to go through the steps 1-6 above, but simply:
98+
- Switch to dev branch (`git checkout dev`)
99+
- Make sure it is identical to upstream (`git pull upstream dev`)
100+
- Check out a new feature branch and continue from 7. above.
101+
102+
Another good command, if for some reasons your dev branch diverged, which should never happen as you never commit to it, you can reset it by `git reset --hard upstream/dev`. Make absolutely sure you are in your dev branch (not the feature branch) and be aware that this will delete any local changes!
103+
57104
## Attribution
58105

59106
This guide is based on the **contributing-gen**. [Make your own](https://github.com/bttger/contributing-gen)!

CerebNet/apply_warp.py

Lines changed: 85 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,85 @@
1+
2+
# Copyright 2022 Image Analysis Lab, German Center for Neurodegenerative Diseases (DZNE), Bonn
3+
#
4+
# Licensed under the Apache License, Version 2.0 (the "License");
5+
# you may not use this file except in compliance with the License.
6+
# You may obtain a copy of the License at
7+
#
8+
# http://www.apache.org/licenses/LICENSE-2.0
9+
#
10+
# Unless required by applicable law or agreed to in writing, software
11+
# distributed under the License is distributed on an "AS IS" BASIS,
12+
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
13+
# See the License for the specific language governing permissions and
14+
# limitations under the License.
15+
16+
# IMPORTS
17+
from os.path import join
18+
import numpy as np
19+
import nibabel as nib
20+
21+
from CerebNet.datasets import utils
22+
23+
24+
def save_nii_image(img_data, save_path, header, affine):
25+
img_out = nib.Nifti1Image(img_data, header=header, affine=affine)
26+
print(f"Saving {save_path}")
27+
nib.save(img_out, save_path)
28+
29+
30+
def store_warped_data(img_path, lbl_path, warp_path, result_path, patch_size):
31+
32+
img, img_file = utils.load_reorient_rescale_image(img_path)
33+
34+
lbl_file = nib.load(lbl_path)
35+
label = np.asarray(lbl_file.get_fdata(), dtype=np.int16)
36+
37+
warp_field = np.asarray(nib.load(warp_path).get_fdata())
38+
img = utils.map_size(img, base_shape=warp_field.shape[:3])
39+
label = utils.map_size(label, base_shape=warp_field.shape[:3])
40+
warped_img = utils.apply_warp_field(warp_field, img, interpol_order=3)
41+
warped_lbl = utils.apply_warp_field(warp_field, label, interpol_order=0)
42+
utils.map_subseg2label(warped_lbl, label_type='cereb_subseg')
43+
roi = utils.bounding_volume(label, patch_size)
44+
45+
img = utils.map_size(warped_img[roi], patch_size)
46+
label = utils.map_size(warped_lbl[roi], patch_size)
47+
48+
img_file.header['dim'][1:4] = patch_size
49+
img_file.set_data_dtype(img.dtype)
50+
lbl_file.header['dim'][1:4] = patch_size
51+
save_nii_image(img,
52+
join(result_path, "T1_warped_cropped.nii.gz"),
53+
header=img_file.header,
54+
affine=img_file.affine)
55+
save_nii_image(label,
56+
join(result_path, "label_warped_cropped.nii.gz"),
57+
header=lbl_file.header,
58+
affine=lbl_file.affine)
59+
60+
61+
if __name__ == '__main__':
62+
import argparse
63+
parser = argparse.ArgumentParser()
64+
parser.add_argument("--img_path",
65+
help="path to T1 image",
66+
type=str)
67+
parser.add_argument("--lbl_path",
68+
help="path to label image",
69+
type=str)
70+
parser.add_argument("--result_path",
71+
help="folder to store the results",
72+
type=str)
73+
74+
parser.add_argument("--warp_filename",
75+
help="Warp field file",
76+
default='1Warp.nii.gz',
77+
type=str)
78+
79+
args = parser.parse_args()
80+
warp_path = join(args.result_path, args.warp_filename)
81+
store_warped_data(args.img_path,
82+
args.lbl_path,
83+
warp_path=warp_path,
84+
result_path=args.result_path,
85+
patch_size=(128, 128, 128))

CerebNet/datasets/realistic_deformations.sh

Lines changed: 5 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515

1616

1717
# IMPORTS
18-
if ["$ANTSPATH" = "" ] || [ -f "$ANTSPATH/antsRegistrationSyNQuick.sh" ]
18+
if [ "$ANTSPATH" = "" ] || [ -f "$ANTSPATH/antsRegistrationSyNQuick.sh" ]
1919
then
2020
exit "environment \$ANTSPATH not defined or invalid. \$ANTSPATH must contain antsRegistrationSyNQuick.sh."
2121
fi
@@ -114,14 +114,16 @@ output_dir=$moving_dataroot/subj2subj_reg
114114
mkdir -p $output_dir
115115

116116
if [ "$unlabeled_subject" != "UNDEFINED" ]
117+
then
117118
IFS=',' read -r -a unlabeled_subject_array <<< "$unlabeled_subject"
118119
fi
119120

120121
if [ "$labeled_subject" != "UNDEFINED" ]
122+
then
121123
IFS=',' read -r -a labeled_subject_array <<< "$labeled_subject"
122124
fi
123125

124-
if [ ${#labeled_subject_array} != 1 ] && [ ${#unlabeled_subject_array} != 1 ] && [ ${#labeled_subject_array} != ${#unlabeled_subject_array}]
126+
if [ ${#labeled_subject_array} != 1 ] && [ ${#unlabeled_subject_array} != 1 ] && [ ${#labeled_subject_array} != ${#unlabeled_subject_array} ]
125127
then
126128
exit "invalid parameters for labeled_subject and unlabeled_subject"
127129
fi
@@ -179,7 +181,7 @@ do
179181
else
180182
echo "$result_path$output_warp already exists, skipping $sub1 -> $sub2."
181183
fi
182-
elif
184+
else
183185
echo "WARNING: $lab_img or $unlab_img does not exist, skipping $sub1 -> $sub2."
184186
fi
185187

CerebNet/inference.py

Lines changed: 6 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,9 +14,9 @@
1414

1515
# IMPORTS
1616
import time
17-
from os import makedirs, cpu_count
17+
from os import makedirs
1818
from os.path import join, dirname, isfile
19-
from typing import Dict, List, Tuple, Optional, Literal
19+
from typing import Dict, List, Tuple, Optional
2020
from concurrent.futures import Future, ThreadPoolExecutor
2121

2222
import nibabel as nib
@@ -26,6 +26,7 @@
2626
from tqdm import tqdm
2727

2828
from FastSurferCNN.utils import logging
29+
from FastSurferCNN.utils.threads import get_num_threads
2930
from FastSurferCNN.utils.mapper import JsonColorLookupTable, TSVLookupTable
3031
from FastSurferCNN.utils.common import (
3132
find_device,
@@ -34,8 +35,8 @@
3435
NoParallelExecutor,
3536
)
3637
from CerebNet.data_loader.augmentation import ToTensorTest
37-
from CerebNet.data_loader.dataset import SubjectDataset, Plane, LocalizerROI, PLANES
38-
from CerebNet.datasets.utils import crop_transform, load_reorient_lia
38+
from CerebNet.data_loader.dataset import SubjectDataset, Plane, PLANES
39+
from CerebNet.datasets.utils import crop_transform
3940
from CerebNet.models.networks import build_model
4041
from CerebNet.utils import checkpoint as cp
4142

@@ -64,7 +65,7 @@ def __init__(
6465
self.pool = None
6566
self._threads = None
6667
self.threads = threads
67-
torch.set_num_threads(cpu_count() if self._threads is None else self._threads)
68+
torch.set_num_threads(get_num_threads() if self._threads is None else self._threads)
6869
self.pool = (
6970
ThreadPoolExecutor(self._threads) if async_io else NoParallelExecutor()
7071
)

0 commit comments

Comments
 (0)