Skip to content

Commit 7aeb60c

Browse files
repair train bug in multi gpu
1 parent 1049c66 commit 7aeb60c

File tree

2 files changed

+5
-1
lines changed

2 files changed

+5
-1
lines changed

configs/rec/rec_latex_ocr.yml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -19,6 +19,7 @@ Global:
1919
rec_char_dict_path: ppocr/utils/dict/latex_ocr_tokenizer.json
2020
save_res_path: ./output/rec/predicts_latexocr.txt
2121
d2s_train_image_shape: [1,256,256]
22+
find_unused_parameters: True
2223

2324
Optimizer:
2425
name: AdamW

tools/train.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -217,7 +217,10 @@ def main(config, device, logger, vdl_writer, seed):
217217
)
218218

219219
if config["Global"]["distributed"]:
220-
model = paddle.DataParallel(model)
220+
find_unused_parameters = config["Global"].get("find_unused_parameters", False)
221+
model = paddle.DataParallel(
222+
model, find_unused_parameters=find_unused_parameters
223+
)
221224
# start train
222225
program.train(
223226
config,

0 commit comments

Comments
 (0)