From 784fba44d14e17896338e9f8d75b7f3ccf27584f Mon Sep 17 00:00:00 2001
From: Genius Patrick <74176172+geniuspatrick@users.noreply.github.com>
Date: Fri, 2 Jun 2023 17:35:51 +0800
Subject: [PATCH] docs: set long_description to the contents of README.md as
the description on PyPI (#669)
---
README.md | 348 +++++++-----------
README_CN.md | 272 ++++++--------
RELEASE.md | 93 ++++-
docs/en/index.md | 66 ++--
docs/zh/index.md | 21 +-
.../data}/imagenet1000_clsidx_to_labels.txt | 0
infer.py | 2 +-
setup.py | 13 +-
tutorials/README.md | 1 -
tutorials/data/test/dog/dog.jpg | Bin 33315 -> 0 bytes
10 files changed, 406 insertions(+), 410 deletions(-)
rename {tutorials => examples/data}/imagenet1000_clsidx_to_labels.txt (100%)
delete mode 100644 tutorials/README.md
delete mode 100644 tutorials/data/test/dog/dog.jpg
diff --git a/README.md b/README.md
index 6b7dd28c1..59260420c 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,4 @@
-
+
# MindCV
@@ -20,83 +20,48 @@ English | [中文](README_CN.md)
[Get Started](#get-started) |
[Tutorials](#tutorials) |
[Model List](#model-list) |
-[Supported Algorithms](#supported-algorithms) |
-[Notes](#notes)
+[Supported Algorithms](#supported-algorithms)
## Introduction
+
MindCV is an open-source toolbox for computer vision research and development based on [MindSpore](https://www.mindspore.cn/en). It collects a series of classic and SoTA vision models, such as ResNet and SwinTransformer, along with their pre-trained weights and training strategies. SoTA methods such as auto augmentation are also provided for performance improvement. With the decoupled module design, it is easy to apply or adapt MindCV to your own CV tasks.
-
- Major Features
+### Major Features
- **Easy-to-Use.** MindCV decomposes the vision framework into various configurable components. It is easy to customize your data pipeline, models, and learning pipeline with MindCV:
-```python
->>> import mindcv
-# create a dataset
->>> dataset = mindcv.create_dataset('cifar10', download=True)
-# create a model
->>> network = mindcv.create_model('resnet50', pretrained=True)
-```
+ ```pycon
+ >>> import mindcv
+ # create a dataset
+ >>> dataset = mindcv.create_dataset('cifar10', download=True)
+ # create a model
+ >>> network = mindcv.create_model('resnet50', pretrained=True)
+ ```
-Users can customize and launch their transfer learning or training task in one command line.
+ Users can customize and launch their transfer learning or training task in one command line.
-``` python
-# transfer learning in one command line
->>> !python train.py --model=swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir={data_dir}
-```
+ ```shell
+ # transfer learning in one command line
+ python train.py --model=swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/data
+ ```
- **State-of-The-Art.** MindCV provides various CNN-based and Transformer-based vision models including SwinTransformer. Their pretrained weights and performance reports are provided to help users select and reuse the right model:
-- **Flexibility and efficiency.** MindCV is built on MindSpore which is an efficent DL framework that can be run on different hardware platforms (GPU/CPU/Ascend). It supports both graph mode for high efficiency and pynative mode for flexibility.
-
-
-
+- **Flexibility and efficiency.** MindCV is built on MindSpore which is an efficient DL framework that can be run on different hardware platforms (GPU/CPU/Ascend). It supports both graph mode for high efficiency and pynative mode for flexibility.
-### Benchmark Results
+## Model Zoo
-The performance of the models trained with MindCV is summarized in [benchmark_results.md](./benchmark_results.md), where the training recipes and weights are both available.
+The performance of the models trained with MindCV is summarized in [here](https://mindspore-lab.github.io/mindcv/modelzoo/), where the training recipes and weights are both available.
-Model introduction and training details can be viewed in each subfolder under [configs](configs).
+Model introduction and training details can be viewed in each sub-folder under [configs](configs).
## Installation
-### Dependency
-
-- mindspore >= 1.8.1
-- numpy >= 1.17.0
-- pyyaml >= 5.3
-- tqdm
-- openmpi 4.0.3 (for distributed mode)
-
-To install the dependency, please run
-```shell
-pip install -r requirements.txt
-```
-
-MindSpore can be easily installed by following the official [instructions](https://www.mindspore.cn/install) where you can select your hardware platform for the best fit. To run in distributed mode, [openmpi](https://www.open-mpi.org/software/ompi/v4.0/) is required to install.
-
-The following instructions assume the desired dependency is fulfilled.
+See [Installation](https://mindspore-lab.github.io/mindcv/installation/) for details.
-### Install with PyPI
-
-The released version of MindCV can be installed via `PyPI` as follows:
-```shell
-pip install mindcv
-```
-
-### Install from Source
-
-The latest version of MindCV can be installed as follows:
-```shell
-pip install git+https://github.com/mindspore-lab/mindcv.git
-```
-
-> Notes: MindCV can be installed on Linux and Mac but not on Windows currently.
-
-## Get Started
+## Getting Started
### Hands-on Tutorial
@@ -104,7 +69,7 @@ To get started with MindCV, please see the [Quick Start](docs/en/tutorials/quick
Below are a few code snippets for your taste.
-```python
+```pycon
>>> import mindcv
# List and find a pretrained vision model
>>> mindcv.list_models("swin*", pretrained=True)
@@ -118,17 +83,19 @@ Below are a few code snippets for your taste.
**Image classification demo**
+Right click on the image below and save as `dog.jpg`.
+
-Infer the input image with a pretrained SoTA model,
+Classify the downloaded image with a pretrained SoTA model:
-```python
->>> !python infer.py --model=swin_tiny --image_path='./tutorials/data/test/dog/dog.jpg'
+```pycon
+>>> !python infer.py --model=swin_tiny --image_path='./dog.jpg'
{'Labrador retriever': 0.5700152, 'golden retriever': 0.034551315, 'kelpie': 0.010108651, 'Chesapeake Bay retriever': 0.008229004, 'Walker hound, Walker foxhound': 0.007791956}
```
-The top-1 prediction result is labrador retriever (拉布拉多犬), which is the breed of this cut dog.
+The top-1 prediction result is labrador retriever, which is the breed of this cut dog.
### Training
@@ -136,85 +103,92 @@ It is easy to train your model on a standard or customized dataset using `train.
- Standalone Training
-``` shell
-# standalone training
-python train.py --model=resnet50 --dataset=cifar10 --dataset_download
-```
+ ```shell
+ # standalone training
+ python train.py --model=resnet50 --dataset=cifar10 --dataset_download
+ ```
-Above is an example for training ResNet50 on CIFAR10 dataset on a CPU/GPU/Ascend device
+ Above is an example for training ResNet50 on CIFAR10 dataset on a CPU/GPU/Ascend device
- Distributed Training
-For large datasets like ImageNet, it is necessary to do training in distributed mode on multiple devices. This can be achieved with `mpirun` and parallel features supported by MindSpore.
+ For large datasets like ImageNet, it is necessary to do training in distributed mode on multiple devices. This can be achieved with `mpirun` and parallel features supported by MindSpore.
-```shell
-# distributed training
-# assume you have 4 GPUs/NPUs
-mpirun -n 4 python train.py --distribute \
- --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet
-```
-> Notes: If the script is executed by the root user, the `--allow-run-as-root` parameter must be added to `mpirun`.
+ ```shell
+ # distributed training
+ # assume you have 4 GPUs/NPUs
+ mpirun -n 4 python train.py --distribute \
+ --model=densenet121 --dataset=imagenet --data_dir=/path/to/imagenet
+ ```
+ > Notes: If the script is executed by the root user, the `--allow-run-as-root` parameter must be added to `mpirun`.
-Detailed parameter definitions can be seen in `config.py` and checked by running `python train.py --help'.
+ Detailed parameter definitions can be seen in `config.py` and checked by running `python train.py --help'.
-To resume training, please set the `--ckpt_path` and `--ckpt_save_dir` arguments. The optimizer state including the learning rate of the last stopped epoch will also be recovered.
+ To resume training, please set the `--ckpt_path` and `--ckpt_save_dir` arguments. The optimizer state including the learning rate of the last stopped epoch will also be recovered.
- Config and Training Strategy
-You can configure your model and other components either by specifying external parameters or by writing a yaml config file. Here is an example of training using a preset yaml file.
+ You can configure your model and other components either by specifying external parameters or by writing a yaml config file. Here is an example of training using a preset yaml file.
-```shell
-mpirun --allow-run-as-root -n 4 python train.py -c configs/squeezenet/squeezenet_1.0_gpu.yaml
-```
+ ```shell
+ mpirun --allow-run-as-root -n 4 python train.py -c configs/squeezenet/squeezenet_1.0_gpu.yaml
+ ```
-**Pre-defined Training Strategies:** We provide more than 20 training recipes that achieve SoTA results on ImageNet currently. Please look into the [`configs`](configs) folder for details. Please feel free to adapt these training strategies to your own model for performance improvement, which can be easily done by modifying the yaml file.
+ **Pre-defined Training Strategies:**
+ We provide more than 20 training recipes that achieve SoTA results on ImageNet currently.
+ Please look into the [`configs`](configs) folder for details.
+ Please feel free to adapt these training strategies to your own model for performance improvement, which can be easily done by modifying the yaml file.
- Train on ModelArts/OpenI Platform
-To run training on the [ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html) or [OpenI](https://openi.pcl.ac.cn/) cloud platform:
+ To run training on the [ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html) or [OpenI](https://openi.pcl.ac.cn/) cloud platform:
+
+ ```text
+ 1. Create a new training task on the cloud platform.
+ 2. Add run parameter `config` and specify the path to the yaml config file on the website UI interface.
+ 3. Add run parameter `enable_modelarts` and set True on the website UI interface.
+ 4. Fill in other blanks on the website and launch the training task.
+ ```
+
+**Graph Mode and PyNative Mode**:
+
+By default, the training pipeline `train.py` is run in [graph mode](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E9%9D%99%E6%80%81%E5%9B%BE) on MindSpore, which is optimized for efficiency and parallel computing with a compiled static graph.
+In contrast, [pynative mode](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E5%8A%A8%E6%80%81%E5%9B%BE) is optimized for flexibility and easy debugging. You may alter the parameter `--mode` to switch to pure pynative mode for debugging purpose.
+
+**Mixed Mode**:
+
+[Pynative mode with ms_function ](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/combine.html) is a mixed mode for comprising flexibility and efficiency in MindSpore. To apply pynative mode with ms_function for training, please run `train_with_func.py`, e.g.,
-```text
-1. Create a new training task on the cloud platform.
-2. Add run parameter `config` and specify the path to the yaml config file on the website UI interface.
-3. Add run parameter `enable_modelarts` and set True on the website UI interface.
-4. Fill in other blanks on the website and launch the training task.
+```shell
+python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
```
+> Note: this is an **experimental** function under improvement. It is not stable on MindSpore 1.8.1 or earlier versions.
+
### Validation
-To evalute the model performance, please run `validate.py`
+To evaluate the model performance, please run `validate.py`
```shell
# validate a trained checkpoint
python validate.py --model=resnet50 --dataset=imagenet --data_dir=/path/to/data --ckpt_path=/path/to/model.ckpt
```
-- Validation while Training
+**Validation while Training**
You can also track the validation accuracy during training by enabling the `--val_while_train` option.
```shell
python train.py --model=resnet50 --dataset=cifar10 \
- --val_while_train --val_split=test --val_interval=1
+ --val_while_train --val_split=test --val_interval=1
```
The training loss and validation accuracy for each epoch will be saved in `{ckpt_save_dir}/results.log`.
-More examples about training and validation can be seen in [examples/scripts](examples/scripts).
-
-- Graph Mode and Pynative Mode
-
-By default, the training pipeline `train.py` is run in [graph mode](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E9%9D%99%E6%80%81%E5%9B%BE) on MindSpore, which is optimized for efficiency and parallel computing with a compiled static graph. In contrast, [pynative mode](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E5%8A%A8%E6%80%81%E5%9B%BE) is optimized for flexibility and easy debugging. You may alter the parameter `--mode` to switch to pure pynative mode for debugging purpose.
-
-[Pynative mode with ms_function ](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/combine.html) is a mixed mode for comprising flexibility and efficiency in MindSpore. To apply pynative mode with ms_function for training, please run `train_with_func.py`, e.g.,
-
-``` shell
-python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
-```
->Note: this is an **experimental** function under improvement. It is not stable on MindSpore 1.8.1 or earlier versions.
-
+More examples about training and validation can be seen in [examples](examples/scripts).
## Tutorials
+
We provide the following jupyter notebook tutorials to help users learn to use MindCV.
- [Learn about configs](docs/en/tutorials/configuration.md)
@@ -228,7 +202,7 @@ We provide the following jupyter notebook tutorials to help users learn to use M
Currently, MindCV supports the model families listed below. More models with pre-trained weights are under development and will be released soon.
-
+
Supported models
* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370
@@ -272,134 +246,86 @@ Please see [configs](./configs) for the details about model performance and pret
## Supported Algorithms
-
+
+
Supported algorithms
* Augmentation
- * [AutoAugment](https://arxiv.org/abs/1805.09501)
- * [RandAugment](https://arxiv.org/abs/1909.13719)
- * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
- * RandErasing (Cutout)
- * CutMix
- * Mixup
- * RandomResizeCrop
- * Color Jitter, Flip, etc
+ * [AutoAugment](https://arxiv.org/abs/1805.09501)
+ * [RandAugment](https://arxiv.org/abs/1909.13719)
+ * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
+ * RandErasing (Cutout)
+ * CutMix
+ * MixUp
+ * RandomResizeCrop
+ * Color Jitter, Flip, etc
* Optimizer
- * Adam
- * Adamw
- * [Lion](https://arxiv.org/abs/2302.06675)
- * Adan (experimental)
- * AdaGrad
- * LAMB
- * Momentum
- * RMSProp
- * SGD
- * NAdam
+ * Adam
+ * AdamW
+ * [Lion](https://arxiv.org/abs/2302.06675)
+ * Adan (experimental)
+ * AdaGrad
+ * LAMB
+ * Momentum
+ * RMSProp
+ * SGD
+ * NAdam
* LR Scheduler
- * Warmup Cosine Decay
- * Step LR
- * Polynomial Decay
- * Exponential Decay
+ * Warmup Cosine Decay
+ * Step LR
+ * Polynomial Decay
+ * Exponential Decay
* Regularization
- * Weight Decay
- * Label Smoothing
- * Stochastic Depth (depends on networks)
- * Dropout (depends on networks)
+ * Weight Decay
+ * Label Smoothing
+ * Stochastic Depth (depends on networks)
+ * Dropout (depends on networks)
* Loss
- * Cross Entropy (w/ class weight and auxiliary logit support)
- * Binary Cross Entropy (w/ class weight and auxiliary logit support)
- * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
- * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
+ * Cross Entropy (w/ class weight and auxiliary logit support)
+ * Binary Cross Entropy (w/ class weight and auxiliary logit support)
+ * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
+ * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
* Ensemble
- * Warmup EMA (Exponential Moving Average)
+ * Warmup EMA (Exponential Moving Average)
+
-## Notes
-### What is New
-- 2023/04/28
-1. Add some new models, listed as following
- - [VGG](configs/vgg)
- - [DPN](configs/dpn)
- - [ResNet v2](configs/resnetv2)
- - [MnasNet](configs/mnasnet)
- - [MixNet](configs/mixnet)
- - [RepVGG](configs/repvgg)
- - [ConvNeXt](configs/convnext)
- - [Swin Transformer](configs/swintransformer)
- - [EdgeNeXt](configs/edgenext)
- - [CrossViT](configs/crossvit)
- - [XCiT](configs/xcit)
- - [CoAT](configs/coat)
- - [PiT](configs/pit)
- - [PVT v2](configs/pvt_v2)
- - [MobileViT](configs/mobilevit)
-2. Bug fix:
- - Setting the same random seed for each rank
- - Checking if options from yaml config exist in argument parser
- - Initializing flag variable as `Tensor` in Optimizer `Adan`
-
-- 2023/03/25
-1. Update checkpoints for pretrained ResNet for better accuracy
- - ResNet18 (from 70.09 to 70.31 @Top1 accuracy)
- - ResNet34 (from 73.69 to 74.15 @Top1 accuracy)
- - ResNet50 (from 76.64 to 76.69 @Top1 accuracy)
- - ResNet101 (from 77.63 to 78.24 @Top1 accuracy)
- - ResNet152 (from 78.63 to 78.72 @Top1 accuracy)
-2. Rename checkpoint file name to follow naming rule ({model_scale-sha256sum.ckpt}) and update download URLs.
-
-- 2023/03/05
-1. Add Lion (EvoLved Sign Momentum) optimizer from paper https://arxiv.org/abs/2302.06675
- - To replace adamw with lion, LR is usually 3-10x smaller, and weight decay is usually 3-10x larger than adamw.
-2. Add 6 new models with training recipes and pretrained weights for
- - [HRNet](configs/hrnet)
- - [SENet](configs/senet)
- - [GoogLeNet](configs/googlenet)
- - [Inception V3](configs/inception_v3)
- - [Inception V4](configs/inception_v4)
- - [Xception](configs/xception)
-3. Support gradient clip
-4. Arg name `use_ema` changed to **`ema`**, add `ema: True` in yaml to enable EMA.
-
-- 2023/01/10
-1. MindCV v0.1 released! It can be installed via PyPI `pip install mindcv` now.
-2. Add training recipe and trained weights of googlenet, inception_v3, inception_v4, xception
-
-- 2022/12/09
-1. Support lr warmup for all lr scheduling algorithms besides cosine decay.
-2. Add repeated augmentation, which can be enabled by setting `--aug_repeats` to be a value larger than 1 (typically, 3 or 4 is a common choice).
-3. Add EMA.
-4. Improve BCE loss to support mixup/cutmix.
-
-- 2022/11/21
-1. Add visualization for loss and acc curves
-2. Support epochwise lr warmup cosine decay (previous is stepwise)
-- 2022/11/09
-1. Add 7 pretrained ViT models.
-2. Add RandAugment augmentation.
-3. Fix CutMix efficiency issue and CutMix and Mixup can be used together.
-4. Fix lr plot and scheduling bug.
-- 2022/10/12
-1. Both BCE and CE loss now support class-weight config, label smoothing, and auxiliary logit input (for networks like inception).
-- 2022/09/13
-1. Add Adan optimizer (experimental)
-
-### How to Contribute
+## What is New
+
+- 2023/5/30
+1. New Models:
+ - AMP(O2) version of [VGG](configs/vgg)
+ - [GhostNet](configs/ghostnet)
+ - AMP(O3) version of [MobileNetV2](configs/mobilenetv2) and [MobileNetV3](configs/mobilenetv3)
+ - (x,y)_(200,400,600,800)mf of [RegNet](configs/regnet)
+ - b1g2, b1g4 & b2g4 of [RepVGG](configs/repvgg)
+ - 0.5 of [MnasNet](configs/mnasnet)
+ - b3 & b4 of [PVTv2](configs/pvt_v2)
+2. New Features:
+ - 3-Augment, Augmix, TrivialAugmentWide
+3. Bug Fixes:
+ - ViT pooling mode
+
+See [RELEASE](RELEASE.md) for detailed history.
+
+## How to Contribute
We appreciate all kind of contributions including issues and PRs to make MindCV better.
-Please refer to [CONTRIBUTING.md](CONTRIBUTING.md) for the contributing guideline. Please follow the [Model Template and Guideline](docs/en/how_to_guides/write_a_new_model.md) for contributing a model that fits the overall interface :)
+Please refer to [CONTRIBUTING.md](CONTRIBUTING.md) for the contributing guideline.
+Please follow the [Model Template and Guideline](docs/en/how_to_guides/write_a_new_model.md) for contributing a model that fits the overall interface :)
-### License
+## License
This project follows the [Apache License 2.0](LICENSE.md) open-source license.
-### Acknowledgement
+## Acknowledgement
MindCV is an open-source project jointly developed by the MindSpore team, Xidian University, and Xi'an Jiaotong University.
Sincere thanks to all participating researchers and developers for their hard work on this project.
We also acknowledge the computing resources provided by [OpenI](https://openi.pcl.ac.cn/).
-### Citation
+## Citation
If you find this project useful in your research, please consider citing:
diff --git a/README_CN.md b/README_CN.md
index 7405d2c68..4fc4aec12 100644
--- a/README_CN.md
+++ b/README_CN.md
@@ -1,4 +1,4 @@
-
+
# MindCV
@@ -20,87 +20,46 @@
[快速入门](#快速入门) |
[教程](#教程) |
[模型列表](#模型列表) |
-[支持算法](#支持算法) |
-[日志](#日志)
+[支持算法](#支持算法)
## 简介
-MindCV是一个基于 [MindSpore](https://www.mindspore.cn/)
-开发的,致力于计算机视觉相关技术研发的开源工具箱。它提供大量的计算机视觉领域的经典模型和SoTA模型以及它们的预训练权重和训练策略。同时,还提供了自动增强等SoTA算法来提高模型性能。通过解耦的模块设计,您可以轻松地将MindCV应用到您自己的CV任务中。
+MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力于计算机视觉相关技术研发的开源工具箱。它提供大量的计算机视觉领域的经典模型和SoTA模型以及它们的预训练权重和训练策略。同时,还提供了自动增强等SoTA算法来提高模型性能。通过解耦的模块设计,您可以轻松地将MindCV应用到您自己的CV任务中。
-
- 主要特性
+### 主要特性
- **高易用性** MindCV将视觉任务分解为各种可配置的组件,用户可以轻松地构建自己的数据处理和模型训练流程。
-```python
->>> import mindcv
-# 创建数据集
->>> dataset = mindcv.create_dataset('cifar10', download=True)
-# 创建模型
->>> network = mindcv.create_model('resnet50', pretrained=True)
-```
+ ```pycon
+ >>> import mindcv
+ # 创建数据集
+ >>> dataset = mindcv.create_dataset('cifar10', download=True)
+ # 创建模型
+ >>> network = mindcv.create_model('resnet50', pretrained=True)
+ ```
-用户可通过预定义的训练和微调脚本,快速配置并完成训练或迁移学习任务。
+ 用户可通过预定义的训练和微调脚本,快速配置并完成训练或迁移学习任务。
-```shell
-# 配置和启动迁移学习任务
-python train.py --model swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/dataset
-```
+ ```shell
+ # 配置和启动迁移学习任务
+ python train.py --model swin_tiny --pretrained --opt=adamw --lr=0.001 --data_dir=/path/to/dataset
+ ```
- **高性能** MindCV集成了大量基于CNN和和Transformer的高性能模型, 如SwinTransformer,并提供预训练权重、训练策略和性能报告,帮助用户快速选型并将其应用于视觉模型。
- **灵活高效** MindCV基于高效的深度学习框架MindSpore开发,具有自动并行和自动微分等特性,支持不同硬件平台上(CPU/GPU/Ascend),同时支持效率优化的静态图模式和调试灵活的动态图模式。
-
-
-### 性能结果
+## 模型支持
-
-基于MindCV进行模型实现和重训练的汇总结果详见[benchmark_results.md](./benchmark_results.md), 所用到的训练策略和训练后的模型权重均可通过表中链接获取。
+基于MindCV进行模型实现和重训练的汇总结果详见[模型仓库](https://mindspore-lab.github.io/mindcv/zh/modelzoo/), 所用到的训练策略和训练后的模型权重均可通过表中链接获取。
各模型讲解和训练说明详见[configs](configs)目录。
-
## 安装
-### 依赖
-
-- mindspore >= 1.8.1
-- numpy >= 1.17.0
-- pyyaml >= 5.3
-- tqdm
-- openmpi 4.0.3 (分布式模式需要使用)
-
-运行以下脚本,安装相关依赖。
-
-```shell
-pip install -r requirements.txt
-```
-
-用户可遵从[官方指导](https://www.mindspore.cn/install) 并根据自身使用的硬件平台选择最适合您的MindSpore版本来进行安装。如果需要在分布式条件下使用,还需安装[openmpi](https://www.open-mpi.org/software/ompi/v4.0/) 。
-
-之后的说明将默认用户已正确安装好相关依赖。
-
-### PyPI安装
-
-MindCV的已发布版本可以通过PyPI安装。
-
-```shell
-pip install mindcv
-```
-
-### 源码安装
-
-Git上最新的MindCV可以通过以下指令安装。
-
-```shell
-pip install git+https://github.com/mindspore-lab/mindcv.git
-```
-
-> 注:MindCV可以在Linux和Mac系统安装,但是目前还不能在Windows系统上安装。
+详情请见[安装](https://mindspore-lab.github.io/mindcv/zh/installation/)页面。
## 快速入门
@@ -110,7 +69,7 @@ pip install git+https://github.com/mindspore-lab/mindcv.git
以下是一些供您快速体验的代码样例。
-```python
+```pycon
>>> import mindcv
# 列出满足条件的预训练模型名称
>>> mindcv.list_models("swin*", pretrained=True)
@@ -118,78 +77,94 @@ pip install git+https://github.com/mindspore-lab/mindcv.git
# 创建模型
>>> network = mindcv.create_model('swin_tiny', pretrained=True)
# 验证模型的准确率
->>> !python validate.py - -model = swin_tiny - -pretrained - -dataset = imagenet - -val_split = validation
+>>> !python validate.py --model=swin_tiny --pretrained --dataset=imagenet --val_split=validation
{'Top_1_Accuracy': 0.808343989769821, 'Top_5_Accuracy': 0.9527253836317136, 'loss': 0.8474242982580839}
```
**图片分类示例**
+右键点击如下图片,另存为`dog.jpg`。
+
-使用加载了预训练参数的SoTA模型对一张图片进行推理。
+使用加载了预训练参数的SoTA模型对图片进行推理。
-```python
->>> !python infer.py - -model = swin_tiny - -image_path = './tutorials/data/test/dog/dog.jpg'
-{'Labrador retriever': 0.5700152, 'golden retriever': 0.034551315, 'kelpie': 0.010108651,
- 'Chesapeake Bay retriever': 0.008229004, 'Walker hound, Walker foxhound': 0.007791956}
+```pycon
+>>> !python infer.py --model=swin_tiny --image_path='./dog.jpg'
+{'Labrador retriever': 0.5700152, 'golden retriever': 0.034551315, 'kelpie': 0.010108651, 'Chesapeake Bay retriever': 0.008229004, 'Walker hound, Walker foxhound': 0.007791956}
```
预测结果排名前1的是拉布拉多犬,正是这张图片里的狗狗的品种。
### 模型训练
-通过`train.py`,用户可以很容易地在标准数据集或自定义数据集上训练模型,用户可以通过外部变量或者yaml配文件来设置训练策略(如数据增强、学习路策略)。
+通过`train.py`,用户可以很容易地在标准数据集或自定义数据集上训练模型,用户可以通过外部变量或者yaml配置文件来设置训练策略(如数据增强、学习率策略)。
- 单卡训练
-```shell
-# 单卡训练
-python train.py --model resnet50 --dataset cifar10 --dataset_download
-```
+ ```shell
+ # 单卡训练
+ python train.py --model resnet50 --dataset cifar10 --dataset_download
+ ```
-以上代码是在CIFAR10数据集上单卡(CPU/GPU/Ascend)训练ResNet的示例,通过`model`和`dataset`参数分别指定需要训练的模型和数据集。
+ 以上代码是在CIFAR10数据集上单卡(CPU/GPU/Ascend)训练ResNet的示例,通过`model`和`dataset`参数分别指定需要训练的模型和数据集。
- 分布式训练
-对于像ImageNet这样的大型数据集,有必要在多个设备上以分布式模式进行训练。基于MindSpore对分布式相关功能的良好支持,用户可以使用`mpirun`来进行模型的分布式训练。
+ 对于像ImageNet这样的大型数据集,有必要在多个设备上以分布式模式进行训练。基于MindSpore对分布式相关功能的良好支持,用户可以使用`mpirun`来进行模型的分布式训练。
-```shell
-# 分布式训练
-# 假设你有4张GPU或者NPU卡
-mpirun --allow-run-as-root -n 4 python train.py --distribute \
- --model densenet121 --dataset imagenet --data_dir ./datasets/imagenet
-```
+ ```shell
+ # 分布式训练
+ # 假设你有4张GPU或者NPU卡
+ mpirun --allow-run-as-root -n 4 python train.py --distribute \
+ --model densenet121 --dataset imagenet --data_dir ./datasets/imagenet
+ ```
-完整的参数列表及说明在`config.py`中定义,可运行`python train.py --help`快速查看。
-
-如需恢复训练,请指定`--ckpt_path`和`--ckpt_save_dir`参数,脚本将加载路径中的模型权重和优化器状态,并恢复中断的训练进程。
+ 完整的参数列表及说明在`config.py`中定义,可运行`python train.py --help`快速查看。
+ 如需恢复训练,请指定`--ckpt_path`和`--ckpt_save_dir`参数,脚本将加载路径中的模型权重和优化器状态,并恢复中断的训练进程。
- 超参配置和预训练策略
+ 您可以编写yaml文件或设置外部参数来指定配置数据、模型、优化器等组件及其超参。以下是使用预设的训练策略(yaml文件)进行模型训练的示例。
-您可以编写yaml文件或设置外部参数来指定配置数据、模型、优化器等组件及其超参。以下是使用预设的训练策略(yaml文件)进行模型训练的示例。
+ ```shell
+ mpirun --allow-run-as-root -n 4 python train.py -c configs/squeezenet/squeezenet_1.0_gpu.yaml
+ ```
-```shell
-mpirun --allow-run-as-root -n 4 python train.py -c configs/squeezenet/squeezenet_1.0_gpu.yaml
-```
+ **预定义的训练策略**
+ MindCV目前提前了超过20种模型训练策略,在ImageNet取得SoTA性能。
+ 具体的参数配置和详细精度性能汇总请见[`configs`](configs)文件夹。
+ 您可以便捷地将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)。
-**预定义的训练策略** MindCV目前提前了超过20种模型训练策略,在ImageNet取得SoTA性能。具体的参数配置和详细精度性能汇总请见[`configs`](configs)文件夹。您可以便捷将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)
+- 在ModelArts/OpenI平台上训练
+ 在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作:
-- 在ModelArts/OpenI平台上训练
+ ```text
+ 1、在云平台上创建新的训练任务。
+ 2、在网站UI界面添加运行参数`config`,并指定yaml配置文件的路径。
+ 3、在网站UI界面添加运行参数`enable_modelarts`并设置为True。
+ 4、在网站上填写其他训练信息并启动训练任务。
+ ```
-在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作,:
+**静态图和动态图模式**
-```
-1、在云平台上创建新的训练任务。
-2、在网站UI界面添加运行参数`config`,并指定yaml配置文件的路径。
-3、在网站UI界面添加运行参数`enable_modelarts`并设置为True。
-4、在网站上填写其他训练信息并启动培训任务。
+在默认情况下,模型训练(`train.py`)在MindSpore上以[图模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html) 运行,该模式对使用静态图编译对性能和并行计算进行了优化。
+相比之下,[pynative模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E5%8A%A8%E6%80%81%E5%9B%BE)的优势在于灵活性和易于调试。为了方便调试,您可以将参数`--mode`设为1以将运行模式设置为调试模式。
+
+**混合模式**
+
+[基于ms_function的混合模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/combine.html) 是兼顾了MindSpore的效率和灵活的混合模式。用户可通过使用`train_with_func.py`文件来使用该混合模式进行训练。
+
+```shell
+python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
```
+> 注:此为试验性质的训练脚本,仍在改进,在MindSpore 1.8.1或更早版本上使用此模式目前并不稳定。
+
### 模型验证
使用`validate.py`可以便捷地验证训练好的模型。
@@ -199,30 +174,18 @@ mpirun --allow-run-as-root -n 4 python train.py -c configs/squeezenet/squeezenet
python validate.py --model=resnet50 --dataset=imagenet --data_dir=/path/to/data --ckpt_path=/path/to/model.ckpt
```
-- 训练过程中进行验证
+**训练过程中进行验证**
当需要在训练过程中,跟踪模型在测试集上精度的变化时,请启用参数`--val_while_train`,如下
```shell
python train.py --model=resnet50 --dataset=cifar10 \
- --val_while_train --val_split=test --val_interval=1
+ --val_while_train --val_split=test --val_interval=1
```
各轮次的训练损失和测试精度将保存在`{ckpt_save_dir}/results.log`中。
-
-- 静态图和动态图模式
-
-在默认情况下,模型训练(`train.py`)在MindSpore上以[图模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html) 运行,该模式对使用静态图编译对性能和并行计算进行了优化。相比之下,[pynative模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/mode.html#%E5%8A%A8%E6%80%81%E5%9B%BE)的优势在于灵活性和易于调试。
-为了方便调试,您可以将参数`--mode`设为1以将运行模式设置为调试模式。
-
-[基于ms_function的混合模式](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/combine.html) 是兼顾了MindSpore的效率和灵活的混合模式。用户可通过使用`train_with_func.py`文件来使用该混合模式进行训练。
-
-```shell
-python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
-```
-
-> 注:此为试验性质的训练脚本,仍在改进,在1.8.1或更早版本的MindSpore上使用此模式目前并不稳定。
+更多训练和验证的示例请见[示例](examples/scripts)。
## 教程
@@ -232,14 +195,14 @@ python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download
- [模型推理](docs/zh/tutorials/inference.md)
- [自定义数据集上的模型微调训练](docs/zh/tutorials/finetune.md)
- [如何自定义模型]() //coming soon
-- [视觉ransformer性能优化]() //coming soon
+- [视觉transformer性能优化]() //coming soon
- [部署推理服务](docs/zh/tutorials/deployment.md)
## 模型列表
目前,MindCV支持以下模型。
-
+
支持模型
* Big Transfer ResNetV2 (BiT) - https://arxiv.org/abs/1912.11370
@@ -286,7 +249,7 @@ python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download
## 支持算法
-
+
支持算法列表
* 数据增强
@@ -295,13 +258,13 @@ python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download
* [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
* RandErasing (Cutout)
* CutMix
- * Mixup
+ * MixUp
* RandomResizeCrop
* Color Jitter, Flip, etc
* 优化器
* Adam
- * Adamw
- * [Lion](https://arxiv.org/abs/2302.06675)
+ * AdamW
+ * [Lion](https://arxiv.org/abs/2302.06675)
* Adan (experimental)
* AdaGrad
* LAMB
@@ -329,9 +292,22 @@ python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download
-## 日志
+## 更新
+
+- 2023/5/30
+1. 新模型:
+ - [VGG](configs/vgg)混合精度(O2)版本
+ - [GhostNet](configs/ghostnet)
+ - [MobileNetV2](configs/mobilenetv2) 和 [MobileNetV3](configs/mobilenetv3)混合精度(O3)版本
+ - [RegNet](configs/regnet)的(x,y)_(200,400,600,800)mf版本
+ - [RepVGG](configs/repvgg)的b1g2, b1g4 & b2g4版本
+ - [MnasNet](configs/mnasnet)的0.5版本
+ - [PVTv2](configs/pvt_v2)的b3 & b4版本
+2. 新特性:
+ - 3-Augment, Augmix, TrivialAugmentWide
+3. 错误修复:
+ - ViT 池化模式
-### 更新
- 2023/04/28
1. 增添了一些新模型,列出如下:
- [VGG](configs/vgg)
@@ -361,68 +337,64 @@ python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download
- ResNet50精度从76.64提升到76.69
- ResNet101精度从77.63提升到78.24
- ResNet152精度从78.63提升到78.72
-2. 按照规则(model_scale-sha256sum.ckpt)更新预训练权重名字和相应下载URL链接。
+2. 按照规则(model_scale-sha256sum.ckpt)更新预训练权重名字和相应下载URL链接
- 2023/03/05
1. 增加Lion (EvoLved Sign Momentum)优化器,论文 https://arxiv.org/abs/2302.06675
- - Lion所使用的学习率一般比Adamw小3到10倍,而权重衰减(weigt_decay)要大3到10倍.
+ - Lion所使用的学习率一般比Adamw小3到10倍,而权重衰减(weigt_decay)要大3到10倍
2. 增加6个模型及其训练策略、预训练权重:
- - [HRNet](configs/hrnet)
- - [SENet](configs/senet)
- - [GoogLeNet](configs/googlenet)
- - [Inception V3](configs/inception_v3)
- - [Inception V4](configs/inception_v4)
- - [Xception](configs/xception)
-3. Support gradient clip
+ - [HRNet](configs/hrnet)
+ - [SENet](configs/senet)
+ - [GoogLeNet](configs/googlenet)
+ - [Inception V3](configs/inception_v3)
+ - [Inception V4](configs/inception_v4)
+ - [Xception](configs/xception)
+3. 支持梯度裁剪
- 2023/01/10
-1. MindCV v0.1发布! 支持通过PyPI安装 (`pip install mindcv`).
+1. MindCV v0.1发布! 支持通过PyPI安装 (`pip install mindcv`)
2. 新增4个模型的预训练权重及其策略: googlenet, inception_v3, inception_v4, xception
- 2022/12/09
-
-1. 支持在所有学习率策略中添加学习率预热操作,除cosine decay策略外。
-2. 支持`Repeated Augmenation`操作,可以通过`--aug_repeats`对其进行设置,设置值应大于1(通常为3或4)。
-3. 支持EMA。
-4. 通过支持mixup和cutmix操作进一步优化BCE损失函数。
+1. 支持在所有学习率策略中添加学习率预热操作,除cosine decay策略外
+2. 支持`Repeated Augmenation`操作,可以通过`--aug_repeats`对其进行设置,设置值应大于1(通常为3或4)
+3. 支持EMA
+4. 通过支持mixup和cutmix操作进一步优化BCE损失函数
- 2022/11/21
-
-1. 支持模型损失和正确率的可视化。
-2. 支持伦次维度的cosine decay策略的学习率预热操作(之前仅支持步维度)。
+1. 支持模型损失和正确率的可视化
+2. 支持轮次维度的cosine decay策略的学习率预热操作(之前仅支持步维度)
- 2022/11/09
-
-1. 支持2个ViT预训练模型。
-2. 支持RandAugment augmentation操作。
-3. 提高了CutMix操作的可用性,CutMix和Mixup目前可以一起使用。
-4. 解决了学习率画图的bug。
+1. 支持2个ViT预训练模型
+2. 支持RandAugment augmentation操作
+3. 提高了CutMix操作的可用性,CutMix和Mixup目前可以一起使用
+4. 解决了学习率画图的bug
- 2022/10/12
-
-1. BCE和CE损失函数目前都支持class-weight config操作、label smoothing操作、auxilary logit input操作(适用于类似Inception模型)。
+1. BCE和CE损失函数目前都支持class-weight config操作、label smoothing操作、auxilary logit input操作(适用于类似Inception模型)
- 2022/09/13
+1. 支持Adan优化器(试用版)
-1. 支持Adan优化器(试用版)。
-
-### 贡献方式
+## 贡献方式
欢迎开发者用户提issue或提交代码PR,或贡献更多的算法和模型,一起让MindCV变得更好。
-有关贡献指南,请参阅[CONTRIBUTING.md](CONTRIBUTING.md)。请遵循[模型编写指南](docs/zh/how_to_guides/write_a_new_model.md)所规定的规则来贡献模型接口:)
+有关贡献指南,请参阅[CONTRIBUTING.md](CONTRIBUTING.md)。
+请遵循[模型编写指南](docs/zh/how_to_guides/write_a_new_model.md)所规定的规则来贡献模型接口:)
-### 许可证
+## 许可证
本项目遵循[Apache License 2.0](LICENSE.md)开源协议。
-### 致谢
+## 致谢
MindCV是由MindSpore团队、西安电子科技大学、西安交通大学联合开发的开源项目。
衷心感谢所有参与的研究人员和开发人员为这个项目所付出的努力。
十分感谢 [OpenI](https://openi.pcl.ac.cn/) 平台所提供的算力资源。
-### 引用
+## 引用
如果你觉得MindCV对你的项目有帮助,请考虑引用:
diff --git a/RELEASE.md b/RELEASE.md
index 46f39cf6f..88c451d4d 100644
--- a/RELEASE.md
+++ b/RELEASE.md
@@ -1,6 +1,97 @@
-
# Release Note
+- 2023/5/30
+1. New Models:
+ - AMP(O2) version of [VGG](configs/vgg)
+ - [GhostNet](configs/ghostnet)
+ - AMP(O3) version of [MobileNetV2](configs/mobilenetv2) and [MobileNetV3](configs/mobilenetv3)
+ - (x,y)_(200,400,600,800)mf of [RegNet](configs/regnet)
+ - b1g2, b1g4 & b2g4 of [RepVGG](configs/repvgg)
+ - 0.5 of [MnasNet](configs/mnasnet)
+ - b3 & b4 of [PVTv2](configs/pvt_v2)
+2. New Features:
+ - 3-Augment, Augmix, TrivialAugmentWide
+3. Bug Fixes:
+ - ViT pooling mode
+
+- 2023/04/28
+1. Add some new models, listed as following
+ - [VGG](configs/vgg)
+ - [DPN](configs/dpn)
+ - [ResNet v2](configs/resnetv2)
+ - [MnasNet](configs/mnasnet)
+ - [MixNet](configs/mixnet)
+ - [RepVGG](configs/repvgg)
+ - [ConvNeXt](configs/convnext)
+ - [Swin Transformer](configs/swintransformer)
+ - [EdgeNeXt](configs/edgenext)
+ - [CrossViT](configs/crossvit)
+ - [XCiT](configs/xcit)
+ - [CoAT](configs/coat)
+ - [PiT](configs/pit)
+ - [PVT v2](configs/pvt_v2)
+ - [MobileViT](configs/mobilevit)
+2. Bug fix:
+ - Setting the same random seed for each rank
+ - Checking if options from yaml config exist in argument parser
+ - Initializing flag variable as `Tensor` in Optimizer `Adan`
+
+## 0.2.0
+
+- 2023/03/25
+1. Update checkpoints for pretrained ResNet for better accuracy
+ - ResNet18 (from 70.09 to 70.31 @Top1 accuracy)
+ - ResNet34 (from 73.69 to 74.15 @Top1 accuracy)
+ - ResNet50 (from 76.64 to 76.69 @Top1 accuracy)
+ - ResNet101 (from 77.63 to 78.24 @Top1 accuracy)
+ - ResNet152 (from 78.63 to 78.72 @Top1 accuracy)
+2. Rename checkpoint file name to follow naming rule ({model_scale-sha256sum.ckpt}) and update download URLs.
+
+- 2023/03/05
+1. Add Lion (EvoLved Sign Momentum) optimizer from paper https://arxiv.org/abs/2302.06675
+ - To replace adamw with lion, LR is usually 3-10x smaller, and weight decay is usually 3-10x larger than adamw.
+2. Add 6 new models with training recipes and pretrained weights for
+ - [HRNet](configs/hrnet)
+ - [SENet](configs/senet)
+ - [GoogLeNet](configs/googlenet)
+ - [Inception V3](configs/inception_v3)
+ - [Inception V4](configs/inception_v4)
+ - [Xception](configs/xception)
+3. Support gradient clip
+4. Arg name `use_ema` changed to **`ema`**, add `ema: True` in yaml to enable EMA.
+
+## 0.1.1
+
+- 2023/01/10
+1. MindCV v0.1 released! It can be installed via PyPI `pip install mindcv` now.
+2. Add training recipe and trained weights of googlenet, inception_v3, inception_v4, xception
+
+## 0.1.0
+
+- 2022/12/09
+1. Support lr warmup for all lr scheduling algorithms besides cosine decay.
+2. Add repeated augmentation, which can be enabled by setting `--aug_repeats` to be a value larger than 1 (typically, 3 or 4 is a common choice).
+3. Add EMA.
+4. Improve BCE loss to support mixup/cutmix.
+
+- 2022/11/21
+1. Add visualization for loss and acc curves
+2. Support epochwise lr warmup cosine decay (previous is stepwise)
+
+- 2022/11/09
+1. Add 7 pretrained ViT models.
+2. Add RandAugment augmentation.
+3. Fix CutMix efficiency issue and CutMix and Mixup can be used together.
+4. Fix lr plot and scheduling bug.
+
+- 2022/10/12
+1. Both BCE and CE loss now support class-weight config, label smoothing, and auxiliary logit input (for networks like inception).
+
+## 0.0.1-beta
+
+- 2022/09/13
+1. Add Adan optimizer (experimental)
+
## MindSpore Computer Vision 0.0.1
### Models
diff --git a/docs/en/index.md b/docs/en/index.md
index b8451e993..11c9a73de 100644
--- a/docs/en/index.md
+++ b/docs/en/index.md
@@ -85,7 +85,7 @@ Below are a few code snippets for your taste.
- Classify the dowloaded image with a pretrained SoTA model:
+ Classify the downloaded image with a pretrained SoTA model:
```pycon
>>> !python infer.py --model=swin_tiny --image_path='./dog.jpg'
@@ -156,7 +156,7 @@ It is easy to train your model on a standard or customized dataset using `train.
[Pynative mode with ms_function](https://www.mindspore.cn/tutorials/zh-CN/r1.8/advanced/pynative_graph/combine.html) is a mixed mode for comprising flexibility and efficiency in MindSpore. To apply pynative mode with ms_function for training, please run `train_with_func.py`, e.g.,
- ``` shell
+ ```shell
python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
```
@@ -201,42 +201,42 @@ We provide the following jupyter notebook tutorials to help users learn to use M
Supported algorithms
* Augmentation
- * [AutoAugment](https://arxiv.org/abs/1805.09501)
- * [RandAugment](https://arxiv.org/abs/1909.13719)
- * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
- * RandErasing (Cutout)
- * CutMix
- * MixUp
- * RandomResizeCrop
- * Color Jitter, Flip, etc
+ * [AutoAugment](https://arxiv.org/abs/1805.09501)
+ * [RandAugment](https://arxiv.org/abs/1909.13719)
+ * [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
+ * RandErasing (Cutout)
+ * CutMix
+ * MixUp
+ * RandomResizeCrop
+ * Color Jitter, Flip, etc
* Optimizer
- * Adam
- * AdamW
- * [Lion](https://arxiv.org/abs/2302.06675)
- * Adan (experimental)
- * AdaGrad
- * LAMB
- * Momentum
- * RMSProp
- * SGD
- * NAdam
+ * Adam
+ * AdamW
+ * [Lion](https://arxiv.org/abs/2302.06675)
+ * Adan (experimental)
+ * AdaGrad
+ * LAMB
+ * Momentum
+ * RMSProp
+ * SGD
+ * NAdam
* LR Scheduler
- * Warmup Cosine Decay
- * Step LR
- * Polynomial Decay
- * Exponential Decay
+ * Warmup Cosine Decay
+ * Step LR
+ * Polynomial Decay
+ * Exponential Decay
* Regularization
- * Weight Decay
- * Label Smoothing
- * Stochastic Depth (depends on networks)
- * Dropout (depends on networks)
+ * Weight Decay
+ * Label Smoothing
+ * Stochastic Depth (depends on networks)
+ * Dropout (depends on networks)
* Loss
- * Cross Entropy (w/ class weight and auxiliary logit support)
- * Binary Cross Entropy (w/ class weight and auxiliary logit support)
- * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
- * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
+ * Cross Entropy (w/ class weight and auxiliary logit support)
+ * Binary Cross Entropy (w/ class weight and auxiliary logit support)
+ * Soft Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
+ * Soft Binary Cross Entropy Loss (automatically enabled if mixup or label smoothing is used)
* Ensemble
- * Warmup EMA (Exponential Moving Average)
+ * Warmup EMA (Exponential Moving Average)
diff --git a/docs/zh/index.md b/docs/zh/index.md
index 15ad16e21..f6d703a9d 100644
--- a/docs/zh/index.md
+++ b/docs/zh/index.md
@@ -55,7 +55,7 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力
## 安装
-详情请见[安装](./installation.md)页面
+详情请见[安装](./installation.md)页面。
## 快速入门
@@ -73,7 +73,7 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力
# 创建模型
>>> network = mindcv.create_model('swin_tiny', pretrained=True)
# 验证模型的准确率
->>> !python validate.py - -model = swin_tiny - -pretrained - -dataset = imagenet - -val_split = validation
+>>> !python validate.py --model=swin_tiny --pretrained --dataset=imagenet --val_split=validation
{'Top_1_Accuracy': 0.808343989769821, 'Top_5_Accuracy': 0.9527253836317136, 'loss': 0.8474242982580839}
```
@@ -96,7 +96,7 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力
### 模型训练
-通过`train.py`,用户可以很容易地在标准数据集或自定义数据集上训练模型,用户可以通过外部变量或者yaml配文件来设置训练策略(如数据增强、学习路策略)。
+通过`train.py`,用户可以很容易地在标准数据集或自定义数据集上训练模型,用户可以通过外部变量或者yaml配文件来设置训练策略(如数据增强、学习率策略)。
- 单卡训练
@@ -133,18 +133,17 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力
!!! tip "预定义的训练策略"
MindCV目前提前了超过20种模型训练策略,在ImageNet取得SoTA性能。
具体的参数配置和详细精度性能汇总请见[`configs`](https://github.com/mindspore-lab/mindcv/tree/main/configs)文件夹。
- 您可以便捷将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)
-
+ 您可以便捷将这些训练策略用于您的模型训练中以提高性能(复用或修改相应的yaml文件即可)。
- 在ModelArts/OpenI平台上训练
- 在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作,:
+ 在[ModelArts](https://www.huaweicloud.com/intl/en-us/product/modelarts.html)或[OpenI](https://openi.pcl.ac.cn/)云平台上进行训练,需要执行以下操作:
```text
1、在云平台上创建新的训练任务。
2、在网站UI界面添加运行参数`config`,并指定yaml配置文件的路径。
3、在网站UI界面添加运行参数`enable_modelarts`并设置为True。
- 4、在网站上填写其他训练信息并启动培训任务。
+ 4、在网站上填写其他训练信息并启动训练任务。
```
!!! tip "静态图和动态图模式"
@@ -160,7 +159,7 @@ MindCV是一个基于 [MindSpore](https://www.mindspore.cn/) 开发的,致力
python train_with_func.py --model=resnet50 --dataset=cifar10 --dataset_download --epoch_size=10
```
- > 注:此为试验性质的训练脚本,仍在改进,在1.8.1或更早版本的MindSpore上使用此模式目前并不稳定。
+ > 注:此为试验性质的训练脚本,仍在改进,在MindSpore 1.8.1或更早版本上使用此模式目前并不稳定。
### 模型验证
@@ -177,7 +176,7 @@ python validate.py --model=resnet50 --dataset=imagenet --data_dir=/path/to/data
```shell
python train.py --model=resnet50 --dataset=cifar10 \
- --val_while_train --val_split=test --val_interval=1
+ --val_while_train --val_split=test --val_interval=1
```
各轮次的训练损失和测试精度将保存在`{ckpt_save_dir}/results.log`中。
@@ -206,13 +205,13 @@ python validate.py --model=resnet50 --dataset=imagenet --data_dir=/path/to/data
* [Repeated Augmentation](https://openaccess.thecvf.com/content_CVPR_2020/papers/Hoffer_Augment_Your_Batch_Improving_Generalization_Through_Instance_Repetition_CVPR_2020_paper.pdf)
* RandErasing (Cutout)
* CutMix
- * Mixup
+ * MixUp
* RandomResizeCrop
* Color Jitter, Flip, etc
* 优化器
* Adam
* AdamW
- * [Lion](https://arxiv.org/abs/2302.06675)
+ * [Lion](https://arxiv.org/abs/2302.06675)
* Adan (experimental)
* AdaGrad
* LAMB
diff --git a/tutorials/imagenet1000_clsidx_to_labels.txt b/examples/data/imagenet1000_clsidx_to_labels.txt
similarity index 100%
rename from tutorials/imagenet1000_clsidx_to_labels.txt
rename to examples/data/imagenet1000_clsidx_to_labels.txt
diff --git a/infer.py b/infer.py
index 06889d736..af262fa81 100644
--- a/infer.py
+++ b/infer.py
@@ -47,7 +47,7 @@ def main():
logits = nn.Softmax()(network(ms.Tensor(img)))[0].asnumpy()
preds = np.argsort(logits)[::-1][:5]
probs = logits[preds]
- with open("./tutorials/imagenet1000_clsidx_to_labels.txt", encoding="utf-8") as f:
+ with open("./examples/data/imagenet1000_clsidx_to_labels.txt", encoding="utf-8") as f:
idx2label = ast.literal_eval(f.read())
# print(f"Predict result of {args.image_path}:")
cls_prob = {}
diff --git a/setup.py b/setup.py
index 3d1fac604..9783485f6 100644
--- a/setup.py
+++ b/setup.py
@@ -1,19 +1,28 @@
#!/usr/bin/env python
+from pathlib import Path
+
from setuptools import find_packages, setup
+# read the contents of README file
+this_directory = Path(__file__).parent
+long_description = (this_directory / "README.md").read_text()
+
+# read the `__version__` global variable in `version.py`
exec(open("mindcv/version.py").read())
setup(
name="mindcv",
- author="MindSpore Ecosystem",
- author_email="mindspore-ecosystem@example.com",
+ author="MindSpore Lab",
+ author_email="mindspore-lab@example.com",
url="https://github.com/mindspore-lab/mindcv",
project_urls={
"Sources": "https://github.com/mindspore-lab/mindcv",
"Issue Tracker": "https://github.com/mindspore-lab/mindcv/issues",
},
description="A toolbox of vision models and algorithms based on MindSpore.",
+ long_description=long_description,
+ long_description_content_type="text/markdown",
license="Apache Software License 2.0",
include_package_data=True,
packages=find_packages(include=["mindcv", "mindcv.*"]),
diff --git a/tutorials/README.md b/tutorials/README.md
deleted file mode 100644
index 322cfd37f..000000000
--- a/tutorials/README.md
+++ /dev/null
@@ -1 +0,0 @@
-This folder contains jupyter notebooks used in the tutorial.
diff --git a/tutorials/data/test/dog/dog.jpg b/tutorials/data/test/dog/dog.jpg
deleted file mode 100644
index 81c91b1d3778e40b3882b39a750cc08624a30112..0000000000000000000000000000000000000000
GIT binary patch
literal 0
HcmV?d00001
literal 33315
zcmb5VWmH>R^gSAiySo(E;O@nOI|YJUkV2qP%Eh_3OOW7Dyp+;H@ZugEiX~7eTA&Rr
zR*Lg_e}8%J>s#l;9^>h5yFJ3}C*n`AH#Mt>?zw+|+VAs{tX4f;((q<2mkq`i~`@eSf@d^xKS7w(H
zBYIp1XaGnFiHL~^NQsGv$;e2_so80%si>&=SlH;O035b022U!{r5|;0RPvpo?ruTaPj{782)qoe_a5;UnkDv8h`{F3-E*jo8s?{
zUOrg#v6<
z2~mX%eEPSnGKGb>oIGRR86hVv;zQcpT+gwxJQO;AFrvoHkme%xJUXR0FXkJ_;8}v4
zDGJBQg^6Av&PC+)dolno&G9o#C&iOMj!_w=_9p~#<1ggBu)97fSbQwk|1i@wi+6S+>Ge)DQohol4e;OKo{qmz-O}?gwU(M>be~uRqLZ2Lr)8sR}9F*jG
zv(jfz=>?lq7F!e*s`>IMIP60e`XH2EnJ}YeE;OkW0rP1YuV(6eq}HsrF8GT=2vjTP+xcuiVi~9gbpRQG(co1u-bL4
zpKvHNt7lN?o6{kF;WbFHvLvtS9zY@CnHs4j0~l)L>&?Sw%PK6ZCvl{zxiJA<`(CYcI;
zqU|uH-2sLlQqC>t>QiNw{_{!J;bo$>ta5X)5Q3Ti4Ugs*iNQGka!D$8imgst;J%{g_!a&npc7Jy)6$t6yYB)BJ;I
zO7o?LTz-A^KXLJV>DiqxKaDXzK}>l;G)Zf4=Zzwh^=-%BjlH~?7=52ACiE@T4iVtm
zbszvyzp0+(v_U0cdF67|5A$1JgnWIyWEgPW7^+XlJ8It2V8JYi*JSjWS+I}}ygV}K
zivNx=yD0G4KmoAE<&x9$GT`-j^Za=&;+M>ukK!>a#$rPLu`Dvf7WEP$P;-(Li#b5D;vip*E{YIu_tuuSIL)`-YZK=1u)7R_AS1BafVr^eSwO&-%vAUxDP@5F$<$
zCm=w;qu@jwIw90uom!o9cAqJO7V~yrYje~nez$*p(bN%@zup)~%aorHynFEd+W}3d
z<`NM$#c?tTLXnq2Fpu#)&$Hj@ZQ;p(0u+MQkr(e95)nR293HN-rCJ3Pxa8SHi~OIH
zIU7Mv(<|@LT6z9^Wf>8Ef#>VyUDr31&%W$^Z&~RzpY#1UU?e2;I%GV!@Tu{03D~Q{
ztPwk%JYr*s`41MpWk$W7pBsBaQVolb-98{3UepzryDxaPBZZuNzb2x}`MzEI`|xL|XUVb|L`TKNh4IC=-ORz6jw{XGxW6Q7+ujDrAZp
zOE2BH+`U@xTyF%G(sJp%ngNsP^~x0(i`xbKFc0?4@d~Wq%og_aNfx*shI@^T_ul?K)u0lW)sQm_uy|Ry7
zn4H{fa-$1VJUt10yfBJMkE>7zclX5y(nY)Tz!4G|*8Jk%ElmEBG6$>_1(R9ts~JpS
zZ)h+QX>p9DR5Q-}JpS36@E}ZFnO$@=tQrTbpZGDgrHw7B;oBC2h`Y0f*AKc>>9=De
z_yUX7#SJ;j%X{?avhU`KzN{e8b5#jH8fO*=vSG6{Ft#T39oc7^Ruyi@A4%T{7{Q>x
z6oyb+v(2!Jz@NVUdKA;)w)sW
zt<3uD&wjR?v(Vt?7c?$byvnWB3g3I+=WeC)>m@;jwS%0syn+ZG^B3VXvmP{_ZE`Sy
zEIoAz@uc=qQ<1_%D}iXi2$iSW?>70mhC%{!{299ON$PctKGeVWdvhY-@FO^Ay*q#I
z03PPUERtRJvHW6#V()mjBQwQ#`g>ymV6A4+J+N@L1&&=$d7!7g2+!eW=HEBd$9W;h
z@4hI+-O9N@Y?ZUHxX6{$b|J6O(|X=!j6<#&ox?1{r)^WrI7(6k6H3BA0*5*jzfN|u
zQF+>M4%W>&H~TzKMr+-zmup6cCQ2lVSIcb-c=y~|U>Y}QiN{>m-K|XyS#M0NTPkQX
zlAGRSnww$0nf-FH$0$VGxHwt_Atbt@7Vzg;RCHI6->#q13?ljfdA0`Co6GQ7)A)B`
z&nwSNeSOkJFGo+)tKmp0lirJS>-$?sZYoom8V~Hq?t@gPO5-zxQj6}hjA#?jU*bic
zMI~mIiTjMN1-qOYtL@%@tUOW^EXZqYNvh853XJ+&VF^+Io;<zN6S#^0KY;IAJG_lfbtOz`R!<1l
zf-k6qMqj!P6dRBj6K41>Dy3iWa}HgK%CvDCHD)A~;8udg=I-|{E5{_fq)~vn*Rei>
z5|X{NqqVuHxe_kEt#M~zIw_`d`Um`n%tI39&ZtbwY2puT?iboj`A8~9&RZq*b?~X-
z)3PNMacOCKWhwA<(Nl;Y$P@y%e|YL@sVo_nnys==Z1PQjNSmw03Y2pGKE%3zX5tJ+
z0#ZJdm<82-WSDdC)$RIzsG9BpK9Q2Hsi#LrO815+AGO3~NC@PbTFTMPR|`nq87s{RlBW-S+H5KH!c$!{vMfo^y*;}m@Zw<&UN~67xEd45mX+hN4!ZLIUXXy(W{oDeE3$xWjZVtykt6J4bH)RTeHY
zTjyFMI!id6>A;beV92ps6(jZM9aCB(A?676xp?Sz7$h}Rs(X16L09twWW^0ekL1vO
z_Az(;KE`=xRSE|~euaIF*B-XFQxat(3_cRQmiEiG&g~r6THF<$t&t9GfssNjFV`h~4b?_?a~?DacFmXc0(xr_?6lpCNH-U0scBR8JXHJne@rWjK1@~?ba
z3g2RO5$o%}X4{RqVRB?73*iKM%a|$YwL0FfiyCF`n{<+Tp6XBb_#mdvYz}gV#d5K+u^J6y_k-$1>5o0Tj@Gqlxf=#m%p!E
z44p|AD;l$DH0^!bD`3YycfTMln<&{3Xad1?53j)VKN_fr-2$z%dXyWBfxRDxE5rO#A06WX9sJP^uFbUa;s;*#?=GJQ2S&Lwhz_JXpCuZ*!^LJMc8j
zF&$I-7PA8WT1;8rwiPMx3)u{sT${BvmnY^*`Y8hJr@w)Qp&`eZ#v_`S*QT|6-^2ZoDtgfwPnwZ2X@?ZOtL2*SK(tp?zTcVUaac-!
z?msa(g88Uw%Ci~|2fJp)xnVR{`GYz6
zp}saT;#}f@%5OT-eL&Af`lz;%oU$LG%glDql=4PnQYQU(yXm1rwwvDQZgXho2n`^CHN_O)|ut!79(|Z1=IVs2o{)!
z@PihO@KLbD+>XB-eCJ|+RsgC^)gBg5RU}VH$YPozILUvJ_*JWDfex8gKVkmkg(?_i
zi>|-v+=;W)LzL1(=%!QQOO_FE(ewO(qy@ZPxqV`&uD`xX%Qyd)G)ts=(7Uy=6^D&f
zTvsy|9BtlrFJ$KoS)pi3o#vv9d?8UGzrDl3&BIR$%aKvV|&>KOM1wG
zapMi!axW|0IFx%HVJ}&gbtXMp*`=+$EAS3DrL3X}rf(es<{8(%ZjK0Ic
zG--YTZQKPy0vzgM*cg|2VrAM-#jCbcqtXgOWZUTW>;
zm1WFDmg$MU&^GMhbx;6~LZ2=$#k{5!k|(pi&E?f?S3Bl0$@lJutt@FGT6|MBOT8k$
zWK8^XuO#c9(#Y4EX_%Z^9TYaxR+dJS82ecyRl~J{r-MJ`hNm84-?
z=@#XzFbyw47F+-M6b^!CWduc;@J6#BhX_zMqWLKr_zk>ax`E6&VwzV(XYy9c@pVF>AkS51Iu>
zw5)Vbb&ooCSB?vEaZjN$EcA5q>EZ$fI#^*mS_`O?W9cy^ujjBGkv^ba^`vLw%?Jn)
z6``H1rMPSOT0|RZ?zxAa+02E!+r3!x($+TO8Y-06;{^6>GnpPVayz(rD=t5@`&?jZ
zN2BYN-_1$^SAuuOb6753|r8HLGCVs7}Uo{p4t-O9#=hS1TiWEguZyZqYs_#M3SAzNgl
zEZYL2h)h_R)Y*UW#ZgGgb@l~p<*T`|?l_BHiab}YJ~bi<{!`&>k*#6(&uM)#hNJHdeZmf)w-{^Z6BhDH5;;1K-PVnA;@b-
zSM3Y9%h_!$|ET`al8gGgkd-z^(|QQzq!V?qu(AqC&d}O^AM_=DqUMWK->#!NtgePt
zO7aH}$ar6$t&2h9T0}-pu}U_~)=<2hvyr=j3r^EAzmt*aaZv*jB^dNEnx<^kHXV88
zvoA63GsA}r648R&*qZHXk&V+yq86B@u
z@3yFCOSN5%|Kg1pY2&I74Jc(QH`!C$AGIFcWc)?-+$CB)P}jQhIjM`d;px?ieJYW>
zvEiBZ(?I0lSC$WB8A`eVpJo}FJG}F_$AY*Qex_EMi18m62bud-wiL4&o0--KSX~@3
zc+hY+R{P84`p3E7LHcHv@)%P5@$s&OtpC03hMRN
zk!$UZ7L5o!9ysb`C-7|>LdUCoI)GQ=n$iO7tBq_O(=|g*Zz0voJlp?`SXf2%_}soDm!7*5f>ugbeBS8fLXfvu>W6x7C=EM_Ke->i5iEv$^SG4e_IDtti$F`T;hrNig13D$~OnU
zX!7`%cdSn`#B&8)V)1^zGkqEtoMPm!+bOjn_z%?ss9}U&XfOO;{OxFf9NkIyBHOR_%pTt6~dGMLU<4s
zw1}Liko>MF_W}2%P;X;7CHlp3g9XvMc`247e>*GMQU8BUG#$sS*E+j)ipl|EEe99K}o*t#Q_3Y3EMJq@s;%lx{t7guUH;_R7cVv_$Xr@n76
z+lt2@92~=`7Aw`jtC16)D7eHiNP~J=Jfab*MOD0~xbbh@l+zKx!1*ilYK`8qQ?}tZ
zA$guyipyry7c#7TGf;b>(K5b4D~^{IB~ApAIzOF`lwRi;is+ZLa%K*Fbsh^Fc|*_O
zzX<+<`{JOd$8J^17$sfOz-*QIcRoZZ09ZI!PjLVCZP>W~Yf7*vC^^L0ou7z3Q!}OF
zl<+r+hB;L{SxjVTo?H;T)!>x4@2PNUc-!|O2Y=)WUL<5!D}SwX_T5x
z4dc+^AuuD<;*ff1+nO?07nTZGG(7C+u$(|3)KmSmdgw
zEaPxtDvC#7wb8hw$=R8A}7X;+$L
zrd3&IFp<-heF6I9&2y|PaH3btrbmC*2Cucn^{nEAyG>~iWFU5q2YASeS&pe#(t!&DgSs~pc;;Ne}{$9C63g!2-7U77w{X=
z4YVXo7_0Nn2-Zd!5+p7pOh~?%ug94P6UR^OkeE2mz2!#rLz6T0NN)t2i
zOQ4h~VI~jSj{k=oMMI$G`3Ku6lWaX(u%%Hl{J3_eEL+zi=E~UYpHNz!b?sR
z-rube_R?4V5AV!m)tAp5-Oox0Z#l%$stW~eS}iD#gqcaZA@xk`mnpVBdQrI9=oiY<
zJ(}0l%W;-*9QwaLdbG(>cj>GiDfXN`s9157JFEMlgFL22Id`+rbgWe>H^X5qNqVQyp@ys%2N{0b~a6BP1xC<
zut=ES7`FhJ!Ng;AE6qj`w-8J4EIAvVDg5Q<=BDUgxB8lwEf3w&eBwO=KMczBl~WyV
z&}S?hgXGjL;?i=P&`D3Ikl+hK6rF(;Zi!peIZd~wtO%PmgOuK+WE-C>2@r8sQkc=i
z=_Xb-JrIQnRAX#(w@KRSwGKg`R(dNJu;xZ?i~(h(K3y0AZw93?8rblq^mo7^+3BS3){
zB#I|YJs2gE*zA{h$5bX`Og*frX%Nh2k{@@atx3ffe-{|%{Rj|!I?m>WH}#Fw0f#S+
zwfYo)H%7vhlk+5je?g!k>?9sU{UG@WAnSpKct0I_1Q13gvWf3T=MPu9YD@<2y<_0u
z5Im9Mi_St8upu=>$%jreiDy5Odk7FLHw838%$edOW;EHCvR7JjH@tgS%1y!
zlY3yKH0b3Lk6tJ5`f<`VB7IaVG0-*9yrqkY(;1Ju6_DKL!4bvBji)((*0qxgzdkW0
zq=Ln758PnPd4yc7jmODB-lGE6gM`W7($IV-yTg4%DA=s^%>q@Q=6#Z=srAaJ!5DC@
zBAA9(jr3}u6`7z`;6}@lLGk-D70kF@G7I@fD`pYp=JuM_B(Jfcxxvevyb24E#IrlN
zt%20t9>h^6$2;@kAf_sfQK{15MO_l3KJqoGbEi!r`?o5naoDVz^!&H<+}?MgZ`=u`
z(nv~zhb=gmad}WO7=toB9q5hF(@?Fg-y19o;@Q-(n1rTk_j3YgH=`ER#vslFe$vSF
zmGwzwr8((6+k%SbB}WSoqmIA4{pZRvy^6-o)3@OCL`OMTI9OUEn3Kw0MgrYS
zpk0!4y6ed&Ah{WcNuZO#Y??_kXvEp6ilg~-8zG7ZdjwQS@j*>s%|%*btuXhg>?;p1
zwyqly#C(J6x(fzm0139_F!t4Z4NJkVObh&&P44oIn46oK`Bpe9SjYF4y3$Dx6f%+!
zKbRGnfrB!VB(DwP->7*=mxBhsxuAF6u+AU9up<=R5%!aU*WBI!{MS!g@+CI8^o6B0*3rh
zijk{7kNu74Fl@6K~|Nv_P43w?^{A
z6_OVxoSo*}D#+dNRG$C~9e>B@`S;i9c^Txl)V6|b7B3$G#9#j{n$L@*V!m{kl(Do5
zqoy>~-!>_aqsIKkdUqA7c`7xULz!bloWojDI%=c)
zN8^{}WmtD{lBYNFG$T-yrwU@9o;-AQcTyhjV}L9u)Eq5JV`yeC{N$$Yg!C6C9@tMZ
zySlM4W~K4))RI$nu_s-ict>%effXw8AwUkVZ)b0UYG?pGzff3F1=~*!lG@6^=*~2Y
zB&i`D$QTIPCH73-8ZUSxP*-&$S2#8*+9ppcXaq0Qw=5zky_ohspUKM&bO-W+saLFT
zwPbwzv1_Qp^I5J)z6_5o=XwsQBear>7sdYX08&HJCqI
zxXq>mUSbYs3_IC5F^6Tkp4f5+>ZZ9-i9Hd#VHytO@5nS6lf`)Jh(7sDo^
zD$*z2t*{sJ<4&U`4>Oed$jE|=UHuUG&Gsi8?#1G}hdtuAg`cGmUWASe>XA&9E#<`C
zAT(7?kb+c^=QU%68>(R1jwRqPuS}#@Gy69-L1)`=nzC_bjsta#FS_d=-5$p9u^kTrbnv
zt1f}QiZo&1LW!kIisK4eNJN^RkL~n=`Q&&nG@Kwm+9{H5MBl&FYJM2jWPD`p-K@(f
zHm$J&bV8$a-I|)q@i(#O~LLI`N
zZ`)xGo5aVPm+UiJ!aO{d6PbFn&22NmEi?ir61(u;dXaGGJ`UIGR!I@Kr->P2y3*hq
zQ1$;e@0z%3zKEg_B#Wre{;HmyD-oW)0&YHl(eb(_sa0QcWUbF9OdR^y5t8EM@?m>l
zeZUcAHNYhwOVc}14~`u^S+fzNv#Rj=G1wyIen;|)nT?uxU^UtgdwpwuhhZ-kwL8kJ
zoE_ltHTraWpeUFw)ts6wriZ!rqi}SU3>DQte(Lr(T}4JrB;oED1NtY6dxC_B0k@7H
z_Sb`G!q!_V`?UCQEtB53Spra0R{ge54}v2TV=JsOu8mcTt`MlOidS0)G2r~bOENGN
zP6xT;5`Zapf}}L4{irOQw71HrNp2W?3K(+y@ILV;3Bz1N$CmStv@sl*Q2h#{Gfp*p
zDZ#=Df~
zPIl^|M>BWHfpj7$iYu}wg_U8kJ}a^p!Au-U`10YzZxwl2Qb#GP^ol>?MJKtAr=G~O
zY>pv+dK!tjNf?;1NQiS(5;#)_%tL15m+3+I150psS$F(~f$7>VR)&da8SaI$%mGK|
z?&cVNzFAy3(hwH6K<;mq-*9r8%44ioP=os0c&%cMb#ZQzm5kU=HF6hhnGt1KW!6+K
zBn(r65&U`6D$MhnK41csc!SvT!J-02XbDBumTw-_w{nnhoyy2Y_Lsp(hH^sE#3Ur2W6I$t{zd-vc16vuSUW)`|HJ}4I>R8j
z@7017valUOfyTf}2uhfBYWtGH-3q=ylv{vRygjYSAWs@qSieRt(@wgBJ}O!Rg`Q}a
z9yn1L9n;Y){g6g&>|%9k9m(Q=fw|Jd1_~qHcli>SqHa>I*@b^rVt+?w@yx)o2ei2P
zP^*uCSnj&N>&FXww)YxrthFf2@ghFaBFo!gz^wgwoXYt3y*YB_q=ZR|XNPYb1j6vv
zB)(*PCLJ|Q=||LKP6^Qw!!~Pum$Af-oUdhxCa9y@n+YF9VdWBF4IeNYhIQJJf#2
zYje(jG_mz3TCo`v7-MF*0M!d&9(R(3C1Fe6z2xD6Xb!k+whHIYXq6dEQ
zd#jLaFh>jbEB$(+n~8pvgRlD!Li3-8xDWy%RkFq(qjJ>L*knKR1XxJOL>H>BSS#K%
zPaX?WAaA7RE~s$nTktfmw#L4k8LV-{>@gx`CA=(AIetTWWP8CgX+i-~pM%5}5O!W1
zdhWGuQ!5pr+D75lbqn3DPG5F{1j6yg?v!Z`2iA3+LSJX4C3wj~X*xB8V%-p=M!YyB
z%pw86xhIFE{Ff4wE0R8{OKj_3{t6i
z{%Myovk>~wG`&aCnWb!GN}|=h%85T5r*@5by^_{X4f38Q3-uIVf}Nlj1203Xz6oQL
zXa&Ah$sA&L8#+Ae0v=48|O+0f>^n?f(Mvu5;Mklf=$#?qX5*q#t
zvf)7!7knxMwm{G6{M`yaMjzQEnG7loJK5PagpCBrdP*Xre`&n#8V@d3^nTL5R
zco%5b63UYkVXOsp=hrc4VuS
zZ&;O8@bpAHmlx3!ucG@zPK}L&s%7x1R6D;?svizoK})lvG@;v`T4Oi#hd3_|7BWz|
zP>m2quz?s;Fh}EB$g875>QmiB
z2A>p-+~t|YwuUs}Jo3k3*RI!Fqq}H
z>hVUG5ljmqkVO}-S>q1^lv7EcUT|-n#6dkAkXzZ86Xs))q!gR{S#HM7U8xUD>b|~X
zIp&wvJ4op9{S)W|6KQZUcGY0jr391gMPg(^MJSo|p>$5#AG<;+JQ_}mk&SM3Q(KK;
zIf|}gDZ$hzP=N+dny4m>UVU6Eh1yU=j6*0F{bdTXhLQF^y9iXXVxV(|W_L5>`km^e
zMyvL_S-(5EWs{5R?V;wpktUM72PY6n2ER05kSN@PPCevJeXT`FA)$~=Uy8n=PW|8M
zWzHXK?dP_6rEZzL6mGN(k3LgFqG1lH7mdl5LC2x&eM`>3%!dkl?zyz(H(x2?pW
zaL4ah2p<8U)Pv|in{Xls@sk^5bG%4=ssY4Gn_as`)Se_;K02-Sa4p6Y?AuHD@A-Uo
z*)SY8-gJx1%3p&+V~d))l{BnW;0_#xhgfApvm*ApbQ)Och_LE|T$MIfO6f1|w)AHiEYu3!?OWCBE-#KS|Qr3n=
zFWUBMbDt(>b-H9dFU3P_w_0Wjq?Nj%>K@W`j_IA%;F9XiKSWR7fpdc*Qq9uu+>A5h
z=B{MwGp{t!5-dLX_gK(~Fsng>)rL3go+3do1pJ-BinX0aR1en@3
zRu1!!j*7-A`a~vI_mquxTpb#G?h%(>W>Tu2(Yayz5nasZlil=Yc8B{$GUyR7yh^z|
zSo_ZydfFS|r%SeTuie*mm^}aba6IlRPGqRPmE1LK;)aDh)+EMomJE{Ky4Zg>(?!o+
zx;L}W{v$gd`|GRuo8;p{CojW;Xk}}vHA6*AkZIjHq@xpNv0gHWXNe~Kw)7jL@BQ>}
zY52W;DYFD1+iTaQX1~C=t>yOL`SK^@uK(~D5tpQk=wjSk%3r}gqWfuR7S3p`#UBBc
zu%35PXAK;!qdZdaKNY!?9d#Z7(L^BxFQ%&xzlfqne>XJ0?F@qGo~`qy^2!EN+Ya^8
zZc`MsGtq{{1dDB{mtHVZXR7bjNhOCzy?@Vj9zCpoOExDmOW0UoQL15T1!nEcuGXG$
z#r3|UNi%q#F;n*&wiNJNOkkVI>-(H>Py+ni8#M3Cw;MK1-<^_OPpRul5i
zA(wde-wF0zpv<&$LPJ3I!aRgq>m#Cp7?Pz9YJd9HI3o2Xx&D6n|9?W*Z4wIC4
zVGinY*%k5e5ug)Hxlm(3v*=o{ej@pF3KfD#3Ur;Try
z>PQ*BHZ1p&M!ZS?hGBYY9KS+!Q3Mf2TU2~6CuT?$RaeEFEYydbN*OgG{B)&$W(OVq
zmgDksv?}mAj?Tzn1ZPpTMtYCwp53t2fXG-Q-4E~cq=@Q~tFXaVIMczMP;1b+7F&Rn
zumO#=dl&bOQ*K)^Oz8Pjc;O
zNSY}eL7@AM%yV%o>}LJY^0KsDMY|542iPiA-?Fvq{pBa`soudmP(r?xcL2rZ%J_Sd
zaK3U);SZdm#N;50)gBhY2=Pt>@AdC2*%KlLb+?CeO&E`UEEH=YQ1C(|uBBFkq|mku
z-@1D3l?<;?-LF{Hin@8dbOQHH?>{7Zvtz-oM{jZ&U04vIA=I|1HjMA*whY@PFI+RD
z7+**w$OOABbn)&+pY{_aoFsZ4fZ4wwT!ppp_J@Fw*3u2?WVzQBdC#f7!DVx$j=fI;
z)=e2ga*BzzKH4z|d1#t$dxFe{se*hdvkn$nD#u1PybsYVB&|`-_qz%B6A#=jb6*J(
zE92dg?zNszNSgQInkZ8lA06EgA-*+7uUC3V#9Xqij-~V{`)z-H--V|uSQnZ8!FPV;e{-Yvnp2gK29C2rP#
zUg2ijo>X}?CpAUPWc)DvyU+^u^UfIqBeVRtY%?2S&nhjcUw)F1kfX3f!r4@0!E!j
zGvNK81zs;eoDzUxbOXJ_t^)HV3Wxs;v!-)n))~T0mh&q2wh9z|g9LJQiK%U(w*JJ*
zVg_Zx5=6_mfqyp+qfWnQ0V`7jOKnr>1cvt0#c;M-3(6aojF68*#9`Ee%@}RR6E6oS
z<`Y~cYu)}6J;~UrM))-uc?O|Qr_iKr2}8F@Sb`EJw6LC_NYyNVD^}0=R5Q8Hj}utSj9&3e_hj(VU
zShw?=`J%-p#*+%x)$)V5G%qUxd%#aCy7tvXwFw8=ofTiIa202wQys-D&h|0>jE
zH|4DDIwfsm>XxVZBo)||Hffnoc=ALXf4(sId7?dWtk=g{sfAU_q{rA~z
z5RFpf8PLncw)Uj)rA)5fiBu2v6_1ew=ccj2bN+^z1=)BXhi!`8ZXKxfWQ@R-#qO}SNRluqN!M(1up=)_s*}6^#$Q{JV$HDPOna!ffA71Zj1^Em`tIyXLYn@h
z=n=OVGPipVwpuqo97RU)4cB)oUXVj+RZ0y_+`IX*?chF;S%!B
z(9?&Q9U9iO&i{0myf^;KEBwzvq`xEt06pU;jCeCB^*q>`f((C*std-I+ucxQ(N={3m`!0@r`sF+bhS073XYT{aGH
z@Hx{eV|l&&;D$77Y<2`H-hWi!iljVOCncX6XP(sf^f@?iABxC+M^U>|O%h-csd#mv
zbMRVzAWks=X2Sa7C#jkprHABe>uuUzf)5{S?zeQ~d*(@kC*D~RD9v+7;~DGjjQw!4
zQRn?LY_wh(lvVY-yIwg-DdLmkTxL*7{>$GR;E&fj@nlb#DZi@s&SEc(k^5VvcfB%;
zoU-lO6jeFnWLxo$MlKvUa|`O8CjqB1U
ztK;ELJ{`mOfhQxruCKgSeVb-V_r|%|3@dCH1lWgz!O2E?@dI0qwZBN^XceWH6jeEm
zkS!q&4@LhXI*e{daLYAHr2T~{RtHd{jS5#Q7o^{tqxfY@65Aip(sKv0_tQ1g2HlSU
zHw^^KO@ydu%%1#1Ir)qm#*dDoQU50~%Y>`;;KthOpS+HTe1!m-4Up0mGHz>v<@e#f
z${E7Za$rRjD%)az)V_ryXa#Bb(4$e!RQBXxrcEZ|F9pU%`BxV4f9!v>09b&(6c~k?
zsWZF({}pFYggw2q%Y6j2-WZmX7en7GY}zuP{L4hr5o5uo&~Ryxa(Y
zx@YNfIHCT&ml=rA2bSJnc_6}<+G2ft;EK38VXi)U*17}hgvaoJDRu;!9ic07t(36Q
zTAtay;e1@br@C&0${tGOvg+9D9BlF9X-5HLbPFY(`>Eg-yVCCkuua^kKiL1<6oUPO
z^t%1h#^BCHh2DHF?KCcu(ChjUfW@n=f2M@?xi8r3nNkey?dNo1zyyGW+o$Cp0gCNL
zwo3W=xZRA|m@q#d44v-j74wSP&th3NgOh_1D7pZ(z(u=?p%?Wov
zTg78}Iu5dbkf_RYpP75+%o?^385`bw&5C;jz}qD;BKDi$tUu15L)M}Bk<7lvr!S@z
zeKkrG4U5mtG$uHI%fb9q0)3()L%10g*^K9HQ8cxj;?#1}S$-D27Nsx$kLd5gRu`GoFO4k8L;YCyq|(wzf0t>pbY8onRZi2pRhj@=IUto{7s%{nD~CgO-a78ygP}IK8N?SO)cq=v2ooUNQct6}umg
z@G3n2VZ2bqu?gcGIeQG}NvjATfwKk}^txj(>K9Sz_;>#eASPDERFlb
zV&&s{S6A7lfG?NgfgKUe*G%I)2Qxemj<>vL=Tps96JZ3m%-38+=<1hOkgRjz>w|&4
zTFqjX(kc4);(I0WJ`>hZYmsZ$e#x__DZ77j&$oi)k%!-`0{&F>
zMRWoDtB6sUZ+B;eJV(HX9*$v@l>TT#|{rSjudv<{>vTHbVN{V;kR>7aqmOh3k4L-POTG3);)kN>x%
z=f6BwqcHV%{+~4dPvAtMig6UNWe)i|zdn4rtDJp?f=+}?sXP$p`*8$q9(?oZnu2y-
zI-IRsK9Sz<5{0WKw!{5=${tAjGfQG`c@@K`0;Y($liCCS{c}Q(?1i!rW4h!x$yFWN
z-Rk##{X!$hsJ$+S)Q(7$;lysU>vB*3qAg_VaRBtMY_T3OYFz@>X8k_>rnBSjtNpG$
z3Duu0-srP1;ULOvkzYRCNVE$kN);d0bk`d&!oankz8AGVh57|ACopbIrO$-WRiUhS
z+;W)rB>UFAVVeecUcMtC!QQMLdv7kE=2N!=@x-p7?LgbF1GC)Q`>|=1gk5y~!Vt_q
zQyQp|Ax(~NpxT^%r&twD{q7Ji%%^pq2S^Kv*evydZ<)0tq_yjk;z&NUvouRz-hD79
zJ!|i_$x9?af|z`y9o^1#%L)&8?gZWYXPv7Ks>B`eYU0nXac;O`2U
zMlAPUzH)0yie;9L3c664Tj?S9=vJL1(5DdSd^u-@z$F-^jOJ8RYZm)6Mc|ip`;R3d~}Gw#%%42~pBFd^?^4uLI-l
zWWDx?zl(h87xTO%jYk3;Ou^-$=nW6D9^)pDe4vVgtsm-7m&NU%p6bVTh;-pQ5e6Ll
zlS4Xl_6tSiUrQ%uXZmJ!Vc$0=D!0po6bccsz|RaNP}N?7VbByigBF5E0PuL6FD?;v
zdiAsMVcq{ya>AGxJD+%!hcL)IWKSLvFym2%eCo=j-O&?;T{d3qdmYvEa?JRRPV=A$
z^A%ly{)#njvZzQY%q`sn7-Q4T^bf?0aet7i@4b#w%gc{P_Ts*#jx$H1w1JCFDuY?CbM2diZ^jjc<(V&4^V58_UDGgF
z=%WdWbfjZ{rXtCoIDFF1XH>$)Zss1?yg1Tucm@stI?8M{nNWYz5+h-3WlfD%n50|2rziOYv8J`Q$(QKj-zaC;|UzZT?Da{wi($3T?JeDgLk0hN8u7gjg)8
zP;&-4rXV!BBH;R7$~=vNiC$=F;Q1~H@Rk8Le18f2Cwstui3LlUA=rMGVu(^)&(f7L
zag)d9jnlJLj*5sqne7@Q|D?qd!0TBK`$)0G0|LB;F?MEg{i?L*LH`d!*Zs)W+r^Uz
zB35EXtk|1qjo5qdz16BxEuvJ660wP`ts1SpYIZ6qLBuMhln#xNsn~KmT=m%?zkyojxX52Y`*y*U|F0Kr-BYsULat8|R#W8Kq
zG;1D~!fDE}g@Li59F{}(>AI#X1ycl~jRwBeq2?_!)%G6iBL45+$Fa=uFb_U7vCw5&2!F?p=R3AT@f^y4nOUBu`8MeS`a~sg}wP%*U`8=aZT?)bHC{kpw=i
z@s5|5#R0nq3IdtfcyoDMW<>ql!uu@eIlj#Azp)D37germXQr|B(;nJgYGRs~FL8rO
z{`G|5w*y)#;)ER%iemY4&Smsb
z?A9^R{4N7lS-!B4Z4D`H)j*!7#D$ELxjOFC!h;!!CW7&PhMJ{Yt~V^#(wcLiXI35D
zd;#a>G|vOIV%nfW@mu45N7=sByYi*l`N%uwm)6MQ7%)~b$NchIEB^gTwEMvVWiS^%
z(D3#XVqcv73?8XYZ-ky;4}2Ex44)n4pO{D2JOlPZD7-{xw+r5_QtIyVhJEJdTAwl#
z{$2z<$C&4OEAEET#ROu&x7xaT<+xlP*76@`ILK<=2(7<|gZPo)%u|br*)6e_-J_T8
zu_#95qisEJZQnG#*Oi^t$NXVb1scjV`~Yt{-)~!eVcy+CV&%M2lR%#)RbGwN3nDfN
z?l%7$A|h1g@b&iu-@V`Y+V@wuq3kd{st(10EvF-{ymmAE$37^p80m17W8$$qS$?h_
zpZ2iKNM(d%tT&}crb%dL^5z{EfW*qTI^_rwKk1>5$%fZ3x&Kg^-B3NJ;ttxZJLu!Zlu^~e31=3t_B=o<29$xBxDmtb@Tqm3DzUZ$Nus#f
zC$F%)jyWtdcdq)W90&a2gD+UU5i$r(ie(#EN64G3vM?>%p
z$_&`v@i&<*UB_SO3Scxs%mUkBp{t)@;;Xc;uW-uyaXnD2(SnzG#0~lxa(0Y_=#6X1cC39fVT++
z?-rwpo*bt5{ab`1V&(g3B1iCQEhN|;E-!05jst$WyKmYO{&(YX=_<~{%m&SgcJ>&n
zXU3X8;4-u3u5A&7G(6t`2=15HZukalC`n-#WF=m+ikJ8Jaw`rRs;{hXsONQp--Qx1
zPpiy=_qdSX+U9vojBtv&v5*T=kcE_=+YQ0vOBsG0qj-q2&IEGlT+SD|hP>SlL~E5K
zQTKccrJq~-!3g(8w(g8<$N^2fp2X*8X=RN)$U_xy
zpvsud+5{H9Y!8q@-bE;O+vl`Io5+h{KVxPhPF4=?`|i{tvgvJ5+Ze!Af5bxo?KT|Y
z{l9@~&$^^3;O*nmn~~XU_2q{-Hs)bHdzIJQYGpXHFGTxU;B(smtYK}WD-QaBvBr2z
zRfzn8;$}{HJz3HAnBbjZF+Q$NaE;zjc96q->+2BK%qmcUsvIO_nOrv5Sdb
zsNM5^mf8748*9
zX7?Rs22oi*xXA>47m|)~bSL?vI)~5Abj>sfWfwY+&Ed!a)=>eMS}2dUbEZ~nvTRCG
z45-TziCOA@p-92}cOp9N(6=vjToq;BW-(n6#4R9IS9`sun7)s(WPg7s3RgP_VvE;3
z_qdwtTGhGF12Ev1Se}dbk!D-I7aiOlxx
z&1f0${j+*obN13*YU237$oxyzNnT#v7|XJHtO{62W&&wVGl&Pb8@DRuf6sUnHW5O;
zl`_2`gS6N6M){yA3`78T^(Sif-Y%D!>QkC$Ht$m&W1SyCWA~G1KfCA`^D5{fok+G(
z$C%S`PR?3?YYCrf_U`zP6_aO0j%E`aBHq2$WgC&T!&}+6PXvfyY0mD8q>lR4E^x-_
zOkOBTn`(1;L(z82HZHtd`mmrf;C~x3*>w+(ia1hi+Ww65$hHC-FC$8>c#Hjz?38+m
zOZ*S8tD0w>GHMMNPYb)7kkhlSDxdG4;T@0omgQ5a@h<)(w$BK;Ba6P}pATSCm}k@4
z9oujxr0uhmKGXEclAjA9YNz!88cL+E1>@zOY?aT6Tk^Z-P9B#6!LM3TYrYP97a_>Q
z7zPh##+m{lLTZW1#|NB7|W<#)J)y%K>O@)TzBmC{!GTE6CeCfp6g_8@7>CN
z7W!5-=a%IvEdVsDGr{AKhUIViS`{{v7eV6@GUoPz6!ULa#meiN+25+u5mRpmCtq`=
z({bj5ZQj`=l&
zS23?KB`jVpB`Z>3&Zk|$@2on|)SkOigd>CbSV*nt?r$iNSvdbQ_o+~6E7fI*f_D>D9BphWG&RB|65eJ!UvW-LM
zY&V4>u8LR#Ty1V9>z;wCny#uoR=~D@+e4}(RN~oE{Gtl}&~zefTu3e)4J#
zoY!K%1`B)R4hvY8s9M#*8C9c6I(HK4tDK!p$XvV=+Ld+h6Z6bcMvZ+8_VHZff^YKz
zf-1ky$C((S1=tb=Fk6JUs<-tq-h0TSb&{#Z#~pq=8!n*>2U<%4}Z^yu;?_pWb9{Ohvu
zxSKp1u&2M-te5s+l5^6L@LY&Lp8tDQbCf|&f0Kh@_`XCuD|kQ&rrBkFW_8bNxJ1oG
zS|rGR*nm97R6vU7p|B>p=oQm_(w*tli`BO)<`pEl#^XoHXE9{?o$OwRw>v0aJOt7v
z&nyF>sAos=91AIz*wtK2QYQ>1k2*S$Of7%Y*dduOy>mvRosC;j(Ot7R=rtzC8r%HBnIZ+kIkd2*Vtb5o@ZA|>r-gGVPE2LSR>um8q&^mOhnCnS@8ok
zFEZ4axcC_SQVr{*D{LJkWyRHAgo69YSW?Xf^*~zJK34WU{#Ac9`h2u(dLjwelQR2-
zCw2q8Vr%ko!eOwAtSF}Hn*W)Bkhp$MWZ-QP-PBPBu%=y|
zM!Y;TQwE;<$VvBTuA9vho9}!P#O}8u;laAjq6;uv#eD@{sN2X5O68xCF*$gFTrROf
zn7P;WyjXsei}+mRLw>L<(uv%`&qs@
z0|Qz0Qq6RIZU#KJVE|CxkcO3klOBy5#?IW!Ma-2TfPipaS+Pt^N902Y9&zNo+lRv_7aleJl5ggn3OEM*8h
z%~0_BY^3@i@VL?tssTW#J``BqhW_OzV;*NSLPT*vwG
zfA0Y+q&-mJ8QyR3n)i9~Gek01UsYjac#=QLW=BlX$R{%4&!%YM)J6rH_`I)Lz*Cui
zr7?yW_=YY`Mk>HZR;kpidq3MIkpbIuOqfak1ixJiaDP9Jmt3UtWIEOQbN=&=3$8mV
z1?bYl6#-52k{*lms7-ZtumxjY8WLr8PJQDpIuGVIev^~y6VSL7C~P0%hzxyJu>fbm
zk%MAy3*C2>TG|0yYQ%IsT*$?7J-ItDv7SQX7v0rhCd9A-<7vG#1@;RF*EhegZ^;AW20~&y9XtTC^U8NqYBqn2V%&S69VJfTr218X$
zN@u!zOXMc5DmKOK*hM#1?-o3Is8L6bjOu4rdPCu{hwplVL4xloa18$8XPkj
zC?;<>Pl1QMt`#)7uYlMJ$%&ljN{R(890X;
z9vZj4N8|S>lbM1r#)9O6F)IguhQ5$T+lDG!QZ4AJnQC65rbX&+2)R|k%D&YD-z1Pw0&TA
ziXfBJbTAQCR7$lQ6b&*C%~qQJct=@wUT;RAf=;qjpfUw>pJ#S3k^B#U$nxwFm^09M
z!P2Ko0WRRk@=yj1<*vhk@!Z`900tnL+18mWKozzTM?iKiiNGV5Z^;ZIduRD|N5TQ?0hBR^Q*)w&PV;N2r+I6bJku7&z%m9oSjNpj%VJl<*wGZw90E23
zazeCc@D{%aL)ENeU9K&6
zMbw9x_A@P;0x&R#bwjigG>VV+Ce=cj0WrIF331$Dw@+o0P`gCb1JU-E0}Vb}7_BWV
zj(ER%?|4+ysWIx|uul}UDeQkZDeK=O=hGKa)6k~5EVdtSg4vP3#5T7JHm1*Ou8J%%
zmv?{W;FV6qkbTDv>b3y8b)=vt1_=bgs4V$=_X|o{fsRI@a
zY;+Vc^4+GS&m1XyyLg}#mQwbIE^7J|SAW@es~xDB=NKCC=VB+veo^MwyWbSHU!(Y_
z9ZsKK22?LpbG+WW|b-+fMO=1(@K&KUC<-|70(#esJ7_S&CTU
zfXmJ&Xxkz~5tc0{(e1p>j<9DqGEjuJjzu;obR40?6W){
zV1_u{mMl2P#rd?*x2m&HOk4_Z-rrcD8d1kD`F7h|;`+
z^Nwi$`YRL0WBm+S4R@_!G#_vz$sA_rg>d&&S&4JBB3U
zUdq3MyM3GLz^_r#Wo3WhkD5rBB(f&4a_3;&@3UE7TgsChI@rqYq?@cfTfqplYyl{m
zmk_tDdxrPB-MJBXcx@A&U3YN8_QYNq6;*dOf}397MpY?7S~kcn+Vi9k6QINWf?U33
zb<;H~WvK}{(;(3bPlc0-0l%C-7jCYprP=P7ld#kD(iRklrX
zY(>W3c8E@iE5ZPSu-%RsECKQ>2gnu^u!ZofroLgP4vi^1+7z+5Jlz`1UuHKi$yeeU
z1AoJtjF}BX#FoZFsedA2<634UBxnj1{|n-W3JaIp+Ublf2&tb_#gD3-HHfJ2bxfCQy3cUayP(Q?
zT=)YMXb>m_H)>sTT)dc&z_FY9v#OSz!DcbY6o`#C?yxuIXI`Wsk%=uUx_p+7r=u{)
zZ#^+K6jbD-ymEp9k@j7^yaEaFj5%P%HkXCh#uv)
z*1h`xPkE;%jwnExATU*zmG|?Q^R&*WD-WC*5ren78S*D6->pt`2e6@g4veaeigp$j
zp1J6HH~;l4e+&VemzUejVCi9@U`G39Kaw->WznW@2V~hcEJ}#)vixxrdF01by1(+*
z&8PTi{V=9Sv-*YyyiweUy9?_$s7GpXUq{|vF$8)$8f3CVB1*eB1<&(gHl
zkSu&{ti)2@8`~P!9kGO4wOc0Xslf}(3%^q70t7x0%}r6c+A3?V=%WHal&Pb=jk#`)
zNLPu-2G@N*G(Jz*ZKSKuq8zS9J!d653D9cU$GZ!6qvi8^#bQ+(_11MHLajnYE`xZ>
zHwcm`I|`slz8w{3+W6m07
zs)`?qk5~A_T283&2L+*Ek1xBDY=L>-C<(stwf8XN1344_0kHixDFaMqktY+<>n|Pk
z0RSR4=sP@lRttY~N85wRZUrUDxe^;{Z66u%o5HygeQzG^Am$KMT2rdOnOEQZPA6JY
z>M=!v!EE;+3}1VgWm(ytUS$^~dN(mob=MR*TEvYXlZ^OaL|U*V7PHl#$7NpHWhyOk
zl(ep-I!V_YY?zqDpP;!0MLQ2GR$8sP`tp`yx_FW7swPgu&!{a;*=VkcF7FO6-yKD;
zL<1TObiig9^X#XPBb+r}#B+WL2=b2c&xwXxq;3Eu+I6w6EnOY7h_UDL2^Ahm`oUnG
z9XkCi%Vme%VFxVF(Q1=?n8oXVNc`;cZ4@&0i-Kxm_;2t(ccQ}FLJnyljbRGegZEV=
zy1hR|-8te!hnm-}r{*0|NNr+!PJBC^FM(Ia4IdDsg?1IKIls{QZh||1(zBnE)E#Wx
zOOJ9PN^R?;IOlk+VxHuEH~)(&oo~Sd)|bo}^H=h&sk;iixWdAv!zcFOqMHHbnKKS3
z{@dGf`cvZVV9%~FYWZEaT!5aq-p1IB$jN?QYwuH_yiSff;F_HKKeRR*KVq;W1xa(w
zFu$qVf0nkc7rw?tzAJ9{BM_EDpG8m0Ki7t9;zs4oH|I=LKFEU2tmd(jz3j3=CT0Rh
zcz|&^f(4=$j1G-vw<2noi-}`)*}QJE$Pi7~ure~e40+S{*pnaqlo7qDY+GlB^uW3L
zx4V+B+q^ew#n6Bg?ORH%Gsndbm}=9t^(}UQ#xhi@Z;{AtBMi#KX6Dteb12gaWAilh
z)3U+jL*gFcyJ(s^-nl={CQ3bLiMR1RWgyu2&Y}8@yO=@B3{RxaA%h)mUi-G7`6!
zQT|1r2KE9J7*@?oGtN^Mjs`#GZ?relOI=#ZbG|BTtDJtDS+RRQ6o1hy_$_r?swk<71x6Os-#TNd)xz&)>0Q~`X2bFXzr$6gKE
zWcdbX%$)^(U2n65Lu>erx*&yt^L8|JLdZkn2QPySzTy~@dDr~!PP-Ce()l!XS9eYO
z>t#O-oqX&ggg+EB3)EfEF`XT`#?Plp*9c()$bpYFC_
zkzo|Ukimd~zG@(3Hq%t`z#*()qI8f0MzkdD|D)U1va^;5!CZ$574o3BAI@9(JXz{D
zs*sb1HjW%kU3t;Hh8=``$*0G3X%V>o$kl=a3KJ7cJ^0h2VSuU8OGmkl-R57!xp#!%
zEdv$-vM(LaKf2tO_3if*sfh5hw?a!LQ)4XvZep@FYRc01%#5SeD(vGKIj2#Jc>zo<
z@m{uV%AU3dl)l00}km$S-lF8F6ek;D+%=vS%
zw-)!dfD&tO!uyn7-yXlk4s1ZzH?$TRB{@3p4f^xfGdnD9=el71Ctyn;y
z#8;`?O4KBe6|l*-@k%?oR&H&Gy}h4hc0HlU8%uK#d|wIbd7c7|Rx}~dy8&fig
zIS;G)ERXFPKq2>YK?S#gs$O?xHK6qocVleBG;CXHD|VF&dmTw3<5_9ovKZX6#}$3|J@wB(ie#SZ3=knR|(42
z6%%>)Ms(FET_@hSnq7^gV5(fKcF!QF5*DHsj%k>53dcxvvtFYL;h%1(JG#O?563cP
zlySj-(m9o8p2sB7Q#606o44pb$B+wCbNbk#`nIS}6f1B10>Jd^003`gkfX~N=>8+P
z-9i9BCWX|X^73~T79y^2Vjy36Tld
zJ#6pV(zDddC9iLVH=c)##7H*`G%
z#rA^jA#-!R_#_7Yp8Li}H~$xk%ft4U!EWrtDd3>k1um4-UD4E#{O<|_TabmT35NaU
znoyP>dUs44u{E*<{0XJT&LG~~U5M5Z=F@$A5PzlE^9w>j4UUk843;v=j*O{d6K);x
z>bx&|DxGXrzZuP03U1I6QtU8;(^Of$!s+14DO!_M*F0r2n|~XRtdU;V{sY90h6Pl$
z*$_>y_bgHETg)DIshAA?rns1k1JB3^)Xn`famR1b-K}o_s9?;4{lx^RA~RKYhqIJ_
zn_wS)bNy2@dIdSUkbI>&o68O71mSo$NqOC7`kN2rQV0<)H^IzY>f2n?e(K<
zNETRS?SdVD&SN2YuPbi8?DJx4|lwgLKls`
zUB2o3!SJBH+gvAoKQ^w|_ImG2?*=B-?8&F|!gj28jdUDb%TKorALFeJ*!vIj&{6O$
z8W!ddgZM}^-oG_t^2lsG-nst>j?!X;IIVsWwOsJ!kvmK;Wt_tSrXL$O!4XN@Mg+cc2UJEH5if5-O+667PUPp|=qC7s}?pQRoue@>NfvU_!@i6lOB2sBj
zZ$k)=Zqwiba}Y0nHh1%F)gIu)V^29fbba@R>`T_e>{)@&dRSE6bq7|XL^hzX=c0kC
zvL~3sYMmy@?
z6)+i;k3iuZSuR`tK=xc_>nXXZ(S(Ze;pg)Ei+SGV)_p-71MA>#x_p!Y3u$;5XtcADrqI4iY%jE(l
zql{BVYVYsBYj+)~4mzw;3dSj(WFdbjc9?Je=I7RCo)_U?>NVo3V8oJ^Y5meD(+J9`u^WWE=3+D*#sg$y%M>L
z29SkXag0EzlIl>^5#A(z#M|YIG9Q3!TfB1KgtJc*NX|t1H2jDE^xnw$gK@$
z(4!)8A>J$I#OK&d^W4(Zi>Pho$43P7_vTI=Ss@jG!ow~P%du1hoY(SBa9CBP&J(tt
zr=@b7W+i_oK{=yA&%4M>@VI4f`{33=Dp4)Jz@j}0=>1BtFLf(_vHz_8`8*bYLrDV^
z6BubuNP}ZalBqdxfM~^Z
zRHm%DwUymsUZX`p?yIv`(p{vggn94a`Be!0M#oTa@tgNf_k8MQBA7d-d*{)vB_>(#
zs@jVG0}$iEe!k6->4~BpCIr;?byD)J$DVJgFD!36L^Wvs+43z#R?=rPfu^RqwX=)G
znM^^V*mK6Ma#B=wQ;qL$Hg5{SRCceS*e-j>8Bs+VnS1`)(=$vD+I~}^e(X_!bzA(r
zPbRpCIE__7dA@|Vbnrtv2OCC`^ITc-00fYe%|^II!RI3Q6cKZ7>$P~92gNsD0GttX
z)`=yMzC8A{*o9>i-@K#4BKVV|Ti|+L22ASy6pnK4`Zv;tEf@g3Ah>ZtE!Tis?Hv<8
zj7i|m_ModWQtrl~wx#LJOk9Iw{w%=zy3ZO7U&y{ocgon)TwoPZbdrPdHyIi!p|BW`ZDE8^2Ru=F9^*Gr)*h$z5Mlx=
zYMaf>!pkFKo`Owh6fjKq$sT5f_zzBX>s`90*JW(xf04@`HL8+$AI
zR*LSr5)p=bKqTK*?+{%@xgD4;dxGxloE+5;Zb=?%WPv081H9VA*ZW?fHRViozH*Uj
zqbnfwTm~85Kw!OQ%rqLJmB9HYqznL%?D{qiWD3I6#1vgD)lK;_uA;t^iZQbQs;SjN
zdR>=?cK(-!ZY(h7V?LdE$_EU1XL&-7crQ@Ro-fa@QOCa91Ycs+je?*yABR8N)rgQ*
z5O0{gQvS{S7nJ}S6{iWvu)aO2Sh)2e^F^G$b40N{FLc`6{ip)@9yiE*NBvVlF|TBIFymOKOVWl?Fa}{(a8fRVAob_(0nd(Q;FGv{(EBDTEhf
zMX0^K!{4uFFM>P#`;lqs1!|sceO{xX3-(!7gR2*!y?L0GokM0W5uVM5;kAzUu;r3o
zA%4f?5aZ7-g4V|c&iCsrM=)i^r4A2!>O%dr%vDd=a#$lvA7`f?5(8X-0LSk=-q!Z_
zJ&QC)@jM`))_M)eSn&gs$@VDtoyKD^(qZ|1NN)@--geOb3L0gQHOnEtP5YHw9IL^H
zZ&y8C-7j{L5G-d{;pAJmM*Rb{t!wyFhgL+oTHN2f6iKI|C8gZ22KEw_$@wmg&(Qdl
zf9e??!bdbtIFh}4=&rqsbhZZL{mJwzMeT6%#}f)3*7?Cn@89_xG~Xi}EzY)X^>>kV
z517uw(U=RYToSXvmFhkb2>tgU_=nLpo^0!#nC
z6ED-2aOLk-A<_wxp>b`vA29#9-|3V>g)212JK-c6YuH6ojtSMPE~5Xt+n)_rEp|$d
zZA8i!6#p_jfb#lEdVClI8dF8*|AZ<`+EE9b8Nyi3dfCacGtoDvDJ{IUDQnyt^wp5_
z)8^$OwRyuH#kkz-={`5u7XpkrD(YEl7$Ym_^3M7e1|6vAP>JgKFh|+|WJ?x%{igFt
za$C2q-ZR2A@i^ONVg3WuNZXI?1;JjBgPY#Ps0QGVSOPSKYU*LqSO>>-lj@U2A5#wzaTI55>pjSfaYXE?k
zZI<@fCa75P;Vfnv^%Th&`bf!9W9io%Gi_(MFHG0-`r+R+^-<7iG1B#w_VAhLPa%9Y
z;@R#Cnsg8sGuYC!@9g%_H4;KAmMu)aN!RIdz~A%kf6l<+pcTbvh~-@QX3#&T*smMe
zAa0gJ%I4`9goj@=aW8xe`AA!B9_II#0Cw&H35sPC(sBfP3jQ$GHV{;Tdw@Y-9?2>8)*je>AnSN{Ey7qFma4fb=fdz_Kpe&OWhJjNi6I
z@IOGDD=?GmZ&hhy%q8=0o2!|8EJe(y{cNrhIen8INZ3x|02Z2(l#pwEARRQJ})(
zyPNYw&X946L-6K*f
z>>aYkKSHYD84WwN;6{bRtd!9vFa1{j83sz}}btg4u)cZH5;RqRj}x%Cif*
zFrJJ+j)BVr???K+!4jdqOvAFMy;m;uQ3j9uo?zsMmwZ*WfzJ;@5tT*Iv}B4oKyk>i
za9*(qV43)xArF;rr;W#@2AkjTFliYv@}4|?BYAAqn4_DL50n?34nI*o%w-Fa+e7
z<|XZWKL&(B;qth7yRKeVQhb^sO>yWe&>LjDoFs}=TJR>!oKAqMFT;|3{sY*XR6`Wr
z#8&*FQa}<)sp*(JEhHq+sh0$>z%RB}1Zq>&p^yfJi;l4%`5D)&w;yTpOe{(4bWq-7
zFV6k+FNLtnji{4*-Ytl{Vb>*k?k>P~bbqNb^t$~;<&0``zUQ^OD!R>Jy;s%aWt|oc
z!@tt@f-eKIE2upni6;x3dU{x6rf9o3Tatz7`yGT!zD0a|TeL$2HvDgHt&H4QnGQ{v
zI)TE{X|m@cGnhtlS@b0de_>QXuzN29j0ci=k;{)a#sY8(j~&r`9Im4m3PO1@AHf;5
z*77`U`IKD%NC3gWxOh&5&s2D+OPFC?
zYS!n53lEP7TOu;#Y3$Lf9qLBGv|m;w`5&Vk6z
zk$-UH5@xr!`S&Et2M+fKo;yV|hm$#LR$dUM*}Yf&M})^*QS!Ran=()mQ32Oi)BiRz@IcFJ%{Yoem?XGWp=I0ZVGril*c_p0HNQ4v(&m<=
z`6R0Tr2fEYUMAYb*!3^vxQJhB1`=~;r(~WPwuh-h{yTJS-!?ygkCs?@TK*5+Q964oma#G_aBim=@g
z^B246vxXX6)8CQ@@lVx7t96;=m>TzTT=4(*fQ;s>{~CLS`Eixet-tlxeRp9rHfGVN
zl6$yEJ|-Zgk(KCZ#cP5O1A+jo$%5T6J6t)LmrYXbYQS?+ma*qo*d4Uk8cr!fH0pnV
zK+NB52LwOopSloZGZU}Kwc~X<(liW+I@q3N3zVJz#F+rS?R&uJ#pIhbbL)Hg4#e?a
z{o=B5GT~l-PwwaGXxSH*dhQCZ3$@D<7EC3-LghX1^7@-xB_w@$^NabroR&7wI#1sy
zE_Va-g#+QgA*NBLd{RSQzz00Ss13O;>ymz^6EdrVh2M_2XO{nM;=;6h{JZoCO2GpU
zztiPWAFBjr&8E;Ie1?NM<8b~8DgvB$sK0U39#HeYsu%ZbpnxdAS4O3BViGC1)L7D+
zRC7c?=)F_8!6IhE1bnjR*yq|G$~8S(tm}*1Fk0i!04@Z-%5>dq1h~b5jq#p!j9o8z
z3vR`Kd(M0Oj(TTf!sQwj7vxbMo3xiweA?ROpvqfU|D~oQIQZNW>GWas7oZH+e|^?W
z*~rz&Yl7t(3i0o8G*Vc;c#GuidzknoWxnN=HA1$Dt|)`fi`Z?xl0wPKPB#OfRSK9B
z+i>*@%$RMxaTHSE-2TXlPJRD}htCqS!P4){$aL#=dBoAYbqBOC>ci^fE}z8%uFZ`$
z+!R@vAD(t0TBMwfd8acpNae|`YcW!B@7u3`SyU
zvp90Tbi~5lJ<3c5;eOUbW#ee|_?rMq^5U}sv2T{`G>4-&2p-SYbHNLl<>!`WWGVJQ
zC}FgVZIml@v#bI5(d*X=PkjMsS!V#CMeu!C_GJcG7{;eBwl54)2jfgOC%*pRsN~RA
zlY0<_I|GWbLmMj)zV>$a^x;}(3Fe5e<8)>y4l45V%zBm)edOMl@OwXR!Cm_4RPD!l
z{18F-A>*hFtZic(@0oQv1(F?Y9@fmlXG1!A#cG9NtiYHTJEGuG$A%w2cabgID772^
z0lfRF52{CSgo#xmswGYR7}TA(ke<{UTMCWzx`0D4^SVNg+gLy#Te$eJ+PoHC*n~5x
zroeToSIUphd{tUF;Or4+KWT_uH^QA~!YCQ*9Tq?#dKrePo0jYrQElV-O@OTXJ=)oA
z>}4YIXEw@U`r_~lJbE}Ms=hVh7N!UT8|4CG_z=2y
zYyPuU{uk2V7Dw7YbZfhD0Q(vPl9NL;J=UXohA2)iAV5fy=S5S;lL&%5>Pz^)b3a#Z
zv3EP2N4!BYUHK|cz{dStn#gXu0BH$gh8BR_l+-d!fSWGNUj*qf=ybHKh
z{810_FvyAK?VHPXeWF_xg2rCtlpi0PMY`T4c+D#~Ckao}EC9EIete{WzR;j`uO=qX
zRcWFBZsFBjOy0ds+wX)*e==I^8oa_55|GBt;D|t8f9NYbHx1F~H~BU4zdbPGBQk$%
zS45!FAnJ%%sHvGU_?C9S@`)ll&^!kJwq%?*yzl1h&J-oe)}ux`GlFQ%?E>o2yx7#c
z|J2{|mc9}hG19Rt&wF}tmkwjHmT<3oVDJ0206<7R%7nlzz{6bPUBHdk=7Pu1aT7A<
zARloVbZ95EL+za~HMKP(_rs2WbiMPV%JtT^9SCXz@+8XdHBA(1yWH{n?>5{8mbf@=
zM8ddBC@Ux52m=vKM^96JO#p;07p&{26i3W3OZ4Oo91VCYmGA3LTF=0PN7iZ1){Qey
zY7DjUrVuBoieo%nz_pDEzl!5pQsZUog^dR7EFrEcV*syJvh|h{_qxxq=
z>x#PsP*2YNkOu2rcHn7!kF9=hjS^lvyHh#laOC0~HztcG;=OC24ujpGh()SUIs)}r
zsQU$2K8!ixsVc4)TNJ=us2crDo)Z0
zf35WhSBuRy&9<9Kd5B~R)GDb;AT4S*t4qh6t(Pf@wRyI!2)yR9HaR7|kTHi42K5<8
z-34=Qas3TIVW_P_$KmY|MdpH(-6V_q&RRt(7mSFG)%J>{!^K_i6Ps#HvwyqdFBbg*
zDj=nNJatGK(OJ+m!t5@$j%VeYcEH|%B;Z@)-iMDaeeF_dq7Ff!2*{?v@ozFf(gO4O
zJEija=iVB0Jv5Afqy0d)v`cVv|2?|3EMfYi=a^oMp7KL8E2ZOPAJ^0<2EGZ+$!i}>
zCWL5JXTT08cNGAR+^yQ?fJtBMe}I&6V*rxn?a)E6g(=46WS6vH$6u2mO8*?e-0y?H
z-9_Wk8q#g^XgpS7`ps73YT;5zqdr|Lx5WMu!j~Ii+Lc1Os?Is4DUl|=f>4aekUzwV
z04_uC8hxpO!Qb7J#JmG|zUkFG+sks99ufC`C2s#OuHEyEtw`QAs-C$5I362HGlTAZ
z2J2@Hkhwr@hG${@4rGfI9pR{sV|(wjP%@So~mN
zwkbo5y|k>A0bfmMo29}13z1&8(g&n%0lLm_1a}m?-p&GXe5Fd=^;IB7R`pXr~Op}ZmJQ!+EmLO$szTYwC
zB^f~Rv(-Blud@j)Hbz3REsY%FsRIDGG1IrT0ryLaQX?b6<0d9kh|?Ltw;D>SNFA?oa
z^xsoiO4`ni7xYH2_di-G1|`(a6gP$Dac0fzUoo3m3K+ENHPiTKaAGG6>fBXp%)iJpuIJ95R`Eh=F
za=&40O~Rt-ep`~%Fc?p$Q&FM^5jhh
zF=8-b%(l#SBYMB6{~SyN6B(f5k1gjtTmA;0r5aEsKK*!-mP|jwYw$-0WxDW)2Fql@
zRwAnTrPS0q~bWZYN?FNVGc!j
zMAdqfcYZig56o^8h89Q8=6zI>!9?o`llpG4*x1#@8$%KZZ@%3_RBHY|Y5m^_$SDj(ObuZk-QFObqSpS`GTak^&c33AhU$b9HhH
zeB|pX-nE5mXqPpD=>hM2;-;v6S2ThFhsLr*VM+5b1m!2ZFexH8cW(2Uzkf!5|Jm